How do I unzip a .zip file in google cloud storage?

怎甘沉沦 提交于 2019-12-04 04:00:39

Here is some code I created to run as a Firebase Cloud Function. It is designed to listen to files loaded into a bucket with the content-type 'application/zip' and extract them in place.

const functions = require('firebase-functions');
const admin = require("firebase-admin");
const path = require('path');
const fs = require('fs');
const os = require('os');
const unzip = require('unzipper')

admin.initializeApp();

const storage = admin.storage();


const runtimeOpts = {
  timeoutSeconds: 540,
  memory: '2GB'
}

exports.unzip = functions.runWith(runtimeOpts).storage.object().onFinalize((object) => {

    return new Promise((resolve, reject) => {
        //console.log(object)
        if (object.contentType !== 'application/zip') {
          reject();
        } else {
          const bucket = firebase.storage.bucket(object.bucket)
          const remoteFile = bucket.file(object.name)
          const remoteDir = object.name.replace('.zip', '')

          console.log(`Downloading ${remoteFile}`)

          remoteFile.createReadStream()
            .on('error', err => {
              console.error(err)
              reject(err);
            })
            .on('response', response => {
              // Server connected and responded with the specified status and headers.
              //console.log(response)
            })
            .on('end', () => {
              // The file is fully downloaded.
              console.log("Finished downloading.")
              resolve();
            })
            .pipe(unzip.Parse())
            .on('entry', entry => {
              const file = bucket.file(`${remoteDir}/${entry.path}`)

              entry.pipe(file.createWriteStream())
              .on('error', err => {
                console.log(err)
                reject(err);
              })
              .on('finish', () => {
                console.log(`Finsihed extracting ${remoteDir}/${entry.path}`)
              });

              entry.autodrain();

            });
        }
    })

});

You can use Python, e.g. from a Cloud Function:

from google.cloud import storage
from zipfile import ZipFile
from zipfile import is_zipfile
import io

def zipextract(bucketname, zipfilename_with_path):

    storage_client = storage.Client()
    bucket = storage_client.get_bucket(bucketname)

    destination_blob_pathname = zipfilename_with_path

    blob = bucket.blob(destination_blob_pathname)
    zipbytes = io.BytesIO(blob.download_as_string())

    if is_zipfile(zipbytes):
        with ZipFile(zipbytes, 'r') as myzip:
            for contentfilename in myzip.namelist():
                contentfile = myzip.read(contentfilename)
                blob = bucket.blob(zipfilename_with_path + "/" + contentfilename)
                blob.upload_from_string(contentfile)

zipextract("mybucket", "path/file.zip") # if the file is gs://mybucket/path/file.zip

Fortunately, there is no mechanism in the GCS to unzip the files. A feature request regarding the same has already been forwarded to the Google development team.

As an alternative, you can upload the ZIP files to the GCS bucket and then download them to a persistent disk attached to a VM instance, unzip them there, and upload the unzipped files using the gsutil tool.

I'm afraid that by default in Goolge Cloud there is no program that could do this..., but you can have this functionality for example using Python.

Universal method available on any machine where Python is (so also on Google Cloud):

You just need to enter the following commands:

python

or if you need administrator rights:

sudo python

and then in the Python Interpreter:

>>> from zipfile import ZipFile
>>> zip_file = ZipFile('path_to_file/t.zip', 'r')
>>> zip_file.extractall('path_to_extract_folder')

and finally, press Ctrl+D to exit the Python Interpreter.

The unpacked files will be located in the location you specify (of course, if you had the appropriate permissions for these locations).

The above method works identically for Python 2 and Python 3.

Enjoy it to the fullest! :)

In shell, you can use the below command to unzip a compressed file

gsutil cat gs://bucket/obj.csv.gz | zcat |  gsutil cp - gs://bucket/obj.csv

Another fast way to do it using Python in version 3.2 or higher:

import shutil
shutil.unpack_archive('filename')

The method also allows you to indicate the destination folder:

shutil.unpack_archive('filename', 'extract_dir')

The above method works not only for zip archives, but also for tar, gztar, bztar, or xztar archives.

If you need more options look into documentation of shutil module: shutil.unpack_archive

There are Data flow templates in google Cloud data flow which helps to Zip/unzip the files in cloud storage.Refer below screenshots.

This template stages a batch pipeline that decompresses files on Cloud Storage to a specified location. This functionality is useful when you want to use compressed data to minimize network bandwidth costs. The pipeline automatically handles multiple compression modes during a single execution and determines the decompression mode to use based on the file extension (.bzip2, .deflate, .gz, .zip).

Pipeline requirements

The files to decompress must be in one of the following formats: Bzip2, Deflate, Gzip, Zip.

The output directory must exist prior to pipeline execution.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!