FTP to Google Storage

前端 未结 4 891
野趣味
野趣味 2020-12-31 10:46

Some files get uploaded on a daily basis to an FTP server and I need those files under Google Cloud Storage. I don\'t want to bug the users that upload the files to install

4条回答
  •  北海茫月
    2020-12-31 11:01

    You could write yourself an FTP server which uploads to GCS, for example based on pyftpdlib

    Define a custom handler which stores to GCS when a file is received

    import os
    from pyftpdlib.handlers import FTPHandler
    from pyftpdlib.servers import FTPServer
    from pyftpdlib.authorizers import DummyAuthorizer
    from google.cloud import storage
    
    class MyHandler:
        def on_file_received(self, file):
            storage_client = storage.Client()
            bucket = storage_client.get_bucket('your_gcs_bucket')
            blob = bucket.blob(file[5:]) # strip leading /tmp/
            blob.upload_from_filename(file)
            os.remove(file)
        def on_... # implement other events
    
    def main():
        authorizer = DummyAuthorizer()
        authorizer.add_user('user', 'password', homedir='/tmp', perm='elradfmw')
    
        handler = MyHandler
        handler.authorizer = authorizer
        handler.masquerade_address = add.your.public.ip
        handler.passive_ports = range(60000, 60999)
    
        server = FTPServer(("127.0.0.1", 21), handler)
        server.serve_forever()
    
    if __name__ == "__main__":
        main()
    

    I've successfully run this on Google Container Engine (it requires some effort getting passive FTP working properly) but it should be pretty simple to do on Compute Engine. According to the above configuration, open port 21 and ports 60000 - 60999 on the firewall.

    To run it, python my_ftp_server.py - if you want to listen on port 21 you'll need root privileges.

提交回复
热议问题