upload

Python urllib2 file upload problems

烂漫一生 提交于 2019-11-30 05:23:57
I'm currently trying to initiate a file upload with urllib2 and the urllib2_file library. Here's my code: import sys import urllib2_file import urllib2 URL='http://aquate.us/upload.php' d = [('uploaded', open(sys.argv[1:]))] req = urllib2.Request(URL, d) u = urllib2.urlopen(req) print u.read() I've placed this .py file in my My Documents directory and placed a shortcut to it in my Send To folder (the shortcut URL is ). When I right click a file, choose Send To, and select Aquate (my python), it opens a command prompt for a split second and then closes it. Nothing gets uploaded. I knew there

Uploading binary file on Node.js

雨燕双飞 提交于 2019-11-30 05:12:51
I am using Flash to record and upload audio to a node server. The Flash client is a variation of jrecorder . When the user is done recording the audio is uploaded using a POST request (not a form because Flash cannot create files) with the audio ByteArray as the data of the POST request (see more here ). I am able to receive the file correctly on Node-land using the code below but the audio that comes out is mangled and you cannot hear anything. With that said, the content of the file can be played by VLC and other players + Sox is able to encode it as an mp3. Here is my code when using Node:

How to create destination (Folder) in PHP while using move_uploaded_file()?

可紊 提交于 2019-11-30 05:08:24
I want to upload files with PHP and i use move_uplload_files to copy them to the destination folder I want, everything works fine with this : if (move_uploaded_file($_FILES['uploadfile']['tmp_name'], './uploades/')) die("success"); else die("error"); But when I try this $rand = chr(rand(97, 122)). chr(rand(97, 122)). chr(rand(97, 122)); if (move_uploaded_file($_FILES['uploadfile']['tmp_name'], './uploades/'.$rand)) die("success"); else die("error"); I will get error, and it looks like move_uploaded_files can not create folders. How can I do this ? Basically I am looking for a way to do it like

How do I bulk upload to s3?

折月煮酒 提交于 2019-11-30 05:03:09
问题 I recently refactored some of my code to stuff rows into a db using 'load data' and it works great -- however for each record I have I must upload 2 files to s3 -- this totally destroys the magnificent speed upgrade that I was obtaining. Whereas I was able to process 600+ of these documents/second they are now trickling in at 1/second because of s3. What are your workarounds for this? Looking at the API I see that it is mostly RESTful so I'm not sure what to do -- maybe I should just stick

Node reading file in specified chunk size

余生长醉 提交于 2019-11-30 04:02:18
问题 The goal: Upload large files to AWS Glacier without holding the whole file in memory. I'm currently uploading to glacier now using fs.readFileSync() and things are working. But, I need to handle files larger than 4GB and I'd like to upload multiple chunks in parallel. This means moving to multipart uploads. I can choose the chunk size but then glacier needs every chunk to be the same size (except the last) This thread suggests that I can set a chunk size on a read stream but that I'm not

Rails 3 - Amazon S3 Paperclip EU Problem

你说的曾经没有我的故事 提交于 2019-11-30 03:53:47
I'm using: Paperclip 2.3.16 Rails 3.0.9 Ruby 1.9.2 AWS - S3 0.6.2 I'm trying to use paperclip the upload to the EU (Ireland) based bucket. I have the following in my model: has_attached_file :image, :styles => { :grid => '90x128#', :list => '140x200#', :original => '400x548'}, :storage => :s3, :s3_credentials => "#{RAILS_ROOT}/config/s3.yml", :url => 'flyers/:id/:style/:basename.:extension', :path => 'flyers/:id/:style/:basename.:extension', :bucket => 'fsight' In my environment.rb I have set the write to use the AWS/s3 Default Host to the relevant EU one by using: require "aws/s3" AWS::S3:

Laravel - file path to UploadedFile instance

送分小仙女□ 提交于 2019-11-30 03:50:14
问题 I have a Laravel 4.2 API that, when creating a resource, accepts file uploads. The file is retrieved with Input::file('file') Now I want to write a script (also in Laravel) that will batch create some resources (so I can't use a HTML form that POSTs to API's endpoint). How can I translate a file path into an instance of UploadedFile so that Input::file('file') will pick it up in the API? 回答1: Just construct an instance yourself. The API is: http://api.symfony.com/2.0/Symfony/Component

C# Upload whole directory using FTP

空扰寡人 提交于 2019-11-30 03:03:23
问题 What I'm trying to do is to upload a website using FTP in C# (C Sharp). So I need to upload all files and folders within a folder, keeping its structure. I'm using this FTP class: http://www.codeproject.com/Tips/443588/Simple-Csharp-FTP-Class for the actual uploading. I have come to the conclusion that I need to write a recursive method that goes through every sub-directory of the main directory and upload all files and folders in it. This should make an exact copy of my folder to the FTP.

Google App Engine (Python) - Uploading a file (image)

安稳与你 提交于 2019-11-30 02:29:39
I want the user to be able to upload images to Google App Engine. I have the following (Python): class ImageData(ndb.Model): name = ndb.StringProperty(indexed=False) image = ndb.BlobProperty() Information is submitted by the user using a form (HTML): <form name = "input" action = "/register" method = "post"> name: <input type = "text" name = "name"> image: <input type = "file" name = "image"> </form> Which is then processed by: class AddProduct(webapp2.RequestHandler): def post(self): imagedata = ImageData(parent=image_key(image_name)) imagedata.name = self.request.get('name') imagedata.image

Best way to upload a blob with a huge size in GBs to azure in the fastest time [closed]

喜夏-厌秋 提交于 2019-11-30 01:55:34
Please can someone suggest the best way to upload/download a video blob of multiple GBs size in the fastest possible time to azure storage? I'm a Microsoft Technical Evangelist and I have developed a sample and free tool (no support/no guarantee) to help in these scenarios. The binaries and source-code are available here: https://blobtransferutility.codeplex.com/ The Blob Transfer Utility is a GUI tool to upload and download thousands of small/large files to/from Windows Azure Blob Storage. Features: Create batches to upload/download Set the Content-Type Transfer files in parallel Split large