blobstore

How to upload multiple files to BlobStore?

£可爱£侵袭症+ 提交于 2019-12-01 08:15:23
I'm trying to upload multiple files in a form to the BlobStore. Form: <form action="{{upload_url}}" method="POST" enctype="multipart/form-data"> <label>Key Name</label><input type="text" name="key_name" size="50"><br/> <label>name</label><input type="text" name="name" size="50"><br/> <label>image</label><input type="file" name="image" size="50"><br/> <label>thumb</label><input type="file" name="thumb" size="50"><br/> <input type="submit" name="submit" value="Submit"> </form> I'm then trying to fetch the BlobInfo objects for each of those files uploaded: def post(self): image_upload_files =

If you're storing an image Blob in App Engine, should you put it in the Blobstore or Google Cloud Storage?

穿精又带淫゛_ 提交于 2019-12-01 07:04:26
Google Cloud Storage seems more cost effective than the App Engine Blobstore . At the moment I am storing user-uploaded image files as Blob type fields in the Datastore (App Engine Java API, by the way). But I'm debating whether to switch to either the Blobstore or Google Cloud Storage to permit image sizes greater than 1MB. Google Cloud Storage seems more cost-effective than Blobstore . Which one would be better for storing large images? The Blobstore is tightly integrated with App Engine, while Cloud Storage is offered stand-alone. Otherwise, they look like different interfaces to the same

GAE java.lang.IllegalStateException: Must call one of set*BlobStorage() first

风流意气都作罢 提交于 2019-12-01 05:41:58
问题 I am trying to upload a file in GAE using the Blobstore API. I am getting the following exception when running the GAE server locally (dev mode): WARNING: /_ah/upload/ag10cmlwc2NoZWR1bGVychsLEhVfX0Jsb2JVcGxvYWRTZXNzaW9uX18YFQw java.lang.IllegalStateException: Must call one of set*BlobStorage() first. at com.google.appengine.api.blobstore.dev.BlobStorageFactory.getBlobStorage(BlobStorageFactory.java:24) at com.google.appengine.api.blobstore.dev.UploadBlobServlet.init(UploadBlobServlet.java:88)

Using AFNetworking to post both text and multiple images to Google Blobstore

天大地大妈咪最大 提交于 2019-12-01 01:09:36
For reference, here is the working android/Java version of what I am trying to do in iOS/Objective-c public static void saveTextsAndImagesOnServer(List<byte[]> images, long someID1, String servingUrl, boolean someFlag) throws ClientProtocolException, IOException { Log.d(TAG, "saveTextsAndImagesOnServer started "); HttpClient httpClient = new DefaultHttpClient(); HttpPost postRequest = new HttpPost(servingUrl); MultipartEntity reqEntity = new MultipartEntity(HttpMultipartMode.BROWSER_COMPATIBLE); AdditionData extr = AdditionData.getInstance(); reqEntity.addPart("red", new ByteArrayBody(("" +

Writing to an appengine blob asynchronously and finalizing it when all tasks complete

空扰寡人 提交于 2019-11-30 20:20:00
问题 I have a difficult problem. I am iterating through a set of URLs parameterized by date and fetching them. For example, here is an example of one: somewebservice.com?start=01-01-2012&end=01-10-2012 Sometimes, the content returned from the URL gets truncated (missing random results with a 'truncated error' message attached) because I've defined too large a range, so I have to split the query into two URLs somewebservice.com?start=01-01-2012&end=01-05-2012 somewebservice.com?start=01-06-2012&end

Create CSV file and save to Blobstore

℡╲_俬逩灬. 提交于 2019-11-30 19:53:12
问题 Is it possible to create a CSV file and store, update it on AppEngine Blobstore. Also would it be possible to email this blobstore CSV file as attachment? If yes, are are any sample docs available to accomplish this. 回答1: Yes you can write data to blobstore, including CSV data (which is just text). You can also read data from blobstore, and create mail with attachment. Update: You can append to the file until it's finally closed - meaning you can append to it via a series of requests. But

Comparing Blobstore and Google Cloud Storage

随声附和 提交于 2019-11-30 17:00:27
问题 For a GAE application, what are the tradeoffs between using the Blobstore and GCS? as of Aug 2015, the price of blobstore and GCS are the same ($0.312 per GB year) GCS has a nicer code interface (data referenced by things that look like file paths) GCS has console commands and a web UI for uploading/accessing data Are there some kind of advantages to Blobstore that I'm missing? 回答1: Right now with my startup we are using the Blobstore service and we are planning to move to GCS. The only

combining blob servlet with endpoint api

雨燕双飞 提交于 2019-11-30 15:27:00
问题 Here is my web.xml <?xml version="1.0" encoding="utf-8" standalone="no"?> <web-app xmlns="http://java.sun.com/xml/ns/javaee" xmlns:web="http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="2.5" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd"> <servlet> <servlet-name>Upload</servlet-name> <servlet-class>Upload</servlet-class> </servlet> <servlet-mapping> <servlet-name>Upload<

How to write Big files into Blobstore using experimental API?

余生长醉 提交于 2019-11-30 12:14:32
问题 I have dilemma.. I'm uploading files both in scribd store and blobstore using tipfy as framework. I have webform with action is not created by blobstore.create_upload_url (i'm just using url_for('myhandler')). I did it because if i'm using blobstore handler the POST response parsed and I cannot use normal python-scribd api to upload file into scribd store. Now I have working scribd saver: class UploadScribdHandler(RequestHandler, BlobstoreUploadMixin): def post(self): uploaded_file = self

How to clean an Azure storage Blob container?

北慕城南 提交于 2019-11-30 07:58:15
I just want to clean (dump, zap, del . ) an Azure Blob container. How can I do that? Note: The container is used by IIS (running Webrole) logs (wad-iis-logfiles). A one liner using the Azure CLI 2.0: az storage blob delete-batch --account-name <storage_account_name> --source <container_name> Substitute <storage_account_name> and <container_name> by the appropriate values in your case. You can see the help of the command by running: az storage blob delete-batch -h There is only one way to bulk delete blobs and that is by deleting the entire container. As you've said there is a delay between