cloudfiles

Scaling images stored at S3

这一生的挚爱 提交于 2019-12-18 11:53:49
问题 I'm in a situation where I need to push image storage for a number of websites out to a service that can scale indefinitely (S3, CloudFiles, etc.). Up until this point we've been able to allow our users to generate custom thumbnail sizes on the fly using Python's Imaging library with some help from sorl-thumbnail in Django. By moving our images to something like S3, the ability to quickly create thumbnails on the fly is lost. We can either: Do it slowly by downloading the source from S3 and

Error creating BlobContext using jclouds in a Spring MVC application

倾然丶 夕夏残阳落幕 提交于 2019-12-13 14:41:50
问题 I have a Spring MVC 4.0.1 web application that needs to upload files to Rackspace Cloud Files. I am using Apache jClouds in order to do this. When trying to create the BlobStore using the following code: BlobStoreContext context = ContextBuilder.newBuilder("cloudfiles-us").credentials("username","password").buildView(BlobStoreContext.class); I get the following exception: com.google.inject.CreationException: Guice creation errors: 1) No implementation for com.google.common.base.Supplier<java

changing django-storages backend from from s3 to cloudfiles and dealing with old files

£可爱£侵袭症+ 提交于 2019-12-12 15:07:11
问题 I've got a django app that I'm moving to rackspace. I have a model that uses FileFields and I'm using the django-storages library s3/boto backend. I want to use cloudfiles for storage, and I need to be able to serve up the old s3 content. On a template page where I provide links to the files, I do this: href="{{ static_url }}{{ article.code_archive_file}}" static_url is set from the view and equals settings.STATIC_URL . Clearly this isn't going to work since settings.STATIC_URL is going to

OpenStack Rackspace Cloud Files .net SDK

被刻印的时光 ゝ 提交于 2019-12-08 02:51:05
问题 I am trying to save an XML file to a non CDN Container from Sydney: public void Save(XDocument document) { using (MemoryStream ms = new MemoryStream()) { document.Save(ms); ms.Position = 0; RackspaceCloudIdentity identity = new RackspaceCloudIdentity { Username = "username", APIKey = "xxxxxxxxxxx", CloudInstance = CloudInstance.Default }; CloudFilesProvider provider = new CloudFilesProvider(identity); provider.CreateObject("XMLFiles", ms, "xmlFile1.xml", region: "syd"); } } For a 1MB file, it

OpenStack Rackspace Cloud Files .net SDK

时光毁灭记忆、已成空白 提交于 2019-12-06 11:56:57
I am trying to save an XML file to a non CDN Container from Sydney: public void Save(XDocument document) { using (MemoryStream ms = new MemoryStream()) { document.Save(ms); ms.Position = 0; RackspaceCloudIdentity identity = new RackspaceCloudIdentity { Username = "username", APIKey = "xxxxxxxxxxx", CloudInstance = CloudInstance.Default }; CloudFilesProvider provider = new CloudFilesProvider(identity); provider.CreateObject("XMLFiles", ms, "xmlFile1.xml", region: "syd"); } } For a 1MB file, it takes about 50 seconds to upload (very long). And, trying to download the file back, returns an empty

Scaling images stored at S3

我是研究僧i 提交于 2019-11-30 05:03:07
I'm in a situation where I need to push image storage for a number of websites out to a service that can scale indefinitely (S3, CloudFiles, etc.). Up until this point we've been able to allow our users to generate custom thumbnail sizes on the fly using Python's Imaging library with some help from sorl-thumbnail in Django. By moving our images to something like S3, the ability to quickly create thumbnails on the fly is lost. We can either: Do it slowly by downloading the source from S3 and creating the thumbnail locally con: it is slow and bandwidth intensive Do it upfront by creating a pre

Ruby on Rails 3: Streaming data through Rails to client

喜夏-厌秋 提交于 2019-11-26 17:13:29
I am working on a Ruby on Rails app that communicates with RackSpace cloudfiles (similar to Amazon S3 but lacking some features). Due to the lack of the availability of per-object access permissions and query string authentication, downloads to users have to be mediated through an application. In Rails 2.3, it looks like you can dynamically build a response as follows: # Streams about 180 MB of generated data to the browser. render :text => proc { |response, output| 10_000_000.times do |i| output.write("This is line #{i}\n") end } (from http://api.rubyonrails.org/classes/ActionController/Base

How can I use boto to stream a file out of Amazon S3 to Rackspace Cloudfiles?

风格不统一 提交于 2019-11-26 10:29:24
问题 I\'m copying a file from S3 to Cloudfiles, and I would like to avoid writing the file to disk. The Python-Cloudfiles library has an object.stream() call that looks to be what I need, but I can\'t find an equivalent call in boto. I\'m hoping that I would be able to do something like: shutil.copyfileobj(s3Object.stream(),rsObject.stream()) Is this possible with boto (or I suppose any other s3 library)? 回答1: The Key object in boto, which represents on object in S3, can be used like an iterator

Ruby on Rails 3: Streaming data through Rails to client

柔情痞子 提交于 2019-11-26 05:19:46
问题 I am working on a Ruby on Rails app that communicates with RackSpace cloudfiles (similar to Amazon S3 but lacking some features). Due to the lack of the availability of per-object access permissions and query string authentication, downloads to users have to be mediated through an application. In Rails 2.3, it looks like you can dynamically build a response as follows: # Streams about 180 MB of generated data to the browser. render :text => proc { |response, output| 10_000_000.times do |i|