fog

How to upload S3 metadata with a file in Fog?

眉间皱痕 提交于 2019-12-06 13:30:17
I have tried: my_directory.files.create(key: key, body: body, metadata: { custom: "x" }) And: my_directory.files.create(key: key, body: body, custom: "x" }) But the "custom" metadata is not showing up in the S3 web interface. What am I doing it wrong? How should I do it? According to Programming Amazon Web Services by James Murty (O'Reilly), page 74: S3 does not allow you to set arbitrary metadata items to be returned as HTTP headers; only some header names are recognized as legal HTTP headers. Any header with a name the service does not recognize is discarded. According to the properties

How to upload custom S3 metadata with Carrierwave

一世执手 提交于 2019-12-06 07:14:01
问题 I want to add Content-Disposition header to a file I'm uploading with carrierwave (it's not an option to do it afterwards via query param in the URL). Is there something I can add to the AttachmentUploader model that would help me accomplish this, before the file is uploaded? Thanks! 回答1: You can set attributes either globally in your Carrierwave config - CarrierWave.configure do |config| config.fog_attributes = {'Content-Disposition' => ...} end or you can define it on the uploader class

Rails image_tag rotates image

老子叫甜甜 提交于 2019-12-06 06:24:28
问题 I am using Amazon's S3 for image storage with carrierwave and fog configured. The images seem to store correctly however when I have a 'portrait' image (smaller width than height) it is not displaying correctly, but rather rotating the image on its side. Any pointers in the right direction would be much appreciated! uploaders/image_uploader.rb class ImageUploader < CarrierWave::Uploader::Base include CarrierWave::RMagick include Sprockets::Helpers::RailsHelper include Sprockets::Helpers:

Set content_type of Fog storage files on s3

放肆的年华 提交于 2019-12-06 06:18:29
I'm working with Fog and Amazon s3 to manage video and image files. I've been running into a lot of trouble with setting the content_type for my files. When working from the console, I am able to go through and individually update each file's content_type, and then run save. However, when I try to run an update on all of the files within a specific directory, I don't get an error, but nothing gets updated. I've run multiple different methods, all with the same basic idea, and all set to print "saved!" if the file saves. The methods run properly and print out "saved!", but when I go back and

Digest::Digest is deprecated; use Digest

落爺英雄遲暮 提交于 2019-12-05 22:24:28
I am getting the following error Digest::Digest is deprecated; use Digest when i try to boot my rails server. I tried to search my source code for Digest::Digest but i am not using it anywhere. any idea how to solve that? Only place i am using is <% digest = OpenSSL::Digest.new('sha1') %> @alias = Digest::MD5.hexdigest(phone) It is most likely used by one of the gems your app is dependent on. install (unless already installed) ack tool and run the following command: # of course, the path to your gems will be different ack Digest::Digest /Users/username/.rbenv/versions/2.3.1/lib/ruby/gems/2.3.1

Need to change the storage “directory” of files in an S3 Bucket (Carrierwave / Fog)

随声附和 提交于 2019-12-05 17:43:47
I am using Carrierwave with 3 separate models to upload photos to S3. I kept the default settings for the uploader, which was to store photos in a root S3 bucket. I then decided to store them in sub-directories according to model name like /avatars, items/, etc. based on the model they were uploaded from... Then, I noticed that files of the same name were being overwritten and when I deleted a model record, the photo wasn't being deleted. I've since changed the store_dir from an uploader-specific setup like this: def store_dir "items" end to a generic one which stores photo under the model ID

How can I use fog to edit a file on s3?

不羁岁月 提交于 2019-12-05 12:11:03
I have a bunch of files on s3. I have fog set up with a .fog config file so I can fire up fog and get a prompt. Now how do I access and edit a file on s3, if I know its path? The easiest thing to do is probably to use IRB or PRY to get a local copy of the file, or write a simple script to download, edit and then re-upload it. Assume you have a file named data.txt. You can use the following script to initialize a connection to S3. require 'fog' connection = Fog::Storage.new({ :provider => 'AWS', :aws_secret_access_key => YOUR_SECRET_ACCESS_KEY, :aws_access_key_id => YOUR_SECRET_ACCESS_KEY_ID })

Duplicated key at line 80 ignored: “name” rvm

醉酒当歌 提交于 2019-12-05 06:08:29
This doesn't seem to have affected anything, it's just irritating in my terminal - I regularly receive the following warning (sometimes I receive multiple, calling out different lines, and sometimes the path after /gem/ varies, but other than that, this is the output: /Users/alecwilson/.rvm/gems/ruby-2.2.1/gems/fog-1.23.0/lib/fog/rackspace/mock_data.rb:42: warning: duplicated key at line 80 ignored: "name" It's most common when bundling and running rake test . Any idea on how to fix it? I'm generally pretty wary of editing files in my .rvm directory, as I've royally screwed it up before, and

How to check if image version exists on S3 with Carrierwave and Fog?

会有一股神秘感。 提交于 2019-12-05 03:42:43
I'm uploading my images with Carrierwave and Fog to S3. On the upload I also create a thumbnail version of the image: version :thumb do process :resize_to_limit => [90, 80], if: :is_resizable? end Now I need a method to check if thumbnail version exists. The Documentation lists the exists? method. This actually works, if I want to check the existence of the original version: asset.file.exists? # => true But when I use the "thumb" version like this: asset.url(:thumb).file.exists? it get: undefined method 'exists?' for #<String:0x007fcd9f9d9620> : Use this: asset.thumb.file.exists? instead of:

Ruby - Append content at the end of the existing s3 file using fog

我的未来我决定 提交于 2019-12-05 00:29:03
问题 How to append text in an existing or newly created file in S3. I am using fog and I have following code require 'fog' file = "abc.csv" bucket = 'my_bucket' storage = Fog::Storage.new(:provider => 'AWS', :aws_access_key_id => 'XXXXXXXX', :aws_secret_access_key => 'YYYYYYYY') dir = connection.directories.new(:key => bucket) # no harm, if this bucket already exists, if not create one buffer = ["big_chunk1", "big_chunk2", "big_chunk3", "big_chunk4", "big_chunk5"] # I need help after this line. No