amazon-s3

Spark 1.6 DirectFileOutputCommitter

青春壹個敷衍的年華 提交于 2020-01-06 19:59:37
问题 I am having a problem saving text files to S3 using pyspark. I am able to save to S3, but it first uploads to a _temporary on S3 and then proceeds to copy to the intended location. This increases the jobs run time significantly. I have attempted to compile a DirectFileOutputComitter which should write directly to the intended S3 url, but I cannot get Spark to utilize this class. Example: someRDD.saveAsTextFile("s3a://somebucket/savefolder") this creates a s3a://somebucket/savefolder/

Spring Cloud AWS with Transfermanager : Unable to complete transfer: Connection pool shut down

最后都变了- 提交于 2020-01-06 18:34:10
问题 I am using Spring Boot 1.5.1.RELEASE with Spring Cloud AWS 1.1.3.RELEASE to upload files to the AWS S3 bucket. I wanted to use the TransferManager to upload the files to the S3. But unfortunately, I am getting the following error message during the upload and the files are not uploaded to the S3: 2017-02-26 12:36:27.004 ERROR 32696 --- [ask-scheduler-8] o.s.integration.handler.LoggingHandler : com.amazonaws.AmazonClientException: Unable to complete transfer: Connection pool shut down at com

Site domain redirecting to the url of amazon web services bucket

穿精又带淫゛_ 提交于 2020-01-06 18:08:34
问题 I have a static website which is currently being hosted in an AWS S3 bucket: mysite.com I used GoDaddy to purchase the domain for mysite.com and I'm in the process of configuring this URL so that it points to my AWS S3 bucket. I believe I've configured the S3 bucket correctly as mysite.com.s3-website-us-east-1.amazonaws.com serves my static site. However, I have a problem with my AWS Route 53. I set up a hosted zone with an alias, nameservers, and a start of authority. My problem is that when

Control uploads to S3 using BFTask and AWS SDK iOS v2

一个人想着一个人 提交于 2020-01-06 18:05:44
问题 I am using BFTask together with AWS SDK v2 for iOS to upload and download files to AWS S3 storage. The following code works very well but I am wondering if anyone know how I can gain more control over the maximum number of uploads to allow and also better approach to receiving feedback for upload progress. I have read the documentation for both AWS SDK v2, the source code, and the BFTask readme but I am still uncertain how I gain control. For example, how would I edit the following code to

Error in setting up Tachyon on S3 under filesystem

こ雲淡風輕ζ 提交于 2020-01-06 16:46:19
问题 I am trying to set up Tachyon on the S3 filesystem. I am completely new to Tachyon and am still really reading what I can find on it. My tachyon-env.sh is given below: !/usr/bin/env bash # This file contains environment variables required to run Tachyon. Copy it as tachyon-env.sh and # edit that to configure Tachyon for your site. At a minimum, # the following variables should be set: # # - JAVA_HOME, to point to your JAVA installation # - TACHYON_MASTER_ADDRESS, to bind the master to a

django heroku media files 404 error with amazon s3

浪子不回头ぞ 提交于 2020-01-06 13:56:54
问题 So i have followed this question How to set-up a Django project with django-storages and Amazon S3, but with different folders for static files and media files? in order to get my django app uploading media files to my amazon S3 bucket. I am using django-oscar by the way. Everything seemed to work fine right after i uploaded the image, but when i reload the page, the images disappear and i get a 404 error. My static files work fine.. i have found no problems. UPDATE: I have changed my bucket

AFNewtorking loading images from Amazon S3

左心房为你撑大大i 提交于 2020-01-06 12:42:44
问题 We were using AFNetworking only to load images like this: [thumbnailImage setImageWithURL:_item.thumbnailUrl]; and it worked fine. But now we've pulled in the rest of the project it has stopped working. It was just showing the background color. So I tried using this and it loads the placeholder but never loads the image. UIImage* placeholder = [UIImage imageNamed:@"placeholder"]; [thumbnailImage setImageWithURL:_item.thumbnailUrl placeholderImage:placeholder]; I thought I might be able to see

Adding S3 trigger to Lambda function using CloudFormation

若如初见. 提交于 2020-01-06 09:03:13
问题 I'm trying to add an S3 trigger to a lambda function using CloudFormation. From what I've read about circular references the lambda function and S3 bucket needs to be created first, which I've done with a template and they get created successfully. Then I go into "Update Stack" and enter the template: "Resources": { "MyBucket": { "Type": "AWS::S3::Bucket", "NotificationConfiguration": { "LambdaConfigurations": [ { "Event": "s3:ObjectCreated:*", "Function": "arn:aws:lambda:ap-southeast-2

S3 bucket with credentials error

半世苍凉 提交于 2020-01-06 08:06:12
问题 I'm having trouble using the meteor slingshot component with the S3 with temporary AWS Credentials component. I keep getting the error Exception while invoking method 'slingshot/uploadRequest' InvalidClientTokenId: The security token included in the request is invalid. Absolutely no idea what I'm doing wrong. If I use slingshot normally without credentials it works fine. import { Meteor } from 'meteor/meteor'; import moment from 'moment'; const cryptoRandomString = require('crypto-random

Error Updating Stack to Add S3 Trigger

故事扮演 提交于 2020-01-06 07:20:41
问题 I successfully created a lambda function and S3 bucket using a cloudformation stack. I then ran an update to the stack to add a trigger to the S3 bucket to invoke a lambda function. When I run the update it's giving the following error: Unable to validate the following destination configurations (Service: Amazon S3; Status Code: 400; Error Code: InvalidArgument; Request ID: XXXXX; S3 Extended Request ID: XXXXX This is the update JSON I'm using to add the trigger to the S3 bucket: "MyBucket":