My files are currently being uploaded to an s3 bucket according to the tutorials provided.
I have a Post type with a file field pointing to an S3Object. S3Object ha
I believe you need to wrap $util.dynamodb.fromS3ObjectJson($ctx.result.file)
in a call to $util.toJson()
. Can you kindly change the response mapping template to $util.toJson($util.dynamodb.fromS3ObjectJson($ctx.result.file))
and see if that works?
As a side-note, I think you can achieve the desired effect without making a second call to DynamoDB from the Post.file resolver. Create a "None" datasource and change the Post.file resolver to use it. You can provide a barebones request mapping template such as
{
"version": "2017-02-28",
"payload": {}
}
and can then change your response mapping template to use the source instead of the result.
$util.toJson($util.dynamodb.fromS3ObjectJson($ctx.source.file))
Since the post will already have been fetched by the time the Post.file field is being resolved the information will already be available in the source. If you needed to fetch the S3Link from a different table than the Post objects then you would need a second DynamoDB call.
Hope this helps and let me know if the call to $util.toJson fixes the issue.
Thanks