Overview
This library provides a very simple API binding to use Amazon S3 from Google Apps Script. It includes put/delete buckets and put/get/delete objects (CRUD). Almost none of the advanced S3 options regarding either buckets or object operations are supported by this library.
I wrote this library with the motivation to have a key/value store capable of handling objects larger than the 100 KB size limit of Google’s ScriptDB. It’s probably limited by the size of requests that Google allows for the UrlFetch service, which I haven’t tested.
It provides an S3 object, representing the S3 service, which must be constructed with your AWS key and secret. That service has methods to put/delete buckets (by bucket name) and put/get/delete objects (by bucket name + object name). Object content is passed/returned as strings (keep track of the encodings yourself). Getting a non-existent object (AWS error code “NoSuchKey”) returns null.
Unexpected AWS errors result in exceptions (objects with name=”AwsError”).
To look at the code itself, you can read it in Apps Script editor.
Usage Example
var s3 = S3.getInstance(awsAccessKeyId, awsSecretKey); var blob = UrlFetchApp.fetch("http://www.google.com").getBlob(); s3.putObject("bucket", "googlehome", blob, {logRequests:true}); //and to get it back var fromS3 = s3.getObject("bucket", "googlehome");Installation
You can link this library into a Google Apps Script using it’s project key:
MB4837UymyETXyn8cv3fNXZc9ncYTrHL9If you don’t know how to link a library from the Apps Script editor, read Google Apps Scripts : Libraries – Using a Library.
Alternatively, you can make a copy of the project by opening it an choosing File –> Make a Copy.
As of 2023, you might need to use “Script ID”:
1Qx-smYQLJ2B6ae7Pncbf_8QdFaNm0f-br4pbDg0DXsJ9mZJPdFcIEkw_Documentation
I find in-source documentation to be more reliable and easier to maintain than a webpage, so I’ll leave you with the comments in the source. I’ve tried to document all the major bits, using Google’s jsDoc style.
License
The full version of the license can be found with the code, but here’s a quick summary: This code is provided as-is, with absolutely no warranty or assurance that it will not delete, corrupt, misplace, or expose your data. If you’re concerned about those risks, don’t use this library. You can also read the code and, if you find something you don’t like, change it. That’s why they call it open source.
Contributing
Given limitations of Google’s version control system for Apps Script, I’ve put a copy of the source on Github at S3-for-Google-Apps-Script. Send pull requests / create issues there and I’ll try to get to them, and periodically copy the changes back into the library hosted in Apps Script.
Testing
Run the tests of the library from a separate App Script instance as follows (replacing the environment properties with your own AWS access key / secret):
function runTests() { S3.setTestEnv({ "awsAccessKeyId":"{{replace this with your access key}}", "awsSecretKey":"{{replace this with your secret}}", }); S3.runTests(); //if running in Spreadsheet context, you can uncomment this //Browser.msgBox("All tests passed"); }Check View –> Logs to verify everything went OK, or see where in the tests it died. If someone wants to point me to a JSUnit-equivalent for Google Apps Script, that’d be awesome. (or better yet, get Jasmine/Karma working for this).
Love this scripts !!
Just what I was looking for!
This is great Erik,
Is it possible to use assumed roles with these libraries?
There is some information here: http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-api.html
This is how I manage it in the AWS CLI
[profile XXX]
role_arn = arn:aws:iam::000000000000:role/
source_profile = default
Hi Todd – I haven’t tried that. From a quick look at those docs, it would seem that you could implement an STS client to retrieve the credentials for the assumed role, and then have to pass those credentials (accessKey, secret, possibly sessionToken) when you construct the S3 instance. Those credentials thereafter would be used to construct the auth header for the request.
Exectly what I looked. Thank you
Thanks for the script.
I get the following error though when I try to `getObject()`. Same issue reported on your repo.
https://github.com/eschultink/S3-for-Google-Apps-Script/issues/1#issue-131449236
> The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.
Someone has attempted a patch for this, but I don’t have a reliable test env to verify it so YMMV: https://github.com/eschultink/S3-for-Google-Apps-Script/pull/2
Hi Erik,
i get this error: AWS Error – InvalidRequest: The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256. (Line 184, File “S3Request”, Project “S3”)
Any Ideas?
Regards, Ingo
Someone has attempted a patch for this, but I don’t have a reliable test env to verify it so YMMV: https://github.com/eschultink/S3-for-Google-Apps-Script/pull/2
what is needed to be done for changing region. Getting AWS Error – SignatureDoesNotMatch: The request signature we calculated does not match the signature you provided. Check your key and signing method.
I think this is because the signing method I used when I originally wrote this is now out-of-date. Someone has attempted a patch for this, but I don’t have a reliable test env to verify it so YMMV: https://github.com/eschultink/S3-for-Google-Apps-Script/pull/2
I am using the script to attempt to upload a json of a google sheet into my S3 bucket. I am connecting to the bucket fine, and even get a 200 response code for the putObject, though the response has an empty body. The object is failing to show up in my bucket. Any ideas what might be the issue or how to further troubleshoot it? Thanks
Nevermind, I resolved the issue. The url for the put was doubling up so it was incorrect. It now is working, thanks for the great script!
Was anyone able to confirm this working with the updated signing method?
I’ve tried to upload a file with AWS Signature Version 4, but unfortunately, it doesn’t work for me.
https://github.com/eschultink/S3-for-Google-Apps-Script/pull/2
so I made a simple HTTP post upload to s3 from google script.
If anyone needs to upload a file to s3, this may help you.
https://github.com/jadewon/s3-http-post-gs
This is very much helpful. Worked like a charm. Thank you very much for this.
I am having below error, can someone please help…!
AWS Error – InvalidArgument: Requests specifying Server Side Encryption with AWS KMS managed keys require AWS Signature Version 4.
I tried below as well still not working
var s3 = S3.getInstance(‘AccessKey’, ‘awsSecretKey’,’com.amazonaws.services.s3.enableV4′);
I
var s3 = S3.getInstance(‘AKIA2G7BJLKI6ZQ4TCPJ’, ‘ez6VbuGi5cGz1FzaRScYC8b/vAdk0nhrYSnS58Hi’,{‘com.amazonaws.services.s3.enableV4’:true});
This is great. It looks like the github repo has the change for v4, but the library in github is older. Should I be using this instead?
https://github.com/dxdc/aws-sdk-google-apps
not related to mine, but looks like it would support s3 ops from apps script, if you link the library
When I copy paste the MB4837UymyETXyn8cv3fNXZc9ncYTrHL9 as the script ID, it returns that ‘Unable to look up library’. Please help
try 1Qx-smYQLJ2B6ae7Pncbf_8QdFaNm0f-br4pbDg0DXsJ9mZJPdFcIEkw_ ?
possibly Google has migrated this stuff since 2014 …
Perfect, it works. Appreciate your help.
This is a perfect library, and works for most of the situation. But when I test a large google sheet, it return an error ‘Exception: Limit Exceeded: URLFetch POST Size’. Is there any solution to bypass this issue?
Sounds like a limit in the Apps Script URLFetch service
Ha! This works fine for Yandex Cloud Object Storage too. You need to change http to https and s3 host to storage.yandexcloud.net