0

I run a daily Python script that currently outputs the data to Google Sheets using gspread. I then use this Google Sheet with Sheetsu, which creates an API (that outputs JSON) for an app. However, as the feed gets many requests it can end up being expensive with Sheetsu ($25 dollars a month+)

So I am going to tweak my Python script to output a JSON file instead. However, I need to host this JSON data somewhere. It need to be fast and maybe cached too (I currently use caching with Sheetsu - which is available by request)

What Amazon AWS service options are there for this? I see there is an AWS API Gateway and I have seen it mentioned that people host JSON on S3 storage. But unsure about caching with that and speed etc.

So I need some advice please on options for AWS and the code needed to implement the best option.

EC2 to S3 links

How to transfer files between AWS s3 and AWS ec2

http://tecadmin.net/install-s3cmd-manage-amazon-s3-buckets/#

How to move files amazon ec2 to s3 commandline

https://serverfault.com/questions/285905/how-to-upload-files-from-amazon-ec2-server-to-s3-bucket

Community
  • 1
  • 1
me9867
  • 1,519
  • 4
  • 25
  • 53

1 Answers1

1

Create an S3 bucket with static site hosting enabled. Copy the json file from EC2 to the S3 bucket using the Python AWS SDK (Boto) or the AWS CLI Tool.

You mentioned you are concerned with caching and speed of hosting on S3. You can enable S3 transfer acceleration, or you can place a Content Delivery Network (CDN) like CloudFront in front of your S3 bucket.

Mark B
  • 183,023
  • 24
  • 297
  • 295