2

I got a lot of images to load from Amazon S3 on a single page, and sometimes it takes quite some time to load all the images. I heard that splitting the images to load from different sub-domains would help parallel downloads, however what is the actual implementation on that? While it is easy to split for sub-domains like static,image, etc; Should I make like 10 sub-domains (image1, image2...) to load say 100 images? Or is there some clever ways to do?

(By the way I am considering using memcache to cache the S3images; I am not sure if it is possible. I would be grateful for any further comments. Thanks a lot!

StCee
  • 241
  • 3
  • 14
  • with subdomain you mean buckets or different regions ? To speedup a direct load of S3 objects have a look at Amazon CloudFront – golja Oct 19 '12 at 14:07
  • All my images are in the same bucket; but I mean using cname so that the images are downloaded through different subdomain like image1.domain.com, images2.domain.com; but they ultimately all pointed to the same bucket. – StCee Oct 19 '12 at 14:17

3 Answers3

4

The biggest speed-up you're going to get is moving to CloudFront, Amazon's CDN, which can sit in front of S3 and make things dramatically faster for users.

Once you've done that, if you want, you can make multiple CloudFront subdomains that all point at the same S3 bucket, which you could then randomly assign to images so they load in parallel (essentially your image1, image2 idea).

ceejayoz
  • 32,910
  • 7
  • 82
  • 106
1

I'll try to sort my answer in points which can for sure help with you status:

  • As we are talking about Static content, I suggest use Nginx as a reversed proxy to server your static content. Slow download may be because you are hitting Apache's limit of concurrent connections.
  • Using different or multi-sub-domains might help on the scope of browsers themselves, but in the other hand might impact your page's over all global speed and rank, as your sub-domains going to add more DNS resolving requests which is not a good thing.
  • Take a look at something like: Cloudflare.com, they have some free very good services to benefit from.
  • Make use of Browser Caching header, this might be a great help in case your content is not changing very dynamically.
  • Try to keep your static scripts (ex. CSS, JS) always minified to the best.
  • Try using some service like: http://tools.pingdom.com/fpt/ and http://gtmetrix.com/ as they can give you very worthy details about your pages loading performance in addition to valuable hints for better results.
  • Try to minimize the requests your pages generate (like combining different style or layout pictures in less pictures and control them by CSS)

Humm.. this is what I had in mind atm.

1

You're idea of splitting the images across multiple CNAMES is a good one and is best described by Google's best practises: https://developers.google.com/speed/docs/best-practices/rtt#ParallelizeDownloads

Alastair McCormack
  • 2,184
  • 1
  • 15
  • 22