0

I have 2 models that I want CarrierWave to interact with. Item and Image models.

What I want to happen is once the user uploads their item to S3 via Carrierwave - that part is pretty straight forward - whenever another user wants to download the item, they press a button that triggers some method that dynamically generates a download link - that expires when 2 conditions are met. Either the item is downloaded X number of times, or X hours have passed (say 24 hours).

The idea being that there isn't a static download link floating around on the internet to that file.

How do I do that?

marcamillion
  • 32,933
  • 55
  • 189
  • 380

2 Answers2

3

CarrierWave allows you to set the fog_public and fog_authenticated_url_expiration options both for every uploader (through an initializer) or on a specific uploader. In the latter case you just place self.fog_public = false and self.fog_authenticated_url_expiration = 123 inside your uploader class definition. With these two options set, any calls to model.uploader.url will return a specially built URL that will expire after the set amount of time.

Douwe Maan
  • 6,888
  • 2
  • 34
  • 35
  • Yeah, we totally didn't just reach this conclusion together on IRC. :) – Douwe Maan May 05 '13 at 22:20
  • This is what one needs also if you want to set "Content-disposition: Attachment" on the download link. – Jonathan Allard Aug 15 '14 at 19:48
  • For anyone looking for a more detailed walkthrough on how to implement this in CarrierWave (as well as other file upload gems), here's a tutorial: https://medium.com/@carlosramireziii/secure-file-download-urls-in-rails-d52128b24311#.tubaua8va – Carlos Ramirez III Jul 10 '16 at 00:38
0

I'm pretty sure you can only limit the amount of time on an s3 image. you can't limit the number of downloads.

http://docs.aws.amazon.com/AmazonS3/latest/dev/S3_QSAuth.html talks some about it. If you generate with https then it will be hard to sniff and likely you will be safe. You could build your own front end to s3 but then you would have to track the URL, count the downloads and stop it yourself. I'd think hard about that. bytes are not that expensive (IMHO)

Peter Kellner
  • 14,748
  • 25
  • 102
  • 188
  • Hrmm....I was thinking of creating some sort of bridge - e.g. I have a `Downloads` table - where it has the 'S3' link as one column, and basically create a new record in my table for each new dload request where that new URL is unique. Surely there must be a way to do that. – marcamillion May 03 '13 at 23:33
  • yes, very straight forward, you could even redirect them to it but it would still be exposed. If you wanted it secure you would need to pump the bytes out yourself while reading the s3 blob yourself. If you did that hosted inside amazon, you would not pay for the traffic to the s3 storage, just to your end user. Otherwise, you end up paying for it twice. – Peter Kellner May 04 '13 at 02:46
  • actually, just to be clear, a redirect you only pay once. if you download the blob to your server outside the amazon cloud, then serve it up, you pay twice. if you host in amazon cloud, you only pay once. – Peter Kellner May 04 '13 at 03:05