5

So, I'm using Paperclip and AWS-S3, which is awesome. And it works great. Just one problem, though: I need to upload really large files. As in over 50 Megabytes. And so, nginx dies. So apparently Paperclip stores things to disk before going to S3?

I found this really cool article, but it also seems to be going to disk first, and then doing everything else in the background.

Ideally, I'd be able to upload the file in the background... I have a small amount of experience doing this with PHP, but nothing with Rails as of yet. Could anyone point me in a general direction, even?

Simone Carletti
  • 173,507
  • 49
  • 363
  • 364
Steve Klabnik
  • 14,521
  • 4
  • 58
  • 99

4 Answers4

6

You can bypass the server entirely and upload directly to S3 which will prevent the timeout. The same thing happens on Heroku. If you are using Rails 3, please check out my sample projects:

Sample project using Rails 3, Flash and MooTools-based FancyUploader to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-FancyUploader

Sample project using Rails 3, Flash/Silverlight/GoogleGears/BrowserPlus and jQuery-based Plupload to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-Plupload

By the way, you can do post-processing with Paperclip using something like this blog post (that Nico wrote) describes:

http://www.railstoolkit.com/posts/fancyupload-amazon-s3-uploader-with-paperclip

iwasrobbed
  • 46,496
  • 21
  • 150
  • 195
5

Maybe you have to increase the timeout in the ngix configs?

BvuRVKyUVlViVIc7
  • 11,641
  • 9
  • 59
  • 111
1

You might be interested in my post here:

http://www.railstoolkit.com/posts/fancyupload-amazon-s3-uploader-with-paperclip

Its about uploading multiple files (with progress bars, simultaneously) directly to S3 without hitting the server.

  • Thanks for the link! The only problem that I can see with this is that FancyUpload is in Flash, and flash has to load the entire file into memory before starting the upload. So If I'd want to upload a 300MB file, I have to have that much RAM... the flash uploaders I tested all made my Firefox crash, and I have 4GB in my machine. However, the article is still interesting, and I'll be sure to refer to it later... – Steve Klabnik Aug 28 '09 at 17:34
  • Oh, thats indeed a disadvantage! I didn't know about that. –  Aug 29 '09 at 14:36
  • 2
    The comment about Flash loading the entire file into memory prior to uploading is no longer true. They have changed it to where it only buffers what it needs now – iwasrobbed May 27 '11 at 17:18
0

I was having a similar problem but with using paperclip, passenger and apache.
Like nginx, apache has a Timeout directive in apache which I increased to solve my problem.

Also there's an interesting thing passenger does when uploading large files.
Anything over 8k is written to /tmp/passenger. and if apache doesn't have permissions to write there you get 500 errors also.

Here's the article.
http://tinyw.in/fwVB

jacklin
  • 2,739
  • 1
  • 24
  • 31