What is the best way to allow visitors to a website to download files the size of tens of gigabytes? In this case it is about movie files, so they are already relatively compressed.
Here are some assumptions:
Only one visitor will want to download a file at any time and nobody other than our servers has the file at any one time.
There is a high probability that the transfer will be interrupted due to network problems or other problems, so some kind of resume should be available.
The visitor will get a download link either on the website or via email.
Should work at least for visitors running Windows or OS X. It would be a bonus if it also works with some Linux distribution.
My experience with FTP and normal HTTP is that some visitors manage to download what they believe is the entire file but it then turns out not to be working because the file is incomplete or corrupted.
Printing the md5 sum or similar on the website and asking the visitor to run md5 on the download and then compare the results is too complicated for most website visitors.
Zipping the file adds a layer of error checking and completeness checking but often adds some manual steps for the visitor.
Is there some better solution?