I've created an API in Laravel, that allows users to upload zip archives that contain images.
Once an archive is uploaded it's sent to S3 and then picked up by another service to be processed.
I'm finding that with larger archives PHP keeps hitting its memory limit. I know I could raise the limit but that feels like a slippery slope, especially as I imagine multiple users uploading large files.
My current solution has been to completely forego my server and allow the client to upload directly to S3. But this feels very insecure and susceptible to spamming/DDOSing.
I guess what I'm really hoping for is a discussion about how this could be handled elegantly.
Is there a language more suitable for this sort of processing/concurrency? I could easily spawn the uploading process out to something else.
Are my issues about S3 unfounded? I know ever request needs to be signed but the tokens generated are reusable, so they're exploitable.
Resources online speak about NGINX as a better solution, as it has an upload module that write uploads directly to file, as apache appears to be trying to do a lot in memory (not 100% sure about this).
I'm pretty unclear about the whole PHP upload process if I'm honest. Is a request stored directly in memory? i.e. Ten 50mb uploads would cause a memory limit exception against my 500mb of RAM