2

Apologies for the rookie question would really like some advice on best practice regarding my issue.

I have ffmpeg installed in my box and I am running a simple script that converts a .mp3 to a .wav file using shell_exec with php.

This works fine when not abused recently a user of my service wrote a macro script that started the process of convert nearly 10,000 mp3's this wiped out my server until I pulled the plug.

Now, I fully understand I should have something in place to prevent this problem, what is the best method to prevent this kind of situation?

I check the EC2 and had to stop and restart it the cpu was peaking over 90 crashed everything.

Does it make sense to have a php script to check the box cpu for load and if it is somewhere over 50% just don't run the script?

I am still learning would appreciate some advice to give me peace of mind that this issue won't happen again.

Thanks

user1503606
  • 121
  • 2

1 Answers1

1

Two things, 1) rate limiting on your API for paying customers and free tier uses. 2) An AWS architecture pattern for this would be to queue the messages and scale workers based on the size of the queue. You could even have seperate queues for paid users and free users. That way the workers check the Paid queues first/more often than the free tiers ones.

http://docs.aws.amazon.com/autoscaling/latest/userguide/as-using-sqs-queue.htmlenter image description here

strongjz
  • 832
  • 4
  • 7