I have a process that works in the following way:
downloading data from the internet.
- executing a program → creating output A.
- output A → executing a program → creating output B
- output B → executing another program → creating output C
- output C → executing yet another program → creating output D
All this is automated via a bash script. I know how to use crontab to automate execution.
I now want to have it running every 6 hours and upload output D to an FTP server, accessible via the internet. I do not need a nice-looking HTML website, just an FTP. I already have a domain.
My questions are: What is the least costly way to do this? I basically need to rent a CPU 24/7. How do I bring output D to the FTP server? Does the FTP server have to run on the same CPU or on a second one?
As You see I do not know a lot about web stuff. I know a bit about Amazon EC2.