I'm building a web application, and I need to use an architecture that allows me to run it over two servers. The application scrapes information from other sites periodically, and on input from the end user. To do this I'm using Php+curl to scrape the information, Php or python to parse it and store the results in a MySQLDB.
Then I will use Python to run some algorithms on the data, this will happen both periodically and on input from the end user. I'm going to cache some of the results in the MySQL DB and sometimes if it is specific to the user, skip storing the data and serve it to the user.
I'm think of using Php for the website front end on a separate web server, running the Php spider, MySQL DB and python on another server.
What frame work(s) should I use for this kind of job? Is MVC and Cakephp a good solution? If so will I be able to control and monitor the Python code using it?
Thanks