0

To give a brief idea of my current setup:

HTML form to collect the data for the Python Script

PHP inserts form data (POST) into input MySQL table

PHP exec s Python3.4 script.py row_id

Python collected the row ID using sys.argv[1] and runs the code and inserts the results into output MySQL table. When complete PHP displays a report (PHP waits for Python to complete, then I use header to go to the report URL).

The report pages are html with the blanks filled by PHP from the output MySQL table.

This worked fine for me testing and developing. The problem came when I released it into the wild for more general testing within the company - it is hosted on a VPS server and when the Server ran more than one instance of Python it ran like a dog (as we say in Scotland, because to say slow would be too easy).

Of course, the hosting company suggest upgrading, which we did and that doubled the capacity (from 1, sometimes 2 simultaneous runs, to 2 sometimes 3). I have now reached the conclusion that this method is not scalable, we are always going to hit some kind of limit.

So I have been looking at Amazon Web Services, but to do this requires a change in philosophy for the Python design - it must run all the time and have data fed in somehow (as far as I can see).

The Amazon suggestion is Flask or similar framework, but this would mean I then either have two web servers and have to do some kind of cross domain transfer of data, or else scrap what I have done - or at least heavily modify my PHP/HTML/JS that runs on the webserver so that it can be served by Flask and all be hosted by AWS.

Unless someone can suggest another way of communication that doesn't require a web framework, I have been considering either just polling the MySQL database at intervals and processing any new data, or else migrating the database to Postgre, which, I believe, allows Python code to run within the database structure and would enable me to trigger on insert (I believe the user defined triggers in MySQL only support C/C++).

Thanks for any suggestions or pointing out of things I am missing.

Blair

  • what ever the Python script is doing i would bet the php could do. wouldent that be an esier approach? –  Feb 03 '17 at 03:04
  • You can use a task queue such as Celery from PHP (using something like https://github.com/gjedeer/celery-php) to run the tasks in series, instead of simultaneously. – Selcuk Feb 03 '17 at 03:06
  • 1
    You might want to look at using Python Pyro to setup a client/server daemon that keeps your Python code running in memory. It is a little tricky to set this up but if your interested check out [How do I properly call a Python Pyro client using PHP and Apache web server?](http://stackoverflow.com/questions/41516827/how-do-i-properly-call-a-python-pyro-client-using-php-and-apache-web-server) – infinigrove Feb 03 '17 at 04:07
  • Thanks for the advice - I'll look into those. @nogad unfortunately I do statistical analysis on the input which runs to hundreds of thousands of calculations and requires numpy and various other modules. –  Feb 03 '17 at 13:21

1 Answers1

0

Further to infinigrove's suggestion Pyro4 works well for my use case, there is an excellent tutorial that provides an easy learning curve.

Selcuk's suggestion would probably also have worked, but I would still have been left with the problem of running the code on other servers in a distributed environment. Also if there is a choice between writing Python code and writing PHP I am afraid I take Python every time!