1

This is more an architectural question.

I'm about to code a bunch of Import implementations. They all expect some parameters (i.e. an CSV file) and then will take up quiet some time to proceed. In my previous project, I used to send those Imports in the background using an "shell_exec()"-command and than monitored a logfile in the browser to report about the status. My big hope now is, that Laravel takes over here to streamline all that manual work.

For now, my question would be about the proposed class architecture behind this.


My requirements for a bunch of imports are:

  • Each Import need to run as a background process
  • Monitor progress in Browser (and logfile)
  • Start imports in console and via HTTP

Right now I plan to use a "Job" in L5.1 to implement the basic Import. What I'm struggling with, is the implementation of some kind of "progress bar" and monitoring of the (most recent) "log messages" in the browser. I do not need a real "live" view via sockets, but it should be possible to regularly update the progress view of a running Import.

  • Has anybody some hints, how to implement this progress stuff?

My approach so far: Read the CSV file, put each line to a queue element and monitor the queue. The log messages could trigger an event that populates a stack of the most recent log messages. (I may run into race conditions because some lines may depend on a previous processing of another line)

Vladislav Rastrusny
  • 29,378
  • 23
  • 95
  • 156
patriziotomato
  • 591
  • 1
  • 11
  • 25

1 Answers1

0

I would make a ActiveBackgroundTask model like this:

  • handler_class_name
  • state
  • progress
  • latest_log_messages
  • result

And create a cron task in your system to periodically check this table and to start tasks, that appear in this table in the state created. Each task is passed its id in the table to periodically update result and latest_log_messages field.

You can expand on this idea by, for example, standardizing location of the log files for each task so not only latest messages could be extracted, but also full task log could be downloaded.

In this case the state of each task could be checked very easily from every script in your system.

There will be a problem of detecting dead tasks, which aborted due to PHP error or exception. If you need this, you can keep php process PIDs and that cron script could check if tasks with running state are still really running.

Does that suite your needs?

Vladislav Rastrusny
  • 29,378
  • 23
  • 95
  • 156
  • Yes, that should do it. With my question I was also courious on what Laravel brings with it out of the box or what design patterns are recommended here. – patriziotomato Oct 12 '15 at 07:26
  • @redless81 I am not sure you should look at the things this way ;) Always choose the simpliest solution possible. Patterns are for making complex things simplier. I dislike Laravel Jobs in 5.1. They seem artificial for me. – Vladislav Rastrusny Oct 12 '15 at 07:33
  • I agree and disagree ;) I need to import about 50 MB of CSV data to distribute it over several tables. The import process may take up to some hours. I think I need to consider aspects like scalability etc. Also I do want to take adavantage of the architecture provided by the framework because it brings in some kind of standard for other developers, make it more testable. Well, however, I agree that you should prevent using patterns if you do not need them. – patriziotomato Oct 12 '15 at 16:02