1

I would like to write a Linux service using PERL, but don't know how to do. (I know perl)

I have a MySQL database, and need to check it for new records, so I can process them. The service should respond almost instantly, so cron is not a good choice for me and I don't want to use some workarounds in order to run my script every second. Besides, the script is threaded, and may take some time till finish, so cron is not really good choice for me.

So, I have two ways:

A service that queries database at every second and process the new records as they arrive. Or something like MySQL trigger that triggers the script when a new record inserted. However, this should not slow down other software that inserts the records.

What do you suggest me? What is the best way to achieve this?

edit: For example, do I need to do something like this when writing a service?

while(1) { do stuff sleep(1) } and write a simple init script? Is there any more elegant way?

Thanks in advance,

5 Answers5

3

There are several CPAN modules which allow you do write a daemon process. e.g. Daemon::Daemonize.
I am not 100% sure if it is such a good idea depending how the MySQL database is handled as you need to manage your database connection carefully, but it gets a lot easier for your admins. Also keep in mind to be careful with in and explicit locks as the database is used by another application which might not work with locked tables.

weismat
  • 343
  • 3
  • 16
1

FWIW, I like the 'non-forking infinite loop' solution combined with a process supervisor like DJB's daemontools. I think ubuntu's Upstart also provides this. The key feature is rigging your supervisor to automatically restart your service should it fail.

In the case of daemontools, it also permits you to delegate logging to another dedicated program. All your service needs to do is to print to stdout. When you setup the logging service for your script, you can use multilog and keep a private log, or use /usr/bin/logger and output to the syslog, all w/o modifying your script. (Upstart may provide this, but i dunno, my primary experience is with daemontools)

And if you hate DJB's distribution, consider Runit, which is a fork of DJB's daemontools, and provides much of the same functionality.

Jason
  • 1,885
  • 1
  • 13
  • 12
1

If you have many record insertions, triggering an external service for every insertion would kill your application. You will then want to batch process it according to time. However, for low number of insertions, you may not want to wait for the batch process to run. Then, the trigger may be good.

To prevent it crashing when it fails, you may want to configure it to respawn itself. There are a number of ways to do this including using daemontools as mentioned, spawn it as part of the inittab, or even wrapping with a shell script that does an infinite loop.

Nested infinite loops!

sybreon
  • 7,405
  • 1
  • 21
  • 20
0

I would also suggest you do this in C if possible. If speed is required, you can't beat it. Check out:

http://www.google.co.uk/search?rlz=1C1CHMA_en-GBGB359GB359&sourceid=chrome&ie=UTF-8&q=writing+a+linux+service+in+C

Khushil
  • 553
  • 3
  • 11
0

I would suggest that the Perl script monitors the MySQL binary log and acts upon any change in size (which could indicate rows being added/deleted/updated anywhere in the database).

You could use the Time::HiRes module to sleep a quarter-second at a time, and check for a change in the log file size, and then run the appropriate SQL.

PP.
  • 3,316
  • 6
  • 27
  • 31