1

I am running a perl script on a HP-UX box. The script will execute every 15 minutes and will need to compare it's results with the results of the last time it executed.

I will need to store two variables (IsOccuring and ErrorCount) between the executions. What is the best way to do this?

Edit clarification:
It only compares the most recent execution to the current execution.
It doesn't matter if the value is lost between reboots.
And touching the filesystem is pretty much off limits.

Malfist
  • 31,179
  • 61
  • 182
  • 269
  • DO you have to keep a historical record of the variables, or just compare against the most recent? That would make the difference between using a csv/xml or a database ;) – Dave Lasley Sep 13 '11 at 13:49
  • Linux, or HPUX? Two totally different OSs -- although the differences aren't really material here. – Ernest Friedman-Hill Sep 13 '11 at 13:50
  • Just to make sure I understand: you're saying (1) you can't use the filesystem and also (2) you can't keep the connection open (your answer to @tMC)? If so, I suppose you will want to connect to a database or a file on a different machine (though that strikes me as very ugly). – Telemachus Sep 13 '11 at 14:00
  • 1
    There are other ways to store things other than a filesystem or database. I was thinking the best way might be an environment variable, or an IPC shared memory. But I wanted to see what the community thought. – Malfist Sep 13 '11 at 14:04
  • @Malfist Fair enough I suppose. As far as I know, you can't export environment variables from a Perl program back into the parent environment (at least not without something fairly roundabout). – Telemachus Sep 13 '11 at 14:25

7 Answers7

5

If you can't touch the file system, try using a shared memory segment. There are helper modules for that like IPC::ShareLite, or you can use the shmget and related functions directly.

Zaid
  • 36,680
  • 16
  • 86
  • 155
Mat
  • 202,337
  • 40
  • 393
  • 406
4

You'll have to store them in a file. This sort of file is often kept in /tmp, but any place where the user running the cron job has access would do. Make sure your script can handle the case where the file is missing.

Ernest Friedman-Hill
  • 80,601
  • 10
  • 150
  • 186
2

You could create a separate process running a "remember stuff" service over your choice of IPC mechanism. This sounds like a rather tortured solution to "I don't want to touch the disk" but if it's important enough to offset a couple of days of development work (realistically, if you are new to IPC, and HP-SUX continues to live up to its name) then by all means read man perlipc for a start.

tripleee
  • 175,061
  • 34
  • 275
  • 318
1

Does it have to be completely re-executed? Can you just have it running in a loop and sleeping for 15 minutes between iterations? Than you don't have to worry about saving the values externally, the program never stops.

tMC
  • 18,105
  • 14
  • 62
  • 98
1

I definitely think IPC is the way to go here.

torstenvl
  • 779
  • 7
  • 16
0

I'd save off the data in a file. Then, inside the script I'd load the last results if the file exists.

bsegraves
  • 980
  • 13
  • 22
0

Use module Storable to serialize Perl data structures, save them anywhere you want and deserialize them during next script execution.