0

What I want to do:

So my question is : How can I automatically delete entries after exactly 24 hours in a optimized manner.

What you need to know:

I research a solution that will also allow me to delete files associated to this database entry.

My idea:

My initial idea was to start a Perl script which will check every-hours if the earliest entry was created 24 hours ago or not and if it so delete the entry plus the file associate to this entry, but first, I find this solution a bit ugly and I am sure there is a better solution than this one, more optimized and classy!

The table:

UID, COUNT, VALUE_1, VALUE_2, ..., TIMESTAMP, LINK

If you have any inspiration please tell me!

jaypal singh
  • 74,723
  • 23
  • 102
  • 147
torr
  • 1,256
  • 3
  • 11
  • 20
  • 2
    sounds like a cronjob job to me – Sterling Archer Jul 05 '13 at 22:43
  • I do not fully get the question. You have a SQL database with links to files? I mean you have to crawl your database for old stuff, delete the row, delete the file. Is this a very high load system so it has to specifically optimized? – Alex Jul 05 '13 at 22:46
  • For each entries created there is a file created too, and the absolute link to this file is in the entry. There will be a lot of entries. The cron job seems not adapted since if the entry is created at 00:01am the cron entry will do 23 useless check and worst will act one hour to late if it is runned for an hour/check – torr Jul 05 '13 at 22:52

4 Answers4

4

You could use MySQL's event scheduler either:

  • to automatically delete such records when they expire:

    CREATE EVENT delete_expired_101
    ON SCHEDULE AT CURRENT_TIMESTAMP + INTERVAL 24 HOUR DO
    DELETE FROM my_table WHERE id = 101;
    
  • to run an automatic purge of all expired records on a regular basis:

    CREATE EVENT delete_all_expired
    ON SCHEDULE EVERY HOUR DO
    DELETE FROM my_table WHERE expiry < NOW();
    
Stephen R
  • 80
  • 1
  • 8
  • Do you know if I can execute a file with the MySQL Scheduler, your solution seems perfect I just need to erase the file associated to and it would be! – torr Jul 05 '13 at 23:08
1

Write your sql query in a php file and then setup a cron job on that file.

cPanel as this feature build in and is simple to use

http://docs.cpanel.net/twiki/bin/view/AllDocumentation/CpanelDocs/CronJobs

The command to run will look somthing like this

0   0   *   *   *   /usr/bin/lynx http://www.yourdomain.com/thefilexxx.php >/dev/null 2>&1

If you don't have cPanel the take a look at this

http://www.wikihow.com/Set-up-a-Crontab-File-on-Linux

Another related post to cron job's Running cron job on linux every 6 hours

I saw your reply to PRPGFerret only now. Does your sql entries have time stamps? Run the cron job more often and let the php check the time stamp and if an hour is past, delete the entry.

If cron job still does not serve your purpose, take a look at this

http://php.net/manual/en/function.set-time-limit.php

Community
  • 1
  • 1
Llewellyn
  • 388
  • 6
  • 19
  • This could be a good solution but if there is for example 100000 entries? there will be 100000 cron jobs? it is a lot! isn't? – torr Jul 06 '13 at 00:53
  • You work with the entries in the php file, the cron job just runs the file. So the cron job gives automation, like a timer, yet the php interacts with the sql data. So having a 100000 entries could require some advanced php scripting, yet I can't see what other way will delete your files, since a sql query like that of Stephen R will not do that. – Llewellyn Jul 06 '13 at 04:33
1

You could use an

at 

job

So when you could do is setup the at job when you initially load the entires to fire 24 hours later.

xxx:~ $ echo 'ps -ef |grep atd' > x

xxx:~ $ at -f x now + 1 day
job 4 at Sun Jul  7 09:08:00 2013

xxx:~ $ at -l 4
4   Sun Jul  7 09:08:00 2013

xxx:~ $ at -c 4
#!/bin/sh
# atrun uid=501 gid=20
# mail xxxx 0
umask 22
GRAILS_HOME=/usr/share/grails; export GRAILS_HOME
<more environment>
_=/usr/bin/at; export _
cd /Users/xxx || {
 echo 'Execution directory inaccessible' >&2
 exit 1
}
OLDPWD=/Users/xxx;        export OLDPWD
ps -ef |grep atd
KeepCalmAndCarryOn
  • 8,817
  • 2
  • 32
  • 47
1

I have think about that and well, the entries are stocked in an chronological order:

  • If entry 25 must be removed at 00:01am entry 26 will be removed at the same or after 00:01am.

  • I do not want to create one cron job per entry because I do know that there will be a lot of entries.

  • My idea suppose that the script will check at least one time to get the timestamp of the entry and the id of it.

I know this is probably not the most optimized way but I want to share this view.

The idea:

Get the ID and TIMESTAMP of entry e
(TIMESTAMP+24) - TIMESTAMP -> Set a cron job to erase the file and remove the entry
and start this script for entry e+1 (identified by id + 1)

So, there will be one cron job per entry but we can only have one cron job.

torr
  • 1,256
  • 3
  • 11
  • 20