1

I've been wrestling with this issue for awhile now.. and I have tried many solutions, such as:

  1. Why can't DBD::SQLite insert into a database through my Perl CGI script?

  2. Why do I get sqlite error, “unable to open database file”?

Brief

A few weeks ago, I migrated my server from Laravel 4.0 to another server which is now the latest version of Laravel 5.0.

In the old server, I have this Perl file which is a scraper which I run using a crontab every 30 minutes called getListOfClasses.pl

Using the following crontab command on my OLD server, I would run this:

0,30 * * * * /var/www/loop/storage/scripts/getListOfClasses.pl  >> /var/www/loop/storage/logs/laravel-scraper.log 2>&1

Which executes the scraper in /var/www/loop/storage/scripts/getListOfClassesFromSubjects.pl and writes to my database in /var/www/loop/storage/database.sqlite

After my move, Laravel 5.0 changed the default database location from storage to database, so I edited my crontab to reflect that change as well the database name from:

my $dbFile = '../storage/database.sqlite';

to the new file path location

my $dbFile = '../../database/database.sqlite';

The Issue

If I run my scraper manually at:

/var/www/schedulizer/storage/scripts/getListOfClasses.pl

I am able to scrape just fine. However, if I rely on the crontab to execute the script, I receive the following errors:

DBI connect('dbname=../../database/database.sqlite','',...) failed: unable to open database file at /var/www/schedulizer/storage/scripts/getListOfClasses.pl line 22.

Line 22 is my $dbh = DBI->connect($dsn, $user, $password, {. I don't believe this line of code is relevant - I guess my server has issues writing to that database.

The permissions that my SQLite database has is the following:

-rwxrwxrwx 1 www-data root 8845312 Nov  3 00:05 database.sqlite

The folder in which the database exists has the following permissions:

drwxr-xr-x  5 www-data root       4096 Nov  3 00:05 database

These permission levels all are the same as my old server's permission levels for both the database file as well as the folder.. I've also tried chown and chmod 777 on the database file so it has all the permissions possible. Still no luck.

Any one have clues why?

Community
  • 1
  • 1
theGreenCabbage
  • 5,197
  • 19
  • 79
  • 169
  • 2
    What user does cron run as? Different from the user that runs the script manually? – JRD Nov 03 '15 at 06:05
  • @JRD How do I check this? – theGreenCabbage Nov 03 '15 at 06:05
  • Well, do you edit the crontab file with `crontab -e` or some other way? Also, can post the crontab entry on the new server? – JRD Nov 03 '15 at 06:11
  • You could add a logging statement like `print STDERR "UID=$< EUID=$> ", qx/pwd/;` and check what you get back from the cron job (you usually get STDERR by mail). The `qx/pwd/` aso checks the current working directory in case that should have changed. – mbethke Nov 03 '15 at 06:12
  • a couple of quick tips - make sure that the folder has execute permissions for the user and that the file also has execute permissions for the user. You haven't shown the error message which could be helpful ( ie is the perl module not found or is there permissions issue with the files or directories ). Does the perl executable environment have access to the modules as that user or has the module been installed in a different ENV ? – Peter Scott Nov 03 '15 at 07:17
  • 2
    Use absolute pathes instead of relative to make sure you access the correct and existing directory. Include a `cd /some/where ;` before your command in the crontab. Is it roots crontab or www-data's? In the latter case, `su` to `www-data` and try to execute the file from where it is run. – syck Nov 03 '15 at 10:10
  • Maybe, first "../" in $dbFile is superflous (in the old value there were only one "../")? Try to test script with full path to database file in $dbFIle variable. – drvtiny Nov 03 '15 at 12:52
  • 2
    There are three key differences when running from cron - user, environment, path. Check all of these are as you expect. – Sobrique Nov 03 '15 at 14:17
  • @mbethke do I put that in my Perl script? Is it possible to direct the error message to my scraper-specific log file? – theGreenCabbage Nov 03 '15 at 16:12
  • @PeterScott As far as I know, that's all the error messages I've received (the failed to connect one) – theGreenCabbage Nov 03 '15 at 16:12
  • @syck Oh wow.. Using absolute paths worked..! I am so surprised!! I thought this wouldn't matter at all considering executing it manually yields no issue!! Thanks! Could you post the answer so I could mark it as right? – theGreenCabbage Nov 03 '15 at 16:17

1 Answers1

2

When testing manually, you probably went to the correct working directory for the script to do the editing. Starting it from any other location would most likely have caused failure as well.

  • Use absolute pathes instead of relative to make sure you access the correct and existing directory.
  • Include a cd /some/where ; before your command in the crontab. Cron sets your home directory, no matter where the called program sits.

The second proposal is imho the more portable, because it does not require script changes when the location or machine is changed; you simply adapt it in your (machine-specific) crontab.

syck
  • 2,984
  • 1
  • 13
  • 23