9

I'm running a cron job that executes mysqldump via a PHP script, the dump requires the RELOAD privilege. Using the MySQL admin account doesn't feel right, but neither does creating a user with admin privileges.

My main concern is the security aspect, I'm loading the db attributes (username, password, etc.) in a protected array "property" of the class I'm using.

I'm wondering which approach makes more sense or if there's another way to achieve the same results.


Overview:
LAMP Server: CENTOS 5.8, Apache 2.2.3, MySQL 5.0.95, PHP 5.3.3

Cron job outline:

  1. Dump raw stats data from two InnoDB tables in the website db, they have a foreign key relationship.
  2. Load the data into tables in the stats db
  3. Get the last value of the auto-incrementing primary key that was transferred
  4. Use the primary key value in a query that deletes the copied data from the website db
  5. Process the transferred data in the stats db to populate the reports tables
  6. When processing completes, delete the raw stats data from the stats db

The website database is configured as a master with binary logging, and the replicated server will be set up once the stats data is no longer stored and processed in the website database (replicating the website database was the impetus for moving the stats to their own database).

All files accessed during the cron job are located outside the DocumentRoot directory.


The nitty gritty:
The mysqldump performed in the first step requires the RELOAD privilege, here's the command:

<?php
$SQL1 = "--no-create-info --routines --triggers --master-data ";
$SQL1 .= "--single-transaction --quick --add-locks --default-character-set=utf8 ";
$SQL1 .= "--compress --tables stats_event stats_event_attributes";

$OUTPUT_FILENAME = "/var/stats/daily/daily-stats-18.tar.gz";

$cmd1 = "/usr/bin/mysqldump -u website_user -pXXXXXX website_db $SQL1 | gzip -9 > $OUTPUT_FILENAME";

exec( $cmd1 );
?>

The error message:

mysqldump: Couldn't execute 'FLUSH TABLES': Access denied; you need the RELOAD privilege for this operation (1227)

Works fine if I use the mysql admin credentials.

codewaggle
  • 4,893
  • 2
  • 32
  • 48

2 Answers2

5

I'm wondering which approach makes more sense or if there's another way to achieve the same results.

The bottom line is that you need a user with certain privileges to run that mysqldump command. While it may seem silly to create a new user just for this one cron job, it's the most straightforward and simple approach you can take that at least gives the outward appearance of lolsecurity.

Given that this is a stopgap measure until you can get replication up and running, there's no real harm being done here. Doing this by replication is totally the way to go, and the stopgap measure seems sane.

Also, when it comes time to get replication going, xtrabackup is your friend. It includes binary log naming and position information with the snapshot it takes, which makes setting up new slaves a breeze.

Charles
  • 50,943
  • 13
  • 104
  • 142
  • I wasn't clear about the replication, it's for the website database, we wanted to move the stats off first because they take up more space than everything else combined and the processing sometimes bogs things down. Just gonna save dumps of the stats cause a little down time of the reports isn't critical. So not temporary. Guess my concern is about the security more than anything else, I'll stress that in the question. Any thoughts on security in particular? – codewaggle Dec 19 '12 at 04:53
  • I'm blindly assuming that this isn't on shared hosting from the context. Make sure that the user can only connect from the desired host(s) and pick a secure password? There's probably no real threat here. – Charles Dec 19 '12 at 04:55
  • Just what I wanted to hear. We own the server and I figured to set the user to @localhost, thinking about moving the stats to a different server but localhost would work for now. – codewaggle Dec 19 '12 at 05:03
  • 1
    I've been looking at the Percona site, lots of good articles for a non-dba like myself. Saw mention of xtrabackup, but just took a closer look. I'll plan to use it, thanks for the tip. – codewaggle Dec 19 '12 at 05:09
3

I just ran across this same error (probably on the same site you were working on :) ), even when running as the MySQL root user. I managed to get around it by not specifying --skip-add-locks, e.g. this worked:

/usr/bin/mysqldump -u USERNAME -pPW DATABASE_NAME --skip-lock-tables --single-transaction --flush-logs --hex-blob
kmoser
  • 8,780
  • 3
  • 24
  • 40