Backup mysql databse on webserver

Drakkon

Diamond Member
Aug 14, 2001
8,401
1
0
I have a domain with a rather large growing mysql database that it would be nice to backup daily in some fashion. The only way I've found a good backup has been to use PHPMyadmin and run their "export" function tab to SQL.
So that being said is there any way to turn this into a cron job that will run daily? or is there another php script i can use that will have this same functionailty daily?
 
Aug 25, 2004
11,151
1
81
A rudimentary solution would be a perl script like this one, which was written with CPanel in mind. You can change it to meet your needs, and then set it up as a cron job.

#!/usr/bin/perl

# This is a simple script to place in your cron jobs that will email you
# backups of your databases. By doing this, you will have automated backups
# of your databases sent to you on a routine basis. Then, you will have peace
# of mind knowing that if your databases vanish, you have recent backups.

# Installation and Usage:
# you will have to rename the file sql_backup.pl and make sure it has
# executable permission set (mode 755 will work)
# To install:
# download and save this script
# open the file in a text editor (like Notepad)
# edit the variables $user,$password,@dbs,$email and $tmp_dir
# (make sure you leave off your username from the database names. for a
# complete list, check out your backup section on cPanel)
# save the document and then rename it to sql_backup.pl
# upload the file to your home directory
# make sure that it has been renamed to sql_backup.pl
# make sure that it is executable (mode 755)
# make sure that you have a tmp/ directory in your home directory
# NOTE: if you want to have a backup compatible with cPanel's restore
# mechanism, leave the $dump_opts alone. Only change this if you know what
# you are doing.
# Usage:
# Go to your cPanel and click on "Cron jobs"
# Click on standard
# Enter a valid email address in the top entry. This is where any output
# from the script will be sent in case the script fails.
# In the entry box "Command to run" enter: /home/yourusername/sql_backup.pl
# The default configuration is to have the script run every day at 3am
# If you want to change this, simply alter the config you see on your
# screen
# For instance, if you want it to run every Sunday, just click on Sunday in
# the weekday selection box.
#

# Test the script out by setting it to run once every minute. After you
# are confident that it will work correctly, set it to once a day/once a
# week.
#

# Click "Save Crontab"
#

# Depending on when you set it up to run, you will receive an email with all
# of your databases you listed at that time.
# To restore your databases:

#

# Enjoy!

# October 2003 - David Sierkowski (scripts@phonism.net)

# (from the BSD license. in short, I'm not accountable for anything this script does to you.)
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

use MIME::Lite;

# make sure this is your cPanel username
$user = "yourusername";

# enter your cPanel password here
$password = "yourpassword";

# enter the list of databases you want to have sent to you
# leave off your username. use the database names listed on your backup page
# in cPanel
@dbs = qw(database1 phpwebsite misc etc);

# the email address you want these backups sent to
# make sure you place a '' before the @ in your email address
# otherwise, the script will fail
$email = "you@mailhost.net";

# this should be /home/yourusername/tmp or /home2/yourusername/tmp
# go with /home/ and if the script fails, switch it to /home2/
$tmp_dir = "/home/yourusername/tmp";

# mysqldump options to make a cPanel restore compatible dump
$dump_opts = "-c --add-drop-table";

##############################
# DO NOT EDIT BELOW THIS LINE!!!!!!!!!!
##############################

$msg_file = $tmp_dir . "/msg.txt";
$date = `date +'%m-%d-%Y'`;

open(MSG,">$msg_file");
print MSG "Here are the MySQL Dumps, generated by SQL_Backup: n";
print MSG "nThe following databases are enclosed: nn";

foreach $db (@dbs){
system("mysqldump $dump_opts --user=$user --password=$password $user_$db > $tmp_dir/$db.sql");
system("gzip -f $tmp_dir/$db.sql");
print MSG "t $dbn";
push(@atts,"$tmp_dir/$db.sql.gz");
}

print MSG "n";
close(MSG);

# usage of MIME::Lite taken from:
# http://www3.primushost.com/~kylet/mail-att.txt

# use MIME::Lite to send a multipart message
$subject = "MySQL Backup Dumps For $date";
$msg = new MIME::Lite
From => "$ENV{LOGNAME}",
To => "$email",
Subject => "$subject",
Type => 'multipart/mixed';

attach $msg
Type => 'text/plain',
Path => "$msg_file";

foreach $db (@dbs){
attach $msg
Type => "application/x-gzip",
Encoding => 'base64',
Path => "$tmp_dir/$db.sql.gz";
Filename => "$db.sql.gz";
}

open(SENDMAIL,"| /usr/sbin/sendmail -t -oi");
$msg->print(*SENDMAIL);
close(SENDMAIL);
 

manlymatt83

Lifer
Oct 14, 2005
10,051
44
91
mysqldump can be hefty for a larger mysql database. I usually use this for databases less than 1 - 2 GB in size. For larger databases, I would definitely do a file system backup if possible. If this is a shared mysql database (as in, you don't have permissions to view the file system let alone read it), then you're only option is to mysqldump.

You can setup replication to a slave and then backup to the slave --- this slave can run anywhere --- even your local home machine. If you want help setting up replication, PM me.
 

ScottMac

Moderator<br>Networking<br>Elite member
Mar 19, 2001
5,471
2
0
Is this hosted, or do you have access to the machine and a CLI?

Probably the easiest way to get and save your data from mySQL is the "mysqldump" command. It extracts the structure as well as the data so you can completely restore the database from scratch with a restore.

It's also CLI, so a cron job is trivial.

To backup: mysqldump -A -p >/path/filename.ext

It should prompt you for an admin password, if no path is specificied, the default directory is assumed. The "-A" says to dump ALL databases & all tables within each database. You can also specify specific databases.


To restore the database from the file:

mysql -p </path/filename.ext

This command will overwrite all of the information in the specified database & tables. If the DB/Tables exist, it will drop them first, then re-create them. IF the DB/tables don't exist, they will be created, then the data will be restored.

There are a number of other command flags & options, check 'em out.

Good Luck

Scott
 

mundane

Diamond Member
Jun 7, 2002
5,603
8
81
We also use a cron job - mysqldump the databases, and move those (and other backup) files to a pre-designated location. Then rsnapshot pushes them onto an external drive.
 

manlymatt83

Lifer
Oct 14, 2005
10,051
44
91
Originally posted by: mundane
We also use a cron job - mysqldump the databases, and move those (and other backup) files to a pre-designated location. Then rsnapshot pushes them onto an external drive.

Again, these work, but its time consuming, locks the tables (if MyISAM), and isn't good for high end/high traffic databases.
 
Aug 25, 2004
11,151
1
81
Originally posted by: mjuszczak
Originally posted by: mundane
We also use a cron job - mysqldump the databases, and move those (and other backup) files to a pre-designated location. Then rsnapshot pushes them onto an external drive.

Again, these work, but its time consuming, locks the tables (if MyISAM), and isn't good for high end/high traffic databases.

Agreed. For a real-time, high traffic production environment, it is a bad idea. OP needs to clarify what kind of setup he's running.
 

yllus

Elite Member & Lifer
Aug 20, 2000
20,577
432
126
I'm definitely not a server operations guy, but if this webserver is local to you, what about setting up a second database and doing a master-master replication?

Then not only do you achieve high availability (the 2nd master will take over and keep your website functioning if your 1st master DB goes down), but you can additionally also back up the data from the 2nd master even if it causes table-level locking. Then at worse you'd have a few inconsistencies from the primary master DB.
 

troytime

Golden Member
Jan 3, 2006
1,996
1
0
we have a master master replication (offsite)

any insert/update/delete sql is sent to both servers (one in our office, the other in the datacenter)

the slaves (reads) are kept in sync with the datacenter master