I've got a fairly small MySQL database (a Textpattern install) on the server that I don't have SSH use of (I've FTP access only). I have to regularly download the live database to my local dev server when needed i.e., I must either operate a script and/or possess a cron job running. What exactly are good quality methods for carrying this out?

Some things to notice:

  • Live server is running Linux, Apache 2.2, PHP 5.2 and MySQL 4.1
  • Local server is running exactly the same (so using PHP is definitely an option), however the OS is Home windows
  • Local server has Ruby onto it (so using Ruby is really a valid option)
  • The live MySQL db can accept remote connections from different Insolvency practitioners
  • I am unable to enable replication around the remote server

Update: I have recognized BlaM's answer it's superbly simple. Can't believe I did not think about that. There is one problem, though: I needed to automate the procedure, however the suggested solution prompts the consumer for any password. This is a slightly modified version from the mysqldump command that passes within the password:

mysqldump -u USER --password=MYPASSWORD DATABASE_TO_DUMP -h HOST > backup.sql

Since access your database remotely, you should use mysqldump out of your home windows machine to fetch the remote database. From commandline:

cd "into mysql directory"
mysqldump -u USERNAME -p -h YOUR_HOST_IP DATABASE_TO_MIRROR >c:\backup\database.sql

This program will request you for that database password after which produce a file c:backupdatabase.sql that you could operate on your home windows machine to place the information.

Having a small database that needs to be fairly fast.

This is what I personally use. This dump the database in the live server while uploads it towards the local server. mysqldump -hlive_server_addresss -ulive_server_user -plive_server_password --opt --compress live_server_db mysql -ulocal_server_user -plocal_server_password local_server_db

You are able to run this from the softball bat file. You could ever make use of a scheduled task.

Is MySQL replication a choice? You can even transform it off and on should you did not need it constantly replicating.

It was a good article on replication.

I'd produce a (Ruby) script to perform a SELECT * FROM ... on all of the databases around the server after which perform a DROP DATABASE ... then a number of new INSERTs around the local copy. You are able to perform a SHOW DATABASES query to list out the databases dynamically. Now, this assumes the table structure does not change, but when you need to support table changes also you could include a SHOW CREATE TABLE ... query along with a corresponding CREATE TABLE statement for every table in every database. To obtain a listing of all of the tables inside a database you perform a SHOW TABLES query.

After you have the script you are able to arrange it like a scheduled job to operate as frequently since you need.

@Mark Biek

Is MySQL replication a choice? You can even transform it off and on should you did not need it constantly replicating.

Just suggestion, however i cannot enable replication around the server. It's a shared server with hardly any room for maneuver. I have up-to-date the question to notice this.

For the way frequently you have to copy lower live data and just how rapidly you must do it, setting up phpMyAdmin on machines may be a choice. You are able to export and import DBs, but you'd need to do it by hand. Whether it's a little DB (also it seems like it's), and you do not need live data replicated over too frequently, it could work nicely for the thing you need.