I've got a mysqldump file of multiple databases (5). Among the database requires a very very long time to load, it is possible to method to either split the mysqldump file by database, or simply tell mysql to load only among the specified databases?


This Perl script should have the desired effect.

#!/usr/bin/perl -w
# splitmysqldump - split mysqldump file into per-database dump files.

use strict;
use warnings;

my $dbfile;
my $dbname = q{};
my $header = q{};

while (<>) {

    # Beginning of a new database section:
    # close currently open file and start a new one
    if (m/-- Current Database\: \`([-\w]+)\`/) {
    if (defined $dbfile && tell $dbfile != -1) {
        close $dbfile or die "Could not close file!"
    $dbname = $1;
    open $dbfile, ">>", "$1_dump.sql" or die "Could not create file!";
    print $dbfile $header;
    print "Writing file $1_dump.sql ...\n";

    if (defined $dbfile && tell $dbfile != -1) {
    print $dbfile $_;

    # Catch dump file header in the beginning
    # to be printed to each separate dump file.  
    if (! $dbname) { $header .= $_; }
close $dbfile or die "Could not close file!"

Run this for that dump file that contains all databases

./splitmysqldump < all_databases.sql

A "mysqldump file" is simply a text file filled with SQL claims. As a result, you should use any number of text editors to reduce up the way you think fit.

You may be better offered using a more selective dump to begin with (only one database per file, etc.). Without having accessibility original database, you might perform a full restore, then use mysqldump again create dumps for that individual databases.

Should you simply want a fast and dirty solution, a fast search yields references to some couple tools that may be also helpful.

Check out this blog publish I usually re-make reference to to get this done type of factor having a mysqldump.


It is simple to extend it to extract individual db's.