I run php7/mysql on a localhost for personal php apps.
I have a function I've used for years. The function creates a .sql file in a directory of a database dump. It is essentially a manual DB backup tool, used in conjunction with jquery ajax and a button.
I did not write this function. I found it many, many years ago and it's been working well for probably a decade or more.
I've recently upgraded my OS and am having some issues which I can't seem to solve. Primary difference may be php7 now. I think I was running php5 before. I upgraded this function from mysql to mysqli a few years ago, but it was still functional after that change.
The function:
function backup($host,$user,$pass,$dbname,$tables = '*') {
$link = new mysqli($host, $user, $pass, $dbname);
//get all tables
if($tables == '*') {
$tables = array();
$result = mysqli_query($link,'SHOW TABLES');
while($row = mysqli_fetch_row($result)) {
$tables[] = $row[0];
}
} else {
$tables = is_array($tables) ? $tables : explode(',',$tables);
}
//cycle
$return = '';
foreach($tables as $table) {
$result = mysqli_query($link,'SELECT * FROM '.$table);
$num_fields = mysqli_num_fields($result);
$return.= 'DROP TABLE IF EXISTS '.$table.';';
$row2 = mysqli_fetch_row(mysqli_query($link,'SHOW CREATE TABLE '.$table));
$return.= "\n\n".$row2[1].";\n\n";
for ($i = 0; $i < $num_fields; $i ) {
while($row = mysqli_fetch_row($result)) {
$return.= 'INSERT INTO '.$table.' VALUES(';
for($j=0; $j<$num_fields; $j ) {
$row[$j] = addslashes($row[$j]);
//$row[$j] = ereg_replace("\n","\\n",$row[$j]);
$row[$j] = preg_replace("#\n#", "\\n", $row[$j]);
if (isset($row[$j])) { $return.= '"'.$row[$j].'"' ; } else { $return.= '""'; }
if ($j<($num_fields-1)) { $return.= ','; }
}
$return.= ");\n";
} //while
} //for
$return.="\n\n\n";
//echo $return; //This shows $return is populated when uncommented.
} //foreach
//save file
$handle = fopen('../../z_db/'.$dbname.'_db_'.date('Y-m-d-ha').'.sql','w ');
fwrite($handle,$return);
fclose($handle);
mysqli_close($link);
} //func close
I initially realized it was the ereg_replace
line causing a failure. I replaced that with preg_replace
.
Note the commented echo statement in the foreach
loop.
- If that
echo
statement is uncommented and functional it shows the$return
variable is populated with all the db contents. - If that
echo
statement is commented out, as in above, I get an HTTP500 error when calling the page directly in a browser. The page should completely function when called directly. - Ajax errors either way
I think it may be due to the $return
variable not being populated outside the foreach
loop. Therefor the fwrite
function is failing perhaps. I'm honestly just guessing at this point though.
No matter what I try the ajax isn't returning a success. I've verified paths, checked that the target directory is writable, etc., and have spent considerable time searching and trying to figure out where the issue may lie.
Ajax hasn't changed since working state and merely returns an error in all cases (never returning success). It appears to be working though based upon other jquery items related to the button. I can include the ajax if necessary.
I'm hoping someone with more experience and prowess can take a look to see what I'm missing or where an error may be.
CodePudding user response:
The thing you should think about is writing to the file in smaller chunks and then releasing memory / emptying variables.
Or you may go with a mysqldump
command as shown here
If you still want to use your custom code for it you can do something like this:
function backup( $host, $user, $pass, $dbname, $tables = '*' ) {
$link = new mysqli( $host, $user, $pass, $dbname );
//get all tables
if ( $tables == '*' ) {
$tables = array();
$result = mysqli_query( $link, 'SHOW TABLES' );
while ( $row = mysqli_fetch_row( $result ) ) {
$tables[] = $row[0];
}
} else {
$tables = is_array( $tables ) ? $tables : explode( ',', $tables );
}
$handle = fopen( $dbname . '_db_' . date( 'Y-m-d-ha' ) . '.sql', 'w ' );
//cycle
array_walk( $tables, function( $table ) use ( $link, $handle ) {
walk_table( $table, $link, $handle );
} );
fclose( $handle );
mysqli_close( $link );
} //func close
And then, in the walk_table
function:
function walk_table( $table, $link, $handle ) {
$return = '';
$result = mysqli_query( $link, 'SELECT * FROM ' . $table );
$num_fields = mysqli_num_fields( $result );
$return .= 'DROP TABLE IF EXISTS ' . $table . ';';
$row2 = mysqli_fetch_row( mysqli_query( $link, 'SHOW CREATE TABLE ' . $table ) );
$return .= "\n\n" . $row2[1] . ";\n\n";
fwrite( $handle, $return );
unset($return); // empty return
for ( $i = 0; $i < $num_fields; $i ) {
while ( $row = mysqli_fetch_row( $result ) ) {
$return = 'INSERT INTO ' . $table . ' VALUES(';
for ( $j = 0; $j < $num_fields; $j ) {
$row[ $j ] = addslashes( $row[ $j ] );
$row[ $j ] = preg_replace( "#\n#", "\\n", $row[ $j ] );
if ( isset( $row[ $j ] ) ) {
$return .= '"' . $row[ $j ] . '"';
} else {
$return .= '""';
}
if ( $j < ( $num_fields - 1 ) ) {
$return .= ',';
}
}
$return .= ");\n";
fwrite( $handle, $return );
} //while
} //for
fwrite( $handle, "\n\n\n" );
}
Also, for seeing how much memory and time it takes to dump a database, you can add this bit of code right before your backup
call:
$startTime = time();
function onDie() {
echo "\n\nExecution Completed.\n";
$startTime = $GLOBALS['startTime'];
$bytes = memory_get_peak_usage();
if ( $bytes >= 1073741824 ) {
$memusage = round( ( $bytes / 1073741824 ), 2 ) . "GB";
} elseif ( $bytes >= 1048576 ) {
$memusage = round( ( $bytes / 1048576 ), 2 ) . "MB";
} else {
$memusage = round( ( $bytes / 1024 ), 2 ) . "KB";
}
echo 'Memory Usage: ' . $memusage . "\n";
$runTime = ( time() - $startTime );
echo 'Processing Time: ' .
( str_pad( ( floor( $runTime / 60 ) ), 2, "0",
STR_PAD_LEFT ) . ":" . str_pad( ( $runTime % 60 ), 2, "0", STR_PAD_LEFT ) ) . "\n\n";
}
register_shutdown_function( 'onDie' );
You can find other similar examples on SO and all over the internet. There are a lot way more complex than this one. The one above will result into something like:
Execution Completed. Memory Usage: 38.68MB Processing Time: 00:39
The changes I did to your code got a DB dump from 600 MB peak memory usage to under 200MB. So it can still be optimized, but this would be a good start.