LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 02-29-2012, 03:31 PM   #1
Basher52
Member
 
Registered: Mar 2004
Location: .SE
Distribution: Arch
Posts: 401

Rep: Reputation: 22
Question php memory limit


I'm an admin at a forum that uses http://www.one.com/en/ as their web hotel.
I want to make a "remote" backup of the mysql database cos that is way faster than to use
phpmyadmin, and likewise with all the web files. (The last one takes over an hour to get)

one.com's php memory_limit is 65M so when I try to run the php script I get the error:
"PHP Fatal error: Out of memory (allocated 16777216) (tried to allocate 21979956 bytes) in /customers/xxxx/httpd.www/forum/admin/backup/backup.mysql.php on line 34"

I tried to use
Code:
ini_set('memory_limit', '256M');
but with this wouldn't work.

First, is there a way to list the memory_limit with a php script?
this so I can first list it, then change it and list it again to see if my value was set or not.

Second, if the above won't work, is there a way to run the script but with a limited memory so it'll be within the limit?

These are the scripts:

The script I run from the web browser;
Code:
<?php
ini_set('track_errors', true);
ini_set('log_errors', true);
ini_set('error_log', dirname(__FILE__) . '/error_log.txt');
ini_set('memory_limit', '256M');

include "backup.mysql.php";
$z=new backupmysql();
echo $z->backup_tables('localhost','database','password','database_name');
?>

The script that makes the backup;
Code:
<?php
ini_set('track_errors', true);
ini_set('log_errors', true);
ini_set('error_log', dirname(__FILE__) . '/error_log.txt');
ini_set('memory_limit', '256M');

class backupmysql
{
/* backup the db OR just a table */
function backup_tables($host,$user,$pass,$name,$tables = '*')
{
  $link = mysql_connect($host,$user,$pass);
  mysql_select_db($name,$link);

  //get all of the tables
  if($tables == '*')
  {
    $tables = array();
    $result = mysql_query('SHOW TABLES');
    while($row = mysql_fetch_row($result))
    { 
      $tables[] = $row[0];
    }
  }
  else
  {
    $tables = is_array($tables) ? $tables : explode(',',$tables);
  }

  //cycle through
  foreach($tables as $table)
  {
    $result = mysql_query('SELECT * FROM '.$table);
    $num_fields = mysql_num_fields($result);
    $return = "";

    $return.= 'DROP TABLE '.$table.';';
    $row2 = mysql_fetch_row(mysql_query('SHOW CREATE TABLE '.$table));
    $return.= "\n\n".$row2[1].";\n\n";

    for ($i = 0; $i < $num_fields; $i++)
    { 
      while($row = mysql_fetch_row($result))
      { 
        $return.= 'INSERT INTO '.$table.' VALUES(';
        for($j=0; $j<$num_fields; $j++)
        { 
          $row[$j] = addslashes($row[$j]);
          $row[$j] = ereg_replace("\n","\\n",$row[$j]);
          if (isset($row[$j])) { $return.= '"'.$row[$j].'"' ; } else { $return.= '""'; }
          if ($j<($num_fields-1)) { $return.= ','; }
        }
        $return.= ");\n";
      }
    }
    $return.="\n\n\n";
  }

  //save file
  $handle = fopen('./db-backup-'.time().'-'.(md5(implode(',',$tables))).'.sql','w+');
  fwrite($handle,$return);
  fclose($handle);
}
}
?>
 
Old 02-29-2012, 04:34 PM   #2
anomie
Senior Member
 
Registered: Nov 2004
Location: Texas
Distribution: RHEL, Scientific Linux, Debian, Fedora
Posts: 3,935
Blog Entries: 5

Rep: Reputation: Disabled
Quote:
Originally Posted by Basher52
First, is there a way to list the memory_limit with a php script?
I believe a simple phpinfo() will do that for you.
 
Old 03-01-2012, 07:56 AM   #3
resolv_25
Member
 
Registered: Aug 2011
Location: Croatia
Distribution: Debian 10/Ubuntu 20.04
Posts: 64

Rep: Reputation: 15
Yes phpinfo() is solution to see allowed memory.
If you are on shared server, maybe you can't get 256 Mb memory, depends of the hosting.
However, I prefer to put single command in a bash script, and this in cron job. It works faster.
Command may be like this:
mysqldump --user=dbUser --password=dbPassword databaseName > databaseNameBackup.sql
 
Old 03-01-2012, 08:40 AM   #4
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,659
Blog Entries: 4

Rep: Reputation: 3939Reputation: 3939Reputation: 3939Reputation: 3939Reputation: 3939Reputation: 3939Reputation: 3939Reputation: 3939Reputation: 3939Reputation: 3939Reputation: 3939
Don't ask php to do such a thing! Don't ask a httpd server to do such a thing!

You need a background script, unrelated to the web server, to be performing these operations.

Yes, you can of course write that script in php if you want to ... or in any other programming language that you prefer. That's your choice. It's not the language's fault, nor its limitation. (PHP is every bit as capable as any other full-featured programming language, and, like any of them, it can be used outside of a web-page context.) Rather, it is a highly inappropriate task to be being done by a web server.

Good production systems will have some kind of "batch job" processing system that is capable of farming out tasks to other computers or at least to other processes. You could, if you wish, build a web-page administrative interface to such an external system so that authorized users could start these jobs and monitor their progress.

Even though Unix/Linux does not always ship with a batch-job monitor (unlike the venerable and batch-centric IBM MVS system of yore .. which still exists, by the way .. plenty of good ones are available. You don't even need to "roll your own."

Last edited by sundialsvcs; 03-01-2012 at 08:43 AM.
 
Old 03-01-2012, 11:18 AM   #5
Basher52
Member
 
Registered: Mar 2004
Location: .SE
Distribution: Arch
Posts: 401

Original Poster
Rep: Reputation: 22
@anomie: OK, thanks for the phpinfo() I'll keep that in mind.

@resolv_25: Can I really use mysqladmin in php on a server that is hosted by a company? don't you think that is locked down? and if I can, this would be run as a bach shell script from within php, right? The testing here at home is very easy since I can do whatever I like and uses SSH for this.

@sundialsvcs: This I know and while testing on my own test server I saw the php using almost 100%, just some percentage left for mysql But I didn't even think that 'resolv's' version would work, and as you see my question to him/her, I still don't

Last edited by Basher52; 03-01-2012 at 01:48 PM.
 
Old 03-01-2012, 01:44 PM   #6
Basher52
Member
 
Registered: Mar 2004
Location: .SE
Distribution: Arch
Posts: 401

Original Poster
Rep: Reputation: 22
Just tested shell script but as I figured, got this: PHP Warning: shell_exec() has been disabled for security reasons in......
 
Old 03-01-2012, 03:08 PM   #7
Basher52
Member
 
Registered: Mar 2004
Location: .SE
Distribution: Arch
Posts: 401

Original Poster
Rep: Reputation: 22
The other thing I was talking about, copying all web files. This thing takes over an hour to get "home" to my place so I tried to use ZipArchive in php for that.
The max file size of the file created, with a temporary file name though, is 128,00Mbytes and then the php stops with an "Internal Server Error"

What I wonder is, is there a way to create multiple files with smaller size instead with parameters or such?
or do I have to do that myself and check the file size after every file added and then use ::addFile to add more if not "big" enough?
 
Old 03-02-2012, 06:05 AM   #8
resolv_25
Member
 
Registered: Aug 2011
Location: Croatia
Distribution: Debian 10/Ubuntu 20.04
Posts: 64

Rep: Reputation: 15
Quote:
Originally Posted by Basher52 View Post
@anomie: OK, thanks for the phpinfo() I'll keep that in mind.

@resolv_25: Can I really use mysqladmin in php on a server that is hosted by a company? don't you think that is locked down? and if I can, this would be run as a bach shell script from within php, right? The testing here at home is very easy since I can do whatever I like and uses SSH for this.
There are 2 solutions, as sundialsvcs proposed, better put the script out of php.
Typically hosting allows cron jobs option. Just paste this previous code line in cron job line, and pick up the interval of executing.
Second, put this line in a bash script and put your script in a cron job. In a script you may put backups for different databases separately.
This is more simple to handle and process is automatic.
 
Old 03-02-2012, 12:39 PM   #9
Basher52
Member
 
Registered: Mar 2004
Location: .SE
Distribution: Arch
Posts: 401

Original Poster
Rep: Reputation: 22
one.com won't have anything else then FTP or phpMyAdmin, so that sucks.
When I read this just recently I got furious because I had not a CLUE about this.
So now not one bit of change is gonna happen until we move this place.

one.com has no 'cpanel' and no SSH or such for the MySql-DB administration nor all the web files.
For the web files there is only FTP to download them "home" and that sure sucks.

I'm already out looking for other hosts

.....after the move I think it'll all be smooth as glass :P
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
committed memory is much much more than limit mahmoodn Linux - General 2 02-10-2012 01:30 AM
How to Limit suphp user memory limit? Rundi Linux - Server 2 05-14-2010 12:27 PM
Limit memory usage vinaytp Linux - Newbie 1 11-30-2009 11:45 PM
Process memory limit Costea Linux - Security 9 09-24-2006 01:42 PM
linux memory limit? rex67 Linux - General 9 01-08-2005 03:36 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 07:12 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration