LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 06-13-2005, 11:41 PM   #1
carlosruiz
Member
 
Registered: Jul 2003
Location: Japan
Distribution: Mandrake
Posts: 53

Rep: Reputation: 15
Thumbs up PHP: huge functions file vs multiple small files


Hello everybody, I am currently developing a PHP application, now that Im getting close to finish it, I have noticed my functions.php file is almost 1500 lines (without comments), I wonder what are the performance implications for including such a huge file in each of the application scripts, even when some scripts don't use must of the functions, Im thinking on cutting the huge file into one file per function and then use a script to call all the required files depending on the script being executed, this is going to add a couple of MySQL querys to get which functions should be included on each script, so what do you guys think does that approach justify the work? or should I just include the huge file?

Thanks a lot,

P.S. by the way, how does php handle functions does it save them all on memory or just saves a reference to them and when they are called they are loaded??

Last edited by carlosruiz; 06-13-2005 at 11:43 PM.
 
Old 06-14-2005, 02:08 AM   #2
CroMagnon
Member
 
Registered: Sep 2004
Location: New Zealand
Distribution: Debian
Posts: 900

Rep: Reputation: 33
There are a couple of middle-ground options you could take as well. You need to decide which approach matches your design best (which may be one of the options you've already outlined).

1) Group related functions together, so instead of having one file per function, you have five or six that are likely to be used together. This could cut down on the number of include statements you need to maintain, at the penalty of sometimes including unused functions. This would be the least refactoring work, and should have a reasonable performance improvement over the monolithic file approach.

2) I think adding an SQL query at each load might possibly be slower than just parsing the 1500 line file, especially once the file is cached. I understand the desire to save yourself some administrative effort by only needing to update the SQL database, but is it really saving that much time to edit a tuple for "page1.php" as opposed to just editing page1.php directly? I can't see the difference between that and manually adding include lines when the code changes, because you'd normally have to change (or at least read) the file to make sensible alterations anyway.

3) If you're determined to go with the SQL idea, you could make a script that generates the relevant include lines from your SQL table - that way, you edit the table and run your custom update, and you avoid the performance penalty of making an extra SQL request for every page load. If you're worried about forgetting to run the script, build it into a trigger on the table so that modifying the tuple for page1.php regenerates the correct include statements for just that file. This assumes that your version of MySQL supports triggers, of course. The downside is that this introduces an extra step if you don't have triggers (and one more thing you have to remember when migrating/upgrading the server).

No matter which option you decide to try, be sure to do some performance benchmarking before you start, so you can be sure there is actually an improvement. Print up the processing time at the bottom of each page, then load your test pages five to ten times each and take the average value.
 
Old 06-14-2005, 03:06 AM   #3
carlosruiz
Member
 
Registered: Jul 2003
Location: Japan
Distribution: Mandrake
Posts: 53

Original Poster
Rep: Reputation: 15
Thumbs up

CroMagnon, thank you very much for your reply, I didn't consider the fact that the page will be cached, so now that you mention it can get even slower if I use a query for every loading script, I think I will abort that approach, I am considering tho, as you suggested, to chop the file into 5 or six groups of functions and load them accordingly, I will run some load avarage tests to see how it goes, again I appreciate your valuable help.

Regards.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
uploading multiple files with php dizzutch Programming 0 01-04-2005 04:29 PM
PHP -- How to execute a shell script from PHP using FTP functions?? zoonalex Programming 3 07-29-2004 11:51 AM
Eclipse fonts small in gnome & huge in kde juby Linux - Software 1 05-04-2004 03:56 AM
Using wildcards with PHP file functions patpawlowski Programming 3 03-18-2004 09:20 AM
Best File Sys? Lots of Small Files?? KendersPlace Linux - General 1 02-17-2003 02:29 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 07:43 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration