LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   PHP: huge functions file vs multiple small files (https://www.linuxquestions.org/questions/programming-9/php-huge-functions-file-vs-multiple-small-files-333310/)

carlosruiz 06-13-2005 11:41 PM

PHP: huge functions file vs multiple small files
 
Hello everybody, I am currently developing a PHP application, now that Im getting close to finish it, I have noticed my functions.php file is almost 1500 lines (without comments), I wonder what are the performance implications for including such a huge file in each of the application scripts, even when some scripts don't use must of the functions, Im thinking on cutting the huge file into one file per function and then use a script to call all the required files depending on the script being executed, this is going to add a couple of MySQL querys to get which functions should be included on each script, so what do you guys think does that approach justify the work? or should I just include the huge file?

Thanks a lot,

P.S. by the way, how does php handle functions does it save them all on memory or just saves a reference to them and when they are called they are loaded??

CroMagnon 06-14-2005 02:08 AM

There are a couple of middle-ground options you could take as well. You need to decide which approach matches your design best (which may be one of the options you've already outlined).

1) Group related functions together, so instead of having one file per function, you have five or six that are likely to be used together. This could cut down on the number of include statements you need to maintain, at the penalty of sometimes including unused functions. This would be the least refactoring work, and should have a reasonable performance improvement over the monolithic file approach.

2) I think adding an SQL query at each load might possibly be slower than just parsing the 1500 line file, especially once the file is cached. I understand the desire to save yourself some administrative effort by only needing to update the SQL database, but is it really saving that much time to edit a tuple for "page1.php" as opposed to just editing page1.php directly? I can't see the difference between that and manually adding include lines when the code changes, because you'd normally have to change (or at least read) the file to make sensible alterations anyway.

3) If you're determined to go with the SQL idea, you could make a script that generates the relevant include lines from your SQL table - that way, you edit the table and run your custom update, and you avoid the performance penalty of making an extra SQL request for every page load. If you're worried about forgetting to run the script, build it into a trigger on the table so that modifying the tuple for page1.php regenerates the correct include statements for just that file. This assumes that your version of MySQL supports triggers, of course. The downside is that this introduces an extra step if you don't have triggers (and one more thing you have to remember when migrating/upgrading the server).

No matter which option you decide to try, be sure to do some performance benchmarking before you start, so you can be sure there is actually an improvement. Print up the processing time at the bottom of each page, then load your test pages five to ten times each and take the average value.

carlosruiz 06-14-2005 03:06 AM

CroMagnon, thank you very much for your reply, I didn't consider the fact that the page will be cached, so now that you mention it can get even slower if I use a query for every loading script, I think I will abort that approach, I am considering tho, as you suggested, to chop the file into 5 or six groups of functions and load them accordingly, I will run some load avarage tests to see how it goes, again I appreciate your valuable help.

Regards.


All times are GMT -5. The time now is 08:50 PM.