ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hello everybody, I am currently developing a PHP application, now that Im getting close to finish it, I have noticed my functions.php file is almost 1500 lines (without comments), I wonder what are the performance implications for including such a huge file in each of the application scripts, even when some scripts don't use must of the functions, Im thinking on cutting the huge file into one file per function and then use a script to call all the required files depending on the script being executed, this is going to add a couple of MySQL querys to get which functions should be included on each script, so what do you guys think does that approach justify the work? or should I just include the huge file?
Thanks a lot,
P.S. by the way, how does php handle functions does it save them all on memory or just saves a reference to them and when they are called they are loaded??
Last edited by carlosruiz; 06-13-2005 at 11:43 PM.
There are a couple of middle-ground options you could take as well. You need to decide which approach matches your design best (which may be one of the options you've already outlined).
1) Group related functions together, so instead of having one file per function, you have five or six that are likely to be used together. This could cut down on the number of include statements you need to maintain, at the penalty of sometimes including unused functions. This would be the least refactoring work, and should have a reasonable performance improvement over the monolithic file approach.
2) I think adding an SQL query at each load might possibly be slower than just parsing the 1500 line file, especially once the file is cached. I understand the desire to save yourself some administrative effort by only needing to update the SQL database, but is it really saving that much time to edit a tuple for "page1.php" as opposed to just editing page1.php directly? I can't see the difference between that and manually adding include lines when the code changes, because you'd normally have to change (or at least read) the file to make sensible alterations anyway.
3) If you're determined to go with the SQL idea, you could make a script that generates the relevant include lines from your SQL table - that way, you edit the table and run your custom update, and you avoid the performance penalty of making an extra SQL request for every page load. If you're worried about forgetting to run the script, build it into a trigger on the table so that modifying the tuple for page1.php regenerates the correct include statements for just that file. This assumes that your version of MySQL supports triggers, of course. The downside is that this introduces an extra step if you don't have triggers (and one more thing you have to remember when migrating/upgrading the server).
No matter which option you decide to try, be sure to do some performance benchmarking before you start, so you can be sure there is actually an improvement. Print up the processing time at the bottom of each page, then load your test pages five to ten times each and take the average value.
CroMagnon, thank you very much for your reply, I didn't consider the fact that the page will be cached, so now that you mention it can get even slower if I use a query for every loading script, I think I will abort that approach, I am considering tho, as you suggested, to chop the file into 5 or six groups of functions and load them accordingly, I will run some load avarage tests to see how it goes, again I appreciate your valuable help.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.