ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I am curious if there is some sort of built in function of php that sets up multipage results from a query to a database.
So the best example I can think of would be like when you search in google. You see the 1 - 2 - 3 - 4 - 5 - 6 pages of the results... I'm not really too sure why I'm thinking there is some sort of premade function that already does this, but I could have sworn a college professor of mine was showing me it one day a few years back. But perhaps I just dreamt this up...
Anyways, if there is no built in function, which I highly doubt there is, what would you guys suggest the best way to do this would be? I dont need to see any code, just the theory. Thanks for your replies.
You could execute something like wget to fetch the results from Google, read the content of the webpage into an array, parse it for web links not containing "google" within it so you only kept the external links, then dump the contents of the array into a database. Isn't that hard. Is a similar way to handling the raw streams from something like del.icio.us - you read the content, parse it for the data you want, then store it.
lol no. I didn't mean I want to parse google. I meant I want to query my own database, but when there is more X amount of results I need to seperate into pages. Just like google does with its results. So I'm curious how do I select only like results 11-20 of 0-400 results in a database... that kind of thing. Thanks for replying though.
The way i would attempt doing it would be to divide the number of results the database gives you for the query by the number of results you want per page, to give you the number of pages the results would be spread across. Using this information, you should be able to view the rows you want on each page with a simple formula, e.g.
display results from
pagenumber*numberofresultsperpage
to
((pagenumber+1)*numberofresultsperpage) - 1
I don't know if there is any built in function. Hope this helps!
Guess I didn't understand your question - I see you were simply drawing comparison to google Within your query, specify the range for results, such as "LIMIT 0 , 30" for the first 30 records, then "LIMIT 30 , 30" for the next 30, "LIMIT 60 , 30" for the next 30, etc.
1/ You maintain a cache on the server.
On first request, you run the whole query, and store the result in $_SESSION. Then you serve the correct subset of the result as requested.
2/ You run dynamic queries (better IMO).
On first request, you run a query that will get all the keys (should be much smaller than the whole result).
Then for each request, you run the whole query with the additional where clause: key in the subset for the requested page, and you serve the result.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.