LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   Web Application to grab large files from web addresses (https://www.linuxquestions.org/questions/programming-9/web-application-to-grab-large-files-from-web-addresses-373429/)

farmerjoe 10-15-2005 05:19 PM

Web Application to grab large files from web addresses
 
Anyone have any idea for a method to grab large files from other http locations in a web control panel environment? Or know of a way to do it already? Something similar to linux's wget but for web applications.

Thanks,
farmerjoe

btmiller 10-16-2005 12:04 AM

Not sure exactly what you need it to do, but if you can't find any other solution and you know perl, then you can use the LWP module. It will lewt you craft custom HTTP requests and send them to a server. This is very flexible and powerful. Do you want to do this from a Web CP environment or do you want to use it to make requests and get responses friom a remote CP?

farmerjoe 10-16-2005 01:46 AM

well...
I have been looking into using curl in PHP since that what the CP is programmed in. However, my main problem is figuring out a way to display the progress of a large file download while its being downloaded. I assume one would have to loop while the contents are being grabbed by constantly checking the file size or something like that. Anyone know how to create some sort of web based file download progress meter when grabbing something with Curl in php?

BuckRogers01 10-16-2005 08:49 AM

if it's in PHP, then A) you will have to make the max_execution_time in php.ini big enough to allow the complete download and B) While downloading the other file, u cannot do anything else, because the script it not threaded. Also, u cannot change what is outputted to the browser, so a progress bar would have to come before the footer of the page, thus, you wud see half a page with a = slowley moving across, which isnt too helpful.

It is however, possible to download files using cURL, and the script execution time can be changed on the fly (i think). The only way to do a progress bar is to load a page with an animated gif saying something like (Downloading...), where the dots slowly increace, and then instantly redirect to the php which will actually download the file. Don't 'echo' anything to the browser and the browser should hold the last page untill something is echoed, thus giving a seudo progress bar.

The only way to really do a progress bar is thru using AJAX (Asyncronous Javascript and XML). You would need some other php/cgi scripts that found the size of the saved data, and worked that out by the total size of the file you are downloading (either passed by $_GET[], or saved in a temp. file by the original script downloading the file (still dnt know how you would get hold of the total size, but it must be possible). The javascript then queries the seperate php/cgi script for the total amount dwnloaded and total size repeatedly, then updates the webpage using CSS and DOM

Sorry for talking in such a complicated manor, but it's hard to explain. Buck


All times are GMT -5. The time now is 12:14 PM.