Hi,
Been searching endlessly for a command line tool/utility that downloads all the components of a webpage, given a URL. As in,
$<some_command>
http://someurl.com
should fetch the base html and all the required components necessary for rendering the page, which includes all the images, css, js, advertisements etc. Essentially it should emulate the browser. The reason I'm looking for such a tool is to measure the response time of a website for a given URL from the command line. I know of several GUI tools like HTTPFox, Ethereal/Wireshark that serve the same purpose. But none in CLI.
There are wget and curl. But from what I understand they can just fetch the contents of the given URL and don't parse the html to download all other components.
wget does do recursive download. But the problem is, it goes ahead and fetches all those <a href> pages too, which I don't want. Given a URL, the browser gets the html first, parses it, and then downloads each component (css, js, images) that it needs to render the page. Is there a command line tool/script that can accomplish the same task?
TIA!