To ease the sorting by name, you could use something like:
sort -n orders the entries by their numerical value (ascending), use -r if you want to reverse the order (ie descending).
Numerical sorting is safer than the alphabetical ordering done by "ls", in cases where the filenames consist only of numbers, like your examples.
If I understand you correctly, you want to be able to see which files were downloaded most recently, right? This would indeed mean that you can't just "touch" all your files each time wget finishes, because this would overwrite all last mod times ie making all files appear "equally recent".
Instead, you would only like to change the timestamps of the newly downloaded files.
For this, I propose the following solution:
1. Turn on wget's timestamping (-N) to make sure it downloads only new or updated files from the remote server.
2. Use wget's -o option to write the log to a file. You may also want to use -nv (non-verbose) to keep the amount of log information limited.
3. Use standard tools like grep, cut, awk, sed or simple scripting (ie Perl) to parse the log file.
I used a similar parsing method once, but don't have the script at hand here, unfortunately.
If I remember correctly though, you could simply "grep saved /your/log/file" to get a list of all saved (downloaded files). But this could be different now, since I was using a rather old version of wget at the time.
So, a good approach would be to make wget run with -N and -o (and possibly -nv), then taking a look at the log file created to see which lines you need and then trying to filter these lines out by using grep or awk. Finally, you may want to select the parts of those lines that represent the actual filenames, using tools like cut or sed.
Make sure to test this thoroughly.
Once you get a list of retrieved files from parsing the log, you can simply call "touch" on each of those files via:
Code:
for i in `cat filelist`; do
touch ${i};
done;
or something similar.