Have old website, how to only collect what's still actually being used to recreate it
I have an old website running in lampp that is a shell of it's former self, where it's probably only using a small portion of all the files within the lampp setup.
I'm thinking it might be just best to somehow spider the website to see and pull down what's actually being used, then move that to a new server. But any suggestions/tips on how to do that? I could do it from a windows machine or the server itself. |
You might look recursively through /var/www/html with
"#ls -lRr --time=atime" and check the last access date. The most recently accessed will be at the top of the pile in each directory |
very cool thanks
|
All times are GMT -5. The time now is 07:52 PM. |