Backing up MediaWiki 1.16.0
I had a script, see below, which used to work to make a html backup my mediawiki. I just upgraded to 1.16.0 and now it's broke as it won't authenticate, this part must of changed.
Of course I also use mysqldump, but I also like to make a html snapshot of the wiki. Does anyone know how it's changed to allow authentication once more. Code:
WIKI_USERNAME=WikiSysop |
Is it possible for you to provide the error you get while running this script as well?
Regards, Alunduil |
I made a typo in the script above which has been corrected.
It's not saving the cookie into the cookies.txt which makes me wonder if the authentication URL has changed with 1.16.0 Here is the first wget with debug (-d). Certain values have been changed for obvious reasons. Code:
DEBUG output created by Wget 1.10.2 (Red Hat modified) on linux-gnu. |
This seems like kind of an indirect method of backing up your wiki. How would you picture the recovery process if you needed to use this backup?
Why not use mysqldump or mysqlhotcopy -- http://dev.mysql.com/doc/refman/5.1/...p-methods.html -- and then use gtar, rsync, rdiff-backup or something else to maintain a backup of your webtree and mysqldumps on another server or drive? Although it seems slightly more complex, recovery would be more straightforward. |
For backup and recovery I use mysqldump. I realised that I had used the wrong wording above which may of misled (sorry)
Using wget in the past has allowed me to view the wiki offline, i.e. on my mobile I can carry the content and view it offline. For me that is extremely useful, especially if you have no data plan or no access to wifi. You now have to use the wpLoginToken to authenticate via MediaWiki 1.16.0 so that you can download a MediaWiki page using wget inside of a script. http://labs.creativecommons.org/2011...-to-mediawiki/ I've made some small modification's to the script on the URL above to make it easier to use. This is very handy script.... Code:
#!/bin/bash |
All times are GMT -5. The time now is 09:30 PM. |