Yeah wget in recursive mode would be a killer for their system so I can see why they block it. I'll script it via perl and LWP, they allow that as long as you don't abuse it. Just need to figure out the regex but looks like they make it pretty easy.
Announcement
Collapse
No announcement yet.
The T2K/Merc WIKI
Collapse
X
-
Originally posted by kato13Yeah wget in recursive mode would be a killer for their system so I can see why they block it. I'll script it via perl and LWP, they allow that as long as you don't abuse it. Just need to figure out the regex but looks like they make it pretty easy.
Russ
Comment
-
Originally posted by avantman42Cool. If you can get images & uploaded files as well as the articles, that would be very cool - the export that I did doesn't include uploaded files.
Russ
Comment
-
Originally posted by avantman42 View PostI've been looking into that (I'm paranoid about backups). It's not as simple as I'd like, but here's my check list:
- Go to the Special:AllPages page and open each link (3 at the moment) in a new tab
- Copy/paste the list of articles into a text file
- Use a regexp search to replace tab with newline to get a list of articles, one per line
- Copy/paste the list of articles into the Special:Export page
- Hit the Export button
The end result is an XML file that can be imported into another MediaWiki installation. I did this earlier today, and got a 1.5MB file. I've zipped it and attached it to this message so that everyone reading this forum has access to a copy.
Russ
Comment
-
Originally posted by headquarters View PostYou flatter me to speak my nick in the company of such esteemed RPGers .
Yes General - a little praise for you too .Will definently make a scenario that will likely kill your PC next Ftf though .(Anything less would be an insult I feel)
Comment
Comment