Related to How to export text from all pages of a MediaWiki?, but I want the output be individual text files named using the page title.
SELECT page_title, page_touched, old_text
FROM revision,page,text
WHERE revision.rev_id=page.page_latest
AND text.old_id=revision.rev_text_id;
works to dump it into stdout and all pages in one go.
How to split them and dump into individual files?
SOLVED
First dump into one single file:
SELECT page_title, page_touched, old_text
FROM revision,page,text
WHERE revision.rev_id=page.page_latest AND text.old_id=revision.rev_text_id AND page_namespace!='6' AND page_namespace!='8' AND page_namespace!='12'
INTO OUTFILE '/tmp/wikipages.csv'
FIELDS TERMINATED BY '\n'
ESCAPED BY ''
LINES TERMINATED BY '\n@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@\n';
Then split it into individual file, use python:
with open('wikipages.csv', 'rb') as f:
alltxt = f.read().split('\n@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@\n')
for row in alltxt:
one = row.split('\n')
name = one[0].replace('/','-')
try:
del one[0]
del one[0]
except:
continue
txt = '\n'.join(one)
of = open('/tmp/wikipages/' + name + '.txt', 'w')
of.write(txt)
of.close()