1 wikiq: a simple and fast stream-based MediaWiki XML dump parser
3 authors: Erik Garrison <erik@hypervolu.me>
4 Benjamin Mako Hill <mako@atdot.cc>
8 wikiq is written in C++ using expat. It is designed to enable
9 researchers to rapidly extract revision histories (minus text and
10 comments) from large XML datasets.
14 To use, first make sure you have libexpat and libpcrecpp installed, then:
17 % ./wikiq -h # prints usage
18 % 7za e -so hugewikidatadump.xml | ./wikiq >hugewikidatadump.tsv
23 In addition to parsing WikiMedia XML data dumps into a tab-separated
24 tabular format, wikiq can match Perl-compatible regular expressions
25 against revision content, can extract article diffs, and can match
26 regexes against the additions and deletions between revisions. Any
27 number of regular expressions may be supplied on the command line, and
28 may be tagged using the '-n' and -N options.
30 MD5 checksums of revisions are used at runtime.
34 wikiq generates these fields for each revision:
36 title, articleid, revid, timestamp, anon, editor, editorid, minor,
37 text_length, text_md5, reversion, additions_size, deletions_size
38 .... and additional fields for each regex executed against content or
41 Boolean fields are TRUE/FALSE except in the case of reversion, which is blank
42 unless the article is a revert to a previous revision, in which case, it
43 contains the revision ID of the revision which was reverted to.