You are browsing a read-only backup copy of Wikitech. The live site can be found at wikitech.wikimedia.org
Legacy Version, Wikistats 1
Wikistats 1, available at stats.wikimedia.org, consists of two almost independent clusters of data and scripts.
- Data about edits, editors and content (harvested from xml dumps)
- Data about pageviews, per wiki, per month, globally or per country (nowadays harvested with hadoop/hive) (see also this page)
New Version, Wikistats 2
Dec 2017: WMF Analytics Team is happy to announce the first release of Wikistats 2. Wikistats has been redesigned for architectural simplicity, faster data processing, and a more dynamic and interactive user experience. The data used in the reports will also be made available for external processing.
First goal is to match the numbers of the current system, and to provide the most important reports, as decided by the Wikistats community (see survey). Over time, we will continue to migrate reports and add new ones that you find useful. We can also analyze the data in new and interesting ways, and look forward to hearing your feedback and suggestions.
Data for Wikistats 2
The data behind Wikistats 2 (at least the data on edits, editors and content) is based on Analytics/Data_Lake/Edits. It is processed in multiple steps:
- Extracted as raw data from labs replicas on a monthly basis, by a cron.
- Processed by a oozie job running scala and labeled with the month's snapshot (eg. 2017-06)
- The data is then prepared and loaded into Druid, to allow for fast slicing and dicing
- Data is accessible over the internet through AQS, which queries Druid.