You are browsing a read-only backup copy of Wikitech. The live site can be found at wikitech.wikimedia.org

Difference between revisions of "Analytics/Data Lake/Edits"

From Wikitech-static
Jump to navigation Jump to search
imported>Milimetric
imported>Iflorez
(Reformated summary sentences #raddocs)
 
(15 intermediate revisions by 6 users not shown)
Line 1: Line 1:
This page links to detailed information about '''Edits datasets''' in the [[Analytics/Data Lake|Data Lake]].
The [[Analytics/Data Lake|Analytics Data Lake]] contains a number of '''editing datasets'''.


To access this data, log into <code>stat1005.eqiad.wmnet</code> and run <code>hive</code>Here you can <code>use wmf;</code> and query the tables described below.
To access this data see [[SRE/Production access|how to request and set up access]]. To understand the aspects of access and access guidelines see [[Analytics/Data access guidelines|Data Access Guidelines]] and [[Analytics/Data access|accessing sensitive data]]For recipes that work with lots of data, see [[Analytics/Data Lake/Cookbook]].


In comparison to the [[Analytics/Data Lake/Traffic|traffic ones]], those datasets are not continuously updated. They are regularly updated by fully re-importing/re-building them, creating a new '''<code>snapshot</code>'''.
'''Note''': In comparison to [[Analytics/Data Lake/Traffic|traffic datasets]], edit datasets are ''not'' continuously updated. They are regularly updated by fully re-importing/re-building them, creating a new '''<code>snapshot</code>'''. This '''<code>snapshot</code>''' notion is key when querying the Edits datasets, since including multiple snapshots doesn't make sense for most queries. As of 2017-04, snapshots are provided monthly. When we import, we grab all the data available from all tables except the <code>revision</code> table, for which we filter by <code>where rev_timestamp <= <<snapshot-date>></code>. If the snapshot is a little late because of processing problems, then by the time it finishes it may have more data in tables like logging, archive, etc. These should not affect history reconstruction because we base everything on revisions, but they'll affect any queries you may run on those tables separately.


This '''<code>snapshot</code>''' notion is key when querying the Edits datasets, since including multiple snapshots doesn't make sense for most queries. As of 2017-04, snapshots are provided monthly.  When we import, we grab all the data available from all tables except the <code>revision</code> table, for which we filter by <code>where rev_timestamp <= <<snapshot-date>></code>.  If the snapshot is a little late because of processing problems, then by the time it finishes it may have more data in tables like logging, archive, etc.  These should not affect history reconstruction because we base everything on revisions, but they'll affect any queries you may run on those tables separately.
The pipeline used to generate these edits datasets is described at [[Analytics/Systems/Data Lake/Edits/Pipeline]].


== Datasets ==
== Datasets ==


=== Mediawiki raw data ===
=== Reference Data ===
Those are copy of mediawiki MySQL tables
* <code>[[Analytics/Data Lake/Edits/Mediawiki project namespace map|wmf_raw.mediawiki_project_namespace_map]]</code>
* archive
* ipblocks
* logging
* page
* pagelinks
* redirect
* revision
* user
* user_groups


=== Processed Data ===
=== Raw Mediawiki data ===
* [[Analytics/Data Lake/Edits/Mediawiki user history|Mediawiki user history]] -- Dataset providing reconstructed history events of mediawiki users
These are unprocessed copies of the [[MariaDB]] [[mw:Manual:Database layout|application tables]] (most of them publicly available) that back our MediaWiki installations. They are stored in the <code>wmf_raw</code> database. Main difference with the original tables in databases is that the import bundle all wikis together in every table, facilitating cross-wiki queries. This means every table contains a new field <code>wiki_db</code> allowing to choose the wikis to query. Another thing to notice about this field is that it is a [[Analytics/Systems/Cluster/Hive/Queries#Always restrict queries to a date range (partitioning)|partition]] in the sense of hive tables, so a restriction on that field will make the queries a lot faster for not having to read every wiki data.
* [[Analytics/Data Lake/Edits/Mediawiki page history|Mediawiki page history]] -- Dataset providing reconstructed history events of mediawiki pages
* <code>[[mw:Manual:Archive table|mediawiki_archive]]</code>
* [[Analytics/Data Lake/Edits/Mediawiki history|Mediawiki fully denormalized history]] -- Fully denormalized dataset containing user, page and revision processed data
* <code>[[mw:Extension:CheckUser/cu changes table|mediawiki_cu_changes]]</code> (from the [[mw:Extension:CheckUser|CheckUser]] extension)
* [[Analytics/Data Lake/Edits/Metrics|Metrics]] -- Dataset providing precomputed metrics over edits data (e.g. monthly new registered users or daily edits by anonymous users)
* <code>[[mw:Manual:ipblocks table|mediawiki_ipblocks]]</code>
* <code>[[mw:Manual:logging table|mediawiki_logging]]</code>
* <code>[[mw:Manual:page table|mediawiki_page]]</code>
* <code>[[mw:Manual:pagelinks table|mediawiki_pagelinks]]</code>
* <code>[[mw:Manual:redirect table|mediawiki_redirect]]</code>
* <code>[[mw:Manual:revision table|mediawiki_revision]]</code>
* <code>[[mw:Manual:user table|mediawiki_user]]</code>
* <code>[[mw:Manual:user groups table|mediawiki_user_groups]]</code>


=== Other Data ===
=== Processed data ===
* [[Analytics/Data Lake/Edits/Geowiki|Geowiki]] -- Mostly private dataset with aggregate geographic editing activity data. System is explained in detail at [[Analytics/Systems/Geowiki]].
Those are preprocessed data, usually stored in Parquet format and sometimes containing additional fields. Those tables can be found in the <code>wmf</code> database.


* <code>[[Analytics/Data Lake/Edits/Mediawiki history|mediawiki_history]]</code>: fully denormalized dataset containing user, page and revision processed data
* <code>[[Analytics/Data Lake/Edits/Mediawiki history_dumps|mediawiki_history dumps]]</code>: TSV dump of the Mediawiki-History fully denormalized dataset. Available to download on [https://dumps.wikimedia.org/other/mediawiki_history Mediawiki Dumps]
* <code>[[Analytics/Data Lake/Edits/Mediawiki history reduced|mediawiki_history_reduced]]</code>: Dataset providing a reduced version of the <code>mediawiki_history</code> one, with less fields and specific precomputed events so that the datastore [[Analytics/Systems/Druid|druid]] can compute by-page and by-user activity levels.
* <code>[[Analytics/Data Lake/Edits/Mediawiki user history|mediawiki_user_history]]</code>: a subset of <code>mediawiki_history</code> containing only user events
* <code>[[Analytics/Data Lake/Edits/Mediawiki page history|mediawiki_page_history]]</code>: a subset of <code>mediawiki_history</code> containing only page events
* <code>[[Analytics/Data Lake/Edits/Metrics|mediawiki_metrics]]</code>: Dataset providing precomputed metrics over edits data (e.g. monthly new registered users or daily edits by anonymous users)
* <code>[[Analytics/Data_Lake/Content/XMLDumps/Mediawiki_wikitext_current|mediawiki_wikitext_current]]</code>: Avro version of current-page XML-Dumps (updated monthly, middle of the month). It contains the text of each page latest revision as well as some page and user information.
* <code>[[Analytics/Data_Lake/Content/XMLDumps/Mediawiki_wikitext_history|mediawiki_wikitext_history]]</code>: Avro version of all revisions history XML-Dumps (updated monthly, late in the month). It contains the text of each non-deleted revision as well as some page and user information.
* <code>[[Analytics/Data Lake/Edits/Edit hourly|edit_hourly]]</code>: Cube-like data set focused on edits. Its structure resembles the one from [[Analytics/Data Lake/Traffic/Pageview hourly|pageview_hourly]]. It has an hourly granularity and is partitioned by snapshot (as it is computed from [[Analytics/Data Lake/Edits/Mediawiki history|mediawiki_history]]).
* [[Analytics/Data_Lake/Edits/Geoeditors|Geoeditors]]: Counts of editors by project by country at different activity levels.  For reference, this is migrated from the old [[Analytics/Systems/Geowiki]].
** <code>mediawiki_geoeditors_daily</code>
** <code>mediawiki_geoeditors_monthly</code>
** <code>mediawiki_geoeditors_edits_monthly</code>
** [[Analytics/Data_Lake/Edits/Geoeditors/Public|Public bucketed version of geoeditors monthly]]
* <code>[[Analytics/Data_Lake/Edits/Wikidata_entity|Wikidata entity]]</code>: A parquet version of the wikidata json dumps. Updated weekly, partitioned by snapshot.
* <code>[[Analytics/Data_Lake/Edits/Wikidata_item_page_link|Wikidata item page link]]</code>: Links between wikidata-items and wiki pages (wiki_db, page_id). This is computed using the <code>wikidata_entity</code>, <code>mediawiki_page_history</code> and <code>project_namespace_map</code> tables every week. Warning: the page-history table is updated monthly only, so as month moves in, items to pages links get less precisely binded.


For an explanation of how this data is processed, see docs at [[Analytics/Systems/Wikistats]]
=== Public dataset ===
Download from https://dumps.wikimedia.org/other/analytics/


=== Limitations of the historical datasets ===
=== Limitations of the historical datasets ===
Line 41: Line 56:


==== How much/Which data is missing? ====
==== How much/Which data is missing? ====
After vetting the data for some time we approximated that the recoverable data that we did not make to recover represented less than 1%. We also saw that this data corresponded mostly to the earlier years of reconstructed history (2007-2009), and especially related to deleted pages. We do not have yet an in-depth analysis of the completeness of the data, it's in our backlog, see: https://phabricator.wikimedia.org/T155507
After vetting the data for some time we approximated that the recoverable data that we did not make to recover represented less than 1%. We also saw that this data corresponded mostly to the earlier years of reconstructed history (2007-2009), and especially related to deleted pages. We do not have yet an in-depth analysis of the completeness of the data, it's in our backlog, see: [[phab:T155507]]


==== Will there be improvements in the future to correct this missing data? ====
==== Will there be improvements in the future to correct this missing data? ====
Line 47: Line 62:


==== Examples ====
==== Examples ====
''History of deleted pages that are (re)created:'' Correctly identifying a page as deleted and recreated might be straightforward for small sets of pages. It might also be simplified by "recreated" not meaning the page was undeleted by an administrator. As mentioned above, how MediaWiki logs data changes over time. This further complicates the identification process, particularly on a scale of "across all wikis". You might therefore find examples of pages that were recreated with the same page ID, namespace, and title. This can result in their creation and deletion timestamps in the history table appearing to be incorrect. If you're looking to run analysis on those kind of cases, further narrowing of the dataset (e.g. by time) might allow for correct processing of those.
''History of deleted pages that are (re)created:'' Correctly identifying a page as deleted and recreated might be straightforward for small sets of pages. It might also be simplified by "recreated" not meaning the page was undeleted by an administrator. As mentioned above, how MediaWiki logs data changes over time. This further complicates the identification process, particularly on a scale of "across all wikis". You might therefore find examples of pages that were recreated with the same page ID, namespace, and title. This can result in their creation and deletion timestamps in the history table appearing to be incorrect. If you're looking to run analysis on those kind of cases, further narrowing of the dataset (e.g. by time) might allow for correct processing of those.
== Access ==
Some of the data above is made public through different systems (see [[Analytics]] main page), but any data on the Data Lake is private by default. For this, reference [[Analytics/Data access]]

Latest revision as of 21:18, 21 September 2021

The Analytics Data Lake contains a number of editing datasets.

To access this data see how to request and set up access. To understand the aspects of access and access guidelines see Data Access Guidelines and accessing sensitive data. For recipes that work with lots of data, see Analytics/Data Lake/Cookbook.

Note: In comparison to traffic datasets, edit datasets are not continuously updated. They are regularly updated by fully re-importing/re-building them, creating a new snapshot. This snapshot notion is key when querying the Edits datasets, since including multiple snapshots doesn't make sense for most queries. As of 2017-04, snapshots are provided monthly. When we import, we grab all the data available from all tables except the revision table, for which we filter by where rev_timestamp <= <<snapshot-date>>. If the snapshot is a little late because of processing problems, then by the time it finishes it may have more data in tables like logging, archive, etc. These should not affect history reconstruction because we base everything on revisions, but they'll affect any queries you may run on those tables separately.

The pipeline used to generate these edits datasets is described at Analytics/Systems/Data Lake/Edits/Pipeline.

Datasets

Reference Data

Raw Mediawiki data

These are unprocessed copies of the MariaDB application tables (most of them publicly available) that back our MediaWiki installations. They are stored in the wmf_raw database. Main difference with the original tables in databases is that the import bundle all wikis together in every table, facilitating cross-wiki queries. This means every table contains a new field wiki_db allowing to choose the wikis to query. Another thing to notice about this field is that it is a partition in the sense of hive tables, so a restriction on that field will make the queries a lot faster for not having to read every wiki data.

Processed data

Those are preprocessed data, usually stored in Parquet format and sometimes containing additional fields. Those tables can be found in the wmf database.

  • mediawiki_history: fully denormalized dataset containing user, page and revision processed data
  • mediawiki_history dumps: TSV dump of the Mediawiki-History fully denormalized dataset. Available to download on Mediawiki Dumps
  • mediawiki_history_reduced: Dataset providing a reduced version of the mediawiki_history one, with less fields and specific precomputed events so that the datastore druid can compute by-page and by-user activity levels.
  • mediawiki_user_history: a subset of mediawiki_history containing only user events
  • mediawiki_page_history: a subset of mediawiki_history containing only page events
  • mediawiki_metrics: Dataset providing precomputed metrics over edits data (e.g. monthly new registered users or daily edits by anonymous users)
  • mediawiki_wikitext_current: Avro version of current-page XML-Dumps (updated monthly, middle of the month). It contains the text of each page latest revision as well as some page and user information.
  • mediawiki_wikitext_history: Avro version of all revisions history XML-Dumps (updated monthly, late in the month). It contains the text of each non-deleted revision as well as some page and user information.
  • edit_hourly: Cube-like data set focused on edits. Its structure resembles the one from pageview_hourly. It has an hourly granularity and is partitioned by snapshot (as it is computed from mediawiki_history).
  • Geoeditors: Counts of editors by project by country at different activity levels. For reference, this is migrated from the old Analytics/Systems/Geowiki.
  • Wikidata entity: A parquet version of the wikidata json dumps. Updated weekly, partitioned by snapshot.
  • Wikidata item page link: Links between wikidata-items and wiki pages (wiki_db, page_id). This is computed using the wikidata_entity, mediawiki_page_history and project_namespace_map tables every week. Warning: the page-history table is updated monthly only, so as month moves in, items to pages links get less precisely binded.

Public dataset

Download from https://dumps.wikimedia.org/other/analytics/

Limitations of the historical datasets

Users of this data should be aware that the reconstruction process is not perfect. The resulting data is not 100% complete throughout all wiki-history. In some specific slices/dices of the data set, some fields may be missing (null) or approximated (inferred value).

Why?

  • MediaWiki databases are not meant to store history (revisions yes, of course; but not user history or page history). They hold part of the history in the logging table, but it's incomplete and formatted in many different ways depending on the software version. This makes the reconstruction of MediaWiki history a really complex task. Even sometimes the data is not there, and can not be reconstructed.
  • The size of the data is considerably large. The reconstruction algorithm needs to reprocess the whole database(s) at every run since the beginning of time, because MediaWiki constantly updates the old records of the logging table. This presents hard performance challenges to the reconstruction job, which made the code much more complex. We need to balance the complexity of the job with the data quality, at some point we need to add a lot of complexity to "maybe" improve quality for a small percentage of data. For example, if only 0.5% of pages have field X missing and getting the info to fix the field would make reconstruction twice as complex, it will not be corrected but rather documented as not present. This is a balance of requirements so you always let us know whether we are missing something there.

How much/Which data is missing?

After vetting the data for some time we approximated that the recoverable data that we did not make to recover represented less than 1%. We also saw that this data corresponded mostly to the earlier years of reconstructed history (2007-2009), and especially related to deleted pages. We do not have yet an in-depth analysis of the completeness of the data, it's in our backlog, see: phab:T155507

Will there be improvements in the future to correct this missing data?

Yes, if we know that the improvement will have enough benefit. The mentioned task would help in measuring that.

Examples

History of deleted pages that are (re)created: Correctly identifying a page as deleted and recreated might be straightforward for small sets of pages. It might also be simplified by "recreated" not meaning the page was undeleted by an administrator. As mentioned above, how MediaWiki logs data changes over time. This further complicates the identification process, particularly on a scale of "across all wikis". You might therefore find examples of pages that were recreated with the same page ID, namespace, and title. This can result in their creation and deletion timestamps in the history table appearing to be incorrect. If you're looking to run analysis on those kind of cases, further narrowing of the dataset (e.g. by time) might allow for correct processing of those.