You are browsing a read-only backup copy of Wikitech. The primary site can be found at


From Wikitech-static
< Analytics‎ | AQS
Revision as of 14:47, 1 May 2017 by imported>Nuria (→‎Pagecounts (legacy data))
Jump to navigation Jump to search

This page documents the Pageview API (v1), a public API developed and maintained by the Wikimedia Foundation that serves analytical data about article pageviews of Wikipedia and its sister projects. With it, you can get pageview trends on specific articles or projects; filter by agent type or access method, and choose different time ranges and granularities; you can also get the most viewed articles of a certain project and timespan. Have fun!

Quick start

Technical Documentation: (includes interactive examples).

Pageview counts by article

Daily counts

Get a pageview count timeseries of en.wikipedia's article Albert Enstein for the month of October 2015:


Get a pageview count timeseries of de.wikipedia's article Johann Wolfgang von Goethe from October 13th 2015 to October 27th 2015 counting only the pageviews generated by human users:


Get the number of pageviews of es.wiktionary's entry hoy generated via mobile web on November 1st, 2015:


Monthly counts

Get a monthly pageview count de.wikipedia's article Barack_Obama for the year 2016:

Slice and dice pageview counts

Get a daily pageview count timeseries of all projects for the month of October 2015:


Get an hourly timeseries of all project's pageviews belonging to human users visiting the mobile app on October 1st, 2015:


Get the number of pageviews of ca.wikipedia generated by spiders on mobile web on November 1st, 2015:


Most viewed articles

Get the top 1000 most visited articles from en.wikipedia for October 10th, 2015:


Get the top 1000 articles from pt.wikipedia visited via the mobile app on November 1st, 2015:


Get the top 1000 most visited articles from en.wikisource for all days in October, 2015:


Pageviews for ALL projects





Pagecounts (legacy data)


What is it?

The Pageview API is a collection of REST endpoints that serve analytical data about pageviews in Wikimedia's projects. It's developed and maintained by WMF's Analytics and Services teams, and is implemented using Analytics' Hadoop cluster and RESTBase. This API is meant to be used by anyone interested in pageview statistics on Wikimedia wikis: Foundation, communities, and the rest of the world.

How to access

The API is accessible via https at As it is public, it doesn't need authentication and it supports CORS. The urls are structured like this:

/metrics/pageviews/{endpoint}/{parameter 1}/{parameter 2}/.../{parameter N}


Please, see AQS's RESTBase docs for a complete and interactive technical reference on Pageview API endpoints.

Updates and backfilling

The data is loaded at the end of the timespan in question. So data for 2015-12-01 will be loaded on 2015-12-02 00:00:00 UTC; Data for 2015-11-10 18:00:00 UTC will be loaded on 2015-11-10 19:00:00 UTC; and so on. The loading can take a long time. It's usually a few hours, but sometimes 24 hours, sometimes more if there are problems. See the #Gotchas for more details. The API serves data starting at 2015-08-01.

As for the date range available, we only have the quality source data we need going back to May 1st, 2015. We will finish back-filling to that date but we can't go further back since we delete the more sensitive raw logs that we generate this data from (for privacy reasons).


Very high number of views of the "-" page
Dash value is being used as a special value for "no page title found" when extracting titles from urls thus page titled "-" might seem like it receives an unusually high number of pageviews
404 means zero or not loaded yet
At some point you may get a 404 not found response from the API. Sometimes, this means that there are 0 pageviews for the given project, timespan and filters you specified in the query. Another case this may happen is when your client requests the data for today and the correspondent data has not yet been loaded into the API's database yet (see #Updates_and_backfilling). The problem is that the API, because of implementation reasons, can not distinguish between actual zeros, or data that hasn't been loaded yet in the database. For now, it's up to the user to control that.
404s within timeseries
Because of the same caveat (404 means zero or not loaded yet), if you request a timeseries from the API, you might get no data for the dates that have 0 pageviews. This may create holes in the timeseries and break charting libraries. For now, it's up to the user to control that and fill in the missing zeros.
429 throttling
Client has made too many request and it is being throttled, this will happen if the storage cannot keep up with the request ratio from a given IP. Throttling is enforced at the storage layer, meaning that if you request data we have in cache (cause other client has requested it earlier) there is no throttling. Throttling will be enabled late May 2016.
Pageviews for yesterday not available
Data loads into the pageview API from a large stream of data. This process can take a while. It usually is done within a few hours, but can last 24 hours or more if there are problems. We sometimes also have to re-load data if we find bugs or problems. We will try to announce any and all such significant problems on the analytics-l mailing list.

Sample app

Here is a simple web application sample that shows how to access the Analytics Query Service via JavaScript.

Clients and Tools

As of 2016 the API is pretty new but there are a few clients already available:

Some tools:


Initial release. Featuring 3 endpoints for pageview metrics: per-article, aggregate and top. Some endpoints do not support all granularities yet.
Remove "-" from the top pageviews. The "-" page is both a redirect to the Hyphen Minus page, and the way the Analytics team flags pages with unknown titles (such as search pages, diff pages, and other action specific pages). Community globally asked for this title to be removed from the top list.
Strip out 'www' if it's passed into the project parameter. This is confusing when people try to look up www.mediawiki. Fix decoding bug where articles with % in their titles were causing a 500 error. Fix the date range to include start and end in the results.

Issues with data

Issues with data are documented in the hive data store from which data is extracted: Analytics/Data/Pageview_hourly#Changes_and_known_problems_since_2015-06-16

See also