You are browsing a read-only backup copy of Wikitech. The live site can be found at

Add a wiki

From Wikitech-static
Revision as of 18:11, 11 August 2016 by imported>Dzahn (→‎After you finished: === Tell wikistats labs project to add the wiki ===)
Jump to navigation Jump to search

This page will walk you through how to add a new wiki project. This includes new languages on sister projects, and wikis for committees, chapters etc.


  • Tell in advance the Ops list, jcrespo (jynus) or Coren that this is happening so the storage layer can be prepared and checked (labs, backups, dumps). In particular, it needs to be made clear whether the wiki will be public - if so, ops will arrange for the wiki to be replicated to labs. If not, ops will need to add the wiki to $private_wikis in the puppet repository's manifests/realm.pp.

This can be fixed after the wiki creation, but it is more painful/involves custom queries.


  • First of all, the DNS entry for the new wiki has to be added. Make the following changes to your local checkout of operations/dns.git and submit to gerrit for review.
    • If it is a language project, add the language code to /templates/helpers/langs.tmpl in operations/dns repo. If it is not a new language, then it has probably been done already as these are shared by all the sister projects.
    • If it is a wiki in * domain (chapter wiki or special wiki), add it to /templates/ Make sure to add a mobile entry if necessary.
  • Merge the change in gerrit and run authdns-update. Query the DNS servers to make sure it has been correctly deployed. See DNS#HOWTO for details.
  • For new languages, there is also a need to regenerate zones. Run on ns0, ns1 and ns2: authdns-gen-zones -f /srv/authdns/git/templates /etc/gdnsd/zones && gdnsd checkconf && gdnsd reload-zones

Apache configuration

You will firstly have to login to our deployment host. As of sometime, this is the host tin, you will have to go via bast1001. Then gather all relevant information for the new project, including all configuration variables and entries. Each wiki will have different requirements (project name, namespaces, extensions to be enabled etc).

You now have to determine if this is a new language project (say something like Spanish Wikibooks) or a chapter wiki, or a special wiki (anything else, including private wikis). Then choose the correct part of this manual:

Language Project


IMPORTANT: For Private Wikis

  • Private wiki databases must not be replicated to the labsdb MySQL instances!
    • Before creating a database for a new private wiki, make sure to add the db name to the puppet global array $private_wikis in manifests\realm.pp.
    • Deploy this config change with puppet and manually restart the Prelabsdb-db (Sanitarium) MySQL instance that will house this wiki's db (most likely s3).
    • If you need help with this, please ask a member of the Ops team for help. This is very important.

MediaWiki configuration

Database list Purpose
Database lists of wikis in each MySQL database cluster
all.dblist All wikis should be listed in here
closed.dblist Any closed (no write access, full read access) wikis
deleted.dblist Wiki databases which MediaWiki is no longer configured to access
Database lists of wikis arranged into their relevant size
flaggedrevs.dblist All wikis running the FlaggedRevs extension
securepollglobal.dblist $wgSecurePollCreateWikiGroups wikis: Board Election wikis
visualeditor-default.dblist All wikis where VisualEditor is enabled by default
commonsuploads.dblist All wikis which should have local uploading soft-disabled. Uploads go to Commons instead.
fishbowl.dblist All fishbowl (restricted write access, full read access) wikis
private.dblist All private (read and write restricted) wikis
special.dblist All special wikis
wikidata.dblist All wikis running the Wikidata repo
wikidataclient.dblist All wikis running the Wikidata client (most new language-project wikis should start off like this)
wikimania.dblist All Wikimania wikis
wikimedia.dblist All chapter wikis
wikibooks.dblist All Wikibooks wikis
wikinews.dblist All Wikinews wikis
wikipedia.dblist All Wikipedia wikis
wikiquote.dblist All Wikiquote wikis
wikisource.dblist All Wikisource wikis
wikiversity.dblist All Wikiversity wikis
wikivoyage.dblist All Wikivoyage wikis
wiktionary.dblist All Wiktionary wikis
  • Now it is time to actually create the database (this script also performs other tasks, such as notifying the 'newprojects' mailing list). This has to be done on terbium, not mira (as it's in codfw, so no DB write access). If the wiki is going to run the Wikidata client, make sure it's added to the dblist on terbium **before** running this script, or things will break. There are two different ways to do this, depending on if you did Part A, B, or C above:
  • mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki languagecode projectname databasename domain
  • Part A is for standard language additions to projects, like adding a Spanish Wikinews, which will be my example.
  • mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki languagecode projectname databasename domain
  • EG: mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki es wikinews eswikinews to add a Spanish Wikinews.
  • Part B is for Chapters Wikis. My example will be adding a Finnish chapter wiki:
  • mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki languagecode projectname databasename domain
  • EG: mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki fi wikimedia fiwikimedia
  • Part C is for non-standard special wikis such as committees, chapters, and the like. My example will be
  • mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki languagecode projectname databasename domain
  • EG: mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki en wikimedia strategywiki

(Note the 'aawiki' in these examples: aawiki is in s3, so this will create the database in s3, which is usually the right place for new small wikis.)

  • Merge the config change in Gerrit, and pull it onto tin
  • Check *.dblist files now contain the new wiki
  • Run sync-dir dblists to synchronize all database lists
  • Check wikiversions.json is sane
  • Run sync-wikiversions to synchronize the version number to use for this wiki
  • Run sync-file wmf-config/InitialiseSettings.php
  • Run sync-file w/static/images/project-logos/path-to-logo.png
  • Run sync-file langlist (this must be done before the interwiki cache below)
  • Unless it's a language project, update the meta:Interwiki map
  • Deploy the interwiki update (for all new wiki creations):
    • mwscript extensions/WikimediaMaintenance/dumpInterwiki.php --protocolrelative > wmf-config/interwiki.php
    • Commit change, upload to gerrit for review
    • Review and merge
    • sync-file wmf-config/interwiki.php
    • Update meta:Interwiki map date of last update


RESTBase is a service providing a RESTful API for the projects' wikis. To enable it to serve the new wiki as well, create a patch for ops/puppet adding its domain to RESTBase's configuration.


Parsoid has its own copy of SiteMatrix, which needs updating using tools/fetch-sitematrix.js in mediawiki/services/parsoid.git.

Then, once merged, it must be deployed; if not, switch to visual editor will trigger Failed to load resource: the server responded with a status of 404 () errors.


You shouldn't have to do much (maybe adjust shard sizes preemptively if it'll be big), the index should be automatically created by addWiki.php and all wikis now opt into Cirrus/Elastic by default. See Search/New#Adding_new_wikis for more information.


  • Public wikis:
    • Create the container for thumbnails. mwscript extensions/WikimediaMaintenance/filebackend/setZoneAccess.php databasename --backend=local-multiwrite
  • Private wikis:
    • Create the container for thumbnails. mwscript extensions/WikimediaMaintenance/filebackend/setZoneAccess.php databasename --backend=local-multiwrite --private

Labs DNS

Create a patch for ops/puppet to add the database to the correct list in modules/role/manifests/labs/dnsrecursor.pp


Only needed if the new wiki is supposed to be a Wikidata client.

In order to be able to link the new Wiki from Wikidata and have interwiki links from Wikidata to that Wiki appear on other Wikis, extensions/Wikidata/extensions/Wikibase/lib/maintenance/populateSitesTable.php --force-protocol https needs to be run on at least all Wikidata clients (that includes Wikidata and testwikidata).

That script is known to be troublesome, you might want to ask Marius (hoo) or Katie (aude) run it for you or just create a ticket (that can be done anytime after the Wikis has been created).

Beware: The script sometimes fails with a duplicate key conflict. In that case, go to the wiki's master database and empty the sites and site_identifiers tables, then run the script again. It's probably also wise to backup these tables from Wikidata and at least one Wikipedia before running the script across the whole fleet. Breaking the sites, site_identifiers tables will break page rendering on a Wiki!

Also, make sure that the language code appears in the file client/config/WikibaseClient.default.php in the mediawiki/extensions/Wikibase repo. (Example:


Translatable project name

Add a message with the wiki name to extensions/WikimediaMessages/i18n/wikimediaprojectnames/en.json and qqq.json. The message keys should include the wiki database name (WIKI_DBNAME below) and official "human readable" name (WIKI_NAME below) as follows:

Key Message
project-localized-name-WIKI_DBNAME WIKI_NAME

For example, "project-localized-name-enwiki": "English Wikipedia",

Key Message
project-localized-name-WIKI_DBNAME {{ProjectNameDocumentation|url=WIKI_URL|name=WIKI_NAME|language=WIKI_LANG}}

For example, "project-localized-name-enwiki": "{{ProjectNameDocumentation|url=|name=English Wikipedia|language=en}}",

Interwiki search result title

Add a message with the wiki name to extensions/WikimediaMessages/i18n/wikimediainterwikisearchresults/en.json and qqq.json in the list of search-interwiki-results messages.


If the wiki is not private, not a wikimania, and not misc like usability/outreach/login/vote/strategy/etc., send a change proposal to analytics/refinery.git to add the wiki to static_data/pageview/whitelist/whitelist.tsv

After you finished


If there's something to import (as is often the case in new Language wikis), someone will do so, following the process described at MetaWikipedia:incubator:Incubator:Importing from Incubator (logged at MetaWikipedia:incubator:Incubator:Site creation log).


Then when that's done, add the language code to the ContentTranslation registry - mediawiki/services/cxserver repository, files registry.yaml (included by and registry.wikimedia.yaml (included by, in the source and target sections in each.

Once merged to master, ping the project to deploy the change. That requires to sync repositories, ie to update mediawiki/services/cxserver/deploy to match mediawiki/services/cxserver. See [1] for an example commit.

Clean up interwiki links

After finishing, Inform the community and make a phabricator card for removing old interwikilinks and migrating them to Wikidata (For example phab:T134991 for edits such as d:Special:Diff/336584053 and w:jam:Special:Diff/12330). You can do it by yourself using in pywikibot.

python scripts/ -lang:LANGCODE -clean -start:! -always

Tell wikistats labs project to add the wiki

Create a phabricator ticket with the tag "Labs-project-wikistats" and just ask for the wiki to be added. (What needs to be done can be seen f.e. in

See also