You are browsing a read-only backup copy of Wikitech. The live site can be found at wikitech.wikimedia.org
Add a wiki: Difference between revisions
imported>Alex Monk (→cxserver: Per Nikerabbit on phab:T151570#2837609, this is only for wikipedias) |
imported>Krinkle |
||
Line 1: | Line 1: | ||
{{Warn| | {{Warn|This guide may be incomplete. Some cases don't need Apache configs or a docroot. Look at the existing similar wikis to see what is required.}} | ||
This page | This page documents the proces for '''adding a new wiki'''. This includes new languages on sister projects, and wikis for committees, chapters etc. | ||
== | == Preparation == | ||
The following steps need to be completed before the wiki database may be created. | |||
* | === Notify === | ||
* Create a Phabricator task for the wiki creation (if one doesn't exist already). | |||
** Ensure [[Phab:T18976|T18976]] is the parent task. | |||
** It should be tagged with "Wikimedia-Site-requests". | |||
* Create (if not already) a sub task "Prepare and check storage layer for <new wiki>". | |||
** It should be tagged with "DBA" and "Labs". | |||
* Notify the Operations list. In particular, it needs to be made clear whether the wiki should be public, or private. If public, ops will arrange for the wiki to be replicated to labs. If private, ops will need to add the wiki to <code>$private_wikis</code> in <code>operations/puppet.git:/manifests/realm.pp</code>. | |||
=== DNS === | === DNS === | ||
First of all, ensure the relevant domain names exist for the new wiki. Make the following changes in a commit for the [https://phabricator.wikimedia.org/diffusion/ODNS/ operations/dns.git] repo and submit to Gerrit for review. | |||
*If it is a language project, ensure the language code is present in <code>[https://phabricator.wikimedia.org/diffusion/ODNS/browse/master/templates/helpers/langs.tmpl /templates/helpers/langs.tmpl]</code>. This is shared between all sister projects. If another sister project has the same language already, then this has probably been done already. | |||
*If it is a subdomain of ".wikimedia.org" domain (chapter wiki or special wiki) | |||
*Merge the change in | **Add it to <code>[https://phabricator.wikimedia.org/diffusion/ODNS/browse/master/templates/wikimedia.org /templates/wikimedia.org]</code>. | ||
**Make sure to also add a mobile entry. | |||
*Merge the change in Gerrit and run <code>authdns-update</code>. | |||
*Query the DNS servers to make sure it has been correctly deployed. See [[DNS#HOWTO]] for details. | |||
*For new languages, there is also a need to regenerate zones. Run on ns0, ns1 and ns2: <code>authdns-gen-zones -f /srv/authdns/git/templates /etc/gdnsd/zones && gdnsd checkconf && gdnsd reload-zones</code> | *For new languages, there is also a need to regenerate zones. Run on ns0, ns1 and ns2: <code>authdns-gen-zones -f /srv/authdns/git/templates /etc/gdnsd/zones && gdnsd checkconf && gdnsd reload-zones</code> | ||
=== Apache configuration === | === Apache configuration === | ||
Apache configuration is located at <code>[https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/mediawiki/files/apache/sites/ /modules/mediawiki/files/apache/sites/]</code> in the [https://phabricator.wikimedia.org/diffusion/OPUP/ operations/puppet.git] repo. | |||
*Common configuration: | *Common configuration: | ||
**For | **For a new language project, this step is usually not needed as shared configuration already covers it. | ||
**For chapter | **For a new chapter wiki, add a <code>ServerAlias</code> to the "wikimedia-chapter" virtual host in <code>/wikimedia.conf</code>. | ||
**For Wikimania wikis, add ServerAlias to <code>/wikimania.conf</code>. | **For Wikimania wikis, add a <code>ServerAlias</code> to <code>/wikimania.conf</code>. | ||
*After | *After the change is merged in Gerrit, deploy the configuration change and (if needed) gracefully restart app servers. See [[Apache#Deploying config|Apache# Deploying config]] for details. | ||
* If there are | * If there are additional domains that should point to the same wiki, add it to <code>[https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/mediawiki/files/apache/sites/redirects/redirects.dat /redirects/redirects.dat]</code> and regenerate <code>[https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/mediawiki/files/apache/sites/redirects.conf /redirects.conf]</code>. | ||
=== Language committee === | |||
Determine if this is a new language project or chapter wiki (say something like Spanish Wikibooks), or something else (special wiki or private wiki). | |||
For a new language project, follow the steps below. For all other wikis, go directly to [[#Install]]. | |||
* Make sure the language has been approved by the [[m:Language committee|Language committee]] at [[m:Requests for new languages|Requests for new languages]] on Meta-Wiki. Usually, the | * Make sure the language has been approved by the [[m:Language committee|Language committee]] at [[m:Requests for new languages|Requests for new languages]] on Meta-Wiki. Usually, the Phabricator task will contain a link to the approval page. | ||
* Make sure that the language code appears in [https://github.com/wikimedia/jquery.uls jquery.uls]'s langdb (file data/langdb.yaml). | * Make sure that the language code appears in [https://github.com/wikimedia/jquery.uls jquery.uls]'s langdb (file data/langdb.yaml). | ||
* Go to [[# | * Go to [[#Install]]. | ||
== | == Install == | ||
=== IMPORTANT: For Private Wikis === | === IMPORTANT: For Private Wikis === | ||
* Private wiki databases must not be replicated to the | * Private wiki databases must not be replicated to the Labs DB MySQL instances! | ||
** Before creating a database for a | ** Before creating a database for a private wiki, make sure to add the db name to the puppet global array <code>$private_wikis</code> in <code>operations/puppet.git:/manifests/realm.pp</code>. | ||
** Deploy this config change with puppet and manually restart the Prelabsdb-db (Sanitarium) | ** Deploy this config change with puppet and manually restart the <code>Prelabsdb-db</code> MySQL instance (Sanitarium) on the server that will house this wiki's db (most likely s3). | ||
** If you need help with this, please ask a member of the Ops team for help. This is very important. | ** If you need help with this, please ask a member of the Ops team for help. This is very important. | ||
=== MediaWiki configuration === | === MediaWiki configuration === | ||
Gather all relevant information about the new project. Each wiki has different requirements (project name, namespaces, extensions to be enabled etc). | |||
* | |||
Make the following changes on your own checkout of [https://phabricator.wikimedia.org/diffusion/OMWC/ operations/mediawiki-config.git] and submit to Gerrit for review: | |||
* For a new language project, add the language code ([[:en:List of ISO 639-1 codes|ISO 639 code]]: usually provided in the task) to [https://phabricator.wikimedia.org/diffusion/OMWC/browse/master/langlist langlist]. This will be used at [[Special:SiteMatrix]] and for interwiki linking etc. | |||
** | * Update the configuration files (located in <code>[https://phabricator.wikimedia.org/diffusion/OMWC/browse/master/wmf-config /wmf-config]</code>). Some may have sufficient defaults but the following are often wiki-specific in InitialiseSettings.php: | ||
** | ** [[mw:Manual:$wgServer|$wgServer]], [[mw:Manual:$wgCanonicalServer|$wgCanonicalServer]] - For language projects, the defaults work fine. | ||
** [[mw:Manual:$wgLogo|$wgLogo]], [[mw:Manual:$wgSitename|$wgSitename]], [[mw:Manual:$wgExtraNamespaces|$wgExtraNamespaces]], [[mw:Manual:$wgLocaltimezone|$wgLocaltimezone]]. | |||
** <code>groupOverrides</code>. | |||
* Add the wiki to the relevant dblists ( | ** Ensure [[mw:Manual:$wgCategoryCollation|$wgCategoryCollation]] is set to the appropiate sorting for this wiki's language. If you are not sure, ask on the task. | ||
** If an extension needs to be enabled, this is usually done by setting a wmg-variable to true in InitialiseSettings along with custom configuration there. Some extensions' configuration are located in a dedicated files in the <code>/wmf-config</code> directory. | |||
* For "*wikimedia" suffix databases, [https://github.com/wikimedia/operations-mediawiki-config/commit/8639c3500ac7327e762f77f1eeac1d3bbf035bec add the subdomain to the list in MWMultiVersion::setSiteInfoForWiki] | |||
* Add an entry to [https://phabricator.wikimedia.org/diffusion/OMWC/browse/master/wikiversions.json wikiversions.json]. See [[Deployments]] ([[Deployments/One week|one week]]) and [[mw:Deployment Train|Deployment Train]]. | |||
* Add the wiki to the relevant dblists (in the <code>dblists</code> directory). | |||
{| class="wikitable sortable" | {| class="wikitable sortable" | ||
|- | |- | ||
! Database list !! Purpose | ! Database list !! Purpose | ||
|- | |- | ||
| s1.dblist<br />s2.dblist<br />s3.dblist<br />s4.dblist<br />s5.dblist<br />s6.dblist<br />s7.dblist | | s1.dblist<br />s2.dblist<br />s3.dblist<br />s4.dblist<br />s5.dblist<br />s6.dblist<br />s7.dblist | ||
| Database lists of wikis in each [[MySQL]] database cluster | | '''Every wiki must be in one of these.''' | ||
Database lists of wikis in each [[MySQL]] database cluster | |||
|- | |- | ||
| all.dblist || All wikis | | all.dblist || '''All wikis must be listed here.''' | ||
|- | |- | ||
| closed.dblist || Any closed (no write access, full read access) wikis | | closed.dblist || Any closed (no write access, full read access) wikis | ||
Line 68: | Line 79: | ||
| deleted.dblist || Wiki databases which MediaWiki is no longer configured to access | | deleted.dblist || Wiki databases which MediaWiki is no longer configured to access | ||
|- | |- | ||
| small.dblist<br />medium.dblist<br />large.dblist || Database lists of wikis arranged into their relevant size | | small.dblist<br />medium.dblist<br />large.dblist || '''Every wiki must be in one of these.''' | ||
Database lists of wikis arranged into their relevant size. | |||
|- | |- | ||
| flaggedrevs.dblist || All wikis running the FlaggedRevs extension | | flaggedrevs.dblist || All wikis running the FlaggedRevs extension | ||
Line 81: | Line 93: | ||
|- | |- | ||
| private.dblist || All private (read and write restricted) wikis | | private.dblist || All private (read and write restricted) wikis | ||
|- | |- | ||
| wikidata.dblist || All wikis running the Wikidata repo | | wikidata.dblist || All wikis running the Wikidata repo | ||
Line 88: | Line 98: | ||
| wikidataclient.dblist || All wikis running the Wikidata client (most new language-project wikis should start off like this) | | wikidataclient.dblist || All wikis running the Wikidata client (most new language-project wikis should start off like this) | ||
|- | |- | ||
| wikimania.dblist | | wikimania.dblist | ||
wikimedia.dblist wikibooks.dblist | |||
wikinews.dblist wikipedia.dblist | |||
wikiquote.dblist wikisource.dblist | |||
wikiversity.dblist wikivoyage.dblist | |||
wiktionary.dblist special.dblist | |||
|| '''Every wiki must be in one of these.''' | |||
Sister project, Wikimania, chapter, or special. | |||
NOTE: Some wikis maybe in special and one other list. | |||
|} | |} | ||
=== Database creation === | |||
Once the above is reviewed, merged and pull down on the deployment host also pull it down on terbium (scap pull), but '''do not yet deploy''' to main app servers until after the database is created. | |||
: | Now it's time to actually create the database. The installation script also performs other tasks, such as notifying the "newprojects" mailing list. The installation script must be run from [[terbium]] ('''not''' the deployment host). If the wiki is going to be a Wikidata client, make sure it's present in the <code>wikidataclient.dblist</code> file and that the new version of that file has been pulled down on terbium **'''''before'''''** running the installation script, or things will break. | ||
The syntax is as follows: | |||
<code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki <''languagecode> <projectname> <databasename> <domain>''</code> | |||
* For a new language projects - for example a Spanish Wikinews:<br><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki es wikinews eswikinews es.wikinews.org</code> | |||
: | * For a new chapters wikis - for example a Finnish chapter wiki:<br><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki fi wikimedia fiwikimedia fi.wikimedia.org</code> | ||
* For non-standard special wikis (such as committees, or unique projects like meta or commons) - for example strategy.wikimedia.org:<br><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki en wikimedia strategywiki strategy.wikimedia.org</code> | |||
'''Note: In these examples <code>'aawiki'</code> is used because it is a wiki on s3''' - which means the new wiki will also be created on s3. This is usually the right place for new small wikis. | |||
# Merge the config change in Gerrit, and [[Heterogeneous deployment#In_your_own_repo_via_gerrit|pull it onto tin]]. | |||
# Verify that the *.dblist files now contain the new wiki. | |||
# Run <code>scap sync-dir dblists</code> to synchronize all database lists | |||
# Verify wikiversions.json is sane. | |||
# Run <code>scap sync-wikiversions</code> to synchronize the version number to use for this wiki | |||
# Run <code>scap sync-file wmf-config/InitialiseSettings.php</code> | |||
# Run <code>scap sync-dir w/static/images/project-logos/</code> | |||
# Run <code>scap sync-file langlist</code> (this must be done '''before''' deploying the updated interwiki cache - step 10) | |||
# Unless it's a language project, add the project to the [[meta:Interwiki map]]. | |||
# Regenerate the interwiki cache and deploy it. (For all new wikis, not just new language projects.) | |||
* | #* <code>mwscript extensions/WikimediaMaintenance/dumpInterwiki.php --protocolrelative > wmf-config/interwiki.php</code> | ||
#* Commit change, upload to Gerrit for review | |||
#* Review and merge. | |||
* | #* <code>sync-file wmf-config/interwiki.php</code> | ||
* | #* Edit [[meta:Interwiki map]] and update when the last update was. | ||
=== RESTBase === | === RESTBase === | ||
[[mw:RESTBase|RESTBase]] is a service providing a RESTful API for the projects' wikis. To enable it to serve the new wiki as well, create a patch for | [[mw:RESTBase|RESTBase]] is a service providing a RESTful API for the projects' wikis. To enable it to serve the new wiki as well, create a patch for operations/puppet adding its domain to RESTBase's [[phab:diffusion/OPUP/browse/production/modules/restbase/templates/config.yaml.erb;ca4eb23316acb56d46fb6921b62b3e1300235374$200|configuration (restbase/templates/config.yaml.erb)]]. | ||
=== Parsoid === | === Parsoid === | ||
Parsoid has its own copy of SiteMatrix, which needs updating using tools/fetch-sitematrix.js in mediawiki/services/parsoid.git. | Parsoid has its own copy of SiteMatrix, which needs updating using <code>tools/fetch-sitematrix.js</code> in mediawiki/services/parsoid.git. | ||
Then, once merged, it must be deployed; if not, | Then, once merged, it must be deployed; if not, loading VisualEditor will fail with errors such as "Failed to load resource: the server responded with a status of 404". | ||
=== Search === | === Search === | ||
You shouldn't have to do much (maybe adjust shard sizes preemptively if it'll be big), the index should be automatically created by addWiki.php and all wikis now opt into Cirrus/Elastic by default. See [[Search/New# | You shouldn't have to do much (maybe adjust shard sizes preemptively if it'll be big), the index should be automatically created by addWiki.php and all wikis now opt into Cirrus/Elastic by default. See [[Search/New#Adding new wikis]] for more information. | ||
=== Swift === | === Swift === | ||
* Public wikis: | * Public wikis: | ||
** Create the container for thumbnails. < | ** Create the container for thumbnails. <code>mwscript extensions/WikimediaMaintenance/filebackend/setZoneAccess.php databasename --backend=local-multiwrite</code> | ||
* Private wikis: | * Private wikis: | ||
** Create the container for thumbnails. < | ** Create the container for thumbnails. <code>mwscript extensions/WikimediaMaintenance/filebackend/setZoneAccess.php databasename --backend=local-multiwrite --private</code> | ||
=== Labs === | === Labs === | ||
* Create a patch for operations/puppet to add the database to the correct list in <code>modules/role/manifests/labs/dnsrecursor.pp</code> | |||
* Once it is replicating to the labsdb servers: | |||
** Run <code>maintain-views --databases $wiki --debug on each</code> | |||
** Insert a new row in <code>meta_p.wiki</code> for the new wiki. | |||
== Post-install == | |||
=== Wikidata === | === Wikidata === | ||
<small>''Only needed if the new wiki is supposed to be a Wikidata client.''</small> | <small>''Only needed if the new wiki is supposed to be a Wikidata client.''</small> | ||
In order to be able to link the new | In order to be able to link the new wiki from Wikidata, and to allow interwiki links from Wikidata to the new wiki, run <code>extensions/Wikidata/extensions/Wikibase/lib/maintenance/populateSitesTable.php --force-protocol https</code> on at least all Wikidata clients (including wikidatawiki itself and testwikidata). | ||
'''That script is known to be troublesome''', you might want to ask {{ircnick|hoo|Marius}} or {{ircnick|aude|Katie}} run it for you or just create a ticket (that | '''That script is known to be troublesome''', you might want to ask {{ircnick|hoo|Marius}} or {{ircnick|aude|Katie}} run it for you or just create a ticket (that may be done anytime after the wiki was created). | ||
<span style="color: red; font-face: bold;">Beware</span>: The script sometimes fails with a duplicate key conflict. In that case, go to the wiki's master database and empty the sites and site_identifiers tables, then run the script again. It's probably also wise to backup these tables from Wikidata and at least one Wikipedia before running the script across the whole fleet. '''Breaking the sites, site_identifiers tables will break page rendering | <span style="color: red; font-face: bold;">Beware</span>: The script sometimes fails with a duplicate key conflict. In that case, go to the wiki's master database and empty the sites and site_identifiers tables, then run the script again. It's probably also wise to backup these tables from Wikidata and at least one Wikipedia before running the script across the whole fleet. '''Breaking the sites, site_identifiers tables will break page rendering of many wikis!''' | ||
Make sure that the language code appears in the file <code>client/config/WikibaseClient.default.php</code> in the mediawiki/extensions/Wikibase repo. (Example: https://gerrit.wikimedia.org/r/288097.) | |||
=== WikimediaMessages === | === WikimediaMessages === | ||
Line 202: | Line 210: | ||
Add a message with the wiki name to <code>extensions/WikimediaMessages/i18n/wikimediainterwikisearchresults/en.json</code> and <code>qqq.json</code> in the list of search-interwiki-results messages. | Add a message with the wiki name to <code>extensions/WikimediaMessages/i18n/wikimediainterwikisearchresults/en.json</code> and <code>qqq.json</code> in the list of search-interwiki-results messages. | ||
== Analytics == | === Analytics === | ||
If the wiki is not private, not a | If the wiki is not private, not a Wikimania conference wiki, and not special wiki like usability/outreach/login/vote/strategy/etc., send a change proposal to <code>analytics/refinery.git</code> that adds the wiki to <code>static_data/pageview/whitelist/whitelist.tsv</code> | ||
=== Incubator === | === Incubator === | ||
If there's something to import (as is often the case in new | If there's something to import (as is often the case in new language projects), someone will do so. Their process is described at [[incubator:Project:Importing_from_Incubator|Incubator:Importing from Incubator]] (logged at [[incubator:Project:Site_creation_log|Incubator:Site creation log]]). | ||
=== cxserver === | === cxserver === | ||
For | For new Wikipedia projects only: Add the language code to the ContentTranslation registry - mediawiki/services/cxserver repository, files <code>registry.yaml</code> (included by config.dev.yaml) and <code>registry.wikimedia.yaml</code> (included by config.prod.yaml), in the source and target sections in each. | ||
Once merged to master, ping the project to deploy the change. That requires to sync repositories, ie to update mediawiki/services/cxserver/deploy to match mediawiki/services/cxserver. See | Once merged to master, ping the project to deploy the change. That requires to sync repositories, ie to update mediawiki/services/cxserver/deploy to match mediawiki/services/cxserver. See https://gerrit.wikimedia.org/r/#/c/303763/ for an example commit. | ||
=== Clean up interwiki links === | === Clean up interwiki links === | ||
After | After any import Incubator is completed, Inform the community and make a Phabricator task for removing old interwiki links and migrating them to Wikidata (For example [[phab:T134991|T134991]] for edits such as [[:d:Special:Diff/336584053]] and [[w:jam:Special:Diff/12330|jam:Special:Diff/12330]]). You can do it by yourself using [https://github.com/wikimedia/pywikibot-core/tree/master/scripts/interwikidata.py interwikidata.py] in pywikibot. | ||
<pre> | <pre> | ||
python pwb.py scripts/interwikidata.py -lang:LANGCODE -clean -start:! -always | python pwb.py scripts/interwikidata.py -lang:LANGCODE -clean -start:! -always | ||
Line 221: | Line 227: | ||
=== Tell wikistats labs project to add the wiki === | === Tell wikistats labs project to add the wiki === | ||
Create a | Create a Phabricator task (subtask of the main task to create the wiki) with the tag "Labs-project-wikistats" and just ask for the wiki to be added. (What needs to be done can be seen f.e. in https://phabricator.wikimedia.org/T140970#2531977) | ||
== See also == | == See also == |
Revision as of 00:11, 3 December 2016
![]() | This guide may be incomplete. Some cases don't need Apache configs or a docroot. Look at the existing similar wikis to see what is required. |
This page documents the proces for adding a new wiki. This includes new languages on sister projects, and wikis for committees, chapters etc.
Preparation
The following steps need to be completed before the wiki database may be created.
Notify
- Create a Phabricator task for the wiki creation (if one doesn't exist already).
- Ensure T18976 is the parent task.
- It should be tagged with "Wikimedia-Site-requests".
- Create (if not already) a sub task "Prepare and check storage layer for <new wiki>".
- It should be tagged with "DBA" and "Labs".
- Notify the Operations list. In particular, it needs to be made clear whether the wiki should be public, or private. If public, ops will arrange for the wiki to be replicated to labs. If private, ops will need to add the wiki to
$private_wikis
inoperations/puppet.git:/manifests/realm.pp
.
DNS
First of all, ensure the relevant domain names exist for the new wiki. Make the following changes in a commit for the operations/dns.git repo and submit to Gerrit for review.
- If it is a language project, ensure the language code is present in
/templates/helpers/langs.tmpl
. This is shared between all sister projects. If another sister project has the same language already, then this has probably been done already. - If it is a subdomain of ".wikimedia.org" domain (chapter wiki or special wiki)
- Add it to
/templates/wikimedia.org
. - Make sure to also add a mobile entry.
- Add it to
- Merge the change in Gerrit and run
authdns-update
. - Query the DNS servers to make sure it has been correctly deployed. See DNS#HOWTO for details.
- For new languages, there is also a need to regenerate zones. Run on ns0, ns1 and ns2:
authdns-gen-zones -f /srv/authdns/git/templates /etc/gdnsd/zones && gdnsd checkconf && gdnsd reload-zones
Apache configuration
Apache configuration is located at /modules/mediawiki/files/apache/sites/
in the operations/puppet.git repo.
- Common configuration:
- For a new language project, this step is usually not needed as shared configuration already covers it.
- For a new chapter wiki, add a
ServerAlias
to the "wikimedia-chapter" virtual host in/wikimedia.conf
. - For Wikimania wikis, add a
ServerAlias
to/wikimania.conf
.
- After the change is merged in Gerrit, deploy the configuration change and (if needed) gracefully restart app servers. See Apache# Deploying config for details.
- If there are additional domains that should point to the same wiki, add it to
/redirects/redirects.dat
and regenerate/redirects.conf
.
Language committee
Determine if this is a new language project or chapter wiki (say something like Spanish Wikibooks), or something else (special wiki or private wiki).
For a new language project, follow the steps below. For all other wikis, go directly to #Install.
- Make sure the language has been approved by the Language committee at Requests for new languages on Meta-Wiki. Usually, the Phabricator task will contain a link to the approval page.
- Make sure that the language code appears in jquery.uls's langdb (file data/langdb.yaml).
- Go to #Install.
Install
IMPORTANT: For Private Wikis
- Private wiki databases must not be replicated to the Labs DB MySQL instances!
- Before creating a database for a private wiki, make sure to add the db name to the puppet global array
$private_wikis
inoperations/puppet.git:/manifests/realm.pp
. - Deploy this config change with puppet and manually restart the
Prelabsdb-db
MySQL instance (Sanitarium) on the server that will house this wiki's db (most likely s3). - If you need help with this, please ask a member of the Ops team for help. This is very important.
- Before creating a database for a private wiki, make sure to add the db name to the puppet global array
MediaWiki configuration
Gather all relevant information about the new project. Each wiki has different requirements (project name, namespaces, extensions to be enabled etc).
Make the following changes on your own checkout of operations/mediawiki-config.git and submit to Gerrit for review:
- For a new language project, add the language code (ISO 639 code: usually provided in the task) to langlist. This will be used at Special:SiteMatrix and for interwiki linking etc.
- Update the configuration files (located in
/wmf-config
). Some may have sufficient defaults but the following are often wiki-specific in InitialiseSettings.php:- $wgServer, $wgCanonicalServer - For language projects, the defaults work fine.
- $wgLogo, $wgSitename, $wgExtraNamespaces, $wgLocaltimezone.
groupOverrides
.- Ensure $wgCategoryCollation is set to the appropiate sorting for this wiki's language. If you are not sure, ask on the task.
- If an extension needs to be enabled, this is usually done by setting a wmg-variable to true in InitialiseSettings along with custom configuration there. Some extensions' configuration are located in a dedicated files in the
/wmf-config
directory.
- For "*wikimedia" suffix databases, add the subdomain to the list in MWMultiVersion::setSiteInfoForWiki
- Add an entry to wikiversions.json. See Deployments (one week) and Deployment Train.
- Add the wiki to the relevant dblists (in the
dblists
directory).
Database list | Purpose |
---|---|
s1.dblist s2.dblist s3.dblist s4.dblist s5.dblist s6.dblist s7.dblist |
Every wiki must be in one of these.
Database lists of wikis in each MySQL database cluster |
all.dblist | All wikis must be listed here. |
closed.dblist | Any closed (no write access, full read access) wikis |
deleted.dblist | Wiki databases which MediaWiki is no longer configured to access |
small.dblist medium.dblist large.dblist |
Every wiki must be in one of these.
Database lists of wikis arranged into their relevant size. |
flaggedrevs.dblist | All wikis running the FlaggedRevs extension |
securepollglobal.dblist | $wgSecurePollCreateWikiGroups wikis: Board Election wikis |
visualeditor-default.dblist | All wikis where VisualEditor is enabled by default |
commonsuploads.dblist | All wikis which should have local uploading soft-disabled. Uploads go to Commons instead. |
fishbowl.dblist | All fishbowl (restricted write access, full read access) wikis |
private.dblist | All private (read and write restricted) wikis |
wikidata.dblist | All wikis running the Wikidata repo |
wikidataclient.dblist | All wikis running the Wikidata client (most new language-project wikis should start off like this) |
wikimania.dblist
wikimedia.dblist wikibooks.dblist wikinews.dblist wikipedia.dblist wikiquote.dblist wikisource.dblist wikiversity.dblist wikivoyage.dblist wiktionary.dblist special.dblist |
Every wiki must be in one of these.
Sister project, Wikimania, chapter, or special. NOTE: Some wikis maybe in special and one other list. |
Database creation
Once the above is reviewed, merged and pull down on the deployment host also pull it down on terbium (scap pull), but do not yet deploy to main app servers until after the database is created.
Now it's time to actually create the database. The installation script also performs other tasks, such as notifying the "newprojects" mailing list. The installation script must be run from terbium (not the deployment host). If the wiki is going to be a Wikidata client, make sure it's present in the wikidataclient.dblist
file and that the new version of that file has been pulled down on terbium **before** running the installation script, or things will break.
The syntax is as follows:
mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki <languagecode> <projectname> <databasename> <domain>
- For a new language projects - for example a Spanish Wikinews:
mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki es wikinews eswikinews es.wikinews.org
- For a new chapters wikis - for example a Finnish chapter wiki:
mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki fi wikimedia fiwikimedia fi.wikimedia.org
- For non-standard special wikis (such as committees, or unique projects like meta or commons) - for example strategy.wikimedia.org:
mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki en wikimedia strategywiki strategy.wikimedia.org
Note: In these examples 'aawiki'
is used because it is a wiki on s3 - which means the new wiki will also be created on s3. This is usually the right place for new small wikis.
- Merge the config change in Gerrit, and pull it onto tin.
- Verify that the *.dblist files now contain the new wiki.
- Run
scap sync-dir dblists
to synchronize all database lists - Verify wikiversions.json is sane.
- Run
scap sync-wikiversions
to synchronize the version number to use for this wiki - Run
scap sync-file wmf-config/InitialiseSettings.php
- Run
scap sync-dir w/static/images/project-logos/
- Run
scap sync-file langlist
(this must be done before deploying the updated interwiki cache - step 10) - Unless it's a language project, add the project to the meta:Interwiki map.
- Regenerate the interwiki cache and deploy it. (For all new wikis, not just new language projects.)
mwscript extensions/WikimediaMaintenance/dumpInterwiki.php --protocolrelative > wmf-config/interwiki.php
- Commit change, upload to Gerrit for review
- Review and merge.
sync-file wmf-config/interwiki.php
- Edit meta:Interwiki map and update when the last update was.
RESTBase
RESTBase is a service providing a RESTful API for the projects' wikis. To enable it to serve the new wiki as well, create a patch for operations/puppet adding its domain to RESTBase's configuration (restbase/templates/config.yaml.erb).
Parsoid
Parsoid has its own copy of SiteMatrix, which needs updating using tools/fetch-sitematrix.js
in mediawiki/services/parsoid.git.
Then, once merged, it must be deployed; if not, loading VisualEditor will fail with errors such as "Failed to load resource: the server responded with a status of 404".
Search
You shouldn't have to do much (maybe adjust shard sizes preemptively if it'll be big), the index should be automatically created by addWiki.php and all wikis now opt into Cirrus/Elastic by default. See Search/New#Adding new wikis for more information.
Swift
- Public wikis:
- Create the container for thumbnails.
mwscript extensions/WikimediaMaintenance/filebackend/setZoneAccess.php databasename --backend=local-multiwrite
- Create the container for thumbnails.
- Private wikis:
- Create the container for thumbnails.
mwscript extensions/WikimediaMaintenance/filebackend/setZoneAccess.php databasename --backend=local-multiwrite --private
- Create the container for thumbnails.
Labs
- Create a patch for operations/puppet to add the database to the correct list in
modules/role/manifests/labs/dnsrecursor.pp
- Once it is replicating to the labsdb servers:
- Run
maintain-views --databases $wiki --debug on each
- Insert a new row in
meta_p.wiki
for the new wiki.
- Run
Post-install
Wikidata
Only needed if the new wiki is supposed to be a Wikidata client.
In order to be able to link the new wiki from Wikidata, and to allow interwiki links from Wikidata to the new wiki, run extensions/Wikidata/extensions/Wikibase/lib/maintenance/populateSitesTable.php --force-protocol https
on at least all Wikidata clients (including wikidatawiki itself and testwikidata).
That script is known to be troublesome, you might want to ask Marius (hoo) or Katie (aude) run it for you or just create a ticket (that may be done anytime after the wiki was created).
Beware: The script sometimes fails with a duplicate key conflict. In that case, go to the wiki's master database and empty the sites and site_identifiers tables, then run the script again. It's probably also wise to backup these tables from Wikidata and at least one Wikipedia before running the script across the whole fleet. Breaking the sites, site_identifiers tables will break page rendering of many wikis!
Make sure that the language code appears in the file client/config/WikibaseClient.default.php
in the mediawiki/extensions/Wikibase repo. (Example: https://gerrit.wikimedia.org/r/288097.)
WikimediaMessages
Translatable project name
Add a message with the wiki name to extensions/WikimediaMessages/i18n/wikimediaprojectnames/en.json
and qqq.json
. The message keys should include the wiki database name (WIKI_DBNAME
below) and official "human readable" name (WIKI_NAME
below) as follows:
Key | Message |
project-localized-name-WIKI_DBNAME
|
WIKI_NAME
|
For example, "project-localized-name-enwiki": "English Wikipedia",
Key | Message |
project-localized-name-WIKI_DBNAME
|
{{ProjectNameDocumentation|url=WIKI_URL|name=WIKI_NAME|language=WIKI_LANG}}
|
For example, "project-localized-name-enwiki": "{{ProjectNameDocumentation|url=https://en.wikipedia.org|name=English Wikipedia|language=en}}",
Interwiki search result title
Add a message with the wiki name to extensions/WikimediaMessages/i18n/wikimediainterwikisearchresults/en.json
and qqq.json
in the list of search-interwiki-results messages.
Analytics
If the wiki is not private, not a Wikimania conference wiki, and not special wiki like usability/outreach/login/vote/strategy/etc., send a change proposal to analytics/refinery.git
that adds the wiki to static_data/pageview/whitelist/whitelist.tsv
Incubator
If there's something to import (as is often the case in new language projects), someone will do so. Their process is described at Incubator:Importing from Incubator (logged at Incubator:Site creation log).
cxserver
For new Wikipedia projects only: Add the language code to the ContentTranslation registry - mediawiki/services/cxserver repository, files registry.yaml
(included by config.dev.yaml) and registry.wikimedia.yaml
(included by config.prod.yaml), in the source and target sections in each.
Once merged to master, ping the project to deploy the change. That requires to sync repositories, ie to update mediawiki/services/cxserver/deploy to match mediawiki/services/cxserver. See https://gerrit.wikimedia.org/r/#/c/303763/ for an example commit.
Clean up interwiki links
After any import Incubator is completed, Inform the community and make a Phabricator task for removing old interwiki links and migrating them to Wikidata (For example T134991 for edits such as d:Special:Diff/336584053 and jam:Special:Diff/12330). You can do it by yourself using interwikidata.py in pywikibot.
python pwb.py scripts/interwikidata.py -lang:LANGCODE -clean -start:! -always
Tell wikistats labs project to add the wiki
Create a Phabricator task (subtask of the main task to create the wiki) with the tag "Labs-project-wikistats" and just ask for the wiki to be added. (What needs to be done can be seen f.e. in https://phabricator.wikimedia.org/T140970#2531977)