Difference between revisions of "Add a wiki"

From Wikitech-static
Jump to navigation Jump to search
imported>Reedy
(→‎See also: Remove .)
imported>Volans
m (→‎Best method: Update the cookbook name that was renamed.)
(67 intermediate revisions by 26 users not shown)
Line 8: Line 8:
=== Notify ===
=== Notify ===
* Create a Phabricator task for the wiki creation (if one doesn't exist already).
* Create a Phabricator task for the wiki creation (if one doesn't exist already).
** Ensure [[Phab:T18976|T18976]] is the parent task.
** It should be tagged with <code>[[phab:project/view/2942/|#Wiki-Setup (Create)]]</code>.
** It should be tagged with "Wikimedia-Site-requests".
* Create (if not already) a sub-task "Prepare and check storage layer for <new wiki>".  
* Create (if not already) a sub task "Prepare and check storage layer for <new wiki>".  
** It should be tagged with <code>#DBA</code>, <code>#wmcs-kanban</code> and <code>#Data-Services</code>.
** It should be tagged with "DBA" and "Labs".
* Notify the Operations list. In particular, it needs to be made clear whether the wiki should be public, or private. If public, ops will arrange for the wiki to be replicated to Cloud Services. If private, ops will need to add the wiki to <code>$private_wikis</code> in <code>operations/puppet.git:/manifests/realm.pp</code>.
* Notify the Operations list. In particular, it needs to be made clear whether the wiki should be public, or private. If public, ops will arrange for the wiki to be replicated to labs. If private, ops will need to add the wiki to <code>$private_wikis</code> in <code>operations/puppet.git:/manifests/realm.pp</code>.
* IMPORTANT: If the wiki is a regular public wiki to appear on Cloud Services - you can continue. '''If the wiki is private and should not be replicated to Cloud Services DO NOT CONTINUE UNTIL YOU HAVE THE OK FROM OPS/DBAs''' to check no private data is leaked. There are mechanisms in place to prevent that by default, but those should be manually checked.
* IMPORTANT: If the wiki is a regular public wiki to appear on labs- you can continue. '''If the wiki is private and should not be replicated to labs DO NOT CONTINUE UNTIL YOU HAVE THE OK FROM OPS/DBAs''' to check no private data is leaked. There are mechanisms in place to prevent that by default, but those should be manually checked.


=== DNS ===
=== DNS ===
First of all, ensure the relevant domain names exist for the new wiki. Make the following changes in a commit for the [https://phabricator.wikimedia.org/diffusion/ODNS/ operations/dns.git] repo and submit to Gerrit for review.
First of all, ensure the relevant domain names exist for the new wiki. Make the following changes in a commit for the [https://phabricator.wikimedia.org/diffusion/ODNS/ operations/dns.git] repo and submit to Gerrit for review.
*If it is a language project, ensure the language code is present in <code>[https://phabricator.wikimedia.org/diffusion/ODNS/browse/master/templates/helpers/langs.tmpl /templates/helpers/langs.tmpl]</code>. This is shared between all sister projects. If another sister project has the same language already, then this has probably been done already.
*If it is a language project, ensure the language code is present in [https://phabricator.wikimedia.org/diffusion/ODNS/browse/master/templates/helpers/langlist.tmpl /templates/helpers/langlist.tmpl]. This is shared between all sister projects. If another sister project has the same language already, then this has probably been done already.
*If it is a subdomain of ".wikimedia.org" domain (chapter wiki or special wiki)
*If it is a subdomain of ".wikimedia.org" domain (chapter wiki or special wiki)
**Add it to <code>[https://phabricator.wikimedia.org/diffusion/ODNS/browse/master/templates/wikimedia.org /templates/wikimedia.org]</code>.
**Add it to <code>[https://phabricator.wikimedia.org/diffusion/ODNS/browse/master/templates/wikimedia.org /templates/wikimedia.org]</code>.
Line 24: Line 23:
*Merge the change in Gerrit and run <code>authdns-update</code>.
*Merge the change in Gerrit and run <code>authdns-update</code>.
*Query the DNS servers to make sure it has been correctly deployed. See [[DNS#HOWTO]] for details.
*Query the DNS servers to make sure it has been correctly deployed. See [[DNS#HOWTO]] for details.
*For new languages, there is also a need to regenerate zones. Run on ns0, ns1 and ns2: <code>authdns-gen-zones -f /srv/authdns/git/templates /etc/gdnsd/zones && gdnsd checkconf && gdnsd reload-zones</code>
*For new languages, there is also a need to regenerate zones. Run on ns0, ns1 and ns2: <code>authdns-gen-zones -f /srv/authdns/git/templates /etc/gdnsd/zones && gdnsdctl reload-zones</code>


=== Apache configuration ===
=== Apache configuration ===
Apache configuration is located at <code>[https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/mediawiki/files/apache/sites/ /modules/mediawiki/files/apache/sites/]</code> in the [https://phabricator.wikimedia.org/diffusion/OPUP/ operations/puppet.git] repo.
Apache configuration is located in the [https://phabricator.wikimedia.org/diffusion/OPUP/ operations/puppet.git] repo.
*Common configuration:
*Common configuration:
**For a new language project, this step is usually not needed as shared configuration already covers it.
**For a new language project, this step is usually not needed as shared configuration already covers it.
**For a new chapter wiki, add a <code>ServerAlias</code> to the "wikimedia-chapter" virtual host in <code>/wikimedia.conf</code>.
**For a new chapter wiki, add it to {{Puppet manifest|::mediawiki:web::prod_sites}} under <code>wikimedia-chapter</code>
**For Wikimania wikis, add a <code>ServerAlias</code> to <code>/wikimania.conf</code>.
*After the change is merged in Gerrit, deploy the configuration change and (if needed) gracefully restart app servers. See [[Application_servers/Runbook#Deploying_config]] for details.
*After the change is merged in Gerrit, deploy the configuration change and (if needed) gracefully restart app servers. See [[Apache#Deploying config|Apache# Deploying config]] for details.
* If there are additional domains that should point to the same wiki, add it to <code>[[phab:diffusion/OPUP/browse/production/modules/mediawiki/files/apache/sites/redirects/redirects.dat|redirects/redirects.dat]]</code>.
* If there are additional domains that should point to the same wiki, add it to <code>[https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/mediawiki/files/apache/sites/redirects/redirects.dat /redirects/redirects.dat]</code> and regenerate  <code>[https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/mediawiki/files/apache/sites/redirects.conf /redirects.conf]</code>.


=== Language committee ===
=== Language configuration ===
Determine if this is a new language project or chapter wiki (say something like Spanish Wikibooks), or something else (special wiki or private wiki).
Check the following for every wiki, even if another wiki in this language already exists:


For a new language project, follow the steps below. For all other wikis, go directly to [[#Install]].
# Make sure that the wiki's content language code appears in the [https://github.com/wikimedia/language-data language-data repository] (file data/langdb.yaml; if you're adding it, make sure to follow the instructions in README).
* Make sure the language has been approved by the [[m:Language committee|Language committee]] at [[m:Requests for new languages|Requests for new languages]] on Meta-Wiki. Usually, the Phabricator task will contain a link to the approval page.
# Make sure that the wiki's content language code is fully configured to be supported by core MediaWiki. The code must appear in the following files:
* Make sure that the language code appears in [https://github.com/wikimedia/jquery.uls jquery.uls]'s langdb (file data/langdb.yaml).
#*<code>languages/data/Names.php</code> (precisely this code, and not a variant)
* Go to [[#Install]].
#*The file <code>languages/messages/Messages''Abc''.php</code> must exist (replace ''Abc'' with the language code).
# If the language is written from right to left, make sure it is configured as such in the following files:
#*In the core MediaWiki repository, the file languages/messages/Messages''Abc''.php must explicitly say <code>$rtl = true;</code> near the top.
#*In the [[gerrit:#/admin/projects/mediawiki/extensions/MobileFrontend|MobileFrontend]] extension repository: src/mobile.languages.structured/rtlLanguages.js
#*In the [https://github.com/wikimedia/apps-android-wikipedia Android app] repository: app/src/main/java/org/wikipedia/util/L10nUtil.java (the RTL_LANGS list)
#*In the [https://github.com/wikimedia/wikipedia-ios iOS app] repository: Wikipedia/Code/MWLanguageInfo.m (the rtlLanguages list)
#Check whether the language code appears in any extra configuration for Wikibase and Wikidata. If it appears in any of the following files, it should be removed (verify with Wikidata developers):
#*In the Wikibase repository: [[git:mediawiki/extensions/Wikibase/+/master/lib/includes/WikibaseContentLanguages.php|lib/includes/WikibaseContentLanguages.php]]
#*In the WikibaseLexeme repository:
#**[[git:mediawiki/extensions/WikibaseLexeme/+/master/WikibaseLexeme.mediawiki-services.php|WikibaseLexeme.mediawiki-services.php]]
#**If the key <code>wikibase-lexeme-language-name-''ABC''</code> exists in [[git:mediawiki/extensions/WikibaseLexeme/+/master/i18n/en.json|i18n/en.json]] and [[git:mediawiki/extensions/WikibaseLexeme/+/master/i18n/qqq.json|i18n/qqq.json]], remove it from these two files.
#*In the Wikimedia's mediawiki-config repository: the <code>wmgExtraLanguageNames</code> variable in [[git:operations/mediawiki-config/+/master/wmf-config/InitialiseSettings.php|wmf-config/InitialiseSettings.php]]
#Check that usernames in the writing system of this language are possible and aren't filtered out. If they are blocked, add support for this writing system to AntiSpoof. [[gerrit:c/mediawiki/extensions/AntiSpoof/+/575783/|Example patch]].
 
Determine if this is a new language project (like Spanish Wikibooks) or a chapter wiki (like Wikimedia Denmark), or something else (special wiki or private wiki).
 
For a new language project, make sure the language has been approved by the [[m:Language committee|Language committee]] at [[m:Requests for new languages|Requests for new languages]] on Meta-Wiki. Usually, the Phabricator task will contain a link to the approval page. For all other wikis, go directly to [[#Install]].


== Install ==
== Install ==


=== IMPORTANT: For Private Wikis ===
=== IMPORTANT: For Private Wikis ===
* Private wiki databases must not be replicated to the Labs DB MySQL instances!
* Private wiki databases '''must not''' be replicated to the Cloud Services DB MySQL instances!
** Before creating a database for a private wiki, make sure to add the db name to the puppet global array <code>$private_wikis</code> in <code>operations/puppet.git:/manifests/realm.pp</code>.
** Before creating a database for a private wiki, make sure to add the db name to the puppet global array <code>$private_wikis</code> in <code>operations/puppet.git:/manifests/realm.pp</code>.
** Deploy this config change with puppet and manually restart the <code>Prelabsdb-db</code> MySQL instance (Sanitarium) on the server that will house this wiki's db (most likely s3).
** Deploy this config change with puppet and manually restart the <code>Prelabsdb-db</code> MySQL instance (Sanitarium) on the server that will house this wiki's db (most likely <code>s5</code>).
** If you need help with this, please ask a member of the Ops team for help. This is very important.
** If you need help with this, please ask a member of the Ops team for help. This is very important.


Line 54: Line 68:
Gather all relevant information about the new project. Each wiki has different requirements (project name, namespaces, extensions to be enabled etc).
Gather all relevant information about the new project. Each wiki has different requirements (project name, namespaces, extensions to be enabled etc).


Make the following changes on your own checkout of [https://phabricator.wikimedia.org/diffusion/OMWC/ operations/mediawiki-config.git] and submit to Gerrit for review:
Make the following changes on your own checkout of [[phab:diffusion/OMWC/|operations/mediawiki-config.git]] and submit to Gerrit for review:
* For a new language project, add the language code ([[:en:List of ISO 639-1 codes|ISO 639 code]]: usually provided in the task) to [https://phabricator.wikimedia.org/diffusion/OMWC/browse/master/langlist langlist]. This will be used at [[Special:SiteMatrix]] and for interwiki linking etc.
 
* Update the configuration files (located in <code>[https://phabricator.wikimedia.org/diffusion/OMWC/browse/master/wmf-config /wmf-config]</code>). Some may have sufficient defaults but the following are often wiki-specific in InitialiseSettings.php:
* For a new language project, add the language code ([[:en:List of ISO 639-1 codes|ISO 639 code]]: usually provided in the task) to [[phab:diffusion/OMWC/browse/master/langlist|langlist]]. This will be used at [[Special:SiteMatrix]] and for interwiki linking etc.
** [[mw:Manual:$wgServer|$wgServer]], [[mw:Manual:$wgCanonicalServer|$wgCanonicalServer]] - For language projects, the defaults work fine.
* Create the YAML definition of the wiki in <code>wmf-config/config</code>, including making it inherit from relevant settings profiles, as follows:
** [[mw:Manual:$wgLogo|$wgLogo]], [[mw:Manual:$wgSitename|$wgSitename]], [[mw:Manual:$wgExtraNamespaces|$wgExtraNamespaces]], [[mw:Manual:$wgLocaltimezone|$wgLocaltimezone]].
** <code>groupOverrides</code>.
** Ensure [[mw:Manual:$wgCategoryCollation|$wgCategoryCollation]] is set to the appropiate sorting for this wiki's language. If you are not sure, ask on the task.
** If an extension needs to be enabled, this is usually done by setting a wmg-variable to true in InitialiseSettings along with custom configuration there. Some extensions' configuration are located in a dedicated files in the <code>/wmf-config</code> directory.
* For "*wikimedia" suffix databases, [https://github.com/wikimedia/operations-mediawiki-config/commit/8639c3500ac7327e762f77f1eeac1d3bbf035bec add the subdomain to the list in MWMultiVersion::setSiteInfoForWiki]
* Add an entry to [https://phabricator.wikimedia.org/diffusion/OMWC/browse/master/wikiversions.json wikiversions.json]. See [[Deployments]] ([[Deployments/One week|one week]]) and [[mw:Deployment Train|Deployment Train]].


* Add the wiki to the relevant dblists (in the <code>dblists</code> directory).
{| class="wikitable sortable"
{| class="wikitable sortable"
!Wiki grouping
!Group options
!Purpose
|-
|-
! Database list !! Purpose
!All
 
 
<u>All wikis must be tagged against 'all'</u>
|all.dblist
|The primary listing of what wikis exist, used as the basis of all tools.
|-
|-
| s1.dblist<br />s2.dblist<br />s3.dblist<br />s4.dblist<br />s5.dblist<br />s6.dblist<br />s7.dblist
!DB cluster
| '''Every wiki must be in one of these.'''
 
Database lists of wikis in each [[MySQL]] database cluster.<br />In most cases, wikis should just be added to s3.dblist
 
<u>Every wiki must be in exactly one of these</u>
|s1.dblist<br />s2.dblist<br />s3.dblist<br />s4.dblist<br />s5.dblist<br />s6.dblist<br />s7.dblist<br />s8.dblist<br />s10.dblist<br />s11.dblist
|Database lists of wikis in each [[MySQL]] database cluster.<br>In most cases, wikis should just be added to <code>s5</code><br><code>s10</code> was previously known as <code>wikitech</code>
|-
|-
| all.dblist || '''All wikis must be listed here.'''
!Wiki size
 
 
<u>Every wiki must be in exactly one of these</u>
|small.dblist<br />medium.dblist<br />large.dblist
|Database lists of wikis arranged into their relevant size.
|-
|-
| closed.dblist || Any closed (no write access, full read access) wikis
!Wiki family
 
 
<u>Every wiki must be in exactly one of these (or one plus in special)</u>
|wikimania.dblist
wikimedia.dblist wikibooks.dblist
 
wikinews.dblist wikipedia.dblist
 
wikiquote.dblist wikisource.dblist
 
wikiversity.dblist wikivoyage.dblist
 
wiktionary.dblist special.dblist
|Sister project, Wikimania, chapter, or special.
NOTE: Some wikis maybe in special and one other list.
|-
|-
| deleted.dblist || Wiki databases which MediaWiki is no longer configured to access
!Closed
|closed.dblist
|Any closed (no write access, full read access) wikis
|-
|-
| small.dblist<br />medium.dblist<br />large.dblist || '''Every wiki must be in one of these.'''
!Deleted
Database lists of wikis arranged into their relevant size.
|deleted.dblist
|Wiki databases which MediaWiki is no longer configured to access
|-
|-
| flaggedrevs.dblist || All wikis running the FlaggedRevs extension
! rowspan="2" |Wiki privacy
|fishbowl.dblist
|All fishbowl (restricted write access, full read access) wikis
|-
|-
| securepollglobal.dblist || $wgSecurePollCreateWikiGroups wikis: Board Election wikis
|private.dblist
|All private (read and write restricted) wikis
|-
|-
| visualeditor-default.dblist || All wikis where VisualEditor is enabled by default
! rowspan="4" |Extension configurations
|flaggedrevs.dblist
|All wikis running the FlaggedRevs extension
|-
|-
| commonsuploads.dblist || All wikis which should have local uploading soft-disabled. Uploads go to Commons instead.
|securepollglobal.dblist
|$wgSecurePollCreateWikiGroups wikis: Board Election wikis
|-
|-
| fishbowl.dblist || All fishbowl (restricted write access, full read access) wikis
|visualeditor-nondefault.dblist
|All wikis where VisualEditor is not enabled by default
|-
|-
| private.dblist || All private (read and write restricted) wikis
|commonsuploads.dblist
|All wikis which should have local uploading soft-disabled. Uploads go to Commons instead.
|-
|-
| wikidata.dblist || All wikis running the Wikidata repo
! rowspan="2" |Wikidata-related wiki groups
|wikidata.dblist
|All wikis running the Wikidata repo
|-
|-
| wikidataclient.dblist || All wikis running the Wikidata client (most new language-project wikis should start off like this)
|wikidataclient.dblist
|-
|All wikis running the Wikidata client (most new language-project wikis should start off like this)
| wikimania.dblist
|}
wikimedia.dblist wikibooks.dblist


wikinews.dblist wikipedia.dblist
* Once your YAML file is created, add a dummy entry to [[phab:diffusion/OMWC/browse/master/wikiversions.json|<code>wikiversions.json</code>]] mapping the database name to a version string, then run <code>composer test</code> which will update the automatic "dblist" files and spot any issues. Alternatively, you can run <code>composer buildDBLists</code> which only builds the dblists.
* If the new wiki is created at a different shard than s3 (which is ''likely'', as new wikis are created at s5 as of August 2020), you ''need'' to add the wiki to <code>wmf-config/db-eqiad.php</code> and <code>wmf-config/db-codfw.php</code>. Check with DBAs if you're unsure how to properly edit the file, this is very important.
* Update the configuration files (located in <code>[[phab:diffusion/OMWC/browse/master/wmf-config|/wmf-config]]</code>). Some may have sufficient defaults but the following are often wiki-specific in InitialiseSettings.php:
** [[mw:Manual:$wgServer|$wgServer]], [[mw:Manual:$wgCanonicalServer|$wgCanonicalServer]] - For language projects, the defaults work fine.
** [[mw:Manual:$wgLogos|$wgLogos]] (see below), [[mw:Manual:$wgSitename|$wgSitename]], [[mw:Manual:$wgExtraNamespaces|$wgExtraNamespaces]], [[mw:Manual:$wgLocaltimezone|$wgLocaltimezone]].
** <code>groupOverrides</code>.
** Ensure [[mw:Manual:$wgCategoryCollation|$wgCategoryCollation]] is set to the appropiate sorting for this wiki's language. If you are not sure, ask on the task.
** If an extension needs to be enabled, this is usually done by setting a wmg-variable to true in InitialiseSettings along with custom configuration there. Some extensions' configuration are located in a dedicated files in the <code>/wmf-config</code> directory.
* If you added a new language code to the langlist (see above), you probably need to add it to the InterwikiSortingOrder.php file too
* For "*wikimedia" suffix databases, [https://github.com/wikimedia/operations-mediawiki-config/commit/8639c3500ac7327e762f77f1eeac1d3bbf035bec add the subdomain to the list in MWMultiVersion::setSiteInfoForWiki]
* Adjust the entry in <code>wikiversions.json</code> to the current production version. See [[Deployments]] ([[Deployments/One week|one week]]) and [[Deployment Train]].


wikiquote.dblist wikisource.dblist
==== Logo ====
 
Projects, in particular Wikipedias, usually have a localized logo.
wikiversity.dblist wikivoyage.dblist


wiktionary.dblist special.dblist
Check the following:
|| '''Every wiki must be in one of these.'''
Sister project, Wikimania, chapter, or special.


NOTE: Some wikis maybe in special and one other list.
* The project has a full localized logo in Commons (image + wordmark). For Wikipedia, the filename is usually File:Wikipedia-logo-v2-LANGUAGE.svg, for example [[:File:Wikipedia-logo-v2-fr.svg]] for French. If there is no localized logo, ask the project's editor to provide the localized text and get the logo created and uploaded to Commons.
|}
* The project has versions of the logo in all the necessary sizes and resolutions uploaded. If not, create, upload, and configure them according to the instructions at [[Wikimedia site requests#Change the logo of a Wikimedia wiki]].
*The project has a localized wordmark without the image. This is shown at the bottom of the mobile site and in some other places. This is not necessary if the name of the project is completely identical to the default English name, but necessary if it's not identical. For Wikipedia, the filename is usually File:Wikipedia-wordmark-LANGUAGE.svg, for example [[:File:Wikipedia-wordmark-fr.svg|File:Wikipedia-wordmark-fr.svg]] for French.


=== Database creation ===
=== Database creation ===
Once the above is reviewed, merged and pull down on the deployment host also pull it down on terbium (scap pull), but '''do not yet deploy''' to main app servers until after the database is created.
Once the above is reviewed, merged and pull down on the deployment host also pull it down on [[mwmaint1002]] (scap pull), but '''do not yet deploy''' to main app servers until after the database is created.


Now it's time to actually create the database. The installation script also performs other tasks, such as notifying the "newprojects" mailing list. The installation script must be run from [[terbium]] ('''not''' the deployment host). If the wiki is going to be a Wikidata client, make sure it's present in the <code>wikidataclient.dblist</code> file and that the new version of that file has been pulled down on terbium **'''''before'''''** running the installation script, or things will break.
Now it's time to actually create the database. The installation script also performs other tasks, such as notifying the "newprojects" mailing list. The installation script must be run from [[mwmaint1002]] ('''not''' the deployment host). If the wiki is going to be a Wikidata client, make sure it's present in the <code>wikidataclient.dblist</code> file and that the new version of that file has been pulled down on mwmaint1002 **'''''before'''''** running the installation script, or things will break.
 
The dummy wiki (noted as <code>muswiki</code> in the code) is shard-dependant. Wikis at <code>s5</code> should be created using <code>muswiki</code>, while wikis at <code>s3</code> were created via <code>aawiki</code>. Wiktionaries needs to be created via <code>mhwiktionary</code> (s5) and <code>aawiktionary</code> (s3).


The syntax is as follows:
The syntax is as follows:


<code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki <''languagecode> <projectname> <databasename> <domain>''</code>  
<code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki <''languagecode> <projectname> <databasename> <domain>''</code>  
* For a new language projects - for example a Spanish Wikinews:<br><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki es wikinews eswikinews es.wikinews.org</code>
 
* For a new chapters wikis - for example a Finnish chapter wiki:<br><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki fi wikimedia fiwikimedia fi.wikimedia.org</code>
* For a new Wikipedia - for example Lingua Franca Nova Wikipedia:<br /><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki lfn wikipedia lfnwiki lfn.wikipedia.org</code>
* For non-standard special wikis (such as committees, or unique projects like meta or commons) - for example strategy.wikimedia.org:<br><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=aawiki en wikimedia strategywiki strategy.wikimedia.org</code>
* For another new language projects - for example a Spanish Wikinews:<br /><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki es wikinews eswikinews es.wikinews.org</code>
'''Note: In these examples <code>'aawiki'</code> is used because it is a wiki on s3''' - which means the new wiki will also be created on s3. This is usually the right place for new small wikis.
* For a new chapters wikis - for example a Finnish chapter wiki:<br /><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki fi wikimedia fiwikimedia fi.wikimedia.org</code>
* For non-standard special wikis (such as committees, or unique projects like meta or commons) - for example strategy.wikimedia.org:<br /><code>mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki en wikimedia strategywiki strategy.wikimedia.org</code>


# Merge the config change in Gerrit, and [[Heterogeneous deployment#In_your_own_repo_via_gerrit|pull it onto tin]].
# Merge the config change in Gerrit, and [[Heterogeneous deployment#In your own repo via gerrit|pull it onto deploy1001]].
# Verify that the *.dblist files now contain the new wiki.
# Verify that the *.dblist files now contain the new wiki.
# Run <code>scap sync-dir dblists</code> to synchronize all database lists
# Pull the whole on mwdebug1002 and mwmaint1002 through <code>scap pull</code>
# Run the addWiki maintenance script, as described above (important to do before any sync of dblists or wikiversions, as some components like login use that)
# Run <code>scap sync-file dblists</code> to synchronize all database lists
# Verify wikiversions.json is sane.
# Verify wikiversions.json is sane.
# Run <code>scap sync-wikiversions</code> to synchronize the version number to use for this wiki
# Run <code>scap sync-wikiversions</code> to synchronize the version number to use for this wiki
# Run <code>scap sync-file multiversion/MWMultiVersion.php</code> when needed, ie if you add a new *.wikimedia.org site
# Run <code>scap sync-file wmf-config/InitialiseSettings.php</code>
# Run <code>scap sync-file wmf-config/InitialiseSettings.php</code>
# Run <code>scap sync-dir w/static/images/project-logos/</code>
# Run <code>scap sync-file static/images/project-logos/</code>
# Run <code>scap sync-file langlist</code> (this must be done '''before''' deploying the updated interwiki cache - step 10)
# If adding a new language code, run <code>scap sync-file langlist</code> (this must be done '''before''' deploying the updated interwiki cache - step 13)
# Unless it's a language project, add the project to the [[meta:Interwiki map]].
# Unless it's a language project, add the project to the [[metawiki:Interwiki map|meta:Interwiki map]].
# Regenerate the interwiki cache and deploy it. (For all new wikis, not just new language projects.)
# Regenerate the interwiki cache and deploy it. (For all new wikis, not just new language projects.)
#* <code>mwscript extensions/WikimediaMaintenance/dumpInterwiki.php --protocolrelative > wmf-config/interwiki.php</code>
#* <code>scap update-interwiki-cache</code> (and then follow the instructions)
#* Commit change, upload to Gerrit for review
#* Edit [[metawiki:Interwiki map|meta:Interwiki map]] and update when the last update was.
#* Review and merge.
#* <code>scap sync-file wmf-config/interwiki.php</code>
#* Edit [[meta:Interwiki map]] and update when the last update was.


=== RESTBase ===
=== RESTBase ===


[[mw:RESTBase|RESTBase]] is a service providing a RESTful API for the projects' wikis. To enable it to serve the new wiki as well, create a patch for mediawiki/services/restbase/deploy adding its domain to [[phab:diffusion/GRBD/browse/master/scap/vars.yaml;c86e94f937cc26e40dd522949b69bf2c6d771318$59|the appropriate section]] in RESTBase's configuration. Add Mobrovac to review the patch, and wait for it to get merged. This will then require a restart of the RB service across the cluster, which can be done by ops or restbase-admins/restbase-roots.
[[RESTBase]] is a service providing a RESTful API for the projects' wikis. To enable it to serve the new wiki as well, create a patch for <code>mediawiki/services/restbase/deploy</code> adding its domain to [[phab:diffusion/GRBD/browse/master/scap/vars.yaml;c86e94f937cc26e40dd522949b69bf2c6d771318$59|the appropriate section]] in RESTBase's configuration. Add Ppchelko to review the patch, and wait for it to get merged, or create a task tagged with <code>#platform-team</code>. This will then require a restart of the RB service across the cluster, which can be done by ops or restbase-admins/restbase-roots.


=== Parsoid ===
=== Parsoid ===
Parsoid has its own copy of SiteMatrix (<code>lib/config/sitematrix.json</code>), which needs updating using <code>tools/fetch-sitematrix.js</code> in mediawiki/services/parsoid.git. This can only be run after the wiki has been created and deployed.
Parsoid is now integrated with core, and should pick up the wiki automatically.
 
=== cxserver ===
For project families in which the ContentTranslation is installed (as of 2018—only Wikipedia projects): Add the language code to the ContentTranslation registry - <code>mediawiki/services/cxserver</code> repository, file <code>config/languages.yaml</code> (included by config.dev.yaml and config.prod.yaml).


Then, once merged, it must be deployed; if not, loading VisualEditor will fail with errors such as "Failed to load resource: the server responded with a status of 404".
Once merged to master, ping the ContentTranslation developers to deploy the change. That requires to sync repositories, ie to update mediawiki/services/cxserver/deploy to match mediawiki/services/cxserver. See: https://gerrit.wikimedia.org/r/#/c/303763/ for an example commit.


=== Search ===
=== Search ===
You shouldn't have to do much (maybe adjust shard sizes preemptively if it'll be big), the index should be automatically created by addWiki.php. See [[Search/New#Adding new wikis]] for more information.
 
Search indices need to be initialized for new wikis starting with 1.36.0-wmf.10. For a typical wiki creation only the following command is necessary. See [[Search#Adding new wikis]] for more information.
 
<syntaxhighlight lang="shell-session">
$ mwscript extensions/CirrusSearch/maintenance/UpdateSearchIndexConfig.php --wiki=$wiki --cluster=all
</syntaxhighlight>


=== Swift ===
=== Swift ===
Necessary changes are made automatically by the addWiki.php script.
Necessary changes are made automatically by the addWiki.php script.


=== Labs ===
=== Cloud Services ===
* Create a patch for operations/puppet to add the database to the correct list in <code>modules/role/manifests/labs/dnsrecursor.pp</code>
 
Ensure a DBA has created the <code>${wiki}_p</code> database and granted access to <code>labsdbuser</code> (otherwise a bug in MariaDB will cause the scripts to fail). This what ''they'' will probably do:
<syntaxhighlight lang="shell-session">
$ sudo -i mysql --skip-ssl
mysql:root@localhost [(none)]> GRANT SELECT, SHOW VIEW ON `$wiki\_p`.* TO 'labsdbuser';
mysql:root@localhost [(none)]> FLUSH PRIVILEGES;
</syntaxhighlight>
 
==== Best method ====
When the task is ready for WMCS, run the cookbook <code>sre.wikireplicas.add-wiki</code> on the any active cumin server (currently <code>cumin1001.eqiad.wmnet/cumin2002.codfw.wmnet)</code>.
The following example is for smnwiki and it's task. Replace those values with the wiki database name you are adding and the correct task ID:
<syntaxhighlight lang="shell-session">
$ sudo /usr/local/bin/secure-cookbook sre.wikireplicas.add-wiki --task-id T264900 smnwiki
START - Cookbook sre.wikireplicas.add-wiki
Generating views...
Adding DNS
Finalizing meta_p
Added views for new wiki: smnwiki T264900
END (PASS) - Cookbook sre.wikireplicas.add-wiki (exit_code=0)
</syntaxhighlight>
{{Note|The "Adding DNS" step takes quite a while to complete. Do not panic!}}
 
If you are adding more than one wiki that day or several tickets have been opened to add wikis, you may find that the DNS step creates all of them at once. After that has been done, it doesn't need to be done again. You can save yourself ''lots'' of time by using the <code>--skip-dns</code> option on the command line so it doesn't sit there scanning for new wikis.
 
If in doubt you can always use the --dry-run option on the secure-cookbook command so that you just see what it would do. For example:
<syntaxhighlight lang="shell-session">
$ sudo secure-cookbook --dry-run sre.wikireplicas.add-wiki  --task-id T260551 thankyouwiki
DRY-RUN: Executing cookbook sre.wikireplicas.add-wiki with args: ['--task-id', 'T260551', 'thankyouwiki']
DRY-RUN: START - Cookbook sre.wikireplicas.add-wiki
DRY-RUN: Generating views...
DRY-RUN: Executing commands ['/usr/local/sbin/maintain-replica-indexes --database thankyouwiki', '/usr/local/sbin/maintain-views --databases thankyouwiki'] on 4 hosts: labsdb[1009-1012].eqiad.wmnet
DRY-RUN: Adding DNS
DRY-RUN: Executing commands ['source /root/novaenv.sh; wmcs-wikireplica-dns --aliases'] on 1 hosts: cloudcontrol1003.wikimedia.org
DRY-RUN: Finalizing meta_p
DRY-RUN: Executing commands ['/usr/local/sbin/maintain-meta_p --databases thankyouwiki'] on 4 hosts: labsdb[1009-1012].eqiad.wmnet
DRY-RUN: Added views for new wiki: thankyouwiki T260551
DRY-RUN: END (PASS) - Cookbook sre.wikireplicas.add-wiki (exit_code=0)
</syntaxhighlight>
 
==== Manual method in case the best method fails ====
Once it is replicating to the [[Portal:Data Services/Admin/Wiki Replicas#Physical layer|labsdb* servers]] (e.g. labsdb1009, labsdb1010, labsdb1011 and labsdb1012) and the new clouddb1* servers (clouddb10[13-20]), run the following 2 commands on each replica server:
 
<syntaxhighlight lang="shell-session">
localhost:~$ ssh labsdb10xx.eqiad.wmnet
labsdb10xx:~$ sudo /usr/local/sbin/maintain-replica-indexes --database $wiki --debug
labsdb10xx:~$ sudo /usr/local/sbin/maintain-views --databases $wiki --debug
</syntaxhighlight>
 
Note the section for use with the wikireplica dns next
<syntaxhighlight lang="shell-session">
labsdb10xx:~$ grep $wiki /usr/local/lib/mediawiki-config/dblists/s*.dblist* | grep -o '\w[0-9]'
:# Should return a shared, like s3, s5, etc
</syntaxhighlight>
 
 
From a cloud control host, add the wikidb alias in the [[Portal:Data Services/Admin/Wiki Replica DNS|Wiki Replicas service name]]:
 
<syntaxhighlight lang="shell-session">
localhost:~$ ssh cloudcontrol1003.wikimedia.org
cloudcontrol1003:~$ sudo -i
cloudcontrol1003:~$ source novaenv.sh
cloudcontrol1003:~$ /usr/local/sbin/wmcs-wikireplica-dns --aliases --shard <sN>
:# Use the shard arg only if you know the shard (which you can get on a replica server as noted above)
:# If the shard is s3 or a full rebuild is done it will take quite a while to run
</syntaxhighlight>


* Once it is replicating to the labsdb servers:
Insert a new row in <code>meta_p.wiki</code> for the new wiki by running the following on each of the replica servers that host the s7 instance.
** Run <code>maintain-views --databases $wiki --debug on each</code>
** Insert a new row in <code>meta_p.wiki</code> for the new wiki.


== Post-install ==
<syntaxhighlight lang="shell-session">
localhost:~$ ssh labsdb10xx.eqiad.wmnet
labsdb10xx:~$ sudo /usr/local/sbin/maintain-meta_p --database $wiki
</syntaxhighlight>


=== Wikidata ===
==== Finish up for either method ====
<small>''Only needed if the new wiki is supposed to be a Wikidata client.''</small>
Before resolving the ticket, log into a Toolforge bastion as yourself and run:
<syntaxhighlight lang="shell-session">
localhost:~$ ssh login.toolforge.org
tools-sgebastion-07:~$ sql $wiki
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A


In order to be able to link the new wiki from Wikidata, and to allow interwiki links from Wikidata to the new wiki, run <code>extensions/Wikidata/extensions/Wikibase/lib/maintenance/populateSitesTable.php --force-protocol https</code> on at least all Wikidata clients (including wikidatawiki itself and testwikidata).
Welcome to the MariaDB monitor.  Commands end with ; or \g.
Your MariaDB connection id is 470351380
Server version: 10.1.43-MariaDB MariaDB Server


'''That script is known to be troublesome''', you might want to ask {{ircnick|hoo|Marius}} or {{ircnick|aude|Katie}} run it for you or just create a ticket (that may be done anytime after the wiki was created).
Copyright (c) 2000, 2018, Oracle, MariaDB Corporation Ab and others.


<span style="color: red; font-face: bold;">Beware</span>: The script sometimes fails with a duplicate key conflict. In that case, go to the wiki's master database and empty the sites and site_identifiers tables, then run the script again. It's probably also wise to backup these tables from Wikidata and at least one Wikipedia before running the script across the whole fleet. '''Breaking the sites, site_identifiers tables will break page rendering of many wikis!'''
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.


Make sure that the language code appears in the file <code>client/config/WikibaseClient.default.php</code> in the mediawiki/extensions/Wikibase repo. (Example: https://gerrit.wikimedia.org/r/288097.)
MariaDB [$wiki_p]> select * from page limit 2;
</syntaxhighlight>
If you get an error from the <code>select * from page limit 2;</code> statement, you may have missed a step or need to do some troubleshooting. You should just get a couple records.


=== WikimediaMessages ===
=== WikimediaMessages ===
==== Translatable project name ====
==== Translatable project name ====
Add a message with the wiki name to <code>extensions/WikimediaMessages/i18n/wikimediaprojectnames/en.json</code> and <code>qqq.json</code>. The message keys should include the wiki database name (<code>WIKI_DBNAME</code> below) and official "human readable" name (<code>WIKI_NAME</code> below) as follows:
Add a message with the wiki name to <code>extensions/WikimediaMessages/i18n/wikimediaprojectnames/en.json</code> and <code>qqq.json</code>. The message keys should include the wiki database name (<code>WIKI_DBNAME</code> below) and official "human readable" name (<code>WIKI_NAME</code> below) as follows:


{| class="wikitable"
{| class="wikitable"
|+ en.json
|+en.json
|-
|Key
|Key||Message
|Message
|-
|-
|project-localized-name-''<code>WIKI_DBNAME</code>''
|project-localized-name-''<code>WIKI_DBNAME</code>''
Line 195: Line 339:


{| class="wikitable"
{| class="wikitable"
|+ qqq.json
|+qqq.json
|-
|Key
|Key||Message
|Message
|-
|-
|project-localized-name-''<code>WIKI_DBNAME</code>''
|project-localized-name-''<code>WIKI_DBNAME</code>''
| <code><nowiki>{{ProjectNameDocumentation|url=WIKI_URL|name=WIKI_NAME|language=WIKI_LANG}}</nowiki></code>
|<code><nowiki>{{ProjectNameDocumentation|url=WIKI_URL|name=WIKI_NAME|language=WIKI_LANG}}</nowiki></code>
|}
|}


For example, <code><nowiki>"project-localized-name-enwiki": "{{ProjectNameDocumentation|url=https://en.wikipedia.org|name=English Wikipedia|language=en}}",</nowiki></code>
For example, <code><nowiki>"project-localized-name-enwiki": "{{ProjectNameDocumentation|url=https://en.wikipedia.org|name=English Wikipedia|language=en}}",</nowiki></code>


====Interwiki search result title====
==== Cross-wiki (a.k.a. interwiki) search result title ====
Add a message with the wiki name to <code>extensions/WikimediaMessages/i18n/wikimediainterwikisearchresults/en.json</code> and <code>qqq.json</code> in the list of search-interwiki-results messages.
If this is a new language Wikipedia, add a message with the wiki name to <code>extensions/WikimediaMessages/i18n/wikimediainterwikisearchresults/en.json</code> and <code>qqq.json</code> in the list of search-interwiki-results messages.
 
==== Wikipedia mobile apps ====
If you are creating a new Wikipedia, add an entry for the new language.
 
For the [https://github.com/wikimedia/apps-android-wikipedia Android app], you must run the following two scripts, which will update the list of languages known to the app (including the corresponding language codes, autonyms, English names, etc.):
 
$ cd scripts
$ ./generate_wiki_languages.py
$ ./make-templates.py
 
This should update several <code>.java</code> and <code>.xml</code> files that you may then submit in a pull-request to our repository.
 
(TODO: Is there anything to do for the iOS app?)
 
== Post-install ==
 
=== Wikidata ===
<small>''Only needed if the new wiki is supposed to be a Wikidata client.''</small>
 
In order to be able to link the new wiki from Wikidata, and to allow interwiki links from Wikidata to the new wiki, run <code>extensions/Wikibase/lib/maintenance/populateSitesTable.php --force-protocol https</code> on at least all Wikidata clients (including wikidatawiki itself and testwikidata).
 
<code>foreachwikiindblist wikidataclient extensions/Wikibase/lib/maintenance/populateSitesTable.php --force-protocol https</code>
 
'''That script is known to be troublesome''', you might want to ask {{ircnick|hoo|Marius}} or {{ircnick|Amir1|Amir Sarabadani}} run it for you or just create a ticket (that may be done anytime after the wiki was created).
 
<span style="color: red; font-face: bold;">Beware</span>: The script sometimes fails with a duplicate key conflict. In that case, go to the wiki's master database and empty the sites and site_identifiers tables, then run the script again. It's probably also wise to backup these tables from Wikidata and at least one Wikipedia before running the script across the whole fleet. '''Breaking the sites, site_identifiers tables will break page rendering of many wikis!'''


=== Analytics ===
=== Analytics ===
If the wiki is not private, not a Wikimania conference wiki, and not special wiki like usability/outreach/login/vote/strategy/etc., send a change proposal to <code>analytics/refinery.git</code> that adds the wiki to <code>static_data/pageview/whitelist/whitelist.tsv</code>
If the wiki is not private, not a Wikimania conference wiki, and not special wiki like usability/outreach/login/vote/strategy/etc., send a change proposal to <code>analytics/refinery.git</code> that adds the wiki to [[phab:diffusion/ANRE/browse/master/static_data/pageview/whitelist/whitelist.tsv|<code>static_data/pageview/whitelist/whitelist.tsv</code>]]
If pageviews to the wiki need to be included in the pageview definition, check the code or ask the Analytics team.  The code for [[phab:diffusion/ANRS/browse/master/refinery-core/src/main/java/org/wikimedia/analytics/refinery/core/PageviewDefinition.java$77|definition and regular expressions used to include or exclude domains as official pageviews]].
 
=== Incubator ===
=== Incubator ===
If there's something to import (as is often the case in new language projects), someone will do so. Their process is described at [[incubator:Project:Importing_from_Incubator|Incubator:Importing from Incubator]] (logged at [[incubator:Project:Site_creation_log|Incubator:Site creation log]]).
If there's something to import (as is often the case in new language projects), someone will do so. Their process is described at [[incubator:Project:Importing_from_Incubator|Incubator:Importing from Incubator]] (logged at [[incubator:Project:Site_creation_log|Incubator:Site creation log]]).
=== cxserver ===
For new Wikipedia projects only: Add the language code to the ContentTranslation registry - mediawiki/services/cxserver repository, files <code>registry.yaml</code> (included by config.dev.yaml) and <code>registry.wikimedia.yaml</code> (included by config.prod.yaml), in the source and target sections in each.
Once merged to master, ping the project to deploy the change. That requires to sync repositories, ie to update mediawiki/services/cxserver/deploy to match mediawiki/services/cxserver. See https://gerrit.wikimedia.org/r/#/c/303763/ for an example commit.


=== Clean up interwiki links ===
=== Clean up interwiki links ===
After any import Incubator is completed, Inform the community and make a Phabricator task for removing old interwiki links and migrating them to Wikidata (For example [[phab:T134991|T134991]] for edits such as [[:d:Special:Diff/336584053]] and [[w:jam:Special:Diff/12330|jam:Special:Diff/12330]]). You can do it by yourself using [https://github.com/wikimedia/pywikibot-core/tree/master/scripts/interwikidata.py interwikidata.py] in pywikibot.
After any import Incubator is completed, Inform the community and make a Phabricator task for removing old interwiki links and migrating them to Wikidata (For example [[phab:T134991|T134991]] for edits such as [[wikidata:Special:Diff/336584053|d:Special:Diff/336584053]] and [[:en:jam:Special:Diff/12330|jam:Special:Diff/12330]]). You can do it by yourself using [https://github.com/wikimedia/pywikibot-core/tree/master/scripts/interwikidata.py interwikidata.py] in pywikibot.
<pre>
<pre>
python pwb.py scripts/interwikidata.py -lang:LANGCODE -clean -start:! -always
python pwb.py scripts/interwikidata.py -lang:LANGCODE -clean -start:! -always
</pre>
</pre>


=== Tell wikistats labs project to add the wiki ===
=== Tell wikistats Cloud VPS project to add the wiki ===
Create a Phabricator task (subtask of the main task to create the wiki) with the tag "Labs-project-wikistats" and just ask for the wiki to be added. (What needs to be done can be seen f.e. in https://phabricator.wikimedia.org/T140970#2531977)
If the wiki is a public wiki, create a Phabricator task (subtask of the main task to create the wiki) with the tag "VPS-project-wikistats" and just ask for the wiki to be added. (What needs to be done can be seen f.e. in https://phabricator.wikimedia.org/T140970#2531977)
 
=== Tell Pywikibot project to add the wiki ===
Create a Phabricator task (subtask of the main task to create the wiki) with the tag "pywikibot" and just ask for the wiki to be added.
 
=== Mobile apps ===
Wikipedia has mobile apps for Android and iOS. A Wikipedia in a new language must work in the app after the import from the Incubator is complete. Report a bug under the apps tags in Phabricator if any of the following doesn't work:
 
* You are supposed see the language in the user preferences.
** Android: Settings -> Wikipedia language.
** iOS: Settings -> My languages
* You see the language in the interlanguage links list. Find an article that exists in the new Wikipedia and in English. Go to the English Wikipedia, tap the 文Α icon at the bottom and find the article in the list. In particular:
** The article name must appear.
** The language name (autonym) must appear.
** The item must be findable using the language name in English and the autonym. (If you don't know how to type the autonym, try pasting the autonym or ask somebody who writes in that Wikipedia.)
** Tapping the item must show the article.


== See also ==
== See also ==


* [[Close a wiki]]
* [[Delete a wiki]]
* [[Delete a wiki]]


[[Category:How-To]]
[[Category:How-To]]

Revision as of 06:34, 27 August 2021

This page documents the process for adding a new wiki. This includes new languages on sister projects, and wikis for committees, chapters etc.

Preparation

The following steps need to be completed before the wiki database may be created.

Notify

  • Create a Phabricator task for the wiki creation (if one doesn't exist already).
  • Create (if not already) a sub-task "Prepare and check storage layer for <new wiki>".
    • It should be tagged with #DBA, #wmcs-kanban and #Data-Services.
  • Notify the Operations list. In particular, it needs to be made clear whether the wiki should be public, or private. If public, ops will arrange for the wiki to be replicated to Cloud Services. If private, ops will need to add the wiki to $private_wikis in operations/puppet.git:/manifests/realm.pp.
  • IMPORTANT: If the wiki is a regular public wiki to appear on Cloud Services - you can continue. If the wiki is private and should not be replicated to Cloud Services DO NOT CONTINUE UNTIL YOU HAVE THE OK FROM OPS/DBAs to check no private data is leaked. There are mechanisms in place to prevent that by default, but those should be manually checked.

DNS

First of all, ensure the relevant domain names exist for the new wiki. Make the following changes in a commit for the operations/dns.git repo and submit to Gerrit for review.

  • If it is a language project, ensure the language code is present in /templates/helpers/langlist.tmpl. This is shared between all sister projects. If another sister project has the same language already, then this has probably been done already.
  • If it is a subdomain of ".wikimedia.org" domain (chapter wiki or special wiki)
  • Merge the change in Gerrit and run authdns-update.
  • Query the DNS servers to make sure it has been correctly deployed. See DNS#HOWTO for details.
  • For new languages, there is also a need to regenerate zones. Run on ns0, ns1 and ns2: authdns-gen-zones -f /srv/authdns/git/templates /etc/gdnsd/zones && gdnsdctl reload-zones

Apache configuration

Apache configuration is located in the operations/puppet.git repo.

  • Common configuration:
    • For a new language project, this step is usually not needed as shared configuration already covers it.
    • For a new chapter wiki, add it to ::mediawiki:web::prod_sites under wikimedia-chapter
  • After the change is merged in Gerrit, deploy the configuration change and (if needed) gracefully restart app servers. See Application_servers/Runbook#Deploying_config for details.
  • If there are additional domains that should point to the same wiki, add it to redirects/redirects.dat.

Language configuration

Check the following for every wiki, even if another wiki in this language already exists:

  1. Make sure that the wiki's content language code appears in the language-data repository (file data/langdb.yaml; if you're adding it, make sure to follow the instructions in README).
  2. Make sure that the wiki's content language code is fully configured to be supported by core MediaWiki. The code must appear in the following files:
    • languages/data/Names.php (precisely this code, and not a variant)
    • The file languages/messages/MessagesAbc.php must exist (replace Abc with the language code).
  3. If the language is written from right to left, make sure it is configured as such in the following files:
    • In the core MediaWiki repository, the file languages/messages/MessagesAbc.php must explicitly say $rtl = true; near the top.
    • In the MobileFrontend extension repository: src/mobile.languages.structured/rtlLanguages.js
    • In the Android app repository: app/src/main/java/org/wikipedia/util/L10nUtil.java (the RTL_LANGS list)
    • In the iOS app repository: Wikipedia/Code/MWLanguageInfo.m (the rtlLanguages list)
  4. Check whether the language code appears in any extra configuration for Wikibase and Wikidata. If it appears in any of the following files, it should be removed (verify with Wikidata developers):
  5. Check that usernames in the writing system of this language are possible and aren't filtered out. If they are blocked, add support for this writing system to AntiSpoof. Example patch.

Determine if this is a new language project (like Spanish Wikibooks) or a chapter wiki (like Wikimedia Denmark), or something else (special wiki or private wiki).

For a new language project, make sure the language has been approved by the Language committee at Requests for new languages on Meta-Wiki. Usually, the Phabricator task will contain a link to the approval page. For all other wikis, go directly to #Install.

Install

IMPORTANT: For Private Wikis

  • Private wiki databases must not be replicated to the Cloud Services DB MySQL instances!
    • Before creating a database for a private wiki, make sure to add the db name to the puppet global array $private_wikis in operations/puppet.git:/manifests/realm.pp.
    • Deploy this config change with puppet and manually restart the Prelabsdb-db MySQL instance (Sanitarium) on the server that will house this wiki's db (most likely s5).
    • If you need help with this, please ask a member of the Ops team for help. This is very important.

MediaWiki configuration

Gather all relevant information about the new project. Each wiki has different requirements (project name, namespaces, extensions to be enabled etc).

Make the following changes on your own checkout of operations/mediawiki-config.git and submit to Gerrit for review:

  • For a new language project, add the language code (ISO 639 code: usually provided in the task) to langlist. This will be used at Special:SiteMatrix and for interwiki linking etc.
  • Create the YAML definition of the wiki in wmf-config/config, including making it inherit from relevant settings profiles, as follows:
Wiki grouping Group options Purpose
All


All wikis must be tagged against 'all'

all.dblist The primary listing of what wikis exist, used as the basis of all tools.
DB cluster


Every wiki must be in exactly one of these

s1.dblist
s2.dblist
s3.dblist
s4.dblist
s5.dblist
s6.dblist
s7.dblist
s8.dblist
s10.dblist
s11.dblist
Database lists of wikis in each MySQL database cluster.
In most cases, wikis should just be added to s5
s10 was previously known as wikitech
Wiki size


Every wiki must be in exactly one of these

small.dblist
medium.dblist
large.dblist
Database lists of wikis arranged into their relevant size.
Wiki family


Every wiki must be in exactly one of these (or one plus in special)

wikimania.dblist

wikimedia.dblist wikibooks.dblist

wikinews.dblist wikipedia.dblist

wikiquote.dblist wikisource.dblist

wikiversity.dblist wikivoyage.dblist

wiktionary.dblist special.dblist

Sister project, Wikimania, chapter, or special.

NOTE: Some wikis maybe in special and one other list.

Closed closed.dblist Any closed (no write access, full read access) wikis
Deleted deleted.dblist Wiki databases which MediaWiki is no longer configured to access
Wiki privacy fishbowl.dblist All fishbowl (restricted write access, full read access) wikis
private.dblist All private (read and write restricted) wikis
Extension configurations flaggedrevs.dblist All wikis running the FlaggedRevs extension
securepollglobal.dblist $wgSecurePollCreateWikiGroups wikis: Board Election wikis
visualeditor-nondefault.dblist All wikis where VisualEditor is not enabled by default
commonsuploads.dblist All wikis which should have local uploading soft-disabled. Uploads go to Commons instead.
Wikidata-related wiki groups wikidata.dblist All wikis running the Wikidata repo
wikidataclient.dblist All wikis running the Wikidata client (most new language-project wikis should start off like this)
  • Once your YAML file is created, add a dummy entry to wikiversions.json mapping the database name to a version string, then run composer test which will update the automatic "dblist" files and spot any issues. Alternatively, you can run composer buildDBLists which only builds the dblists.
  • If the new wiki is created at a different shard than s3 (which is likely, as new wikis are created at s5 as of August 2020), you need to add the wiki to wmf-config/db-eqiad.php and wmf-config/db-codfw.php. Check with DBAs if you're unsure how to properly edit the file, this is very important.
  • Update the configuration files (located in /wmf-config). Some may have sufficient defaults but the following are often wiki-specific in InitialiseSettings.php:
    • $wgServer, $wgCanonicalServer - For language projects, the defaults work fine.
    • $wgLogos (see below), $wgSitename, $wgExtraNamespaces, $wgLocaltimezone.
    • groupOverrides.
    • Ensure $wgCategoryCollation is set to the appropiate sorting for this wiki's language. If you are not sure, ask on the task.
    • If an extension needs to be enabled, this is usually done by setting a wmg-variable to true in InitialiseSettings along with custom configuration there. Some extensions' configuration are located in a dedicated files in the /wmf-config directory.
  • If you added a new language code to the langlist (see above), you probably need to add it to the InterwikiSortingOrder.php file too
  • For "*wikimedia" suffix databases, add the subdomain to the list in MWMultiVersion::setSiteInfoForWiki
  • Adjust the entry in wikiversions.json to the current production version. See Deployments (one week) and Deployment Train.

Projects, in particular Wikipedias, usually have a localized logo.

Check the following:

  • The project has a full localized logo in Commons (image + wordmark). For Wikipedia, the filename is usually File:Wikipedia-logo-v2-LANGUAGE.svg, for example File:Wikipedia-logo-v2-fr.svg for French. If there is no localized logo, ask the project's editor to provide the localized text and get the logo created and uploaded to Commons.
  • The project has versions of the logo in all the necessary sizes and resolutions uploaded. If not, create, upload, and configure them according to the instructions at Wikimedia site requests#Change the logo of a Wikimedia wiki.
  • The project has a localized wordmark without the image. This is shown at the bottom of the mobile site and in some other places. This is not necessary if the name of the project is completely identical to the default English name, but necessary if it's not identical. For Wikipedia, the filename is usually File:Wikipedia-wordmark-LANGUAGE.svg, for example File:Wikipedia-wordmark-fr.svg for French.

Database creation

Once the above is reviewed, merged and pull down on the deployment host also pull it down on mwmaint1002 (scap pull), but do not yet deploy to main app servers until after the database is created.

Now it's time to actually create the database. The installation script also performs other tasks, such as notifying the "newprojects" mailing list. The installation script must be run from mwmaint1002 (not the deployment host). If the wiki is going to be a Wikidata client, make sure it's present in the wikidataclient.dblist file and that the new version of that file has been pulled down on mwmaint1002 **before** running the installation script, or things will break.

The dummy wiki (noted as muswiki in the code) is shard-dependant. Wikis at s5 should be created using muswiki, while wikis at s3 were created via aawiki. Wiktionaries needs to be created via mhwiktionary (s5) and aawiktionary (s3).

The syntax is as follows:

mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki <languagecode> <projectname> <databasename> <domain>

  • For a new Wikipedia - for example Lingua Franca Nova Wikipedia:
    mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki lfn wikipedia lfnwiki lfn.wikipedia.org
  • For another new language projects - for example a Spanish Wikinews:
    mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki es wikinews eswikinews es.wikinews.org
  • For a new chapters wikis - for example a Finnish chapter wiki:
    mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki fi wikimedia fiwikimedia fi.wikimedia.org
  • For non-standard special wikis (such as committees, or unique projects like meta or commons) - for example strategy.wikimedia.org:
    mwscript extensions/WikimediaMaintenance/addWiki.php --wiki=muswiki en wikimedia strategywiki strategy.wikimedia.org
  1. Merge the config change in Gerrit, and pull it onto deploy1001.
  2. Verify that the *.dblist files now contain the new wiki.
  3. Pull the whole on mwdebug1002 and mwmaint1002 through scap pull
  4. Run the addWiki maintenance script, as described above (important to do before any sync of dblists or wikiversions, as some components like login use that)
  5. Run scap sync-file dblists to synchronize all database lists
  6. Verify wikiversions.json is sane.
  7. Run scap sync-wikiversions to synchronize the version number to use for this wiki
  8. Run scap sync-file multiversion/MWMultiVersion.php when needed, ie if you add a new *.wikimedia.org site
  9. Run scap sync-file wmf-config/InitialiseSettings.php
  10. Run scap sync-file static/images/project-logos/
  11. If adding a new language code, run scap sync-file langlist (this must be done before deploying the updated interwiki cache - step 13)
  12. Unless it's a language project, add the project to the meta:Interwiki map.
  13. Regenerate the interwiki cache and deploy it. (For all new wikis, not just new language projects.)
    • scap update-interwiki-cache (and then follow the instructions)
    • Edit meta:Interwiki map and update when the last update was.

RESTBase

RESTBase is a service providing a RESTful API for the projects' wikis. To enable it to serve the new wiki as well, create a patch for mediawiki/services/restbase/deploy adding its domain to the appropriate section in RESTBase's configuration. Add Ppchelko to review the patch, and wait for it to get merged, or create a task tagged with #platform-team. This will then require a restart of the RB service across the cluster, which can be done by ops or restbase-admins/restbase-roots.

Parsoid

Parsoid is now integrated with core, and should pick up the wiki automatically.

cxserver

For project families in which the ContentTranslation is installed (as of 2018—only Wikipedia projects): Add the language code to the ContentTranslation registry - mediawiki/services/cxserver repository, file config/languages.yaml (included by config.dev.yaml and config.prod.yaml).

Once merged to master, ping the ContentTranslation developers to deploy the change. That requires to sync repositories, ie to update mediawiki/services/cxserver/deploy to match mediawiki/services/cxserver. See: https://gerrit.wikimedia.org/r/#/c/303763/ for an example commit.

Search

Search indices need to be initialized for new wikis starting with 1.36.0-wmf.10. For a typical wiki creation only the following command is necessary. See Search#Adding new wikis for more information.

$ mwscript extensions/CirrusSearch/maintenance/UpdateSearchIndexConfig.php --wiki=$wiki --cluster=all

Swift

Necessary changes are made automatically by the addWiki.php script.

Cloud Services

Ensure a DBA has created the ${wiki}_p database and granted access to labsdbuser (otherwise a bug in MariaDB will cause the scripts to fail). This what they will probably do:

$ sudo -i mysql --skip-ssl
mysql:root@localhost [(none)]> GRANT SELECT, SHOW VIEW ON `$wiki\_p`.* TO 'labsdbuser';
mysql:root@localhost [(none)]> FLUSH PRIVILEGES;

Best method

When the task is ready for WMCS, run the cookbook sre.wikireplicas.add-wiki on the any active cumin server (currently cumin1001.eqiad.wmnet/cumin2002.codfw.wmnet). The following example is for smnwiki and it's task. Replace those values with the wiki database name you are adding and the correct task ID:

$ sudo /usr/local/bin/secure-cookbook sre.wikireplicas.add-wiki --task-id T264900 smnwiki
START - Cookbook sre.wikireplicas.add-wiki
Generating views...
Adding DNS
Finalizing meta_p
Added views for new wiki: smnwiki T264900
END (PASS) - Cookbook sre.wikireplicas.add-wiki (exit_code=0)

If you are adding more than one wiki that day or several tickets have been opened to add wikis, you may find that the DNS step creates all of them at once. After that has been done, it doesn't need to be done again. You can save yourself lots of time by using the --skip-dns option on the command line so it doesn't sit there scanning for new wikis.

If in doubt you can always use the --dry-run option on the secure-cookbook command so that you just see what it would do. For example:

$ sudo secure-cookbook --dry-run sre.wikireplicas.add-wiki  --task-id T260551 thankyouwiki
DRY-RUN: Executing cookbook sre.wikireplicas.add-wiki with args: ['--task-id', 'T260551', 'thankyouwiki']
DRY-RUN: START - Cookbook sre.wikireplicas.add-wiki
DRY-RUN: Generating views...
DRY-RUN: Executing commands ['/usr/local/sbin/maintain-replica-indexes --database thankyouwiki', '/usr/local/sbin/maintain-views --databases thankyouwiki'] on 4 hosts: labsdb[1009-1012].eqiad.wmnet
DRY-RUN: Adding DNS
DRY-RUN: Executing commands ['source /root/novaenv.sh; wmcs-wikireplica-dns --aliases'] on 1 hosts: cloudcontrol1003.wikimedia.org
DRY-RUN: Finalizing meta_p
DRY-RUN: Executing commands ['/usr/local/sbin/maintain-meta_p --databases thankyouwiki'] on 4 hosts: labsdb[1009-1012].eqiad.wmnet
DRY-RUN: Added views for new wiki: thankyouwiki T260551
DRY-RUN: END (PASS) - Cookbook sre.wikireplicas.add-wiki (exit_code=0)

Manual method in case the best method fails

Once it is replicating to the labsdb* servers (e.g. labsdb1009, labsdb1010, labsdb1011 and labsdb1012) and the new clouddb1* servers (clouddb10[13-20]), run the following 2 commands on each replica server:

localhost:~$ ssh labsdb10xx.eqiad.wmnet
labsdb10xx:~$ sudo /usr/local/sbin/maintain-replica-indexes --database $wiki --debug
labsdb10xx:~$ sudo /usr/local/sbin/maintain-views --databases $wiki --debug

Note the section for use with the wikireplica dns next

labsdb10xx:~$ grep $wiki /usr/local/lib/mediawiki-config/dblists/s*.dblist* | grep -o '\w[0-9]'
:# Should return a shared, like s3, s5, etc


From a cloud control host, add the wikidb alias in the Wiki Replicas service name:

localhost:~$ ssh cloudcontrol1003.wikimedia.org
cloudcontrol1003:~$ sudo -i
cloudcontrol1003:~$ source novaenv.sh
cloudcontrol1003:~$ /usr/local/sbin/wmcs-wikireplica-dns --aliases --shard <sN>
:# Use the shard arg only if you know the shard (which you can get on a replica server as noted above)
:# If the shard is s3 or a full rebuild is done it will take quite a while to run

Insert a new row in meta_p.wiki for the new wiki by running the following on each of the replica servers that host the s7 instance.

localhost:~$ ssh labsdb10xx.eqiad.wmnet
labsdb10xx:~$ sudo /usr/local/sbin/maintain-meta_p --database $wiki

Finish up for either method

Before resolving the ticket, log into a Toolforge bastion as yourself and run:

localhost:~$ ssh login.toolforge.org
tools-sgebastion-07:~$ sql $wiki
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Welcome to the MariaDB monitor.  Commands end with ; or \g.
Your MariaDB connection id is 470351380
Server version: 10.1.43-MariaDB MariaDB Server

Copyright (c) 2000, 2018, Oracle, MariaDB Corporation Ab and others.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

MariaDB [$wiki_p]> select * from page limit 2;

If you get an error from the select * from page limit 2; statement, you may have missed a step or need to do some troubleshooting. You should just get a couple records.

WikimediaMessages

Translatable project name

Add a message with the wiki name to extensions/WikimediaMessages/i18n/wikimediaprojectnames/en.json and qqq.json. The message keys should include the wiki database name (WIKI_DBNAME below) and official "human readable" name (WIKI_NAME below) as follows:

en.json
Key Message
project-localized-name-WIKI_DBNAME WIKI_NAME

For example, "project-localized-name-enwiki": "English Wikipedia",

qqq.json
Key Message
project-localized-name-WIKI_DBNAME {{ProjectNameDocumentation|url=WIKI_URL|name=WIKI_NAME|language=WIKI_LANG}}

For example, "project-localized-name-enwiki": "{{ProjectNameDocumentation|url=https://en.wikipedia.org|name=English Wikipedia|language=en}}",

Cross-wiki (a.k.a. interwiki) search result title

If this is a new language Wikipedia, add a message with the wiki name to extensions/WikimediaMessages/i18n/wikimediainterwikisearchresults/en.json and qqq.json in the list of search-interwiki-results messages.

Wikipedia mobile apps

If you are creating a new Wikipedia, add an entry for the new language.

For the Android app, you must run the following two scripts, which will update the list of languages known to the app (including the corresponding language codes, autonyms, English names, etc.):

$ cd scripts
$ ./generate_wiki_languages.py
$ ./make-templates.py

This should update several .java and .xml files that you may then submit in a pull-request to our repository.

(TODO: Is there anything to do for the iOS app?)

Post-install

Wikidata

Only needed if the new wiki is supposed to be a Wikidata client.

In order to be able to link the new wiki from Wikidata, and to allow interwiki links from Wikidata to the new wiki, run extensions/Wikibase/lib/maintenance/populateSitesTable.php --force-protocol https on at least all Wikidata clients (including wikidatawiki itself and testwikidata).

foreachwikiindblist wikidataclient extensions/Wikibase/lib/maintenance/populateSitesTable.php --force-protocol https

That script is known to be troublesome, you might want to ask Marius (hoo) or Amir Sarabadani (Amir1) run it for you or just create a ticket (that may be done anytime after the wiki was created).

Beware: The script sometimes fails with a duplicate key conflict. In that case, go to the wiki's master database and empty the sites and site_identifiers tables, then run the script again. It's probably also wise to backup these tables from Wikidata and at least one Wikipedia before running the script across the whole fleet. Breaking the sites, site_identifiers tables will break page rendering of many wikis!

Analytics

If the wiki is not private, not a Wikimania conference wiki, and not special wiki like usability/outreach/login/vote/strategy/etc., send a change proposal to analytics/refinery.git that adds the wiki to static_data/pageview/whitelist/whitelist.tsv If pageviews to the wiki need to be included in the pageview definition, check the code or ask the Analytics team. The code for definition and regular expressions used to include or exclude domains as official pageviews.

Incubator

If there's something to import (as is often the case in new language projects), someone will do so. Their process is described at Incubator:Importing from Incubator (logged at Incubator:Site creation log).

Clean up interwiki links

After any import Incubator is completed, Inform the community and make a Phabricator task for removing old interwiki links and migrating them to Wikidata (For example T134991 for edits such as d:Special:Diff/336584053 and jam:Special:Diff/12330). You can do it by yourself using interwikidata.py in pywikibot.

python pwb.py scripts/interwikidata.py -lang:LANGCODE -clean -start:! -always

Tell wikistats Cloud VPS project to add the wiki

If the wiki is a public wiki, create a Phabricator task (subtask of the main task to create the wiki) with the tag "VPS-project-wikistats" and just ask for the wiki to be added. (What needs to be done can be seen f.e. in https://phabricator.wikimedia.org/T140970#2531977)

Tell Pywikibot project to add the wiki

Create a Phabricator task (subtask of the main task to create the wiki) with the tag "pywikibot" and just ask for the wiki to be added.

Mobile apps

Wikipedia has mobile apps for Android and iOS. A Wikipedia in a new language must work in the app after the import from the Incubator is complete. Report a bug under the apps tags in Phabricator if any of the following doesn't work:

  • You are supposed see the language in the user preferences.
    • Android: Settings -> Wikipedia language.
    • iOS: Settings -> My languages
  • You see the language in the interlanguage links list. Find an article that exists in the new Wikipedia and in English. Go to the English Wikipedia, tap the 文Α icon at the bottom and find the article in the list. In particular:
    • The article name must appear.
    • The language name (autonym) must appear.
    • The item must be findable using the language name in English and the autonym. (If you don't know how to type the autonym, try pasting the autonym or ask somebody who writes in that Wikipedia.)
    • Tapping the item must show the article.

See also