User:ArielGlenn/CentralAuth walkthrough
Setup of the CentralAuth extension
[edit]DRAFT IN PROGRESS
This is for testing environments only. It is an 0.01 draft, which is saved here only so I don't lose it off my laptop. DO NOT USE YET.
TODOS:
[edit]- move out all the bits that are not the centralauth config to separate subpages with links
make code be collapsible elements so they don't fill the entire page- write the selinux crap down, plus fedora package names and install commands
- fill in any missing pieces, get the order right
- do a clean install on desktop and check the instructions so we know it all works
- make sure we add links to any other install instructions and docs, clean up existing links
- remove any setting we don't absolutely need for this
- Describe AntiSpoof and other dependencies
Overview
[edit]This documents my setup for testing login, session and cookie behavior in a small wikifarm environment. It is current as of July 29, 2024.
I do not use docker images for this setup; if you do, then you will want to manage hostnames and networking accordingly, as well as volume mounts for MediaWiki source and for databases.
I do everything on linux. The specific distribution should not matter, but for that reason I do not specify package manager details, since they may vary. I do provide information for dealing with selinux, in case you run a distribution with selinux enabled.
I run PHP 8.3 locally; that is not a requirement. You should make sure that the version of PHP you have available is the minimum required for MediaWiki and a minimal group of extensions; see the compatibility notes for details.
Installation of a wiki farm
[edit]Prerequisites
[edit]You'll need an appropriate version of PHP installed, along with the following php modules:
- php-mysqlnd (mariadb bindings)
- php-fpm
- php-gmp
- php-intl
- php-mbstring
- php-xml
You'll need to adjust the PHP configuration settings to FIXME (memory settings, enabling modules, and other)
You'll also need mariadb installed, preferably the latest version supported by your linux distribution. The default settings should generally be ok. You'll only need one server instance for this setup; all wiki databases will be hosted on the one database server.
For session caching Wikimedia production servers currently use a custom-written extension that writes to kafka. Although I prefer in most cases to keep our setup relatively close to that of production, I don't feel that the cost of installation and maintenance of such a setup is worth it in this case. So I recommend that you install memcached and use the default config for it (port 11211, listening address 127.0.0.1, memory limit of 64 MB). See Manual:Memcached#Setup for details, but ignore the MediaWiki configuration settings; we'll come back to that later.
You'll want a web server. Wikimedia uses apache in production and so that's what I use. You do not need to enable SSL for CentralAuth testing, so the setup of a pki and SSL-enabled virtual hosts will not be discussed here. You will want apache 2.4. The apache rewrite module is not required, but I find it convenient and I recommend you install it as well.
You will need php-fpm installed and configured. Strictly speaking, you could do without, but Wikimedia production uses it, and it's not hard to manage. Configuration: (max/min servers, etc...)
Web server configuration
[edit]I use /var/www/html/wikifarm/w as the document root. Sample configuration for wikis in the testing farm:
UseCanonicalName Off Define WIKIFARMCONF "/etc/httpd/conf/localconf/wikifarm.conf" DEFINE WIKIFARMDIR "/var/www/html/wikifarm/w" <VirtualHost *:80> Define VHOSTDOMAIN "wikipedia.test" ServerName "testen.${VHOSTDOMAIN}" DocumentRoot "${WIKIFARMDIR}/testenwiki" <Directory "${WIKIFARMDIR}/testenwiki"> Include "${WIKIFARMCONF}" </Directory> </VirtualHost> <VirtualHost *:80> Define VHOSTDOMAIN "wikipedia.test" ServerName "test2en.${VHOSTDOMAIN}" DocumentRoot "${WIKIFARMDIR}/test2enwiki" <Directory "${WIKIFARMDIR}/test2enwiki"> Include "${WIKIFARMCONF}" </Directory> </VirtualHost>
and so on, where /etc/httpd/conf/localconf/wikifarm.conf
contains
# options for every virtual wikifarm host Options Indexes FollowSymLinks LogLevel trace6 RewriteEngine On # map a bunch of things to the index.php entry point for convenience RewriteRule ^$ /mw/index.php RewriteRule ^/$ /mw/index.php RewriteRule ^/mw$ /mw/index.php # map bare requests to index.php to the main page for convenience RewriteRule ^/mw/index.php$ /mw/index.php?title=Main_page [L] # leave php files in extensions alone so they can be executed RewriteRule ^/mw/extensions/.*\.php - [F,L]
MediaWiki source
[edit]Make a directory tree that your web server will be able to access. I chose /var/www/html/wikifarm/mediawiki
because my linux distribution has a default DocumentRoot at /var/www/html
, and I have multiple local web sites for testing apart from the wikifarm.
You can use the same directory for your MediaWiki installation for all wikis in your wikifarm. While Wikimedia production has one or two specific branches of MediaWiki deployed at any given time, for testing patches you will likely want to be running against the most recently updated code, in the master branch. And if you want to be able to submit those patches for review to gerrit, our code review system, you will want to have created a gerrit username and uploaded ssh keys for use there. See the gerrit guide on Wikitech for more details. I have multiple copies of MediaWiki in separate directories so that I can run multiple versions simultaneously, but even if you do not, I recommend that you choose a checkout directory name that identifies the version of MediaWiki you will be running. For example,
git clone ssh://your-gerrit-username-here@ssh.wikimedia.org:29418/mediawiki/core.git php-master
Once this completes, you will need to clone a minimal group of extensions for base functionality. The following extensions need virtually no configuration and are either convenien t to have for testing or a must have:
- Cite
- ConfirmEdit
- Interwiki
- ParserFunctions
- TemplateData
- TocTree
- UniversalLanguageSelector
- WikiEditor
- WikimediaMaintenance
You can clone each one of them by entering the extensions
directory in your MediaWiki core repo and running
git clone ssh://your-gerrit-username-here@ssh.wikimedia.org:29418/mediawiki/extensions/ExtensionNameHere.git
Note that you will want to update these extensions from time to time via git pull
, since they are not pinned to any specific version of MediaWiki core, unlike the MediaWiki branches we run in production.
Various MediaWiki vendor dependencies are managed by composer. Make sure you have composer installed on your system.
You'll want composer to manage dependencies not just for MediaWiki but for all of its extensions. To do that, copy
the file composer.local.json-sample
in your MediaWiki core repository to composer.local.json
,
run composer require wikimedia/composer-merge-plugin
and then composer update
.
See [1] for more information.
Databases
[edit]Each wiki in your wikifarm should have a separate database with a separate group of database tables. For full testing of so-called "edge login" (autologin from one wiki family to another) as well as autologin on a wiki which is not the so-called "representative wiki" for that wiki family, and a separate wiki where central logins take place, you will need a total of 5 wikis. I have:
- testen.wikipedia.test (wiki family "wikipedia")
- test2en.wikipedia.test (wiki family "wikipedia")
- metawiki.wikipedia.test (global stewards)
- login.wikipedia.test (central logins)
- testen.wiktionary.test (wiki family "wiktionary")
- test2en.wiktionary.test (wiki family "wiktionary")
and a few others. You can choose any hostnames you like, but you may find it convenient to have a consistent naming convention for these. In my case, that is "test", then "en" since all of these wikis use en-US as their default language, followed by the wiki project name ("wikipedia" or "wiktionary") and finally the top level domain. I recommend using the .test TLD, since you can map local IP addresses to it, and you can also enable HTTPS should you wish.
Once you have decided on hostnames, you'll need to make sure they all can be resolved to 127.0.0.1 and to ::1. Typically this means adding entries in /etc/hosts, but this depends on your linux distribution.
Now that you have your list of hostnames for wikis, you'll need a database name to go with each one. These can be anything you like; I have the names
- testenwiki (for testen.wikipedia.test)
- test2enwiki (for test2en.wikipedia.test)
- metawiki (for metawiki.wikipedia.test)
- loginwiki (for login.wikipedia.test)
- testenwikt (for testen.wiktionary.test)
- test2enwikt (for test2en.wiktionary.test)
again using a consistent naming scheme so I can easily keep track of which db name goes with which wiki.
You will now need to create these databases and make an initial user and add grants for that user. For each wiki database, I created separate regular and admin users, but this created extra work when it came to CentralAuth functionality.
Here is a sample stanza for two wiki databases testenwiki and test2enwiki:
CREATE DATABASE testenwiki; CREATE USER 'wiki_admin'@'localhost' IDENTIFIED BY '<password-for-user-here>'; GRANT ALL PRIVILEGES ON testenwiki.* TO 'wiki_admin'@'localhost'; GRANT ALL PRIVILEGES ON test2enwiki.* TO 'wiki_admin'@'localhost';
Installation of each wiki
[edit]Now that the databases, database users and grants are set up for each wiki, you'll need to do the installation of each wiki via the web browser. You don't need CentralAuth to be set up for this part.
(Add notes about this)
After you've done this and saved LocalSettings.php to a directory someplace, you are ready to proceed with configuration.
MediaWiki Configuration
[edit]Next you'll need a LocalSettings.php file, which should be the same for all wikis in the farm, including a php file that has stanzas that vary depending on the wiki database name. Settings that are common to all wikis in he farm can be in a second file, also included in LocalSettings.php. WMF production uses CommonSettings.php and InitialiseSettings.php for these two files, and I recommend you do something similar.
I find it convenient to have LocalSettings.php be a symbolic link to a file of the same name in a configs directory, where CommonSettings.php and InitialiseSettings.php and other configuration files live.
My main LocalSettings content, with /var/www/html/wikifarm/configs containing both CommonSettings.php and InitialiseSettings.php:
$farmpath = '/var/www/html/wikifarm'; require $farmpath . "/configs/CommonSettings.php"; $wgShowExceptionDetails = true;
My main CommonSettings.php content:
<?php # settings and configuration goes here that is the same for all wikis in your farm # error_reporting( -1 ); ini_set( 'display_errors', 1 ); if ( !defined( 'MEDIAWIKI' ) ) { exit; } $wgServer = "http://localhost"; # this file contains an array $wgDbMapping with keys being the string that appears in the hostname for the wiki # and values being the wiki database name require __DIR__ . "/db_mapping.php"; $wgDBname = ''; # NOT ANY MORE or only sometimes # get the wiki dbame by mapping the first subdir # of the url to the dbname, if we have a url. $subdir = ''; # paths look like /wikinamehere/mw/index.php?... if (isset($_SERVER['REQUEST_URI'])) { $subdir = explode('/', $_SERVER['REQUEST_URI'])[1]; $templog = fopen('/var/www/html/wikifarm/logs/templog.txt', 'a'); $temprequri = $_SERVER['REQUEST_URI']; fwrite($templog, "CommonSettings: (old approach) subdir is $subdir and requesturi is $temprequri\n"); fclose($templog); $project = "none"; } # subdir is the first part of the fqdn for the given project (where pejct means wikipedia, wiktionary and so on) # example fqdn: testen.wikipedia.test if (isset($_SERVER['SERVER_NAME'])) { $subdir = explode('.', $_SERVER['SERVER_NAME'])[0]; $project = explode('.', $_SERVER['SERVER_NAME'])[1]; if ( $subdir == '' || $project == '' ) { header( "HTTP/1.1 500 Bad Request, unknown wiki <$subdir> and <$project>" ); exit(1); } } if ( $subdir != '' ) { $wgDBname = $wgDbMapping[$project][$subdir]; $templog = fopen('/var/www/html/wikifarm/logs/templog.txt', 'a'); fwrite($templog, "CommonSettings: (new approach) subdir is $subdir and wgDBname is $wgDBname\n"); fclose($templog); } } else { $wgDBname = $wgDbMapping['none'][$subdir]; } # we've been called from the cli, get the dbname out of $argv if (!$wgDBname) { # code lightly adapted from MWMultiversion.php # The --wiki param must the second argument to to avoid # any "options with args" ambiguity (see Maintenance.php). $index = 1; if ( isset( $argv[$index] ) && $argv[$index] === '--wiki' ) { $wgDBname = isset( $argv[$index+1] ) ? $argv[$index+1] : ''; // "script.php --wiki dbname" $templog = fopen('/var/www/html/wikifarm/logs/templog.txt', 'a'); fwrite($templog, "CommonSettings: cli (1) and wgDBname is $wgDBname\n"); fclose($templog); } elseif ( isset( $argv[$index] ) && substr( $argv[$index], 0, 7 ) === '--wiki=' ) { $wgDBname = substr( $argv[$index], 7 ); // "script.php --wiki=dbname" $templog = fopen('/var/www/html/wikifarm/logs/templog.txt', 'a'); fwrite($templog, "CommonSettings: cli (2) and wgDBname is $wgDBname\n"); fclose($templog); } else { $envwiki = getenv('MW_PHPUNIT_WIKI'); if ( $envwiki !== false ) { $wgDBname = $envwiki; } } if ( $wgDBname === '' ) { echo "Missing dbname. Why? wiki component is $subdir\n" ; echo "Usage: php scriptName.php --wiki=dbname\n" ; exit(1); } } # we don't do email ever. $wgEnableEmail = false; $wgEnableUserEmail = false; # UPO $wgEmergencyContact = "apache@đ».invalid"; $wgPasswordSender = "apache@đ».invalid"; $wgEnotifUserTalk = false; # UPO $wgEnotifWatchlist = false; # UPO $wgEmailAuthentication = true; ## Database settings $wgDBtype = "mysql"; $wgDBserver = "localhost"; # MySQL specific settings $wgDBprefix = ""; # MySQL table options to use during installation or update $wgDBTableOptions = "ENGINE=InnoDB, DEFAULT CHARSET=binary"; ## Shared memory settings $wgMainCacheType = CACHE_MEMCACHED; $wgParserCacheType = CACHE_MEMCACHED; $wgMessageCacheType = CACHE_MEMCACHED; $wgMemCachedServers = [ '127.0.0.1:11211' ]; ## To enable image uploads, make sure the 'images' directory ## is writable, then set this to true: $wgUploadDirectory = __DIR__ . '/../w/images'; $wgUploadPath = '/images'; $wgEnableUploads = true; $wgUseImageMagick = true; $wgImageMagickConvertCommand = "/usr/bin/convert"; $wgFileExtensions[] = 'svg'; $wgSVGConverter = 'ImageMagick'; # InstantCommons allows wiki to use images from https://commons.wikimedia.org $wgUseInstantCommons = false; # Periodically send a pingback to https://www.mediawiki.org/ with basic data # about this MediaWiki instance. The Wikimedia Foundation shares this data # with MediaWiki developers to help guide future development efforts. $wgPingback = false; ## If you use ImageMagick (or any other shell command) on a ## Linux server, this will need to be set to the name of an ## available UTF-8 locale. This should ideally be set to an English ## language locale so that the behaviour of C library functions will ## be consistent with typical installations. Use $wgLanguageCode to ## localise the wiki. $wgShellLocale = "en_US.utf8"; ## For attaching licensing metadata to pages, and displaying an ## appropriate copyright notice / icon. GNU Free Documentation ## License and Creative Commons licenses are supported so far. $wgRightsPage = ""; # Set to the title of a wiki page that describes your license/copyright $wgRightsUrl = "https://creativecommons.org/licenses/by-sa/4.0/"; $wgRightsText = "Creative Commons Attribution-ShareAlike"; $wgRightsIcon = "$wgResourceBasePath/resources/assets/licenses/cc-by-sa.png"; # Path to the GNU diff3 utility. Used for conflict resolution. $wgDiff3 = "/usr/bin/diff3"; ## Default skin: you can change the default skin. Use the internal symbolic ## names, ie 'vector', 'monobook': $wgDefaultSkin = "vector"; # Enabled skins. wfLoadSkin( 'Vector' ); wfLoadExtension( 'Cite' ); wfLoadExtension('ConfirmEdit'); wfLoadExtension('Interwiki'); wfLoadExtension( 'ParserFunctions' ); wfLoadExtension('TemplateData'); wfLoadExtension( 'TocTree' ); wfLoadExtension( 'UniversalLanguageSelector' ); wfLoadExtension( 'WikiEditor' ); wfLoadExtension( 'WikimediaMaintenance' ); # this setting is so that revision text will be compressed before being stored in the database, # usefulif you plan on any large scale testing of content. $wgCompressRevisions = true; # This gives you full control over when jobs are run; they must be started from # the command line. $wgJobRunRate = 0; # if you have different virtual hostnames, adjust these accordingly $wgLocalVirtualHosts = [ 'testen.wikipedia.test', 'test2en.wikipedia.test', 'login.wikipedia.test', 'meta.wikipedia.test', 'testen.wiktionary.test', 'test2en.wiktionary.test', ]; $wgDBuser = "wiki_admin"; $wgDBpassword = "<db-admin-password-here>" # my logging setup, using MediaWiki\Logger\MonologSpi; set up whatever logging you like require __DIR__ . "/logging.php"; # a selection of things from includes/DevelopmentSettings.php, choose the ones that are useful for you require __DIR__ . "/ATGDevSettings.php"; # all settings that vary per wiki require __DIR__ . "/InitialiseSettings.php";
My main InitialseSettings.php content:
<?php # per-wiki settings go here. $basedir = "/"; $wgDebugLogFile = __DIR__ . "/../logs/$wgDBname/debugging.log"; switch ($wgDBname) { # all wikis for CentralAuth testing case 'testenwiki': case 'test2enwiki': case 'metawiki': case 'loginwiki': case 'testenwikt': case 'test2enwikt': putenv( 'MW_INSTALL_PATH=' . '/var/www/html/wikifarm/mediawiki/php-master' ); if ($wgDBname == 'testenwiki') { $wgServer = "http://testen.wikipedia.test"; $wgSitename = "TestEnWiki"; # edit for your installation $wgSecretKey = "<put-some-key-here>" # Changing this will log out all existing sessions. $wgAuthenticationTokenVersion = "1"; # Site upgrade key. Must be set to a string (default provided) to turn on the # web installer while LocalSettings.php is in place # edit for your installation $wgUpgradeKey = "<put-another-key-here>"; } else if ($wgDBname == 'test2enwiki') { ... etc } # common to all wikis for central auth testing $wgDebugDumpSql = true; $wgScriptPath = $basedir . "mw"; $wgResourceBasePath = $wgScriptPath; # put your own logo somplace and link to it here $wgLogos = [ '1x' => "$wgResourceBasePath/resources/assets/Goatification_logo.svg", 'icon' => "$wgResourceBasePath/resources/assets/Goatification_logo.svg.png" ]; $wgEnotifUserTalk = false; # UPO $wgEnotifWatchlist = false; # UPO $wgEmailAuthentication = true; $wgDBssl = false; $wgLanguageCode = "en"; # Time zone $wgLocaltimezone = "UTC"; ## Set $wgCacheDirectory to a writable directory on the web server # ths is for localization cache stuff, lives on disk $wgCacheDirectory = "$IP/cache/$wgDBname"; # this is new $wgDefaultSkin = "vector-2022"; break; }
Setup of CentralAuth
[edit]Getting the source
[edit]Clone the extension via
git clone "ssh://gerrit-username-here@gerrit.wikimedia.org:29418/mediawiki/extensions/CentralAuth"
If you don't have a gerrit username, clone via
git clone "https://gerrit.wikimedia.org/r/mediawiki/extensions/CentralAuth"
instead.
See [2] for more options.
Copy the repository into the extensions
subdirectory of your mediawiki repository copy, for example
php-master/extensions/
. Make sure that the ownership and permissions of the directory are suitable for
web service.
You'll then want to run composer update
from the mediawiki repository.
Configuration
[edit]I have my CentralAuth settings split up into two files, central_auth_wgconf_settings.php
which contains changes to the global configuration variable $wgConf
, and centralauth_settings.php
which
has the content
<?php # the SameSite: None setting allows cookies to be marked for cross-site use # (but in theory, $wgForceHTTPS should be set to true for this to be effective). # see https://www.mediawiki.org/wiki/Manual:SameSite_cookies for more details $wgCookieSameSite = 'None'; # instead of specifying the name of the centralauth database directly # in a global config variable, assign this to an element in an array # that can be looked up under the key 'virtual-centralauth'; # this enables schema updates to run automatically at installation # time, or via the maintenance script update.php. # see https://www.mediawiki.org/wiki/Manual:$wgVirtualDomainsMapping # for more information $wgVirtualDomainsMapping['virtual-centralauth'] = [ 'db' => 'centralauth' ]; # deprecated $wgCentralAuthDatabase = 'centralauth'; # these are the default settings, which can be changed if testing of # these features is needed # # local accounts will get attached to an existing global user entry # at local wiki login, if their password matches the global user password $wgCentralAuthAutoMigrate = true; # if an account is only local with no entry in the central auth tables, # do not try to create a global user and attach the local account to it # during a local wiki login $wgCentralAuthAutoMigrateNonGlobalAccounts = false; # allow login of unattached accounts, i.e. accounts that exist only # on the local wiki and have no presence in the centralauth tables $wgCentralAuthStrict = false; # allow Special:MergeAccount to function (probably irrelevant for a development # environment, but you never know when you might need to test this) $wgCentralAuthDryRun = false; # these are settings you should check and adjust if needed; # without a correct setup, automatic login will not work properly # # allow automatic login for wikis in a common wiki family (e.g. "wikipedia" # or "wiktionary"); this entails the use of a global session cookie $wgCentralAuthCookies = true; # name of the wiki database where central login occurs, after a successful login on # the local wiki $wgCentralAuthLoginWiki = 'loginwiki'; # global session cookies will be set for this domain rather than just for the current wiki, # enabling autologin for wikis in the same family $wgCentralAuthCookieDomain = ''; if ( in_array( $wgDBname, [ 'testenwiki', 'test2enwiki', 'metawiki' ] ) ) { $wgCentralAuthCookieDomain = '.wikipedia.test'; } elseif ( in_array( $wgDBname, [ 'testenwikt', 'test2enwikt' ] ) ) { $wgCentralAuthCookieDomain = '.wiktionary.test'; } # global session cookie names will begin with this string $wgCentralAuthCookiePrefix = 'centralauth_'; # global session cookies will be restricted to this domain only $wgCentralAuthCookiePath = '/'; # when a user locally logs in on a wiki, the user will be logged in on # a specific named wiki in each wiki family, with session cookies set # for each domain, allowing transparent autologin across domains $wgCentralAuthAutoLoginWikis = [ '*.wikipedia.org' => 'metawiki', '*.wiktionary.org' => 'testenwikt', ]; # this is for autocreation of local accounts when a global one is created # and it should be the same as the above list, so that local login on # these specific named wikis can happen $wgCentralAuthAutoCreateWikis = $wgCentralAuthAutoLoginWikis; # these settings are mostly for things we don't care about # much/at all in a testing environment # # it wants a 20x20 icon, I have none, so we just skip. $wgCentralAuthLoginIcon = false; # this has a default with a lot lot lot of user prefs, skip #$wgCentralAuthPrefsForUIReload = []; # This is for recent changes appearing in irc channels. skip # $wgCentralAuthRC = []; # number of wikis we will do a suppress user at a time. something small # for a test environment $wgCentralAuthWikisPerSuppressJob = 4; # set the centralauth db to read only mode. Nope. $wgCentralAuthReadOnly = false; # allow global renaming. only if you're testing that, we aren't. $wgCentralAuthEnableGlobalRenameRequest = false; # you can set a password policy here for things like minimum password length # but we don't care for testing $wgCentralAuthGlobalPasswordPolicies = []; # full url to a wiki page containing regexes, matching usernames cannot request # global renames, we absolutely do not care $wgGlobalRenameDenylist = null; # a user who globally suppresses a user (so suppression on all local wikis in the farm) # will show up as global>username-here in the logs, to indicate that this was via # a nonlocal user $wgCentralAuthGlobalBlockInterwikiPrefix = "global";
My central_auth_wgconf_settings.php file content:
<?php # this has the wgconf settings for central auth so we can just include it # in the loginwiki stanza in InitialiseSettings # this is probably ignored, since we never set $wgLocalHTTPProxy, so that is false # by default # see https://www.mediawiki.org/wiki/Manual:$wgLocalVirtualHosts for more info # $wgConf->localVirtualHosts = [ 'localhost' ]; # list of wiki databases in the wiki farm $wgLocalDatabases = [ 'testenwiki', 'test2enwiki', 'metawiki', 'loginwiki', 'testenwikt', 'test2enwikt' ]; # must be the same as list in $wgLocalDatabases $wgConf->wikis = $wgLocalDatabases; # we don't use $wgConf->siteFromDB() but let's set this nicely anyways; # suffix is the name of a wiki family like wikipedia, wiktionary and so on # the wiki database name is presumed to be <language code><project name> # but we have weird prefixes like "test2en" and so on. Anyways... # project names should go into this list, any project name that maps to something # else in the db name (as in our case, wikipedia roject dbs end in "wiki") should have # that name as the value, as in the below. $wgConf->suffixes = [ 'wikipedia' => 'wiki', 'wiktionary' => 'wikt' ]; # see https://www.mediawiki.org/wiki/Manual:$wgServer for more info # map of wiki database names to their base url; relative urls are allowed but # we won't bother $wgConf->settings['wgServer'] = [ 'default' => 'http://localhost', 'testenwiki' => 'http://testen.wikipedia.test', 'test2enwiki' => 'http://test2en.wikipedia.test', 'loginwiki' => 'http://login.wikipedia.test', 'metawiki' => 'http://meta.wikipedia.test', 'testenwikt' => 'http://testen.wiktionary.test', 'test2enwikt' => 'http://test2en.wiktionary.test', ]; # $wgCanonicalServer should be the same as $wgServer, in the case # where $wgServer is not a relative url $wgConf->settings['wgCanonicalServer'] = $wgConf->settings['wgServer']; # base url to get to any given page; all of ours use no fancy short urls # see https://www.mediawiki.org/wiki/Manual:$wgArticlePath $wgConf->settings['wgArticlePath'] = [ 'default' => "/mw/index.php?title=$1", ];
Migration of existing accounts
[edit]For each wiki in the wiki farm, after installation, run the CentralAuth migration scripts. Be in the top level directory of the mediawiki repository, and run
php extensions/CentralAuth/maintenance/migratePass0.php --wiki=<wiki-db-name-here>
for all wiki db names, and then
php extensions/CentralAuth/maintenance/migratePass1.php --wiki=<wiki-db-name-here>
for all wiki db names.