Installing mediawiki: Difference between revisions

From Newroco Tech Docs
Jump to navigationJump to search
 
(49 intermediate revisions by 3 users not shown)
Line 10: Line 10:


Install base software  
Install base software  
<pre>apt-get install apache2 mysql-server mysql-client php php-mysql php-gd php-curl php7.0-intl php7.0-json imagemagick unzip php-mcrypt libapache2-mod-php7.0 php-xml php-mbstring</pre>
<pre>apt-get install apache2 mysql-server mysql-client php php-mysql php-gd php-curl php-intl php-json imagemagick unzip libapache2-mod-php php-xml php-mbstring</pre>


Configure a password for mysql root and store securely.
Configure a password for mysql root and store securely.


====Extended search====
====Extended search====
If you want extended search capability in your wiki, also install Sphinx Search
If you want extended search capability in your wiki, also install Sphinx Search ('''check if the Sphinxsearch is still support for the mediawiki version you want to install, if is not skip this step''')


<pre>apt-get install sphinxsearch</pre>
<pre>apt-get install sphinxsearch</pre>
Line 28: Line 28:


====Testing DB====
====Testing DB====
If you want to test connection to your MySQL DB try:
If you want to test connection to your MySQL DB try:
<pre>vi /var/www/html/phpinfo.php
<pre>vi /var/www/html/phpinfo.php
Line 50: Line 51:
         ?>
         ?>
</pre>
</pre>
       
 
====Create DB and user====
 
<pre>
mysql -u root -p
 
CREATE USER '<username>'@'localhost' IDENTIFIED WITH mysql_native_password BY '<password>';
create database dbname;
grant all privileges on dbname.* to username@localhost;
flush privileges;</pre>
 
====Installing mediawiki files====         
====Installing mediawiki files====         
Download latest version of mediawiki (1.28 at time of writing) from ''https://www.mediawiki.org/wiki/Download''
Download latest version of mediawiki (1.28 at time of writing) from ''https://www.mediawiki.org/wiki/Download''
Line 88: Line 99:


====CAS (SSO) Authentication====
====CAS (SSO) Authentication====
Using '''Auth remoteuser''' extention: https://www.mediawiki.org/wiki/Extension:Auth_remoteuser
Download the extention from here by selecting the mediawiki version you need: https://www.mediawiki.org/wiki/Special:ExtensionDistributor/Auth_remoteuser
And place it '''/var/www/html/extentions'''
<pre>wget <plugin-version-link>
tar -xzf Auth_remoteuser-REL1_35-6f570b8.tar.gz
mv Auth_remoteuser /var/www/html/extentions/</pre>
Add these lines to '''/var/www/html/LocalSettings.php'''
<pre>wfLoadExtension( 'Auth_remoteuser' );
$wgAuthRemoteuserUserUrls = [
    'logout' => $wgServer.'/logout.php'
];
$wgGroupPermissions['*']['autocreateaccount'] = true;</pre>
Create file '''/var/www/html/logout.php'''
<pre><?php
if (isset($_SERVER['HTTP_COOKIE'])) {
    $cookies = explode(';', $_SERVER['HTTP_COOKIE']);
    foreach($cookies as $cookie) {
        $parts = explode('=', $cookie);
        $name = trim($parts[0]);
        setcookie($name, '', 1);
        setcookie($name, '', 1, '/');
    }
}
header('Location: https://cas.domain.com/cas/logout');
?></pre>
Install apache2 mod auth cas package and enable it. '''Note''': it is normal for apache2 service to fail after installing the package because you did not configure the vhost yet.
<pre>apt-get install libapache2-mod-auth-cas
a2enmod auth_cas</pre>
Add these lines to the apache2 vhost. '''Note''': it is important to set the ServerName even if it's the only vhost on the server
<pre>        ServerName wiki.domain.com
        CASVersion 2
        CASLoginURL https://cas.domain.com/cas/login
        CASValidateURL https://cas.domain.com/cas/serviceValidate
        CASTimeout 43200
        <Location "/">
                AuthType CAS
                AuthName "CAS Authentication"
                require valid-user
        </Location></pre>
Restart apache2
<pre>systemctl restart apache2</pre>


====CAS Auth (Deprecated)====
Using https://github.com/CWRUChielLab/CASAuth for CAS
Using https://github.com/CWRUChielLab/CASAuth for CAS


Create folder $WIKI/extensions/CASAuth/
Create folder $WIKI/extensions/CASAuth/
<pre>mkdir /var/www/html/extensions/CASAuth/
<pre>mkdir /var/www/html/extensions/CASAuth/
mv /var/www/html/extensions/CASAuth/</pre>
cd /var/www/html/extensions/CASAuth/</pre>


Download CASAuth and phpCAS
Download CASAuth and phpCAS


<pre>wget https://github.com/CWRUChielLab/CASAuth/archive/master.zip
<pre>wget https://github.com/konfuzed/CASAuth/archive/master.zip
wget https://developer.jasig.org/cas-clients/php/current/CAS-1.3.4.tgz</pre>
wget https://github.com/apereo/phpCAS/archive/master.zip</pre>
(latest version of phpCAS can be found here: https://wiki.jasig.org/display/CASC/phpCAS)
(latest version of phpCAS can be found here: https://wiki.jasig.org/display/CASC/phpCAS)


Extract and install
Extract and install


<pre>tar -xzf CAS-1.3.3.tgz
<pre>tar -xzf CAS-1.3.5.tgz
mv CAS-1.3.3 CAS
mv phpCAS-master/ CAS
unzip master.zip
unzip master.zip
mv CASAuth-master/* ./
mv CASAuth-master/* ./
Line 128: Line 190:
<pre>require_once( "$IP/extensions/CASAuth/CASAuth.php" );</pre>
<pre>require_once( "$IP/extensions/CASAuth/CASAuth.php" );</pre>


====Enabling SphinxSearch====
If you get the wiki longin page after you login through CAS, add this to the apache vhost, most probably in /etc/apache2/sites-enabled/000-default.conf
<pre>RedirectMatch ^/$ https://<domain>/index.php/Main_Page</pre>
 
====Enabling SphinxSearch (Deprecated)====


Create directory under $wiki/extensions/ and download the extension to it
Create directory under $wiki/extensions/ and download the extension to it
Line 166: Line 231:
To keep it updated, create a file in cron.daily and add
To keep it updated, create a file in cron.daily and add


<pre>indexer --quiet --config /etc/spinxsearch/sphinx.conf wiki_main --rotate >/dev/null  2>&1
<pre>indexer --quiet --config /etc/sphinxsearch/sphinx.conf wiki_main --rotate >/dev/null  2>&1
indexer --quiet --config /path/to/sphinx.conf wiki_incremental --rotate >/dev/null 2>&1</pre>
indexer --quiet --config /etc/sphinxsearch/sphinx.conf wiki_incremental --rotate >/dev/null 2>&1</pre>


And if your wiki is active, also create a file in cron.hourly
And if your wiki is active, also create a file in cron.hourly


<pre>indexer --quiet --config /path/to/sphinx.conf wiki_incremental --rotate >/dev/null 2>&1</pre>
<pre>indexer --quiet --config /etc/sphinxsearch/sphinx.conf wiki_incremental --rotate >/dev/null 2>&1</pre>


Download a copy of sphinxapi.php and place in extensions/SphinxSearch/
Download a copy of sphinxapi.php and place in extensions/SphinxSearch/
Line 181: Line 246:
<pre>$wgSearchType = 'SphinxMWSearch';
<pre>$wgSearchType = 'SphinxMWSearch';
require_once "$IP/extensions/SphinxSearch/SphinxSearch.php";</pre>
require_once "$IP/extensions/SphinxSearch/SphinxSearch.php";</pre>
====Enabling [https://www.mediawiki.org/wiki/Extension:CirrusSearch Cirusssearch]====
The CirrusSearch extension implements searching for MediaWiki using Elasticsearch.
=====Dependencies=====
1. Elasticsearch
You will need to install [https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html Elasticsearch]
    * MediaWiki 1.29.x and 1.30.x require Elasticsearch 5.3.x or 5.4.x.
    * MediaWiki 1.31.x and 1.32.x require Elasticsearch 5.5.x or 5.6.x.
    * MediaWiki 1.33.x, 1.34.x and 1.35.x require Elasticsearch 6.5.x (6.5.4 recommended).
Take note that a Java installation like [https://openjdk.java.net/ OpenJDK] is needed in addition.
<pre>sudo apt -y install apt-transport-https
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/oss-6.x/apt stable main" | sudo tee  /etc/apt/sources.list.d/elastic-6.x.list
sudo apt update
sudo apt install apt-transport-https default-jdk default-jre
sudo apt-get update && sudo apt-get install elasticsearch-oss</pre>
2. After you have modified the configuration, you can start Elasticsearch:
<pre>sudo systemctl daemon-reload
sudo systemctl enable elasticsearch.service
sudo systemctl restart elasticsearch.service</pre>
3. Limit Elastic search RAM usage: <br>
Edit ''/etc/elasticsearch/jvm.options'', and set min and max RAM usage:
<pre>-Xms128m
-Xmx128m</pre>
Restart the service:
<pre>sudo systemctl daemon-reload
sudo systemctl restart elasticsearch.service</pre>
3. [https://www.mediawiki.org/wiki/Extension:Elastica Elastica]
Elastica is a PHP library to talk to Elasticsearch. Install Elastica per the instructions below.
=====Instalation=====
Get Elasticsearch up and running somewhere.
Place the CirrusSearch extension in your extensions directory.
You also need to install the Elastica MediaWiki extension.
Add this to ''LocalSettings.php'':
<pre>wfLoadExtension( 'Elastica' );
wfLoadExtension( 'CirrusSearch' );
$wgDisableSearchUpdate = true;</pre>
Configure your search servers in LocalSettings.php if you aren't running Elasticsearch on localhost:
<pre>$wgCirrusSearchServers = [ 'elasticsearch0', 'elasticsearch1', 'elasticsearch2', 'elasticsearch3' ];</pre>
There are other $wgCirrusSearch variables that you might want to change from their defaults.
Now run this script to generate your elasticsearch index:
<pre>php $MW_INSTALL_PATH/extensions/CirrusSearch/maintenance/UpdateSearchIndexConfig.php</pre>
Now remove ''$wgDisableSearchUpdate = true'' from LocalSettings.php.  Updates should start heading to Elasticsearch.
Next bootstrap the search index by running:
<pre> php $MW_INSTALL_PATH/extensions/CirrusSearch/maintenance/ForceSearchIndex.php --skipLinks --indexOnSkip
php $MW_INSTALL_PATH/extensions/CirrusSearch/maintenance/ForceSearchIndex.php --skipParse</pre>
Note that this can take some time.
Once that is complete add this to ''LocalSettings.php'' to funnel queries to ElasticSearch:
<pre> $wgSearchType = 'CirrusSearch';</pre>
Create a crontab to update the elasticsearch indexes:
<pre>0 * * * * /usr/bin/php /var/www/html/maintenance/runJobs.php --maxtime=3600 > /var/log/runJobs.log 2>&1</pre>


==Adding extended &lt;code> tag support==
==Adding extended &lt;code> tag support==
Line 255: Line 385:
     }
     }
         if (!defined('GESHI_VERSION')) {
         if (!defined('GESHI_VERSION')) {
         include('extensions/SyntaxHighlight_GeSHi/geshi-1.0-master/src/geshi.php'); // include only once or else wiki dies
         include('extensions/SyntaxHighlight_GeSHi/src/geshi.php'); // include only once or else wiki dies
     }
     }
     $geshi = new GeSHi($source, $language);
     $geshi = new GeSHi($source, $language);
Line 289: Line 419:
require_once "$IP/extensions/Code/Code.php";
require_once "$IP/extensions/Code/Code.php";
</pre>
</pre>
==Public pages on secured server==
If you want to secure your server but have some pages available publicly, then you can add them to $wgWhitelistRead
vi /var/www/html/LocalSettings.php
Find
$wgWhitelistRead
And add in a comma separated list the pages you want to make public, in quotes " " so you get something like
$wgWhitelistRead =  array ( "Special:Search", "Special:Random", "Installing mediawiki" );
Any pages added will be public immediately the file is saved.


==Apache vhost example for a reverse proxy==
==Apache vhost example for a reverse proxy==
Line 321: Line 464:
       ErrorLog /var/log/apache2/blahwiki.oxfordarchaeology.com.error.log
       ErrorLog /var/log/apache2/blahwiki.oxfordarchaeology.com.error.log
  </VirtualHost>
  </VirtualHost>


==Replication==
==Replication==
Line 469: Line 611:
Making the slave into the master copy of the wiki may (a) require changing permissions on the web dir, (b) changing master/slave settings in mysql (though should work without, just no write access) and (c) removing the cron.daily script (though it should just mark an error in the log)
Making the slave into the master copy of the wiki may (a) require changing permissions on the web dir, (b) changing master/slave settings in mysql (though should work without, just no write access) and (c) removing the cron.daily script (though it should just mark an error in the log)


---need notes on adding '<pre>' tag
---need notes on adding '&lt;code&gt;' tag




Line 490: Line 632:


Remember to comment out or delete the line once the issue has been resolved.
Remember to comment out or delete the line once the issue has been resolved.
==Migration==
On the old server dump all pages into an xml file
<pre>php /var/www/html/maintenance/dumpBackup.php --full > wikidump.xml</pre>
Copy the file to the new server and import it
<pre>php /var/www/html/maintenance/importDump.php < wikifile.xml
php /var/www/html/maintenance/rebuildrecentchanges.php</pre>
If you also have images, copy the directory /var/www/html/images from the old server to the new one(not in the same location) and import the images. The command will need to run as www-data user so the images will be imported with the right ownership.
<pre>sudo -u www-data php /var/www/html/maintenance/importImages.php --search-recursively /path/to/import/images</pre>
==Session timeout==
To set the session timeout put this in /var/www/html/LocalSettings.php
<pre>$wgCookieExpiration = 28800;
$wgExtendedLoginCookieExpiration = null;</pre>
==Troubleshoot==
Lock the database:
<pre>$wgReadOnly = "We are upgrading MediaWiki, please be patient. This wiki will be back in a few minutes.";</pre>
Db connection error:
<syntaxhighlight lang="bash">Cannot access the database: :real_connect(): (HY000/1698): Access denied for user 'root'@'localhost'. Check the host, username and password and try again. If using "localhost" as the database host, try using "127.0.0.1" instead (or vice versa).</syntaxhighlight>
To fix this error run the command below:
<syntaxhighlight lang="bash">ALTER USER '<username>'@'localhost' IDENTIFIED WITH mysql_native_password BY '<password>';</syntaxhighlight>

Latest revision as of 07:42, 23 November 2021

mediawiki install Ubuntu 16.04 notes, needs tidying and formatting

Basics

(https://www.mediawiki.org/wiki/Manual:Upgrading need mediawiki upgrade page but could just link to official)

apt-get install screen software-properties-common

Enable universe (for some of the php libraries)

add-apt-repository universe && apt-get update

Install base software

apt-get install apache2 mysql-server mysql-client php php-mysql php-gd php-curl php-intl php-json imagemagick unzip libapache2-mod-php php-xml php-mbstring

Configure a password for mysql root and store securely.

Extended search

If you want extended search capability in your wiki, also install Sphinx Search (check if the Sphinxsearch is still support for the mediawiki version you want to install, if is not skip this step)

apt-get install sphinxsearch

edit php.ini to set longer timeouts inline with a working day

vi /etc/php/7.0/apache2/php.ini

and set session.gc_maxlifetime to 43200

If making a VM to be a slave, jump to replication section now

Testing DB

If you want to test connection to your MySQL DB try:

vi /var/www/html/phpinfo.php

<?php phpinfo(); ?>

vi /var/www/html/phpmysql.php

add:

<?php
        $con = mysql_connect("localhost","root","yourMySQLrootPassword");
        if (!$con)
        {
         die('Could not connect: ' . mysql_error());
        }
        else
        {
         echo "Congrats! connection established successfully";
        }
        mysql_close($con);
        ?>

Create DB and user

mysql -u root -p

CREATE USER '<username>'@'localhost' IDENTIFIED WITH mysql_native_password BY '<password>';
create database dbname;
grant all privileges on dbname.* to username@localhost;
flush privileges;

Installing mediawiki files

Download latest version of mediawiki (1.28 at time of writing) from https://www.mediawiki.org/wiki/Download

wget <link>

Then untar it

tar -xzf mediawiki-<version>.tar.gz

Assuming the server will only be serving a wiki, copy the contents of expanded archive to web root (otherwise move the whole directory)

cp -r mediawiki-<version>/* /var/www/html/

Remove Ubuntu's default index page

rm /var/www/html/index.html

Configuration and setup

Visit server via web browser and follow installation steps. Ignore warning regarding cache as is built in to PHP 5.5 and later and will work regardless. Leave at defaults if not clear which option to use.

Configured admin user/password to suit

configured user email to suit (maybe should be a systems one or wikiadmin group depending on IS team setup)

follow steps in later configuration pages as desired, setting default license to CC-BY-SA(Creative Commons Attribution-ShareAlike)

Copied generated LocalSettings.php to /var/www/html

If wanted, place copy of organisation logo (square) in /var/www/html, then

vi LocalSettings.php

and set

$wgLogo="$wgResourceBasePath/<logoname>.png"

$wgServer = "https://<whatever the FQDN will be>" 

(can use something temporary if this will be a migration and change once the DB and image content is migrated over)

CAS (SSO) Authentication

Using Auth remoteuser extention: https://www.mediawiki.org/wiki/Extension:Auth_remoteuser

Download the extention from here by selecting the mediawiki version you need: https://www.mediawiki.org/wiki/Special:ExtensionDistributor/Auth_remoteuser And place it /var/www/html/extentions

wget <plugin-version-link>
tar -xzf Auth_remoteuser-REL1_35-6f570b8.tar.gz
mv Auth_remoteuser /var/www/html/extentions/

Add these lines to /var/www/html/LocalSettings.php

wfLoadExtension( 'Auth_remoteuser' );
$wgAuthRemoteuserUserUrls = [
    'logout' => $wgServer.'/logout.php'
];
$wgGroupPermissions['*']['autocreateaccount'] = true;

Create file /var/www/html/logout.php

<?php
if (isset($_SERVER['HTTP_COOKIE'])) {
    $cookies = explode(';', $_SERVER['HTTP_COOKIE']);
    foreach($cookies as $cookie) {
        $parts = explode('=', $cookie);
        $name = trim($parts[0]);
        setcookie($name, '', 1);
        setcookie($name, '', 1, '/');
    }
}

header('Location: https://cas.domain.com/cas/logout');
?>

Install apache2 mod auth cas package and enable it. Note: it is normal for apache2 service to fail after installing the package because you did not configure the vhost yet.

apt-get install libapache2-mod-auth-cas
a2enmod auth_cas

Add these lines to the apache2 vhost. Note: it is important to set the ServerName even if it's the only vhost on the server

        ServerName wiki.domain.com

        CASVersion 2
        CASLoginURL https://cas.domain.com/cas/login
        CASValidateURL https://cas.domain.com/cas/serviceValidate
        CASTimeout 43200

        <Location "/">
                AuthType CAS
                AuthName "CAS Authentication"
                require valid-user
        </Location>

Restart apache2

systemctl restart apache2

CAS Auth (Deprecated)

Using https://github.com/CWRUChielLab/CASAuth for CAS

Create folder $WIKI/extensions/CASAuth/

mkdir /var/www/html/extensions/CASAuth/
cd /var/www/html/extensions/CASAuth/

Download CASAuth and phpCAS

wget https://github.com/konfuzed/CASAuth/archive/master.zip
wget https://github.com/apereo/phpCAS/archive/master.zip

(latest version of phpCAS can be found here: https://wiki.jasig.org/display/CASC/phpCAS)

Extract and install

tar -xzf CAS-1.3.5.tgz
mv phpCAS-master/ CAS
unzip master.zip
mv CASAuth-master/* ./
cp CASAuthSettings.php.template CASAuthSettings.php

Edit file CASAuthSettings.php

$CASAuth["Server"]="cas.example.com";
$CASAuth["Url"]="/cas/";    (or the path that cas can be found at)

And replace other example.com config lines to oxfordarchaeology.com

$CASAuth["CreateAccounts"]=true;     (set to false by default)

And would normally set

$CASAuth["RestrictUsers"]=false;

Although you may want to set this to true and enable individual users in some cases, for example where the content should be readable by anyone but only editable by some.

Add following line to LocalSettings.php

require_once( "$IP/extensions/CASAuth/CASAuth.php" );

If you get the wiki longin page after you login through CAS, add this to the apache vhost, most probably in /etc/apache2/sites-enabled/000-default.conf

RedirectMatch ^/$ https://<domain>/index.php/Main_Page

Enabling SphinxSearch (Deprecated)

Create directory under $wiki/extensions/ and download the extension to it

mkdir SphinxSearch
cd SphinxSearch

Download

wget https://github.com/wikimedia/mediawiki-extensions-SphinxSearch/archive/master.tar.gz

unzip resulting file, then delete it.

tar -xzf master.tar.gz
rm master.tar.gz

move the resulting sphinx.conf to place where it will be readable

cp sphinx.conf /etc/sphinxsearch/

Edit sphinx.conf to suit e.g. database name, login details, log file location and create a data directory for sphinx to create indexes in as per the conf file entries. Ensure you change ownership of the data directory to sphinxsearch:sphinxsearch

mkdir -p /var/data/sphinx
chown sphinxsearch:sphinxsearch /var/data/sphinx
mkdir /var/log/sphinx
chown sphinxsearch:sphinxsearch /var/log/sphinx

Edit /etc/default/sphinxsearch

START=yes

Run indexer first run:

indexer --config /etc/sphinxsearch/sphinx.conf --all

Then start sphinxsearch

To keep it updated, create a file in cron.daily and add

indexer --quiet --config /etc/sphinxsearch/sphinx.conf wiki_main --rotate >/dev/null  2>&1
indexer --quiet --config /etc/sphinxsearch/sphinx.conf wiki_incremental --rotate >/dev/null 2>&1

And if your wiki is active, also create a file in cron.hourly

indexer --quiet --config /etc/sphinxsearch/sphinx.conf wiki_incremental --rotate >/dev/null 2>&1

Download a copy of sphinxapi.php and place in extensions/SphinxSearch/

https://raw.githubusercontent.com/romainneutron/Sphinx-Search-API-PHP-Client/master/sphinxapi.php

In the file LocalSettings.php append the following lines:

$wgSearchType = 'SphinxMWSearch';
require_once "$IP/extensions/SphinxSearch/SphinxSearch.php";

Enabling Cirusssearch

The CirrusSearch extension implements searching for MediaWiki using Elasticsearch.

Dependencies

1. Elasticsearch

You will need to install Elasticsearch

   * MediaWiki 1.29.x and 1.30.x require Elasticsearch 5.3.x or 5.4.x.
   * MediaWiki 1.31.x and 1.32.x require Elasticsearch 5.5.x or 5.6.x.
   * MediaWiki 1.33.x, 1.34.x and 1.35.x require Elasticsearch 6.5.x (6.5.4 recommended).

Take note that a Java installation like OpenJDK is needed in addition.

sudo apt -y install apt-transport-https
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/oss-6.x/apt stable main" | sudo tee  /etc/apt/sources.list.d/elastic-6.x.list
sudo apt update
sudo apt install apt-transport-https default-jdk default-jre
sudo apt-get update && sudo apt-get install elasticsearch-oss

2. After you have modified the configuration, you can start Elasticsearch:

sudo systemctl daemon-reload
sudo systemctl enable elasticsearch.service
sudo systemctl restart elasticsearch.service

3. Limit Elastic search RAM usage:
Edit /etc/elasticsearch/jvm.options, and set min and max RAM usage:

-Xms128m
-Xmx128m

Restart the service:

sudo systemctl daemon-reload
sudo systemctl restart elasticsearch.service

3. Elastica Elastica is a PHP library to talk to Elasticsearch. Install Elastica per the instructions below.

Instalation

Get Elasticsearch up and running somewhere. Place the CirrusSearch extension in your extensions directory. You also need to install the Elastica MediaWiki extension. Add this to LocalSettings.php:

wfLoadExtension( 'Elastica' );
wfLoadExtension( 'CirrusSearch' );
$wgDisableSearchUpdate = true;

Configure your search servers in LocalSettings.php if you aren't running Elasticsearch on localhost:

$wgCirrusSearchServers = [ 'elasticsearch0', 'elasticsearch1', 'elasticsearch2', 'elasticsearch3' ];

There are other $wgCirrusSearch variables that you might want to change from their defaults.

Now run this script to generate your elasticsearch index:

php $MW_INSTALL_PATH/extensions/CirrusSearch/maintenance/UpdateSearchIndexConfig.php

Now remove $wgDisableSearchUpdate = true from LocalSettings.php. Updates should start heading to Elasticsearch.

Next bootstrap the search index by running:

 php $MW_INSTALL_PATH/extensions/CirrusSearch/maintenance/ForceSearchIndex.php --skipLinks --indexOnSkip
 php $MW_INSTALL_PATH/extensions/CirrusSearch/maintenance/ForceSearchIndex.php --skipParse

Note that this can take some time.

Once that is complete add this to LocalSettings.php to funnel queries to ElasticSearch:

 $wgSearchType = 'CirrusSearch';

Create a crontab to update the elasticsearch indexes:

0 * * * * /usr/bin/php /var/www/html/maintenance/runJobs.php --maxtime=3600 > /var/log/runJobs.log 2>&1

Adding extended <code> tag support

Download geshi:

wget https://github.com/GeSHi/geshi-1.0/archive/master.zip
unzip master.zip
cp -r geshi-1.0-master /var/www/html/extensions/SyntaxHighlight_GeSHi/

Create a file and copy the code below in it:

mkdir /var/www/html/extensions/Code
vi /var/www/html/extensions/Code/Code.php
<?php
if( !defined( 'MEDIAWIKI' ) ) {
        echo( "This is an extension to the MediaWiki package and cannot be run standalone.\n" );
        die( -1 );
}
$wgExtensionCredits['other'][] = array(
        'path'           => __FILE__,
        'name'           => 'Code',
        'version'        => '0.9',
        'author'         => 'Paul Grinberg',
        'url'            => 'https://www.mediawiki.org/wiki/Extension:Code',
        'description'    => 'Allows syntax highlighting using GeSHi'
);

$wgHooks['ParserFirstCallInit'][] = 'efCodeExtensionInit';

function efCodeExtensionInit(Parser &$parser) {
    $parser->setHook( "Code", "efCodeExtensionRenderCode" );
    return true;
}

function efCodeExtensionRenderCode($input, $argv, $parser) {
    global $wgShowHideDivi, $wgOut;

    // default values
    $language = 'text';
    $showLineNumbers = false;
    $showDownloadLink = false;
    $source = $input;
    $tabwidth = 4;

    foreach ($argv as $key => $value) {
        switch ($key) {
 case 'lang':
                $language = $value;
                break;
            case 'linenumbers':
                $showLineNumbers = true;
                break;
            case 'tabwidth':
                $tabwidth = $value;
                break;
            case 'download':
                $showDownloadLink = true;
                break;
            case 'fileurl':
                $html = $parser->unstrip($parser->recursiveTagParse($value),$parser->mStripState);
                $i = preg_match('/<a.*?>(.*?)<\/a>/', $html, $matches);
                $url = $matches[1];
                //print("URL is '$url'");
                #$source = "file_get_contents disabled! Contact your wiki admin with questions.";
                $source =  file_get_contents($url);
                break;
            default :
                wfDebug( __METHOD__.": Requested '$key ==> $value'\n" );
                break;
        }
    }
        if (!defined('GESHI_VERSION')) {
        include('extensions/SyntaxHighlight_GeSHi/src/geshi.php'); // include only once or else wiki dies
    }
    $geshi = new GeSHi($source, $language);
    $error = $geshi->error();           // die gracefully if errors found
    if ($error) {
        return "Code Extension Error: $error";
 }
    $geshi->enable_line_numbers(GESHI_FANCY_LINE_NUMBERS); // always display line numbers
    $geshi->set_tab_width($tabwidth);
    $code = $geshi->parse_code();
    $code_pieces = preg_split('/\<ol/', $code );

    $output = '';
    $ol_tag = '<ol';
    if (!$showLineNumbers) {
        // if not asked to show line numbers, then we should hide them. This is the preferred method
        // because this allows for a means of a block of code in the middle of a numbered list
        $output .= "<style type='text/css'><!-- ol.codelinenumbers { list-style: none; margin-left: 0; padding-left: 0em;} --></style>";
        $ol_tag = "<ol class='codelinenumbers'";
    }
    $output .= $code_pieces[0];
    if ($showDownloadLink) {
        $output .= "<a href=\"javascript:win3 = window.open('', 'code', 'width=320,height=210,scrollbars=yes');win3.document.writeln('$source');\"  style=\"float:right\">Download Code</a>\n";
    }
    $output .= $ol_tag . $code_pieces[1];

    return $output;
}

Add the following line to /var/www/html/LocalSettings.php:

require_once "$IP/extensions/Code/Code.php";

Public pages on secured server

If you want to secure your server but have some pages available publicly, then you can add them to $wgWhitelistRead

vi /var/www/html/LocalSettings.php

Find

$wgWhitelistRead

And add in a comma separated list the pages you want to make public, in quotes " " so you get something like

$wgWhitelistRead =  array ( "Special:Search", "Special:Random", "Installing mediawiki" );

Any pages added will be public immediately the file is saved.

Apache vhost example for a reverse proxy

NB in virtual host declaration on proxy, use 301 redirect to force SSL - this prevents problems with CAS URL redirects e.g.

<VirtualHost *:80>

       ServerName blahwiki.oxfordarchaeology.com
       ServerAlias blahwiki.thehumanjourney.net

Redirect 301 / https://blahwiki.oxfordarchaeology.com

       ProxyPass / http://192.168.98.45/
       ProxyPassReverse / http://192.168.98.45/
       CustomLog /var/log/apache2/blahwiki.oxfordarchaeology.com.access.log combined
       ErrorLog /var/log/apache2/blahwiki.oxfordarchaeology.com.error.log

</VirtualHost>

<VirtualHost *:443>
       ServerName blahwiki.oxfordarchaeology.com
       SSLEngine on
       SSLCertificateFile /etc/apache2/ssl/oxfordarchaeology.crt
       SSLCertificateKeyFile /etc/apache2/ssl/oxfordarchaeology.com.key
       SSLCertificateChainFile /etc/apache2/ssl/oxfordarchaeology.intermediate.crt
       ProxyPass / http://192.168.98.45/
       ProxyPassReverse / http://blahwiki.oxfordarchaeology.com/
       ProxyPassReverse / http://192.168.98.45/
      CustomLog /var/log/apache2/blahwiki.oxfordarchaeology.com.access.log combined
      ErrorLog /var/log/apache2/blahwiki.oxfordarchaeology.com.error.log
</VirtualHost>

Replication

Following wikipedia with master/slave DB replication and then using rysnc wih --delete flag to keep mediawiki dir synched

On the master:

vi /etc/mysql/my.cnf

set bind-address to the LAN IP of the server

set server-id to something unique for the replication group. Easiest choice is LAN IP (without the dots)

uncomment log_bin

set binlog_do_db to the name of the wiki DB (as per LocalSettings.php)

Restart mysql

service mysql restart

Enter mysql root

mysql -u root -p

Create slave user (username/password as you want)and grant access

GRANT REPLICATION SLAVE ON *.* TO 'username'@'%' IDENTIFIED BY 'password';

FLUSH PRIVILEGES;

Switch to the wikiDB (name as per LocalSettings.php

USE wikiDB

Lock the database to prevent any new changes during setup:

FLUSH TABLES WITH READ LOCK;

Then enter

SHOW MASTER STATUS;

Copy the outcome into text file for reference (as in on your PC)

+------------------+----------+--------------+------------------+ | File | Position | Binlog_Do_DB | Binlog_Ignore_DB | +------------------+----------+--------------+------------------+ | mysql-bin.000001 | 428 | itwiki | | +------------------+----------+--------------+------------------+ 1 row in set (0.00 sec)


In a new console start a new shell into the master wiki and dump the DB

mysqldump -u root -p --opt nameofwikiDB > /tmp/nameofwikiDB.sql

Now return to original console and unlock the DB


UNLOCK TABLES; EXIT;

On the slave

Goto mysql root

mysql -u root -p

and create the DB to be replicated (name as per previous)

CREATE DATABASE wikiDB;

EXIT;

Copy the SQL file dumped earlier to the slave VM, then import it

mysql -u root -p wikiDB < wikiDB.sql

vi /etc/mysql/my.cnf

set server-id to something unique for the replication group. Easiest choice is LAN IP (without the dots)

uncomment log_bin

set binlog_do_db to the name of the wiki DB (as per LocalSettings.php)

Add line below log_bin

relay-log = /var/log/mysql/mysql-relay-bin.log

Restart mysql

service mysql restart

Goto mysql root

mysql -u root -p

Enter the following command, changing values as appropriate

CHANGE MASTER TO MASTER_HOST='ip.ad.re.ss', MASTER_USER='usernamefromabove',MASTER_PASSWORD='passwordfromabove',MASTER_LOG_FILE='mysql-bin.xxxx',MASTER_LOG_POS=xxx;

Replication (files)

Create user for replication on slave

adduser --disabled-password <username>

su <username>

Create key as per sshauth pages but without a passphrase

change to master server, create account and add key auth as per sshauth page, but prepend key (all one line still) with


from="slave.ip.addr.ess",no-X11-forwarding,no-agent-forwarding,no-port-forwarding

chown html dir on slave to <username>:root

chown -R <username>:root /var/www/html

on slave server change to wiki replication user

su <username>

and run

rsync -a --delete <master.ip.addr.ess>:/var/www/html/ /var/www/html/|more

checking output for errors. NB this command will recursively copy from source to target, deleting any files on target that do not exist on the host

Assuming no errors, create a cron.daily script to keep it up to date and log output (exit from <username> shell)

vi /etc/cron.daily/

  1. !/bin/bash

FULLDATE=`date +%Y-%m-%d:%H:%M:%S` DATE=`date +%Y%m%d` echo $FULLDATE > /tmp/rsynclog chmod +w /tmp/rsynclog sudo -i -u wikisync rsync -av --delete 192.168.98.46:/var/www/html/ /var/www/html/ >> /tmp/rsynclog mv /tmp/rsynclog /var/log/rsynclog-$DATE


Making the slave into the master copy of the wiki may (a) require changing permissions on the web dir, (b) changing master/slave settings in mysql (though should work without, just no write access) and (c) removing the cron.daily script (though it should just mark an error in the log)

---need notes on adding '<code>' tag




Troubleshooting CAS

Enable debug by editting CASAuth.php and adding the line

phpCAS::setDebug()

after

// Load phpCAS
                       require_once($CASAuth["phpCAS"]."/CAS.php");

This creates a logfile in /tmp of the server attempting CAS authentication with detailed information about the CAS attempt and failures.

Remember to comment out or delete the line once the issue has been resolved.

Migration

On the old server dump all pages into an xml file

php /var/www/html/maintenance/dumpBackup.php --full > wikidump.xml

Copy the file to the new server and import it

php /var/www/html/maintenance/importDump.php < wikifile.xml
php /var/www/html/maintenance/rebuildrecentchanges.php

If you also have images, copy the directory /var/www/html/images from the old server to the new one(not in the same location) and import the images. The command will need to run as www-data user so the images will be imported with the right ownership.

sudo -u www-data php /var/www/html/maintenance/importImages.php --search-recursively /path/to/import/images

Session timeout

To set the session timeout put this in /var/www/html/LocalSettings.php

$wgCookieExpiration = 28800;
$wgExtendedLoginCookieExpiration = null;

Troubleshoot

Lock the database:

$wgReadOnly = "We are upgrading MediaWiki, please be patient. This wiki will be back in a few minutes.";

Db connection error:

Cannot access the database: :real_connect(): (HY000/1698): Access denied for user 'root'@'localhost'. Check the host, username and password and try again. If using "localhost" as the database host, try using "127.0.0.1" instead (or vice versa).

To fix this error run the command below:

ALTER USER '<username>'@'localhost' IDENTIFIED WITH mysql_native_password BY '<password>';