Installing mediawiki

From Newroco tech docs
Jump to: navigation, search

mediawiki install Ubuntu 16.04 notes, needs tidying and formatting

Basics

(https://www.mediawiki.org/wiki/Manual:Upgrading need mediawiki upgrade page but could just link to official)

  1. apt-get install screen software-properties-common

Enable universe (for some of the php libraries)

  1. add-apt-repository universe && apt-get update

Install base software

  1. apt-get install apache2 mysql-server mysql-client php php-mysql php-gd php-curl php7.0-intl php7.0-json imagemagick unzip php-mcrypt libapache2-mod-php7.0 php-xml php-mbstring

Configure a password for mysql root and store securely.

Extended search

If you want extended search capability in your wiki, also install Sphinx Search

  1. apt-get install sphinxsearch

edit php.ini to set longer timeouts inline with a working day

  1. vi /etc/php/7.0/apache2/php.ini

and set session.gc_maxlifetime to 43200

If making a VM to be a slave, jump to replication section now

Testing DB

If you want to test connection to your MySQL DB try:

  1. vi /var/www/html/phpinfo.php
  2.  
  3. <?php phpinfo(); ?>
  4.  
  5. vi /var/www/html/phpmysql.php
  6.  
  7. add:
  8.  
  9. <?php
  10. $con = mysql_connect("localhost","root","yourMySQLrootPassword");
  11. if (!$con)
  12. {
  13. die('Could not connect: ' . mysql_error());
  14. }
  15. else
  16. {
  17. echo "Congrats! connection established successfully";
  18. }
  19. mysql_close($con);
  20.  ?>
  21.  

Installing mediawiki files

Download latest version of mediawiki (1.28 at time of writing) from https://www.mediawiki.org/wiki/Download

  1. wget <link>

Then untar it

  1. tar -xzf mediawiki-<version>.tar.gz

Assuming the server will only be serving a wiki, copy the contents of expanded archive to web root (otherwise move the whole directory)

  1. cp -r mediawiki-<version>/* /var/www/html/

Remove Ubuntu's default index page

  1. rm /var/www/html/index.html

Configuration and setup

Visit server via web browser and follow installation steps. Ignore warning regarding cache as is built in to PHP 5.5 and later and will work regardless. Leave at defaults if not clear which option to use.

Configured admin user/password to suit

configured user email to suit (maybe should be a systems one or wikiadmin group depending on IS team setup)

follow steps in later configuration pages as desired, setting default license to CC-BY-SA(Creative Commons Attribution-ShareAlike)

Copied generated LocalSettings.php to /var/www/html

If wanted, place copy of organisation logo (square) in /var/www/html, then

  1. vi LocalSettings.php
and set
  1. $wgLogo="$wgResourceBasePath/<logoname>.png"
  2.  
  3. $wgServer = "https://<whatever the FQDN will be>"
(can use something temporary if this will be a migration and change once the DB and image content is migrated over)

CAS (SSO) Authentication

Using https://github.com/CWRUChielLab/CASAuth for CAS

Create folder $WIKI/extensions/CASAuth/

  1. mkdir /var/www/html/extensions/CASAuth/
  2. mv /var/www/html/extensions/CASAuth/

Download CASAuth and phpCAS

  1. wget https://github.com/CWRUChielLab/CASAuth/archive/master.zip
  2. wget https://developer.jasig.org/cas-clients/php/current/CAS-1.3.4.tgz

(latest version of phpCAS can be found here: https://wiki.jasig.org/display/CASC/phpCAS)

Extract and install

  1. tar -xzf CAS-1.3.3.tgz
  2. mv CAS-1.3.3 CAS
  3. unzip master.zip
  4. mv CASAuth-master/* ./
  5. cp CASAuthSettings.php.template CASAuthSettings.php

Edit file CASAuthSettings.php

  1. $CASAuth["Server"]="cas.example.com";
  2. $CASAuth["Url"]="/cas/"; (or the path that cas can be found at)

And replace other example.com config lines to oxfordarchaeology.com

  1. $CASAuth["CreateAccounts"]=true; (set to false by default)

And would normally set

  1. $CASAuth["RestrictUsers"]=false;

Although you may want to set this to true and enable individual users in some cases, for example where the content should be readable by anyone but only editable by some.

Add following line to LocalSettings.php

  1. require_once( "$IP/extensions/CASAuth/CASAuth.php" );

Enabling SphinxSearch

Create directory under $wiki/extensions/ and download the extension to it

  1. mkdir SphinxSearch
  2. cd SphinxSearch

Download

  1. wget https://github.com/wikimedia/mediawiki-extensions-SphinxSearch/archive/master.tar.gz

unzip resulting file, then delete it.

  1. tar -xzf master.tar.gz
  2. rm master.tar.gz

move the resulting sphinx.conf to place where it will be readable

  1. cp sphinx.conf /etc/sphinxsearch/

Edit sphinx.conf to suit e.g. database name, login details, log file location and create a data directory for sphinx to create indexes in as per the conf file entries. Ensure you change ownership of the data directory to sphinxsearch:sphinxsearch

  1. mkdir -p /var/data/sphinx
  2. chown sphinxsearch:sphinxsearch /var/data/sphinx
  3. mkdir /var/log/sphinx
  4. chown sphinxsearch:sphinxsearch /var/log/sphinx

Edit /etc/default/sphinxsearch

  1. START=yes

Run indexer first run:

  1. indexer --config /etc/sphinxsearch/sphinx.conf --all

Then start sphinxsearch

To keep it updated, create a file in cron.daily and add

  1. indexer --quiet --config /etc/sphinxsearch/sphinx.conf wiki_main --rotate >/dev/null 2>&1
  2. indexer --quiet --config /etc/sphinxsearch/sphinx.conf wiki_incremental --rotate >/dev/null 2>&1

And if your wiki is active, also create a file in cron.hourly

  1. indexer --quiet --config /etc/sphinxsearch/sphinx.conf wiki_incremental --rotate >/dev/null 2>&1

Download a copy of sphinxapi.php and place in extensions/SphinxSearch/

  1. https://raw.githubusercontent.com/romainneutron/Sphinx-Search-API-PHP-Client/master/sphinxapi.php

In the file LocalSettings.php append the following lines:

  1. $wgSearchType = 'SphinxMWSearch';
  2. require_once "$IP/extensions/SphinxSearch/SphinxSearch.php";

Adding extended <code> tag support

Download geshi:

  1.  
  2. wget https://github.com/GeSHi/geshi-1.0/archive/master.zip
  3. unzip master.zip
  4. cp -r geshi-1.0-master /var/www/html/extensions/SyntaxHighlight_GeSHi/
  5.  

Create a file and copy the code below in it:

  1.  
  2. mkdir /var/www/html/extensions/Code
  3. vi /var/www/html/extensions/Code/Code.php
  4.  
  1. <?php
  2. if( !defined( 'MEDIAWIKI' ) ) {
  3. echo( "This is an extension to the MediaWiki package and cannot be run standalone.\n" );
  4. die( -1 );
  5. }
  6. $wgExtensionCredits['other'][] = array(
  7. 'path' => __FILE__,
  8. 'name' => 'Code',
  9. 'version' => '0.9',
  10. 'author' => 'Paul Grinberg',
  11. 'url' => 'https://www.mediawiki.org/wiki/Extension:Code',
  12. 'description' => 'Allows syntax highlighting using GeSHi'
  13. );
  14.  
  15. $wgHooks['ParserFirstCallInit'][] = 'efCodeExtensionInit';
  16.  
  17. function efCodeExtensionInit(Parser &$parser) {
  18. $parser->setHook( "Code", "efCodeExtensionRenderCode" );
  19. return true;
  20. }
  21.  
  22. function efCodeExtensionRenderCode($input, $argv, $parser) {
  23. global $wgShowHideDivi, $wgOut;
  24.  
  25. // default values
  26. $language = 'text';
  27. $showLineNumbers = false;
  28. $showDownloadLink = false;
  29. $source = $input;
  30. $tabwidth = 4;
  31.  
  32. foreach ($argv as $key => $value) {
  33. switch ($key) {
  34. case 'lang':
  35. $language = $value;
  36. break;
  37. case 'linenumbers':
  38. $showLineNumbers = true;
  39. break;
  40. case 'tabwidth':
  41. $tabwidth = $value;
  42. break;
  43. case 'download':
  44. $showDownloadLink = true;
  45. break;
  46. case 'fileurl':
  47. $html = $parser->unstrip($parser->recursiveTagParse($value),$parser->mStripState);
  48. $i = preg_match('/<a.*?>(.*?)<\/a>/', $html, $matches);
  49. $url = $matches[1];
  50. //print("URL is '$url'");
  51. #$source = "file_get_contents disabled! Contact your wiki admin with questions.";
  52. $source = file_get_contents($url);
  53. break;
  54. default :
  55. wfDebug( __METHOD__.": Requested '$key ==> $value'\n" );
  56. break;
  57. }
  58. }
  59. if (!defined('GESHI_VERSION')) {
  60. include('extensions/SyntaxHighlight_GeSHi/geshi-1.0-master/src/geshi.php'); // include only once or else wiki dies
  61. }
  62. $geshi = new GeSHi($source, $language);
  63. $error = $geshi->error(); // die gracefully if errors found
  64. if ($error) {
  65. return "Code Extension Error: $error";
  66. }
  67. $geshi->enable_line_numbers(GESHI_FANCY_LINE_NUMBERS); // always display line numbers
  68. $geshi->set_tab_width($tabwidth);
  69. $code = $geshi->parse_code();
  70. $code_pieces = preg_split('/\<ol/', $code );
  71.  
  72. $output = '';
  73. $ol_tag = '<ol';
  74. if (!$showLineNumbers) {
  75. // if not asked to show line numbers, then we should hide them. This is the preferred method
  76. // because this allows for a means of a block of code in the middle of a numbered list
  77. $output .= "<style type='text/css'><!-- ol.codelinenumbers { list-style: none; margin-left: 0; padding-left: 0em;} --></style>";
  78. $ol_tag = "<ol class='codelinenumbers'";
  79. }
  80. $output .= $code_pieces[0];
  81. if ($showDownloadLink) {
  82. $output .= "<a href=\"javascript:win3 = window.open('', 'code', 'width=320,height=210,scrollbars=yes');win3.document.writeln('$source');\" style=\"float:right\">Download Code</a>\n";
  83. }
  84. $output .= $ol_tag . $code_pieces[1];
  85.  
  86. return $output;
  87. }
  88.  

Add the following line to /var/www/html/LocalSettings.php:

  1.  
  2. require_once "$IP/extensions/Code/Code.php";
  3.  

Apache vhost example for a reverse proxy

NB in virtual host declaration on proxy, use 301 redirect to force SSL - this prevents problems with CAS URL redirects e.g.

<VirtualHost *:80>

       ServerName blahwiki.oxfordarchaeology.com
       ServerAlias blahwiki.thehumanjourney.net

Redirect 301 / https://blahwiki.oxfordarchaeology.com

       ProxyPass / http://192.168.98.45/
       ProxyPassReverse / http://192.168.98.45/
       CustomLog /var/log/apache2/blahwiki.oxfordarchaeology.com.access.log combined
       ErrorLog /var/log/apache2/blahwiki.oxfordarchaeology.com.error.log

</VirtualHost>

<VirtualHost *:443>
       ServerName blahwiki.oxfordarchaeology.com
       SSLEngine on
       SSLCertificateFile /etc/apache2/ssl/oxfordarchaeology.crt
       SSLCertificateKeyFile /etc/apache2/ssl/oxfordarchaeology.com.key
       SSLCertificateChainFile /etc/apache2/ssl/oxfordarchaeology.intermediate.crt
       ProxyPass / http://192.168.98.45/
       ProxyPassReverse / http://blahwiki.oxfordarchaeology.com/
       ProxyPassReverse / http://192.168.98.45/
      CustomLog /var/log/apache2/blahwiki.oxfordarchaeology.com.access.log combined
      ErrorLog /var/log/apache2/blahwiki.oxfordarchaeology.com.error.log
</VirtualHost>


Replication

Following wikipedia with master/slave DB replication and then using rysnc wih --delete flag to keep mediawiki dir synched

On the master:

vi /etc/mysql/my.cnf

set bind-address to the LAN IP of the server

set server-id to something unique for the replication group. Easiest choice is LAN IP (without the dots)

uncomment log_bin

set binlog_do_db to the name of the wiki DB (as per LocalSettings.php)

Restart mysql

service mysql restart

Enter mysql root

mysql -u root -p

Create slave user (username/password as you want)and grant access

GRANT REPLICATION SLAVE ON *.* TO 'username'@'%' IDENTIFIED BY 'password';

FLUSH PRIVILEGES;

Switch to the wikiDB (name as per LocalSettings.php

USE wikiDB

Lock the database to prevent any new changes during setup:

FLUSH TABLES WITH READ LOCK;

Then enter

SHOW MASTER STATUS;

Copy the outcome into text file for reference (as in on your PC)

+------------------+----------+--------------+------------------+ | File | Position | Binlog_Do_DB | Binlog_Ignore_DB | +------------------+----------+--------------+------------------+ | mysql-bin.000001 | 428 | itwiki | | +------------------+----------+--------------+------------------+ 1 row in set (0.00 sec)


In a new console start a new shell into the master wiki and dump the DB

mysqldump -u root -p --opt nameofwikiDB > /tmp/nameofwikiDB.sql

Now return to original console and unlock the DB


UNLOCK TABLES; EXIT;

On the slave

Goto mysql root

mysql -u root -p

and create the DB to be replicated (name as per previous)

CREATE DATABASE wikiDB;

EXIT;

Copy the SQL file dumped earlier to the slave VM, then import it

mysql -u root -p wikiDB < wikiDB.sql

vi /etc/mysql/my.cnf

set server-id to something unique for the replication group. Easiest choice is LAN IP (without the dots)

uncomment log_bin

set binlog_do_db to the name of the wiki DB (as per LocalSettings.php)

Add line below log_bin

relay-log = /var/log/mysql/mysql-relay-bin.log

Restart mysql

service mysql restart

Goto mysql root

mysql -u root -p

Enter the following command, changing values as appropriate

CHANGE MASTER TO MASTER_HOST='ip.ad.re.ss', MASTER_USER='usernamefromabove',MASTER_PASSWORD='passwordfromabove',MASTER_LOG_FILE='mysql-bin.xxxx',MASTER_LOG_POS=xxx;

Replication (files)

Create user for replication on slave

adduser --disabled-password <username>

su <username>

Create key as per sshauth pages but without a passphrase

change to master server, create account and add key auth as per sshauth page, but prepend key (all one line still) with


from="slave.ip.addr.ess",no-X11-forwarding,no-agent-forwarding,no-port-forwarding

chown html dir on slave to <username>:root

chown -R <username>:root /var/www/html

on slave server change to wiki replication user

su <username>

and run

rsync -a --delete <master.ip.addr.ess>:/var/www/html/ /var/www/html/|more

checking output for errors. NB this command will recursively copy from source to target, deleting any files on target that do not exist on the host

Assuming no errors, create a cron.daily script to keep it up to date and log output (exit from <username> shell)

vi /etc/cron.daily/

  1. !/bin/bash

FULLDATE=`date +%Y-%m-%d:%H:%M:%S` DATE=`date +%Y%m%d` echo $FULLDATE > /tmp/rsynclog chmod +w /tmp/rsynclog sudo -i -u wikisync rsync -av --delete 192.168.98.46:/var/www/html/ /var/www/html/ >> /tmp/rsynclog mv /tmp/rsynclog /var/log/rsynclog-$DATE


Making the slave into the master copy of the wiki may (a) require changing permissions on the web dir, (b) changing master/slave settings in mysql (though should work without, just no write access) and (c) removing the cron.daily script (though it should just mark an error in the log)

---need notes on adding '<code>' tag




Troubleshooting CAS

Enable debug by editting CASAuth.php and adding the line

phpCAS::setDebug()

after

// Load phpCAS
                       require_once($CASAuth["phpCAS"]."/CAS.php");

This creates a logfile in /tmp of the server attempting CAS authentication with detailed information about the CAS attempt and failures.

Remember to comment out or delete the line once the issue has been resolved.

Migration

On the old server dump all pages into an xml file

  1. php /var/www/html/maintenance/dumpBackup.php --full > wikidump.xml

Copy the file to the new server and import it

  1. php /var/www/html/maintenance/importDump.php < wikifile.xml
  2. php /var/www/html/maintenance/rebuildrecentchanges.php

If you also have images, copy the directory /var/www/html/images from the old server to the new one(not in the same location) and import the images. The command will need to run as www-data user so the images will be imported with the right ownership.

  1. sudo -u www-data php /var/www/html/maintenance/importImages.php --search-recursively /path/to/import/images