WordPress on a LAMP stack with Ubuntu

Install the LAMP Stack

This approach uses tasksel. Tasksel should only be used during initial setup of a server and not for ongoing maintenance and configuration. It installs all of the software and configurations required for a specific task.

  • sudo apt-get update

  • sudo apt-get install tasksel

  • sudo tasksel

    • Check LAMP server

    • Enter new MySQL root password when asked

Test the LAMP

Test Apache

  • Browse to your ip_address or hostname (servername) to view the default Apache page

  • Enable mod_rewite (needed for WordPress)

    • sudo a2enmod rewrite

    • service apache2 restart

Test PHP

  • sudo nano /var/www/html/phpinfo.php

    • Add <?php phpinfo(); ?>

    • Save

    • Browse to servername/phpinfo.php

  • Once it is working, remove the phpinfo.php file for security disclosure reasons.

    • sudo rm /var/www/html/phpinfo.php

Test MySQL

An easy way to test MySQL is to install phpMyAdmin.

  • sudo apt-get install phpmyadmin php-gettext

  • Browse to servername/phpmyadmin

  • Login with root and the password you set earlier

Test a Virtual Host

A virtual host on Apache allows Apache to serve websites for multiple domains of a single IP address or interface. I’ll demonstrated this using a simple configuration for the atest.com domain.

  • Review the default config, less /etc/apache2/sites-available/000-default.conf

  • sudo nano /etc/apache2/sites-available/atest.com.conf

  • Add the following to this new configuration

# start atest.COM
<VirtualHost *:80>
    ServerName atest.com
    ServerAlias *.atest.com
    ServerAdmin admin@atest.com

    DocumentRoot /var/www/atest.com/html
    <Directory />
        Options FollowSymLinks
        AllowOverride None
    </Directory>
    <Directory /var/www/atest.com/html/>
        Options Indexes FollowSymLinks MultiViews
        AllowOverride None
        Order allow,deny
        allow from all
    </Directory>

</VirtualHost>
# end atest.COM
  • Note the DocumentRoot is /var/www/atest.com/html. This is where all the website files will go.

  • Create a web viewable file

    • sudo nano /var/www/atest.com/html/index.html

<html>
  <head>
    <title>Welcome to A Test!</title>
  </head>
  <body>
    <h1>Success! This is a test page of the atest.com virtual host settings.</h1>
  </body>
</html>
  • Change file permissions of web accessible files. (Caution: this disables all executable files)

    • sudo chmod -R 644 /var/www

    • sudo chmod -R ug=rwX,o=rX /var/www

  • Enable the virtual host, sudo a2ensite atest.com.conf

    • to disable a site use `sudo a2dissite atest.com.conf `

  • service apache2 reload

Next, is to point the domain at the server’s IP address. This is not covered in this document but there are two methods.

  1. Use the hosts file

    • /etc/hosts on Linux

    • C:\\Windows\\System32\\drivers\\etc\\hosts on Windows

  2. Set your domain using DNS. This could be your domain registrar or a third party like DNS Made Easy.

Install WordPress

I prefer not to use the apt system for installing and maintaining WordPress. This application has frequent updates some of which may be critical, so install it from [WordPress](https://wordpress.org/download/) and enable automatic updates. This example uses the amaker.com domain.

Set your DNS or hosts file to point your domain to the server

  • sudo mkdir /var/www/amaker.com

  • sudo cd /var/www/amaker.com

  • sudo wget "https://wordpress.org/latest.tar.gz"

  • sudo tar -xzvf latest.tar.gz

  • Use WordPress as the root for amaker.com

    • sudo mv wordpress html

  • WordPress itself will need to modify files for configuration and updates

    • sudo chown -R www-data:www-data html

Configure a Virtual Host

  • Similar to the previous virtual host except that now we must permit mod_rewrite and .htaccess files.

    • sudo nano /etc/apache2/sites-available/amaker.com.conf

  • Add the following to this new configuration

# start amaker.COM
<VirtualHost *:80>
    ServerName amaker.com
    ServerAlias *.amaker.com
    ServerAdmin admin@amaker.com

    DocumentRoot /var/www/amaker.com/html
    <Directory /var/www/amaker.com/html>
            Options -Indexes +FollowSymLinks +MultiViews
            AllowOverride All
            Order allow,deny
            allow from all
    </Directory>
</VirtualHost>
# end amaker.COM
  • sudo a2ensite amaker.com.conf

  • sudo service apache2 restart

Add a database

Here we will add a user and database to be used by our WordPress installation. For this example the user and database are named amaker.

  • Login to phpMyAdmin

  • Click the User Accounts tab

  • Add a new user

    • Make certain to check the box Create database with same name and grant all privileges

Configure WordPress

The configuration is now completed mostly via browser.

  • Browse to servername

  • Follow the prompts and enter the required information

  • Login and browse your new WordPress site

Backup WordPress Files

Backing up files is an important task in maintaining and recovering your WordPress site. Plugins are available to help with this task but I prefer to automate this task with scripts. First, I will run through the manual process and then share the scripts.

Assumptions:

  • WordPress sites are located in /var/www. This example uses the site test.

  • Git cloned repositories are located at /var/local/repos. The example site uses test.

  • Offsite backup is stored to an AWS S3 bucket. This example uses bucket-repos

Manual Initializing Run

Git and Clone the Site

  • sudo -i

  • cd /var/www/test

  • git init /var/www/test

  • git add --all

  • git commit -m "Initial commit of site"

  • mkdir /var/local/repos

  • git clone /var/www/test /var/local/repos/test

  • exit

Since some files contain sensitive information (particularly wp-config.php), the clone should not be public.

Sync the Clone to AWS S3

You now have a local clone of your current WordPress files. Now lets sync the clone to AWS S3. This method uses AWS CLI, see more at https://aws.amazon.com/documentation/cli/.

Some of the python scripts also make use of boto3. See https://github.com/boto/boto3 and https://boto3.readthedocs.io/en/latest/.

  • Install boto3

    • sudo pip install boto3

  • Since we have already configured AWS CLI, boto3 is already configured also.

Setup the S3 bucket

  • Make a new bucket, aws s3 mb s3://bucket-repos

  • List all buckets, aws s3 ls

  • List the contents of a bucket, aws s3 ls s3://bucket-repos

Synchronize the contents of the local repository with the S3 bucket.

  • sudo aws s3 sync /var/local/repos/test s3://bucket-repos/test --delete

  • Verify, aws s3 ls s3://bucket-repos/test/html/

Creating a Scripted Initializing Run

Now let’s automate that whole process by creating a shell script. First we’ll create some text files that are used as input to the script. All of the text files are most easily created automatically, though each may be edited manually. The text files hold one site name per line.

Already Initialized Sites

The first file is s3ed_sites.txt. This is a list of sites already run through this process. The entries are auto-generated using the following make_s3ed.py script. Call the script with python make_s3ed.py > s3ed_sites.txt

make_s3ed.py
#!/bin/python
"""
make_s3ed.py: Connects to S3 bucket used for repositories to list all currently synced sites. The output is often piped to s3ed_sites.txt
How to call:
python make_s3ed.py  > s3ed_sites.txt
"""

import boto3

S3BUCKET = "ggis-repos"

client = boto3.client('s3')
paginator = client.get_paginator('list_objects')
result = paginator.paginate(Bucket=S3BUCKET, Delimiter='/')
for prefix in result.search('CommonPrefixes'):
    print(prefix.get('Prefix')[:-1])

All Sites

The second file is all_sites.txt, a list of all sites in /var/www that need to be initialized. This can be auto-generated by using a command such as ls /var/www > all_sites.txt.

ToDo Sites

The third file is todo_sites.txt. This file is just the difference between s3ed_sites.txt and all_sites.txt. The following python script, all-s3ed.py, creates this third file. Call the script by python ./all-s3ed.py

all-s3ed.py
#!/bin/python
"""
all-s3ed.py: Generates the set difference to create the todo_sites file.
"""

file1 = "s3ed_sites.txt"
file2 = "all_sites.txt"
file3 = "todo_sites.txt"

with open(file1) as f:
  done =  f.read().splitlines()

with open(file2) as f:
  all =  f.read().splitlines()

todo = set(all).difference(done)
with open(file3, 'w') as f:
  f.write('\n'.join(todo))

All Sites Intersection With S3 Sites

A list of sites that are available on the server and are also synced to S3 will be needed. The python script make_all_intersect_s3.py_ is piped to create the file all_intersect_s3.txt. Call it with python make_all_intersect_s3.py > all_intersect_s3.txt

make_all_intersect_s3.py
#!/bin/python
"""
make_all_join_s3.py: Generates the set intersect to prints the file list.
"""

file1 = "s3ed_sites.txt"
file2 = "all_sites.txt"


with open(file1) as f:
  done =  f.read().splitlines()

with open(file2) as f:
  all =  f.read().splitlines()

intersect = set(all).intersection(done)
for item in intersect:
    print(item)

Perform the Initial Git and Sync to S3

Now we use a bash script to control the flow and call various commands. The script is init_git_then_s3.sh. This file is called by sudo ./init_git_then_s3.sh $(cat todo_sites.txt)

init_git_then_s3.sh
#!/bin/bash
#
# This script must be run as sudo.
# This script expects site names input from a file.
# Call this script thusly:
# sudo ./init_git_then_s3.sh $(cat todo_sites.txt)

# Some variables that you may choose to change.
SITEDIR="/var/www/"
REPOSDIR="/var/local/repos/"
S3BUCKET="ggis-repos"

SITES="$@"
for f in $SITES
do
    cd $SITEDIR$f
    git init $SITEDIR$f
    git add --all
    git commit -m "Initial commit of site"
    git clone $SITEDIR$f $REPOSDIR$f
    aws s3 sync $REPOSDIR$f s3://$S3BUCKET/$f --delete

done

Performing the Initializing Run

Let’s put it all together.

  • python make_s3ed.py > s3ed_sites.txt

  • ls /var/www > all_sites.txt

  • python ./all-s3ed.py

  • sudo ./init_git_then_s3.sh $(cat todo_sites.txt)

  • Manually verify that all that your sites are now synced to S3.

  • Update python make_s3ed.py > s3ed_sites.txt

  • python make_all_intersect_s3.py > all_intersect_s3.txt

Later, we will create a cron backup script that will run automatically.

Ongoing File Backup

  • Perform an initializing run to capture any new sites that have been added.

Then

  • python make_s3ed.py > s3ed_sites.txt

  • ls /var/www > all_sites.txt

  • python make_all_intersect_s3.py > all_intersect_s3.txt

  • sudo ./regular_git_then_s3.sh $(cat all_intersect_s3.txt)

    • this script is listed below

  • Verify on S3 that everything worked as planned

regular_git_then_s3.sh
#!/bin/bash
#
# This script must be run as sudo.
# This script expects site names input from a file.
# Call this script thusly:
# sudo ./regular_git_then_s3.sh $(cat all_intersect_s3.txt)

# Some variables that you may choose to change.
SITEDIR="/var/www/"
REPOSDIR="/var/local/repos/"
S3BUCKET="ggis-repos"

SITES="$@"
for f in $SITES
do
    cd $SITEDIR$f
    git add --all
    git commit -m "Regular commit of site"
    cd $REPOSDIR$f
    git pull
    aws s3 sync $REPOSDIR$f s3://$S3BUCKET/$f --delete

done

Backup WordPress Databases

Here we perform many of the same tasks only for databases.

Assumptions

  • WordPress sites are located in /var/www. This example uses the site test.

  • Database dumps are saved to /var/local/dbdumps. The example site uses test.

  • Database names, usernames, and passwords are stored in the various site wp-config.php files.

  • Offsite backup is stored to an AWS S3 bucket. This example uses bucket-repos

Manual First Run

  • sudo -i

  • mkdir /var/local/dbdumps/

  • less /var/www/test/html/wp-config.php

    • Find the lines which define the following:

/** The name of the database for WordPress */
define('DB_NAME', 'dbtest');

/** MySQL database username */
define('DB_USER', 'usertest');

/** MySQL database password */
define('DB_PASSWORD', 'password');
  • Dump the database, mysqldump --user=usertest --password=password --opt --result-file=/var/local/dbdumps/dbtest.sql dbtest

  • Compress the file, gzip /var/local/dbdumps/dbtest.sql

  • Sync it to S3, aws s3 sync /var/local/dbdumps s3://bucket-repos/dbdumps --delete

  • Verify with aws s3 ls s3://bucket-repos/dbdumps/

Creating a Scripted Run

The hardest part of this script is getting the database information from wp-config.php. A python script should do the trick.

  • Find all files named _wp-config.php, find /var/www -name wp-config.php -print

  • Use that find command piped into the python script. sudo find /var/www -name wp-config.php | xargs sudo python mysql2s3.py

    • the script is listed below

mysql2s3.py
#!/bin/python
"""
mysql2s3.py: Pulls credentials from the files passed via command line.
Then it calls the mysqldump command to backup databases. Finally,
it syncs the backed up databases to an AWS S3 bucket
How to call:
sudo find /var/www -name wp-config.php  | xargs sudo python mysql2s3.py
"""

import sys
import re
import datetime
import subprocess

# Some variables that you may choose to change.
SITEDIR = "/var/www/"
DUMPDIR = "/var/local/dbdumps/"
S3BUCKET = "s3://bucket-repos/"

# Compile our regexes
DB_NAME = re.compile("DB_NAME.*'(.*)'")
DB_USER = re.compile("DB_USER.*'(.*)'")
DB_PASSWORD = re.compile("DB_PASSWORD.*'(.*)'")

# Build date string
todaystring = datetime.date.today().isoformat()

# Files to process passed in as command line arguments
thefiles = sys.argv[1:]

for filename in thefiles:
    with open(filename) as f:
        text = f.read()
    f.close()
    name_match = DB_NAME.findall(text)
    user_match = DB_USER.findall(text)
    password_match = DB_PASSWORD.findall(text)
    # print(name_match, user_match, password_match)
    result_file = DUMPDIR + name_match[0] + todaystring + '.sql'
    arg1 = '--user=' + user_match[0]
    arg2 = '--password=' + password_match[0]
    arg3 = '--opt'
    arg4 = '--result-file=' + result_file
    subprocess.call(['mysqldump', arg1, arg2, arg3, arg4, name_match[0]])
    subprocess.call(['gzip', result_file])

subprocess.call(['aws', 's3', 'sync', DUMPDIR, S3BUCKET, '--delete'])

Upgrade WordPress

Keeping up-to-date with the latest releases of WordPress and its various plugins and themes is important. WordPress is a very popular blogging platform with a history of vulnerabilities but also of quick patches. I have a tickler system that notifies me to check for updates every 3 weeks.

Always back up your files and databases before performing a WordPress version upgrade.

Via Admin Web Interface

Using the web interface is the easiest way for many users to upgrade their WordPress sites. It does require admin access to the site and also proper file mode settings. After each update step, verify that the site still functions as it should.

  • Login to your site as an Admin user

  • Navigate to Dashboard > Updates

  • Select which plugins to upgrade (all?)

  • Click Update

  • Return to the WordPress updates page

  • Select which themes to upgrade (all?)

  • Click Update

  • Return to the WordPress updates page

  • Under the heading An updated version of WordPress is available. click Update Now

  • Review and address any update messages that have been generated

Using WP-CLI

WP-CLI, http://wp-cli.org/, is a set of command-line tools for managing WordPress installations. You can update plugins, configure multisite installs and much more, without using a web browser.

Follow the installation instructions at https://wp-cli.org/docs/installing/
Running the sudo command with wp-cli will cause wp-cli to complain. Instead of sudo as the www-data user (which on my machines has no login privileges), I use sudo and then chown on all files back to www-data.

Plugins

  • Return list of plugin updates available, sudo wp plugin update --allow-root --all --dry-run --path=/var/www/tosamakers.com/html/

  • Update all plugins, sudo wp plugin update --allow-root --all --path=/var/www/tosamakers.com/html/

  • Chown files back to www-data, sudo chown -R www-data:www-data /var/www/tosamakers.com/html/

Themes

  • Return list of theme updates available, sudo wp theme update --allow-root --all --dry-run --path=/var/www/tosamakers.com/html/

  • Update all themes, sudo wp theme update --allow-root --all --path=/var/www/tosamakers.com/html/

  • Chown files back to www-data, sudo chown -R www-data:www-data /var/www/tosamakers.com/html/

Core

  • Check for core updates, sudo wp core check-update --allow-root --path=/var/www/tosamakers.com/html/

  • Update the core, sudo wp core update --allow-root --path=/var/www/tosamakers.com/html/

  • Update the database, sudo wp core update-db --allow-root --path=/var/www/tosamakers.com/html/

  • Chown files back to www-data, sudo chown -R www-data:www-data /var/www/tosamakers.com/html/

Scripted Use of WP-CLI

WP-CLI greatly eases updating of many WordPress sites. Scripts make it even easier. First, we run a dryrun script. If you notice any problems or anything that needs special attention in the dry run, handle it manually. For those sites not requiring special attention, we then run a full update script.

Dry Run

Execute the dry run script to determine what needs to be updated and whether any special attention (via unexpected messages) should be given to a site. Call this script via sudo find /var/www -name wp-config.php | xargs sudo python update_wordpress_dryrun.py.

update_wordpress_dryrun.py
#!/bin/python
"""
update_wordpress_dryrun.py: Dry runs updates on WordPress installations. The WordPress installation paths are passed via argument pipe.
How to call:
sudo find /var/www -name wp-config.php  | xargs sudo python update_wordpress_dryrun.py
"""

import sys
import subprocess

thefiles = sys.argv[1:]

for filepath in thefiles:
    print("----------")
    print("Updates available on " + filepath[:-13])
    subprocess.call('wp plugin update --allow-root --all --dry-run --path=' + filepath[:-13], shell=True)
    subprocess.call('wp theme update --allow-root --all --dry-run --path=' + filepath[:-13], shell=True)
    subprocess.call('wp core check-update --allow-root --path=' + filepath[:-13], shell=True)
    print("----------")

Automatic Update with WP-CLI

If any sites require special attention, handle them manually; otherwise, proceed to updating the sites via script. sudo find /var/www -name wp-config.php | xargs sudo python update_wordpress.py

update_wordpress.py
#!/bin/python
"""
update_wordpress: Runs updates on WordPress installations. The WordPress installation paths are passed
via argument pipe. After completion, it changes ownership back to www-data
How to call:
sudo find /var/www -name wp-config.php  | xargs sudo python update_wordpress.py
"""

import sys
import subprocess

thefiles = sys.argv[1:]

for filepath in thefiles:
    print("----------")
    print("Updates available on " + filepath[:-13])
    subprocess.call('wp plugin update --allow-root --all --path=' + filepath[:-13], shell=True)
    subprocess.call('wp theme update --allow-root --all --path=' + filepath[:-13], shell=True)
    subprocess.call('wp core update --allow-root --path=' + filepath[:-13], shell=True)
    subprocess.call('wp core update-db --allow-root --path=' + filepath[:-13], shell=True)
    subprocess.call('chown -R www-data:www-data ' + filepath[:-13], shell=True)
    print("----------")

Assuming the special attention sites have already been handled manually, this should run to completion without issue.

Cron Scripts

Next up

Additional References