A practical guide to continuous integration in PHP

At TheCodingMachine, we strive to develop projects with the best possible quality. In this article, we are presenting the tools we use internally to do continuous integration. Continuous integration is the process of running a set of automated tools on each commit to track average code quality and to warn the developer on potential errors in his/her code.
A practical guide to continuous integration in PHP
Article écrit par David Négrier

When it comes to continuous integration, there are many solutions out there. Many open-source developers will be used to a stack made of GitHub, Travis, Scrutinizer and Coveralls. This stack is absolutely great, but is made of closed source tools. In this blog article, we will present a 100% open-source stack. It works for us, but depending on projects, we can use completely different stacks. I’m sure there are other equally interesting stacks so do not hesitate to share other options with us.

 

Continuous integration at TheCodingMachine

At TheCodingMachine, we mostly use Gitlab for version control.

Gitlab has a very nice feature: it can do continuous integration natively, with the help of Gitlab CI.

We set up several « workers », that are configured to use Docker. This means that on each commit, Gitlab will start a Docker container, clone the Git repository in this container and launch a set of tests.

 

A basic example

Our first example will be very simple: run unit tests, using PHPUnit, at each commit.

The first task is therefore to set up unit tests.

For this, we will first add PHPUnit as a dependency of our project:

 

$ composer require --dev phpunit/phpunit

 

Now, let’s configure PHPUnit.

At the root of our project, let’s write a phpunit.xml.dist file.

phpunit.xml.dist

<?xml version="1.0" encoding="UTF-8"?>

<phpunit xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:noNamespaceSchemaLocation="http://schema.phpunit.de/4.1/phpunit.xsd"
         backupGlobals="false"
         colors="true"
         bootstrap="vendor/autoload.php"
         verbose="true"
        >
    <testsuites>
        <testsuite name="Test suite">
            <directory>./tests/</directory>
        </testsuite>
    </testsuites>
    <filter>
        <whitelist processUncoveredFilesFromWhitelist="true">
            <directory suffix=".php">src</directory>
        </whitelist>
    </filter>
</phpunit>

Done? Now, we need to write some unit tests for your project, in the tests/ folder.

Let’s run those locally and check it works.

$ vendor/bin/phpunit

 

Are we ok?

Automating unit tests with CI

Good, let’s automate this task with CI now.

 

To enable Gitlab CI, we need to add a .gitlab-ci.yml file at the root of the project.

This file instructs Gitlab about the base Docker image to use and the steps to set-up the image and the tests to run.

 

.gitlab-ci.yml

image: tetraweb/php:7.0

before_script:
- bash ci/docker_install.sh

test:
  script:
  - vendor/bin/phpunit

The first thing you will notice in this YML file is the « image » property. It is actually a Docker image.

When doing continuous integration, we try to use an image as complete as possible. The guys at Tetraweb built this very cool set of PHP image: tetraweb/php. These images are tailored for continuous integration. They are based on official PHP images, but add almost all known PHP extensions plus composer and a set of useful tools. So whatever your project, you know that this image will contain the extensions you are looking for. Pretty cool to save some time.

 

The next line you will notice is before_script.

This line points to a script that will set up the project (i.e. it will run Composer install and download a set of additional tools).

Here is a very simple version of this file:

 

ci/docker_install.sh

#!/bin/bash

# We need to install dependencies only for Docker
[[ ! -e /.dockerenv ]] && exit 0

set -xe

# We can enable extensions with this simple line. We need the ZIP extension to run for Composer
docker-php-ext-enable zip

# If your tests need a special php.ini setting, you can copy you own php.ini file here.
# This is useful for instance to increase the memory limit
cp ci/php.ini /usr/local/etc/php/php.ini

# Let's run composer install
composer install

 

Note: don’t forget to make this file executable:

chmod +x ci/docker_install.sh

 

Done? Let’s test this.

We simply need to commit and push our new files in a separate branch. Let’s now create a merge request in Gitlab.

You should see something similar to this:

merge_request_success

 

So far, so good. We have a basic continous integration mechanism set up. If the tests are failing, Gitlab will tell us and we might not want to merge this pull request.

But we can take one (big) step this further.

 

Running PHPStan

PHPStan is a static analysis tool for PHP code. It parses your code and tries to find flaws in the program logic (like a variable being used before being declared, or a function being called that does not exist…). It is not the ultimate tool for catching bugs but it has one huge advantage: it does not need a lot work to set up! No need to write unit tests, simply run the tool and it will output a list of potential errors! And it is highly customizable, which is always good.

So let’s set up PHPStan for our project:

$ composer require --dev phpstan/phpstan

Now, let’s create a phpstan.neon configuration file for PHPStan. This is quite similar to a YML file:

phpstan.neon

parameters:
    excludes_analyse:
        - %rootDir%/../../../src/**/Generated/*.php
    ignoreErrors:
        - '#Using $this outside a class#'

In this simple configuration file, we are not analysing any file in a « Generated » directory (most likely because this file was generated by a tool and not written by us).
PHPStan tends to find a number of false positive (it really depends on your project). In the « ignoreErrors » section, we write regular expressions that will remove false positives.

Finally, let’s configure a script in composer so you can trigger phpstan easily.

composer.json

{
    "scripts": {
        "phpstan": "phpstan analyse src/ -c phpstan.neon --level=4 --no-progress -vvv"
    }
}

This script can now be triggered easily by simply typing:

$ composer phpstan

In this script we are scanning the « src/ » directory. Notice that the level can go from 0 to 4 (0 being the less restricitive and 4 being the most aggressive setting).

If you are starting a new project, we recommend going with level 4. If you are adding this tool on an exiting project, start at level 0, fix all bugs there, and then increase slowly all checks.

Adding PHPStan to CI

Adding PHPStan to CI is a breeze. We just need to modify our .gitlab-ci.yml file to tell Gitlab CI to run it.

.gitlab-ci.yml

test:
  script:
  - composer phpstan | tee phpstan_results.txt
  artifacts:
    when: always
    expire_in: 1 month
    paths:
    - phpstan_results.txt

Here, you can notice that we are « piping » the PHPStan output in a file called phpstan_results.txt.

Just after, we declare a Gitlab « artifact ». An artifact is a file or a directory that will be zipped at the end of CI « build ». It can be easily downloaded from Gitlab. So if PHPStan returns errors, you can have a quick look at those easily.

merge_request_download

Improving the build speed

Build speed is something very important. When you perform a push on the Gitlab server, you aim for getting results as soon as possible. Anything above 5 minutes should be considered too long as you won’t get feedback in a timely manner and you’ll tend to forget about CI.

If your build tends to become too long, here are 2 tips to speed things up.

  1. create your own Docker image
    If you need to install a set of tools in the base Docker image (tools like html2pdf or non-standard PHP extensions that need to be compiled…), this can quickly take some time. You can then consider building your own custom Docker image for the CI of your project, that already contains all the useful tools.
  2. use hirak/prestissimo. Prestissimo is a « global » Composer plugin that allows Composer to download all its extensions in parallel. This is great to speed up the « composer install » phase that is usually quite long.

ci/docker_install.sh

composer global require hirak/prestissimo

 

Measuring tests code coverage

We will now improve our test suite by adding code coverage to our unit tests. On each build, we will generate a report that tracks line by line what is tested and what is not in your application.

We will generate this report in HTML, and also in the « clover » format (for later use).

 

.gitlab-ci.yml

test:
  script:
  - phpdbg -qrr vendor/bin/phpunit --coverage-html coverage/ --coverage-clover clover.xml
  artifacts:
    when: always
    expire_in: 1 month
    paths:
    - coverage
    - clover.xml

Notice how we added 2 options while launching the PHPUnit command and how we are putting the generated files in the artifact (for later download and analysis).

Also, we are now using phpdbg in front of PHPUnit in order to collect code coverage data. Note that XDebug should not be enabled when using phpdbg. XDebug can also provide code coverage, but in our experience it is far less stable than phpdbg for code coverage analysis.

 

Enhancing reporting: tracking code coverage and code quality

The previous step allows us to track code coverage, but in order to check code coverage, we need to download the artifact file, unzip it and check the generated HTML report inside. This is not something that will be done every day, or on every merge request. What would be really great is to have a summary of that report right in the merge request, as a comment. This way, before the merge, a user could easily check that the new code did not cause a drop in code coverage.

This is something that Gitlab cannot do natively, but hopefully, we have developed a tool for that.

We are proud to present thecodingmachine/washingmachine (the tool that will help you to write cleaner code!)

 

Ok, so what is this « washingmachine » exactly?

It’s a simple tool that runs at the end of your build script. It collects the « clover.xml » file generated by PHPUnit, analyses it and pushes a message in your PR to inform you about the code coverage percentage.

As a bonus, it will also scan the « clover.xml » file of the branch you are merging in in order to inform you of the variation of code coverage (did your changes are improving or decreasing code coverage?)

 

Even better, using « clover.xml », the washingmachine is tracking the C.R.A.P. score of your methods and letting you know which methods got better and which methods got worse. What is this C.R.A.P. score?

The Change Risk Anti-Patterns (CRAP) Index is calculated based on the cyclomatic complexity and code coverage of a unit of code. Code that is not too complex and has an adequate test coverage will have a low CRAP index. The CRAP index can be lowered by writing tests and by refactoring the code to lower its complexity.

PHPUnit documentation

So basically, the more complex your code, the higher the CRAP score. The better tested is your code, the lower is the CRAP score. Said otherwise: it is ok to have complex code (sometimes it cannot be avoided) but you need to test it correctly.

The washingmachine will let you know, right in the merge request comments the list of methods that have big variations in CRAP score.

 

Installing the washingmachine

The washingmachine needs to post comments on the merge request. To do so, it needs to be authorized to. It will therefore need a Gitlab API personal access token. You have it? Let’s add it to your project « secret variables ».

In Gitlab, go to your project page in Gitlab: Settings ➔ Variables ➔ Add variable

  • Key: GITLAB_API_TOKEN
  • Value: the token you just received in previous step

Done?

Now, let’s install and trigger the washingmachine as part of the CI build.

ci/docker_install.sh

cd /root
composer create-project thecodingmachine/washingmachine --stability=dev
cd -

And now, we simply need to call the washingmachine at the end of the build process:

.gitlab-ci.yml

test:
  script:
  - phpdbg -qrr vendor/bin/phpunit --coverage-html coverage/ --coverage-clover clover.xml
  after_script:
  - /root/washingmachine/washingmachine run -v

Shazam!

merge_request_washingmachine

Adding security checks

We will now check for known security vulnerabilities using the Sensiolabs’ Security-checker tool.

Security-checker is a tool that analyzes your composer.lock file and let you know if one of the libraries you are using has a known vulnerability.

Notice that this tool does not analyze your code for security vulnerabilities (unlike tools like Scrutinizer or RATS).

ci/docker_install.sh

# Installs Sensiolabs security checker to check against unsecure libraries
php -r "readfile('http://get.sensiolabs.org/security-checker.phar');" > /usr/local/bin/security-checker
chmod +x /usr/local/bin/security-checker

And now, we simply need to call the command as a part of the « script » section:

.gitlab-ci.yml

test:
  script:
  - security-checker security:check

If your composer.lock contains a library with known security issues, the build will fail.

 

Bonus: integration tests with MySQL or any other thrid party service

Often enough, your unit tests will actually be integration tests. They will require you to provide a database and perform tests using this database.

Out of the box, your container does not provide a database instance. However, you can easily add thid-party containers next to your container.

 

Here is a sample with MySQL.

 

First, declare you need additional services. This is done using the « services » key in .gitlab-ci.yml.

.gitlab-ci.yml

services:
- mysql:5.7

variables:
  MYSQL_ROOT_PASSWORD: root
  MYSQL_DATABASE: secret

 

In the example above, the « mysql » Docker image is added to our project. Variables passed to the image are provided in the « variables » section. In the case of the Mysql Docker image, it requires the MYSQL_ROOT_PASSWORD and MYSQL_DATABASE variables. See the documentation of the image you are using to learn more about the required variables.

Adding a « mysql » Docker image is not enough. We also need to enable the PHP mysql extensions.

ci/docker_install.sh

docker-php-ext-enable pdo_mysql

Finally, in your unit tests, the database is not in your main container but in a separate container that has its own IP address. The host address for this database is « mysql » (the name of the service).

 

Wrapping up

At this point you should have a project with continuous integration set up. Your continous integration will:

  • Run your unit tests
  • Check your dependencies for security issues
  • And best of all, send you feedback about code coverage and code quality, right in your merge request

 

You can view and download the final files from this Gist.

image description