June 29, 2018

Boost MySQL speed of Magento 2 integration tests

Yireo Blog Post

Magento 2 is built in a modular way and one of the benefits here is that you can easily (?) add your own unit and/or integration tests, to guarantee the quality of your own code. When running integration tests, a test database is set up. However, this process is kind of slow. Here is a little trick to dramatically increase the performance of MySQL, so your integration tests complete within seconds.


Simply run 2 docker commands to run MySQL entirely in tmpfs. See the commands below. And then modify your Magento configuration to point to this database. Done.

Setting up integration tests

This blog assumes that you have already setup Magento for running integration tests: You have modified the phpunit.xml file, added your database settings to the install-config-mysql.php file, etcetera. There are already excellent blogs out there, helping you with this setup.

What these blogs mention a lot of times is that you can speed up the running of tests by disabling the flag TESTS_CLEANUP in the phpunit.xml file. However, this was not good enough for me. Still, running the tests was taken too much time. So I went ahead into tuning MySQL to the max.

Tuning MySQL is not a huge benefit

Tuning MySQL is simple: The more memory you assign to it, the faster MySQL runs. However, tuning InnoDB or the query_cache (though it is deprecated, it is still really powerful in development environments) are assuming that there needs to be a fine balance between performance and database integrity.

However, with integration tests, this is a different story: The more performant the database, the more production you become. And we don't care about database integrity: Once the tests are run, the database could be thrown away. It is just a temporary resource, only needed while testing.

This leads to the conclusion that if all of MySQL is run in memory, performance is at a maximum. This can be done by either moving the MySQL folders (in my case, /var/lib/mysql) into a RAM disk or into tmpfs.

Using a docker image for tmpfs

Moving MySQL to tmpfs is possible. However, on the same system where I'm running integration tests, I'm also running my development environments, with databases that should stick. I quickly decided that I wanted to use Docker for this.

As a developer, you always try out things yourself. So I went ahead with creating a plain Dockerfile, adding MySQL to it, swapping it out with a tmpfs-based filesystem, disable AppArmor because it was preventing things from working.

However, after doing all of this, I found out that the official MySQL images for Docker already support this behaviour out of the box. Bummer. So here are the instructions to get going with the official MySQL Docker.

Creating a network - for a static IP

While creating the actual container only requires 1 single command, I quickly found out that the default assignments of IP were rather annoying: Every time a new instance of Docker is fired, a new DHCP address is given and the Magento configuration needed to be started.

Instead, I wanted to have a static IP, so that I only needed to modify my Magento configuration once. To do this, you need to define your own custom network:

docker network create --driver bridge magento --subnet

With this command, you'll create a magento network that is used in the second command below.

Creating the docker container

Now that a custom network is created, you can run the actual Docker image. No need to prepare anything because Docker will download the right image (mysql with version 5.7 in this case) automatically for you:

docker run \
--rm \
-p 3306 \
-e MYSQL_DATABASE=magento2 \
-e MYSQL_SQL_TO_RUN='GRANT ALL ON *.* TO "root"@"%";' \
--tmpfs=/var/lib/mysql/:rw,noexec,nosuid,size=600m \
--tmpfs=/tmp/:rw,noexec,nosuid,size=50m \
--net=magento \
--ip= \

Port 3306 is exposed to the outside world (your host). The database credentials are something you can customize if you want. The MYSQL_DATABASE constant also allows for an empty database magento2 to be created when the Docker container comes alive.

The main trick here is the usage of a tmpfs folder for all of the databases that MySQL uses (var/lib/mysql). In my case, 600Mb was large enough for the database. But you might need to increment this.

Once the command runs, it should show a process of the MySQL server listening to a port.

Testing if it works

You can now open up a MySQL client that connects to this server:

mysql --host= --port=3306 --user=root --password=root

If this works, you can start playing with it. If you are done and you want to shut down the container, first locate its container ID and then use that stop the container:

docker ps
docker stop <container_id>

Modify the Magento configuration

Last but not least, we can now put the optimized MySQL image to use. Open up the file install-config-mysql.php again and modify the database settings:

return [
    'db-host' => '',
    'db-user' => 'root',
    'db-password' => 'root',
    'db-name' => 'magento2',

And it is fast

The end result is cool: It takes - in my environment - less than 2 seconds to run a basic set of integration tests of one of my own Magento extensions. And this is such an improvement that I've moved the 2 Docker commands to a script, which is included in my rc.local which is run at every boot.

I hope you can benefit from this little trick as well!

Posted on June 29, 2018

About the author

Author Jisse Reitsma

Jisse Reitsma is the founder of Yireo, extension developer, developer trainer and 3x Magento Master. His passion is for technology and open source. And he loves talking as well.

Sponsor Yireo

Looking for a training in-house?

Let's get to it!

We don't write too commercial stuff, we focus on the technology (which we love) and we regularly come up with innovative solutions. Via our newsletter, you can keep yourself up to date on all of this coolness. Subscribing only takes seconds.

Do not miss out on what we say

This will be the most interesting spam you have ever read

We don't write too commercial stuff, we focus on the technology (which we love) and we regularly come up with innovative solutions. Via our newsletter, you can keep yourself up to date on all of this coolness. Subscribing only takes seconds.