Docker Game Changer Part 2: Using Docker to help our clients go faster

The team here at IntegrationWorks are all about automation and deployment of faster, more agile and robust integration platforms. In this three-part series we will highlight the importance of Docker and why it’s a game-changer. Our devOps team have been using Docker for the last 12 months and in this series will explain how we use it for Automated Testing and deploying in multi-containers with Docker Swarm, Docker Compose and Kubernetes.

To technology enthusiasts and IT business leaders who may not be familiar with Docker – we advise you to take note. Docker is an open-source project that automates the deployment of applications inside software containers, or as according to the Docker website:

“Docker containers wrap up a piece of software in a complete file system that contains everything it needs to run: code, runtime, system tools, system libraries – anything you can install on a server. This guarantees that it will always run the same, regardless of the environment it is running in.” (accessed from

So why is it so important? Because in todays’ development team, collaboration and simplicity are key to quicker, faster and nimbler application development. Docker facilitates this by providing a common framework for developers and other IT teams to collaborate on applications, enhance their devOps capability and innovate faster with greater control.

If you’re still a little lost about Docker – then click on our Part 1 series here.

In this particular blog, we’ll be covering how we use Docker for automated testing and multi-containerisation. 

Docker for Automated Testing

There are many forms of test, ranging from unit test to user acceptance and at IntegrationWorks we’ve worked with them all over the years.   We are passionate about building Continuous Delivery pipelines that include industrial-scale functional and non-functional test capabilities as early as possible in the SDLC.  Without Docker, and reliant on inflexible environments, industrial-scale testing is either slow or expensive.   Even for IT organisations half way on the Continuous Delivery journey, Docker enables developers to run complete suites of functional tests, regression tests, performance tests, and even security tests against their code even before it gets deployed to fixed/inflexible (non-Continuous Delivery) test environments.  Building this kind of automation test capability has immense benefits in terms of agility, quality and and cost reduction.  However, it does require highly dynamic environments that are defined as code that is built and managed by developers, hence our use of Docker.  Docker allows us to spin up, scale, and tear down environments in seconds including containers for the system under test, containers for test execution, containers for virtual services and more.   But Docker alone is not enough, we need to manage multiple containers to achieve the best automation test and Continuous Delivery benefits.


Docker is a wonderful technology but a single container doesn’t fulfil solution requirements or automation test framework requirements. 

The first thing we need is some way to scale Docker containers.  Docker containers are awesome at holding components of a solution stack (e.g. a microservice) but they are strapped for containerisation at scale. The challenge here is to find some way  of horizontally scaling them (e.g. adding duplicate containers to manage more load). 

The second thing we need is a way to manage a set of different containers that make up a way of a solution. Multiple containers require different run-time management.  Docker containers are awesome at holding components of a solution stack (e.g. a microservice) but when that becomes more complex with multiple components (i.e. multiple microservices) then we run into a bit of complexity.  

There are a number of solutions to these needs, including

  • Basic feature of your PaaS or SaaS which can be chosen from your existing solution and implemented in. It may however, be limited in capabilities.
  • Kubernetes
    Kubernetes is created by Google and will eventually be container agnostic, meaning it won’t only allow Docker containers but other containers such as RKT. Kubernetes creates everything as a resource and these are described in YAML files. It is cloud provider aware, meaning that it can automatically perform tasks such as mounting persistent volumes and provisioning load balancers.
  • Docker Swarm
    Docker Swarm is a different approach to multi-containers by exposing the standard Docker API and allowing any tool that you have used to communicate with Docker (i.e. Docker CLI, Docker Compose or Dokku). Being able to use familiar tools allows flexibility, however it is only within the limitations of Docker API and some intuitive coding needs to be performed.
  • Docker Compose
    Docker Compose is a tool for stitching together multiple Docker containers and creating full stack deployments which can include any number of Docker containers. Running a single Docker Container is fine but managing multiple related Docker containers which need to talk to each other can become a nightmare. Using Docker Compose will handle the creation of virtual networks for all your Docker Containers, so you can continue to communicate amongst the containers as if they were individual machines. It also provides a consistent way to define the various run parameters which need to be passed to the Docker run command, allowing you to version your entire Docker configurations and provide an easy way to run that “Docker-Compose up”.
  • You can even roll your own or design a custom solution that is engineered to allow scale and control management.  Not usually recommended given the less laborious options above. 

These solutions themselves can be augmented with other tools to aid the management of containers with your other systems.  For instance, the Stackwork framework ( which incorporates Docker Compose into your Gradle build for full stack integration testing.  So if your application requires a Database Service, MQ server, caching layer then you can specify all these integration testing dependencies within Docker Compose files and before running your integration tests and it will bring up all of the dependent services available for your application to connect to. These can be started up in a matter of seconds and torn down again after the tests are complete allowing for a more efficient means of testing complex integration scenarios.

We are so passionate about the benefits of Docker that we help our clients to benefit from it too.  When clients use Docker well, it makes our job easier. 

In our final blog in this series, we’ll be discussing:

  • Container management, security, licensing and integration with other tools
  • Using Docker in Production 

If you have some questions on how your team can incorporate Docker or maximize the potential of it for your latest project, the team at IntegrationWorks can consult with you from executive strategic briefings, defining Docker in the context of your enterprise environment, training and identifying costs vs trade-offs for your business.

Leave a Reply