Categories
Technology

Docker: Using Great Technology to help Enterprises go faster

The team here at IntegrationWorks are all about automation and deployment of faster, more agile and robust integration platforms. In this three-part series we will highlight the importance of Docker and why it’s a game-changer. Our DevOps team have been using Docker for the last 12 months and in this series will explain how we use it for our Proof of Concepts’, R&D sessions and for building our testing solutions.

To technology enthusiasts and IT business leaders who may not be familiar with Docker – we advise you to take note. Docker is an open-source project that automates the deployment of applications inside software containers, or as according to the Docker website:

“Docker containers wrap up a piece of software in a complete file system that contains everything it needs to run: code, runtime, system tools, system libraries – anything you can install on a server. This guarantees that it will always run the same, regardless of the environment it is running in.” (accessed from Docker.com)

So why is it so important? Because in todays’ development team, collaboration and simplicity are key to quicker, faster and nimbler application development. Docker facilitates this by providing a common framework for developers and other IT teams to collaborate on applications, enhance their devOps capability and innovate faster with greater control.

For non-technical business leaders this might be relevant to you if you have heard the following:

  • “We can’t deploy because the environment isn’t ready”
  • “It worked in the development environment but doesn’t work in test”
  • “Or “it worked in test but it doesn’t work in production”

Or how many times have you heard developers and other IT teams complaining about the deliveries and tools used by other teams? 

Docker tackles these common problems, simply by allowing teams to spin up build environments in seconds and reducing environmental issues. Docker can be used by a multitude of teams thus reducing team frictions and tribal warfare in the office.

So, evidently, if you can build environments faster and increase quality through each stage of the SDLC then this is going to save you time and cost.   

Now, we’re doing to get little more technical here on the benefits for Docker.

Docker allows developers to have the freedom to define environments and deploy apps faster in a containerized environment that is pushed out to IT ops – who in return, have the flexibility to respond quickly to testing and change. Developers own all the code from the infrastructure to the app and IT ops have the ability to standardize, secure and scale the operating environment. Additionally, Docker allows teams to work in portable environments – so shifting from laptops, to team environments or from public infrastructure to public clouds couldn’t be easier.

From a technical and business standpoint, Docker solves a myriad of problems frequented in the developer/deployment world. Collaboration across teams means less downtime and meetings, containerized environments mean low-risk test deployments with easy access, application flexibility and portability means remote workforces can collaborate all over the world – the list goes on.

Additionally, Docker is integrated into major cloud providers like Amazon Web Services, Google Cloud Platform and Microsoft Azure, so containers can easily mould and interchange with existing infrastructure and cloud environments.

Sounds like a game changer right?

Based on the awesomeness of this platform, we’ve been using Docker for the last 12 months, using it to develop POC’s, automate and enhance builds, and automated and increase the scale of our testing.

Docker for Proof of Concepts:

Our DevOps team use Docker to be able to quickly spin up and make available software for Proof of Concepts.

For technologists, Docker allows them to quickly pull down and start up an environment (in a matter of minutes) without having to figure out and go through the process of manually unpacking, configuring and installing it. To quote one of our Senior Developers, Rhys, his professional opinion – “Using Docker is like a package manager on steroids”.  In a recent integration visualization POC, the team built a container running Influx DB (https://influxdata.com/time-series-platform/influxdb/) + Grafana (http://grafana.org/) saving them time and effort in manually installing and managing dependencies across the two solution components.  

Docker for Builds

We use a number of build tools at IntegrationWorks, including Maven and Gradle, for which there are several Docker plugins.  These plugins allow build tools to build and run Docker images.  For example, for Gradle we have been using – https://github.com/Transmode/gradle-docker.   Like other plugins, the Gradle plugin allows the developer to define their Dockerfile (or DSL based configuration) within their source code repository and make it part of the build process.  Rhys says “It works really nicely for micro framework (Spring Boot etc.) based development as you can completely self-contain your microservice within a Docker container and quickly distribute/deploy it to your target machines”.

Docker for Test

There are many forms of test, ranging from unit test to user acceptance – at IntegrationWorks we’ve worked with them all over the years.   These days we are passionate about building Continuous Delivery pipelines that include industrial-scale functional and non-functional test capabilities as early as possible in the SDLC.  For the most advanced teams, this allows developers to run complete suites of functional tests, regression tests, performance tests, and even security tests against their code even before it gets deployed to a formal (and often inflexible!) test environment.  Building this kind of test capability has immense benefits in terms of agility, quality and and cost reduction.  But it requires highly dynamic environments that are defined as code and built and managed by developers, hence our use of Docker.  Docker allows us to spin up, scale, and tear down environments in seconds including containers for the system under test, containers for test execution, containers for virtual services, and more.  Without Docker, and reliant on inflexible environments, industrial-scale test automation is either slow or very expensive.  Docker and testing go together very well and there are many options and tools that can be used – so we’ll cover this in greater detail in a later blog post.

We are so passionate about the benefits of Docker that we help our clients to benefit from it too.  When clients use Docker well, it makes our job easier. 

In our next two blogs, we’ll be discussing:

  • Further details on how we use Docker for testing 
  • Solutions we use for managing multi-container stacks
  • Container management, security, licensing and integration with other tools
  • Using Docker for Production 

If you have some questions on how your team can incorporate Docker or maximize the potential of it for your latest project, the team at IntegrationWorks can consult with you from executive strategic briefings, defining Docker in the context of your enterprise environment, training and identifying costs vs trade-offs for your business.

One reply on “Docker: Using Great Technology to help Enterprises go faster”

Leave a Reply