Docker was released to the general public in 2013. Since then it has been constantly gaining popularity among developers – especially those well informed, who attend conferences and regularly update their knowledge. Implementing this tool opens new possibilities, slowly displacing solutions based on virtualization. Despite its many advantages, Docker implementation is also a challenge not only for the developers but also for other people responsible for the software and systems in the company. In this article, I will share with you what Docker is, why utilizing it would be beneficial and finally how to prepare for its implementation in your organization.
Docker is software that allows to develop and deploy applications using containers. We might say that is something between the virtual machine and the “bare” host system. It’s operating on so-called images that are packed and ready to use environments for our app. We run a pre-configured image using Docker and immediately have a running system (almost in 100% of cases it’s one of Linux distributions), which contains all the setting and dependencies necessary for our applications. What’s important is that although Docker uses the host resources, it isolates our app from it.
There are a few reasons why Docker is constantly gaining popularity. Using containers brings several benefits.
Firstly, it allows developers to have complete control over the environment in which the application will run. It’s only needed to include the configuration of the image in “dockerfile” files and add them to the repository. If the app requires any additional configuration to run in the system (for example installing additional packages) then all of it might be set and checked in the dockerfile.
Secondly, isolation of the environments in containers makes it unnecessary to control conflicts between different versions of the same libraries used in different projects. You don’t have to worry if on the server where the Docker is running some applications would have exclusive dependencies, because each one of them would have an independent environment, tailored to its requirements.
Having Docker files in repository also allows new developers to avoid installing additional dependencies needed to run the applications. All they have to do is configure their development environment and run the app in a container, without worrying whether they have the necessary libraries or appropriate versions of tools installed.
Docker makes testing new solutions much easier. Hence launching a new environment for app does not require adding another Virtual Machine, when the team wants to test different solutions to the problem, they can run multiple environments based on images from the repository in just few simple steps. What’s more Docker containers simplifies reacting to problems with the applications. If after implementation of the new version, we would decide to withdraw the changes all that’s needed to be done is changing the image in the container to the one with the previous app version. Correctly set image register would store the former version for a specified period of time.
Like with every piece of software implementing Docker is not only beneficial but also challenging. The first and obvious challenge regards the developers – the need to learn how to use this technology. It should not be a problem though if they selected Docker, they probably already know it or are willing to learn. But the developers alone are not enough. It’s necessary to prepare the whole organization to use this tool – at least from the point of view of broadly understood IT infrastructure in the company.
Since Docker runs on Linux, developers must either use *unix systems or have Windows 10 Pro, which includes a Linux subsystem, on their computers. Although this requirement should not shock anyone nowadays, it’s possible that in order to use Docker hardware or system updates would be mandatory.
As I mentioned it’s not just about developers and their machines. If applications are to remain on company servers, admins must also become familiar with that technology. Hence Docker would have to be installed on servers, these would also need to have a valid system installed. However, Docker is considered to be safe, it requires root privileges and is a thin layer between base systems and apps so the security and access aspects are important and can’t be overlooked. People responsible for the infrastructure should support developers and constantly monitor the environments in which Docker operates. Even though it’s said that Docker image prepared once would run without any problems anywhere where Docker operates, in case of the strict control of process permissions and minimalizing network traffic to minimum at critical points such as production servers it might happen that during first attempts the app in Docker “does not fit” into the set rules. To deal with these issues’ IT department should be aware of what technology is being used.
I want to mention also that libraries or tools that are dependencies of the particular app are downloaded directly into the container. That makes controlling what is used in the app more demanding. On the one hand, Docker would allow developers to check alternative tools much easier, but on the other, that makes the risk of using software that, due to for example license issues, is banned gets bigger.
Using Docker will make the developers and testers job easier and enable them to experiment, ultimately making the delivered software thoroughly tested and simply better. Docker is also one of the micro-services pillars so if that’s how your project architecture is built, implementing it will allow to use available resources more efficiently and reduce the costs of maintenance. However, it should be remembered that using Docker will not only force the developers to learn this technology and transfer a part of the responsibility for the environment in which the applications work to them. It is also a challenge for people responsible for IT infrastructure in the company, who need to adapt it to Docker. However, it’s worth to invest the time necessary to prepare for Docker implementation and transition to this technology as it’s going to pay off – making the following development work smoother and opening new possibilities for development teams.