Containerization is a famous practice that is being implemented in most of the application. Industries are moving to deploy the application in form of containers rather than with the traditional on-premise method. In this blog, we’ll understand the benefits that containerization provides to Developers as well as DevOps. Let’s begin with the journey
What is Containerization?
This is one of the most popular concepts being followed in the IT sector for the past few years. After introducing Docker, a containerization platform, the containers have evolved to a larger ecosystem. Let’s understand the reason behind their popularity.
In general, a container is a collection of software unified by a namespace, which has the access to operating system kernel, which is shared among multiple containers. The container runs in an isolated environment and has no/very little access between them. Docker adds to this definition by saying that a container is a runtime instance of a Docker image. A running docker container has 3 things included:
- A Docker image
- An execution environment
- A standard set of instructions
If you are from an Object-Oriented world, you can use the analogy of classes and objects to understand this topic. Docker images are like classes, where containers are the running instance of those images, just like objects.
A Docker Ecosystem mainly consist of 3 components
- Docker Engine:
The Docker engine provides the runtime environment to the containers. This also acts as the packaging tools which helps in building the docker images.
It must be installed over the host operating system where the images are actually going to be executed.
- Docker Hub:
It is an online cloud service provided by docker which acts as a repository for storing Docker Images which can be shared among multiple users.
- Docker CLI:
This component is responsible for making the API calls to the Docker Engine. These calls are used to manage the container lifecycle over the Docker Engine.
Benefits of Containerization
- The main benefit a developer would get is a standard for developing and packaging an application.
For eg: For a java developer, a jar (artifact for java application) would require the same version of java to run, with which it was build.
- Testing an application and verifying its dependency with other application modules would be as easy as running a simple
docker run <dependent-container-image>
- Developers get an easy support for new Microservice architecture.
- Another major benefit is that it alleviates the platform compatibility issues.
- Using containers simplify release management.
- The deployments are much more reliable and speedy, which ultimately helps in frequent releases.
- Application lifecycle is simplified as now application is configured ones and ran multiple times (scaling up the replicas of container)
- Environments consistency (Dev, QA and Prod) can be easily achieved.
Containers: A common language between Dev and Ops
In a containerized environment, If any issues come up in any of the environments (especially production) the issues can be debugged easily at the container level and then communicated to the Dev team for a (quick) fix.
The Dev team will also be easily able to replicate the problem since the container will run in an isolated environment every time and this eliminates the runtime issues or host compatibility problems.
Container Lifecycle in Docker
Talking about the container lifecycle, we’ll see how a container is born and then went to a deletion state. The foremost requirement for creating a container is a Docker Image.
A Docker Image is a template that gets instantiated to get a running container. Docker Image can be available to us in 2 ways. Either we can get it from Docker Hub, an image repository, or build our own images using Dockerfile.
Dockerfile is a file where we mention all the packages and dependencies that need to be present for running our application. This is the concept which actually makes the container independent as everything is made available within the container and it doesn’t have to share anything with the host OS. I’ll create a separate blog post to include the steps and best practices for writing a Dockerfile.
Now after we have the Docker Image, we can simply run that image to get a Docker Container out of it. That container is actually the running instance of our application.
A Quick Hands On
- Docker Installation: If you don’t have docker installed on your system, follow this documentation. It’s pretty much straight forward.
Note: After completing the installation, follow the Post-Installation steps which will be applied only after a reboot. This will help you in running docker-cli without sudo permissions.
- DockerHub Account: Make sure you have an account over DockerHub, If not, sign-up here.
After reading this blog, you’ll now be with what containerization is and what are benefits the teams are getting at building as well as deploying the application as containers. Also, with the quick start I’ve provided, you will have a hands-on over most of the docker commands, and after a little practice, these commands will be very handy for you.
Still, if you have any doubts/suggestions, you can contact me directly at firstname.lastname@example.org
Also, I would like to thank you for sticking to the end. If you like this blog, please do show your appreciation by giving thumbs-ups and share this blog and provide suggestions on how can I improve my future posts to suit your needs. Follow me to get updates on different technologies.