Docker 101
Production team: “Hey, I think there is something wrong with the code”
Development team: “But, it works just fine on my laptop “
This problem may occur because of the difference in the computing environment between development and production. Docker is one way to solve this problem. Let’s discuss docker more further.
What is Docker?
Docker is a tool that can be used to automate the deployment of an application in a container so that the application can work in other environments. Docker can solve problems with inconsistent computing environments. With the docker, development, staging, and production can have the same environment.
Docker vs Virtual Machine
At first glance, Docker does look similar to a virtual machine. However, the two have differences. Here are the differences
Virtual Machine:
- Takes up a lot of memory space because you have to pre-allocate a certain amount of RAM
- Long boot-up time
- Compatibility issues when moving to a different platform
- The data volume cannot be shared
Docker:
- Takes up less memory space because you don’t need to pre-allocate RAM
- Boot-up time is faster because it does not need to boot os
- Portable
- Volume data can be shared and reused across multiple containers
Docker Components
Docker Registry
Docker registry is also known as a docker hub. The Docker registry is a storage for all docker images (or you can say, a git repo for docker images). Docker images can be saved as a public or private repository (like git). Just like git, you can pull or push from the docker hub.
Docker Images
Docker images are used for creating containers. The Docker image contains all dependencies for an application. We can pull or push (upload) the docker image to the docker hub.
Docker Containers
Docker containers are runtime instances of docker images. The Docker container will contain everything our application needs. In its implementation, a container may need more than a docker image.
Docker Compose
The docker-compose allows several applications to run in multiple containers with just a single command.
Docker Orchestration
Docker (container) orchestration is a process that automates the deployment, management, and scaling of microservices. Docker orchestration can assist with the automation process of managing and coordinating containers.
Docker orchestration can help automate the following:
- Scheduling containers
- Deployment containers
- Configuring the application where the container runs
- Scaling containers
- Sharing resources between containers
- Monitoring containers
How Docker Orchestration Works
Docker orchestration tools consist of docker machines (tools that determine hosts and install docker engines), docker swarm (a tool that will group multiple Docker hosts into a single host), and docker-compose (create the necessary containers and deploy multi-containers applications).
First, the configuration file will tell the Docker orchestration tool how to connect the container and the log storage area. Next, docker orchestration tools will group the deployment of containers and determine the correct host for those containers. Orchestration tools will then manage the container lifecycle based on the predefined conditions.
Benefits of implementing docker orchestration
- Easier scalability: with just a single command and only scaling certain functions without affecting the entire application
- Simplify deployment
- Simplify the installation process and reduce dependency
- Each application process is wrapped in a container to increase security (only specific resources are shared)
Implementation of Docker Orchestration
In my Software Programming Project course, my group’s CI / CD application job was done on a docker container. The Docker images we use on our backend project are python, sonar, and ruby. We use python because our backend application uses the Django framework. Whereas sonar is used to connect our application with sonar. Ruby is used because our backend application is deployed to Heroku which is a Ruby application.