Why do we need Docker?
Imagine, If I am giving 10 GB of RAM to all my VMs and I have 6 microservices running on the different Virtual machines. In that case these VMs will require 60 GB of RAM. Now to configure host machines almost 65 Gb of RAM should be there in my host machine. Obviously, this is not a good idea for such an architecture as I am wasting a lot of resources here. But, if we use Docker here, we can containerize the application and there is no virtual machine required. By using Docker, we can save the resources, and hence use resources efficiently.
Why use Docker
I guess efficiency is the most important reason to use the docker. Docker Containers run on the Kernel Level. It means containers can share resources efficiently.
Docker Architecture
Docker uses client-server architecture. These are the essential components:
Docker ClientDocker HostDocker DaemonDocker Repository
Let us understand these in brief:
Docker Client is used to manage the components of the Docker Daemon like: images, containers, data volume etc.Docker host is the machine where the docker engine is installed.Docker Daemon acts like the brain of the whole operation. It evaluates the request, talks to underlying operating system and provisions the container.Docker Registry is the type of a store from where we can pull any images. The most popular open communities for the Docker image are nginx, Apache HTTP server.
Apart from these following are the important things:
Docker Image: Docker image is the template that is used to build the container. To download the docker image run the following command in docker container: docker run imageto remove docker image run this command: Docker rmi image
Docker Container: It is the instance of the Docker image. The basic purpose of the Docker is to run the container. Running of containers is managed by using docker run command.
Advantages of Docker
Build, Ship, and run any App, anywhere: By using Docker you can build, ship any App on any platform such as Windows, Linux etc.
Greater efficiency: As I told you earlier, it runs on the kernel level. So, it provides better efficiency.
Isolation: By isolation, I mean that dependencies or setting within a container will not affect any installations or configuration on your computer that are previously installed, or any other container that may be running. By using separate containers for each of the application, we can avoid conflict dependencies.
Environment Management: Docker makes it easy to maintain different environment for different purpose. Here the environment means we can create a separate container for testing, development and production on the same Linode and easily deploy each one separately.
Continuous Integration: Docker works well as a part of the Continuous integration pipelines with various tools like Travis, Jenkins etc.
Disadvantages of Docker
Under Development: Docker is yet not fully developed in the market. It is under the growing technology. Some of the feature request are under progress like container self-registration, and self-inspects, copying files from the host to the container, and many more.
Future Scope
Many times, we think of whether Docker is the future of Virtualization? Yes, it is the future. If you see the above example as I have explained, there is a lot of memory wastage and, we cannot re-allocate the used memory as it is blocked. This is a major issue because RAM is costly. So, how can we avoid this problem? To avoid this problem, we can use docker. If I use Docker, my CPU will allocate exactly the amount of memory that is required by the Docker Container and no memory wastage. You can download Docker from here. Feel free to contact us if you have any query regarding this topic.