Who are you, Docker?






"Docker" is a big buzzword these days. Everybody wants to use it, everybody wants to learn about it. but who is this mysterious whale?

Let's start by defining what is Docker. 

Docker is a tool that is designed to make it easier for us to perform virtualization tasks and to create deploy and run applications in a different way than we used to. With Docker, we can perform these tasks using Containers. (We will understand what that is in a bit).

Virtualization in a nut-shell.

Virtualization is a process of creating something Virtual rather than actual. This process allows us to create instances of operating systems, applications and environments without having them physically present on our work station or a dedicated remote machine.
There are several very popular and well-known solutions for virtualization purposes such as KVM, VMbox, Vagrant, Ovirt, etc...

So what is the hype around Docker and how is it different from a standard VM?

Well, Docker uses a different approach to virtualization. let me try to break it down for you with a fairly primitive example.
Let's say you are developing an application.
You have your application code, which is supposed to run on a Linux server. In order for it to run properly, you have to make sure that the target machine has a right OS, runtime Env, bins and libraries for your project and 3rd party dependencies installed. That is what we call "Provisioning" your environment.




So let's vaguely describe our steps to create an ecosystem for our application to run on. (Assuming that we skip the Virtual machine setup and OS installation that by itself takes a lot of time)

  • Install the runtime environment
  • Install the dedicated bins and libraries
  • Define proper environment variables and system path
  • Add our dependencies
  • Import our project code
  • Run our application
Assuming this part has gone successfully, many times due to environmental differences and issues the application would run fine on a VM but will produce unexpected behavior in production, staging or any other environment it was deployed on in the process.

So what was Docker's approach?

Docker's motto is "Build-Ship-Run".

In this approach, we can take everything we need (Bin's, Libraries, Project code, Provisioning script, etc...) and pack it into one single file (Dockerfile). This file is built into a Docker image. An image is basically a "Snapshot" if you will, or a template from which we can create multiple Containers. Once the image is ready we can ship it to a central repository (For example Docker-Hub). From there, Anyone with access can download this file and use it (Run the container).

What are the Containers?
A Container is a standardized unit of software that includes the application code and all of the necessary dependencies. Docker Containers run via the Docker Engine and abstract the operating system's Kernel - that means that you just need Docker installed on the dedicated station to run a Container. Unlike a full virtual machine, a Container uses the Host machine to manage its processes and can be set up, started, stopped and removed in a simple short command line operation.
What we receive is a more robust, lightweight, standalone package of software. This containerized approach also means that your application will run exactly the same everywhere you put it. It eliminates a lot of maintenance and troubleshooting Because it is self-contained inside the Docker container.


In this simple illustration that I found online, we can see how Docker container is significantly different from a VM.

There is no defined limit to the number of Containers you can run on a single station. A single VM or server can host as many Containers as its CPU and resources can handle.

Now let's see how the previous process can look like (Again assuming that we skip the machine setup and OS installation):


  • Install Docker
  • Run a single line terminal command. That command will download the desired image, create a container from it and turn it "On".


Granted, like everything in life Docker is not perfect and has its share of downsides and technical limitations that we will not cover this time.
I hope that this article has helped some of you better understand the mysterious blue whale. 

You can go ahead and download Docker from the official website. Have fun and never stop learning. :) 



Comments

Post a Comment

Popular posts from this blog

Sharing is caring - Intro to Jenkins shared libraries

Intro to Terraform and how it is related to test automation infrastructure

Test Automation, Security, and other vegetables