Why Docker is a winner over VMs

Virtualization - Software Defined X

User software has been used for many years either on "bare metal systems" or on a virtual machine (VM). The first variant requires an operating system that has full control over the underlying hardware. The second option also requires an operating system to control the underlying hardware, but here several operating systems share the hardware resources.

Serious disadvantages of the software installed on dedicated computers with a fixed operating system are the lack of flexibility - it is very difficult to port it to other systems - and the time-consuming updating of the applications, including the runtime environments. These two constraints make it difficult for IT to react flexibly to changes in business requirements.

Virtualization consolidates IT environments

With virtualization technology, the rigid, monolithic IT structure is being broken up somewhat. The virtualization platforms, also called hypervisors, made it possible for several virtual machines to share a single physical system. Each VM imitates the behavior of an entire system with its own operating system, memory and I / O in isolation. IT departments can thus react more effectively to changes in business requirements, as VMs can be cloned, copied, migrated and moved up or down in order to cover the performance requirements of the application or business process or to conserve hardware and software resources accordingly.

Another positive aspect: Compared to conventional IT infrastructure, virtual machines noticeably reduce IT costs, as more VMs can be consolidated on fewer physical machines. In addition, legacy systems on which older applications run can be converted into VMs and physically decommissioned in order to save the costs of maintaining this old IT.

However, virtual machines also reach their limits. VMs require a lot of system memory (gigabytes) and each contain a full operating system. This is the only way to consolidate many virtualized applications on a single system. It still takes a long time to provision a VM - but it has improved steadily over the past few years. In addition, the portability of VMs is limited. For technological reasons, VMs are no longer able to deliver the speed, agility and savings that fast-moving companies demand.

This is how a Docker container works

Containers work similarly to VMs, but much more specific and granular. They isolate a single application and its dependencies - any external software libraries the application needs to run - from both the underlying operating system and other containers. All containerized applications share a single, common operating system, either Linux or Windows, yet they are separate from each other and from the overall system.

You might also be interested in:

Container technologies simplify the testing of distributed IT systems

Modernize legacy applications - Docker in pole position

Containers versus virtualization - hypervisor-based containers - the best of both worlds?

The main advantages of dockers and containers

Docker enables efficient use of system resources. The instances of containerized applications consume much less main memory than virtual machines. They start and stop faster and they can be packed much more densely on host hardware. This means a high and efficient utilization of resources and thus lower costs on the IT side.

Although the cost savings depend on the type of apps used and their hunger for resources, containers work more efficiently than virtual machines in the overall context. In addition, it is possible to reduce costs for software licenses, since container applications usually require far fewer operating system instances to cover the same workload.

Fast software delivery cycles

In order to withstand increasing competition, companies have to react quickly to changing business conditions. This also applies to your IT and especially your business software. The applications should be scalable as required. In addition, it should be possible to flexibly add new functions at any time.

The structure of the Docker container technology can help here. They make it easy to quickly implement new software versions with new business functions and bring them into production. In addition, it allows a quick fallback to an earlier version in the development phase or in critical runtime environments. The technology is therefore particularly suitable for implementing strategies such as blue green deployments.

Application portability

Depending on the location, two factors are essential for a company application: If it is to be used securely in the immediate vicinity, it must be protected behind a firewall. If, on the other hand, the application is to be available in a public cloud, it must have easy public access and a high degree of flexibility in terms of resources.

Since Docker containers encapsulate everything a program needs to run, applications can easily be exchanged between different environments. Every host with installed Docker runtime environment, be it a developer laptop or a public cloud instance, can safely run a Docker container.

Suitable for microservices architectures

The lightweight, portable and self-contained Docker containers facilitate the development of any kind of software. A decisive advantage of this technology is: Since there are no legacy dependencies, you don't have to implement applications that are supposed to solve tomorrow's problems with yesterday's development tools.

Microservices is a technology that represents a design pattern for container-based applications, as it consists of many loosely coupled components. By breaking down traditional, "monolithic" applications into separate services, microservices make it possible to scale, modify, and maintain the various parts of that application, regardless of the user and when. That means who and when receives a software update, for example, depends solely on the company's requirements.

Important: You don't need containers to implement microservices, but containers are perfectly tailored to the microservices approach and agile development processes.

Docker doesn't solve any security problems

As with any new software technology, the same applies to Docker containers: They are not a silver bullet and cannot solve every problem on their own. By default, software in a container can be more secure than software that runs on a bare metal system. However, this security is deceptive, because it says nothing about the security standards outside of the Docker container or its environment. Of course, containers can add an additional layer of security to an application, but only as part of a general concept of securing an application in the overall context.

  1. Integration into the data center
    In order to be able to develop their full effect, it must be possible to embed containers in the company's existing IT infrastructure with its services - be it security, authentication or network services, for example.
  2. VM management instead of chaos
    The IT managers have to find a way to manage their virtual machines (VM) clearly and still provide the customers with the required services in parallel.
  3. Scalability
    Today's highly dynamic corporate IT makes it necessary for companies to be able to programmatically scale their container technology and the capacities for making them available to users.
  4. Orchestration
    Companies need to combine multiple containers, combine containers with other applications, and enable communication between containers and other IT resources. To achieve all of this, the containers must also be developed in an environment that reflects this mix of technologies and computing capacities.
  5. Note legacy systems
    Containers not only have to harmonize with the latest applications and systems in the company, they also have to take legacy systems into account.

Docker doesn't turn applications into microservices

Packing an existing application in a container can reduce resource consumption and make deployment easier. But this technology doesn't change the structure of the application or how it interacts with other apps like microservices do. The developer has to put the property of converting applications into microservices into the cradle of the application from scratch, it cannot simply be substituted by container technology. For example, if you move a monolithic or SOA-like application into a container, you get an ordinary app with an identical range of functions, just packed in a container.

No substitute for virtual machines

A persistent myth about containers is that they make virtual machines obsolete. Many applications that used to run in a VM can be moved to a container. But that does not automatically mean that it always makes sense. If, for example, a company uses an application with high regulatory security standards, it is not easily possible to migrate the application from a VM to a container, since VMs have a higher level of isolation security than containers.

Use case for Docker containers

Corporate IT is often seen as a stumbling block when it comes to innovation, as it often works in the background and only reacts slowly to changes. Developers in the specialist departments shy away from such restrictions that IT imposes on them. With Docker and container technology, developers get more freedom. In addition, it also offers the possibility of creating business apps that react quickly to changing business conditions.

This article is based on a contribution from our sister publication InfoWorld.