Container Platforms: 6 Best Practices and 15 Top Solutions

Learn how to effectively use container platforms including container engines, orchestrators, and CaaS providers, and discover 15 popular platforms.

January 24, 2021

What are Container Platforms?

Container platforms, broadly defined, are software solutions that allow you to manage containerized applications. They provide capabilities like automation, orchestration, governance, security, customization, and enterprise support for container architectures.

We can identify several types of container platforms:

  • Container engines, like Docker and Docker Enterprise Edition, which provide a container runtime environment, allow you to create containers, manage container images, and perform basic operations.
  • Container orchestrators, like Kubernetes, which allow you to manage, govern and automate containers at scale.
  • Managed container platforms, like Google Kubernetes Engine, which augment container engine and orchestrator with additional services, like management of the orchestrator and underlying hardware resources.

In this article, you will learn:

How Container Platforms Work

To illustrate how different types of container platforms work, let’s look at a common example of each type of platform.

Container Engines: How Docker Works

Docker is a container engine that lets you create and manage containers. It is based on a client-server architecture, which includes the following core components: 

  • A docker client—triggers docker commands 
  • A docker host—runs docker daemons
  • A docker registry—stores docker images

The docker client uses a REST API to communicate with docker daemons. Docker clients and daemons can operate on the same system. Alternatively, you can connect clients and daemons remotely.

The client can issue a build command to docker daemons. The daemon then builds a docker image and saves it in the docker registry. You can use either a local repository for the registry or pull public images from Docker Hub. To create a running instance you need to issue a command to create a docker container.

Container Orchestrators: How Kubernetes Works

Kubernetes provides robust features you can use to deploy Kubernetes in production environments. Here are core components of the Kubernetes architecture:

  • A Kubernetes deployment—is composed of Kubernetes clusters. 
  • A Kubrentes cluster—contains two main components—a control plane and compute machines (also called “nodes”). 
  • A node—serves as a unique, Linux-based, environment, deployed as a physical or virtual machine (VM). The node runs applications and workloads using pods.
  • A pod—is composed of containers.
  • A container—code packaged together.
  • A control plane—in charge of maintaining the state og a cluster, determining which applications should and which container each the application should use.
  • Compute

Kubernetes is deployed on top of the operating system (OS) and interacts with the pods running on nodes. Administrators and DevOps teams use the Kubernetes control plane to issue commands. The control plane relays these instructions to nodes.

Since a Kubernetes deployment in production can run thousands and even tens of thousands of containers simultaneously, Kubernetes leverages capabilities that automate a wide range of processes. For example, the process can automatically choose the best node for a specific task, allocate resources, and assign pods to fulfil that work. 

Container as a Service (CaaS): How Google Kubernetes Engine Works

Google Kubernetes Engine is a CaaS offering that extends the capabilities of Kubernetes. Here are the core components of Container as a Service:

  • A group of Google Compute Engine instances—that run Kubernetes
  • A master node—which manages a cluster of containers, and runs a Kubernetes API.  
  • Additional cluster nodes—running a kubelet agent and runtime, both of which are essential to manage containers.

You can organize groups of pods into services. This enables non-container-aware applications to gain access to other containers without any additional code.

6 Best Practices for Using a Container Platform

When using container platforms, especially in a large-scale deployment, the following best practices will help you operate container infrastructure effectively.

Container Security

Security must be built into the DevOps process and the containerized environment 

throughout its lifecycle. This includes the development and build process, testing and deployment to production. 

There are a variety of tools for managing container security. Ensure that the tools you use can assess the security posture of container clusters and identify vulnerabilities, control images to ensure you never deploy containers from an infected or compromised image, and can monitor and protect container workloads in real time.

Container Monitoring

Deploying cloud-native applications shifts the focus from host-based monitoring to container, cluster, and orchestrator-level monitoring. You must monitor container infrastructure at all levels to ensure availability, performance, and adherence to service-level agreements.

Container Storage

To design true cloud native applications, you will need to adopt container-compatible storage platforms. Container orchestrators can connect to storage providers and dynamically provision storage volumes. Ensure your storage infrastructure integrates with the development life cycle, and can support the required performance and availability of containerized workloads.

Container Networking

The portability and short life cycle of containers can overwhelm traditional network stacks. These do not provide adequate access and policy management capabilities to support containerized applications. When running containers at scale, eliminate manual network configuration, and leveraging network automation capabilities of your container orchestrator.

Container Lifecycle Management

Containers are short-lived, and their lifecycle must be managed carefully to prevent sprawl and waste of resources. Container lifecycle management should be closely integrated with the continuous integration/continuous deployment process. Automate infrastructure deployment and operation tasks using infrastructure as code (IaC) tools, and native capabilities of the container orchestrator or CaaS provider.

Container Orchestration

The orchestration and scheduling layer provides the core functionality of container deployments. The orchestration layer interacts with the application to make sure containers run as expected and maintains service level agreements. Orchestrators are responsible for placing containers on as many hosts as needed, and provisioning other necessary resources, to keep applications running. 

Kubernetes has become the de facto standard for container scheduling and orchestration. Organizations should carefully evaluate the pros and cons of running Kubernetes independently, or relying on a container as a service (CaaS) model, which provides a smoother learning curve but sacrifices flexibility and may result in vendor lock in.

Top Container Platforms

The following list summarizes popular software platforms that can help you deploy and manage containers.

Container Engines and IaaS Solutions

SolutionLicensing ModelDescription
Docker Community Edition (CE)Open sourceA free, open-source version of Docker, available on the Docker Store. Docker CE can run on the following: 
Mac Windows 10Cloud platforms like AWS and AzureCentOSDebianFedoraUbuntu 
Docker CE includes the complete Docker platform and is ideal for those just starting to build container applications.
Docker Enterprise Edition (EE)LicenseThis Docker version is built for business-critical deployments. It is available in three tiers: 
The basic tier provides support and certification, as well as the Docker platform.The standard tier provides advanced features for image and container management, Docker Datacenter for role-based access control, and more.The advanced tier provides all of the above plus continuous vulnerability monitoring and Docker security scanning.

Containers Orchestrators and Container-Based Platform as a Service

SolutionLicensing ModelDescription
KubernetesOpen sourceKubernetes, also known as K8s, is a free, open-source platform that provides features for deploying and managing containerized applications.
Red Hat OpenShiftSupport billed by Red Hat, infrastructure by partnersOpenShift provides a variety of containerization software products based on Red Hat’s open-source software. 
OpenShift offers built-in monitoring, consistent security, centralized policy management. It is also compatible with Kubernetes.
VMware Tanzu Application ServiceLicenseVMware Tanzu offers a variety of solutions designed to help you build, run, and manage containerized applications using Kubernetes.
Notable Tanzu solutions include turnkey microservices operations and security, native support for native Windows and .NET, and integration with CI/CD tools.
VMware Tanzu Kubernetes GridLicenseVMware Tanzu Kubernetes Grid Integrated Edition is designed especially to support Kubernetes deployment to multi-cloud environments. 
This solution leverages Kubernetes without adding abstraction layers or proprietary extensions. This ensures you can use the native version of Kubernetes CLI.
SUSE CaaS PlatformLicenseSUSE CaaS Platform is an enterprise-grade solution for container management that simplifies the entire pipeline. It comes with a wide range of features, including automated lifecycle management.

Container as a Service

SolutionLicensing ModelDescription
Azure Container Service (ACS)Pay per useAzure Container Instances provides a fast and simple option for running containers in Azure. There is no need to manage VMs or adopt higher-level services.
Amazon Elastic Container Service (ECS)Pay per useAmazon Elastic Container Service (ECS) is a cloud-based service utilizing. ECS manages containers and lets you run applications in the AWS cloud without configuring an environment for the deployed code.
Amazon Elastic Kubernetes Services (EKS)Pay per useAmazon Elastic Container Service for Kubernetes (EKS) provides cloud-based container management. EKS natively integrates with Kubernetes.
Amazon FargatePay per useAmazon Fargate lets you o run containers on Amazon Web Services (AWS) without managing the underlying infrastructure.
Azure Kubernetes Service (AKS)Pay per useAzure Kubernetes Service (AKS) is a managed service dedicated to container orchestration. It is based on Kubernetes and deployed in the Azure cloud.
Google Kubernetes Engine (GKE)Pay per useGoogle Kubernetes Engine (GKE) is an orchestration and management solution based on the open source version of Kubernetes. 
RancherPay per useRancher is a software stack for developing containerized applications. Rancher provides tools that help address various Kubernetes challenges.