Modern Data Centre: Phase 4
Organizations are shifting workloads to containers to package and deploy applications, representing the next step on the journey from physical, to virtual, to containers and microservices. To support modern workloads, we require new tools to manage, orchestrate and support the lifecycle of containers and microservices. We need to deliver portability and scalability, built on a foundation of automation and orchestration at cloud scale. Common tools to deploy, manage and support containers spanning on-prem, cloud, hybrid and managed platforms are required.
Platforms for Modern Workloads
Packaging applications as containers allows for portability and consistency of deployment across multiple container platforms while reducing the resources required versus traditional virtual machine hypervisors. This portability and reduced resource requirements give clients the flexibility to deploy anywhere, reduce costs and deliver a consistent experience for developers and consumers of applications alike. A container platform provides the target infrastructure to run and manage a container’s lifecycle using automation, typically utilizing Kubernetes as the orchestration technology. Clients may deploy and manage a container platform on-prem, in the cloud or leverage cloud providers’ managed Kubernetes offerings to further reduce ongoing support costs.
Realizing the Software Defined Data Centre
With Kubernetes, a container platform can be deployed using software on your existing physical, virtual or cloud infrastructure, and allow for hybrid deployments on multiple environments. You may choose to manage your own control plane or let your hyperscale cloud provider do so on your behalf as a service. By defining your workload’s requirements, the platform provides the necessary set of services, availability, scalability and protection required while ensuring security standards are applied throughout. The definitions use software for automating the configuration and orchestration of your workloads on the container platform(s) you choose to implement, build or subscribe to.
Automation and CI/CD
In-house developed as well as commercial off-the-shelf can apply continuous integration and continuous deployment/delivery principles when managing their application lifecycle. Packaging your application into a container using layers allows you to create immutable images with custom layers built using configuration management tools, stored in an image repository. These binary images are then deployed to your container platform using a declarative style of infrastructure management that ensures your target desired deployment state is monitored and corrected against drift. Automating your software deployment pipeline ensures consistency across production, quality assurance, staging and development environments, with automated testing and code promotion via feature flags.
Data for Modern Workloads
Application workloads may be stateful or stateless in nature, and managing data that requires persistence between container deployments and restarts creates additional complexity. While file storage may be mapped to a container for storing persistent data, you can also utilize object-based storage or container-native storage solutions designed to deliver performance, scalability and persistence for high availability requirements. Persisting data in message queues, memory caching and databases are also common design patterns, many of which could be deployed as containers themselves. When evaluating data persistence solutions, protecting both data and applications is critical to ensuring recovery from failure and disaster.