Container Based Application Deployments
Container-based application deployments make efficient use of resources, increase consistency within and between environments, and are highly portable between different infrastructure and cloud platforms. In this article, we explore how Oracle technology can be deployed to Docker containers and orchestrated by Kubernetes with a specific focus on containerizing banking workloads.
A Brief Introduction to Containers
Containers provide a way to decouple applications from the environment that runs them in such a way that the application package (or container) is highly portable. The container can be moved between different environments from the cloud to on-premise and behave consistently. Although this approach shares some similarities with virtual machines (VMs), unlike VMs, containers virtualize the OS rather than the hardware. They also run directly on the Host OS and are typically much more lightweight. In essence, they take up less space on disk, they start faster, and they consume less RAM.
Figure 1 – Virtual Machines vs. Containers
A Container Engine is the name given to the application that runs and manages containers. A container engine typically works with images (a binary file much like a VM image) and runs these images to create containers. Docker is by far the most popular container engine; it is widely supported on several different hardware and software platforms. For this post, we will restrict ourselves to considering Docker, although other container engines have similar features and capabilities.
Beyond being able to run and manage containers, Docker brings the notions of Layers to their container engine. Instead of a container being one big binary image, it is made up of different layers, with each layer adding some new functionality or application. For example, you could start with an Oracle Enterprise Linux layer and add a new layer on top of this for WebLogic, to this you may add a third layer which is the WebLogic Domain, a fourth layer which is the installation of a specific JEE application into the WebLogic domain, and you may finalize this with customization specific to a particular use case.
Figure 2 – Docker Layers for a WebLogic Container Image
This layering has operational benefits as it reduces the amount of storage required and also the amount of data to be moved around the network: imagine you have five 1GB JEE application images; without layers, this would also require 5GB of storage. But if each of the five JEE applications runs on the same underlying OS and middleware, then these layers can be reused and only need to be stored once. The required storage could be well under 2GB. It is easy to see the benefits when extrapolating this to hundreds or even thousands of container images.
The layering also has a benefit at design and development time. When people build a new application, they do not need to start from scratch but can leverage the work of those who went before. For example, a secured standard operating system image could form the basis of every image within your organization.
A container engine will handle the essential management of containers: processes like creating, starting, and stopping containers. Docker will run a container on the server hosting Docker. However if you want to create clusters of containers that run multiple applications, provide resilience and high-availability, and optimize your infrastructure resources, you need a Container Orchestrator. For the context of this paper we will use the popular software Kubernetes as our container orchestrator, although, as with Docker and container engines, there are other options available.
The primary purpose of Kubernetes is to schedule Pods to do work. A Pod is a collection of one or more containers that work together to provide a service. These services handle the workload of your organization, everything from processing customer transactions to ensuring traffic gets to the correct service in the cluster. In Kubernetes, a Workload is effectively a set of scheduling rules for Pods; these rules may define different scenarios like:
By combining these different types of schedules, you can containerize your entire workload.
You can read more about containers, Docker, and Kubernetes at:
Containerizing Oracle Banking Workloads
Many Oracles banking products are available on Docker and Kubernetes. This includes Oracle Banking Platform (OBP) and Oracle Banking Digital Experience (OBDX). If you are using or considering these products, then Oracle can provide Dockerfiles and images as well as advice on best practices for build and deployment.
Figure 3 – Conceptual Docker layers for a banking product
This approach provides a straightforward path to container adoptions and can address many issues often associated with complex enterprise platforms. For example, a container-based solution offers:
Get started with oracle banking products at https://www.oracle.com/industries/financial-services/banking/products.html
Containerizing other Application Workloads
For workloads not specifically associated with one of Oracle’s banking products, there are still a number of options for bringing your workloads to containers. Oracles docker-image GitHub repository is the best place to start: https://github.com/oracle/docker-images/
This repository includes ready to use Dockerfiles and build scripts to enable containerization of a wide array of workloads:
The Oracle docker-image repository on GitHub is free to use for evaluation purposes; please refer to the licensing information in the Oracle docker-image repository for full terms and conditions of use.
Containerizing workloads with Oracle Cloud
Oracle Cloud Infrastructure Container Engine for Kubernetes (OKE) is a fully-managed, scalable, and highly available service that you can use to deploy your containerized applications to the cloud. You specify the compute resources that your applications require, and OKE provisions them on Oracle Cloud Infrastructure in an existing OCI tenancy. OKE is certified as conformant by the Cloud Native Computing Foundation (CNCF), so you can use OKE to run any workload that supports Docker and Kubernetes.
As an alternative to OKE, you can also use other Oracle Cloud services to run your containers:
From running your core banking platforms on Kubernetes through to shifting existing workloads to Oracle Cloud, Oracle Financial Services is committed to supporting you in your container journey. To discover more, message me.
Subscribe to our Blogs:
Oracle Financial Services Blogs: Sign up