Create Kubernetes Clusters and Deploy Containers to Oracle Cloud from VS Code

November 29, 2024 | 8 minute read
Olga Gupalo
Member of Technical Staff
Text Size 100%:

The GraalVM Tools for Micronaut Extension allows you to deploy Docker images and debug running containers within Kubernetes clusters. It is the best solution to deploy apps to Oracle Cloud directly from within Visual Studio Code!

With the recent release of GraalVM Tools for Micronaut Extension it has become more convenient for Micronaut and GraalVM developers to work with Kubernetes clusters directly from within VS Code. This VS Code extension, that is based on the Kubernetes Tools from Microsoft, allows you to work with any Kubernetes provider and any container registry that is out there (Minikube, Azure Container Registry, Oracle Container Registry amongst others).

Follow along with me during the process of deploying a Micronaut application into a Kubernetes cluster in Oracle Cloud, interactively running and debugging containers directly from VS Code! The Kubernetes support in this extension should work equally well with any other container registry - whether it’s an open source or paid one.

Get Everything Ready

You can create, deploy, and manage Kubernetes clusters in Oracle Cloud with Oracle Container Engine for Kubernetes (OKE). OKE is a fully-managed and scalable Kubernetes provider service that you can use to deploy your containerized applications (such as Docker images) into Oracle Cloud. Deploying to Oracle Cloud means storing and sharing these images in Oracle Container Registry (OCIR) — a managed Docker container registry for Kubernetes deployments.

To get ready for deploying your application, there are a few preparatory steps. To get myself fully equipped, I needed:

Step1: Install necessary VS Code extensions. These are GraalVM Tools for Micronaut and Kubernetes Tools. You might be wondering why the GraalVM Tools for Micronaut Extension is required to deploy to Oracle Cloud. Extensions can VS Code serve different purposes. Firstly, this extension provides support for developing applications with the Micronaut framework (I will be deploying a Micronaut microservice later). Secondly, in combination with GraalVM, it allows you to debug your apps directly from VS Code with different debugging protocols, or to build native executables. Thirdly, this extension provides the integration between Micronaut and Oracle Cloud Infrastructure (OCI).

Open the Extensions view, search for “micronaut” and install the GraalVM Tools for Micronaut. Then search for “kubernetes” and install the Kubernetes Tools extension. The later is necessary to achieve a fully integrated Kubernetes experience (that is, to be able to usekubectl).

GraalVM Tools for Micronaut and Kubernetes extensions

Step 2: Get Oracle Cloud Account. This goes without saying. If you have not got an active account, create one now.

Step 3: Create a Kubernetes cluster in OCI. Sign in to your Oracle Cloud account. Using the Oracle Cloud Console, I can create a Kubernetes cluster with default settings using the Quick Create workflow (this guide was very helpful):

  1. In the Oracle Cloud Console, open the navigation menu and click Developer Services.
  2. Under Containers, click Kubernetes Clusters (OKE).
  3. Then click Create Cluster.

A Kubernetes cluster is a group of nodes. The nodes are the machines that run the applications, which can be either a physical or a virtual machine. In my case, it’s virtual.

Step 4: Set up Access to Oracle Container Engine for Kubernetes (OKE). I’ve created a Kubernetes cluster, and can access it using kubectl from the VS Code terminal. Remember, the kubectl command-line tool became available after I had installed the Kubernetes Tools extension, but kubectl must be configured to communicate with Oracle’s Kubernetes provider service, which is called in Oracle “Container Engine for Kubernetes, or OKE” for short.

Setting up local access to OKE means uploading my API signing key (this provides an extra level of security), installing and configuring OCI CLI, and creating a Kubernetes configuration file, kubeconfig. Installing and configuring OCI CLI is a one-time action which enables you to interact with different cloud services, like Oracle Functions. For me, as a macOS user, it required that I run two commands, following the on screen suggestions:

Copied to Clipboard
Error: Could not Copy
Copied to Clipboard
Error: Could not Copy
$ brew update && brew install oci-cli
$ oci setup config

Follow the Setting Up Local Access to Clusters reference guide which provides clear and consistent instructions for this process.

Another cool thing is that the wizard in the OCI console provides the commands to set-up local, from my terminal, access to my cluster (make a note of these):

Step 5: Log in with Docker to Oracle Container Registry. Installing Docker is essential if we want to “dockerize” Java apps, push and pull images etc., on your local laptop / machine. I need to make sure that Docker is up and running and then I’m going to login with Docker to the Oracle Container Registry from the VS Code terminal. The log in command is the following:

Copied to Clipboard
Error: Could not Copy
Copied to Clipboard
Error: Could not Copy
$ docker login -u <tenancy-name>/<oci-user-email> <region-key>.ocir.io

The credentials are taken from my Oracle Cloud account, where <tenancy-name> is my OCI tenancy name, <oci-user-email> email address used for creating an account, and <region-key> is cloud region key. For example, my OCI region is US East (Ashburn), and the key is iad. The list of available regions and their keys is available in cloud docs. I’m then asked to provide a password, which is my OCI user authentication token.

NOTE: If the Docker registry is private you will need a Docker registry secret (this guide describes how to create a secret and describes how to specify the image to pull from Oracle Cloud Infrastructure Registry).

Now I’m fully equipped to deploy my container to Oracle Cloud!

Deploy Containers to Oracle Cloud

A Java project has to be “containerized”, in other words, packaged into a Docker image, so that you can push it to a container registry and then run it as a container. There are different ways to dockerize a Java application, and the simplest method is to write a dockerfile. With GraalVM Tools for Micronaut Extension “containerizing” a Java project is available by default when you invoke the deployment action in VS Code.

I have an existing Micronaut Java application I am going to deploy. You can follow along to “containerize” and deploy your own application to the cloud. Before I can deploy I want to specify the location that I want to push my application Docker image to, within the Oracle Container Registry. For that I will go to the project configuration file, gradle.build, find and update the following line:

Copied to Clipboard
Error: Could not Copy
Copied to Clipboard
Error: Could not Copy
dockerBuild {
images = ["<region-key>.ocir.io/<tenancy-name>/<repo-name>/<image-name>:<tag>"]
}

It should now look like,“iad.ocir.io/cloudnative-devrel/vscode-k8s/rest-demo:0.1”

Now it’s time to deploy! Go to View > Command Palette, search for “micronaut”. The following quick actions for Micronaut are available:

Quick actions for Kubernetes support

I invoke the Deploy to Kubernetes action, VS Code opens the Output window and suggests that I create a Kubernetes deployment file (which is what the Micronaut: Create Kubernetes Deployment Resource action does). The wizard prompts me to fill in the necessary data:

  1. Pick the Docker repository. Since I’m using Oracle Container Registry and Ashburn is my deployment region, this will be OCIR Ashburn.
  2. Provide my Docker image name and version. I can grab the image location from the gradle.build config file.
Creation of a Kubernetes deployment file

3. Select the namespace. I choose default.

4. Lastly, the wizard prompts me to select a secret for my container registry in OCI. This is needed only if the Docker registry is private, which is not my case so I can just press Enter and continue. A manifest file, Deploy.yaml, is then autogenerated for me. Below is an example of the manifest file:

Kubernetes manifest file

Since the deployment file now exists, the deployment process starts and VS Code opens the Output window, where I can track the progress of the deployment.

What is happening in the background? The project is being packaged with Gradle to a runnable JAR file, then built into a Docker image. Then Docker pushes this image to the OKE repository, by applying the Deploy.yaml script. The Kubernetes extension starts port forwarding to connect to the server running in a Kubernetes cluster (kubectl port-forward forwards a local port to a port on the pod). Finally, a URL to access my application from a browser is printed out.

Debug a Running Container from VS Code

In addition to being able to deploy and run Micronaut applications within a Kubernetes cluster, I can also debug running containers directly from within VS, thanks to this VS Code extension!

Kubernetes doesn’t run containers directly, but wraps them into pods. A pod is a single deployable unit. With the connection made to a local port (thanks to the Kubernetes port forwarding feature), I can use my VS Code workstation to debug the application that is running in the pod. I can set some breakpoints in the app and then do the following to start debugging:

  1. Go to View > Command Palette, search for Micronaut: Debug in Kubernetes action and invoke it.
  2. The wizard prompts to confirm port forwarding (only the first time you start debugging the app within Kubernetes).
  3. I have to choose a port forwarding session:
Kubernetes port forwarding session

4. Lastly, I open the Kubernetes window (click on Kubernetes icon in the left sidebar), select the running node, and right-click on it to invoke the Debug (Attach using Java 8+) action. The debugger will automatically find the entry point of the app and start debugging! Now I can inspect the stack, evaluate variables, selected expressions in a tooltip, inspect its properties and more!

Summary

With the current trend to break-up monolithic applications into micro services, Java developers want to build applications that are small, run fast and that are easily deployable to the “Cloud”. Deployment should be smooth and easy. Using GraalVM can help with lower resource usage, which is key in any cloud environment.

GraalVM Tools for Micronaut Extension provides all of the necessary tooling for a complete developer experience in VS Code, including Kubernetes support for automated deployment, scaling and management of containers.

Learn More

 

Olga Gupalo

Member of Technical Staff


Previous Post

JDK 23.0.1, 21.0.5, 17.0.13, 11.0.25, and 8u431 Have Been Released

Raymond Gallardo | 1 min read

Next Post


Registration is now open for JavaOne 2025

Sharat Chander | 2 min read