X

Break New Ground

Exposing services through API Gateway

This post walks through the process of creating a simple Python web service and deploying it to an Oracle Cloud Infrastructure (OCI) compute instance. Then, you deploy an API gateway and expose the instance through the gateway while applying the rate limiting rule. The API gateway allows you to configure the path prefixes and routes, and enable features such as authentication, CORS policies, and rate limiting.

Create a compute instance

First, create the compute instance where the service will run. To create an instance, sign in to the OCI Console, and follow these steps:

  1. Open the navigation menu, select Compute, and then click Instances.
  2. Click Create instance.
  3. For Name, enter my-instance.
  4. Use the default availability domain and image selection.
  5. Click Change Shape.
  6. Click Virtual Machine and then click the AMD Rome option.
  7. Select 1 OCPU and 1 GB of memory.

You could select a VM with more OCPU’s and memory. However, because you’ree deploying a basic service to the instance, 1 OCPU and 1GB of memory is sufficient.

  1. Click Select Shape.

For networking, create a virtual cloud network (VCN) and a new public subnet. You’ll use the same VCN later when creating the API gateway

  1. Select Create new virtual cloud network.
  2. Select Create new public subnet.

In the Add SSH keys section, you can choose to generate a new SSH key pair, use an existing one, or create the instance without SSH keys. SSH keys are used to authenticate with the compute instance.

  1. If you already have an existing SSH key, select the Choose public key files or Paste public keys option. Otherwise, select the Generate SSH key pair and then save the public and private keys to your computer by clicking the Save Private Key button and Save Public Key link.

  2. Click Create to create the compute instance.

Connecting to the instance

When the compute instance is created, the Instance Details page opens. From that page, copy the public IP address. To connect to the virtual machine, open the terminal window on your computer and use SSH to connect to the instance. In the following commands, replace [INSTANCE_IP] with the IP address that you just copied and [PATH_TO_PRIVATE_SSH_KEY] with the path where you stored the private SSH key.

If you chose to generate an SSH key pair and downloaded the keys, be sure to update the permissions on the private key before trying to connect to the instance:

$ chmod 400 [PATH_TO_PRIVATE_SSH_KEY]

You can connect to the instance by using this command:

$ ssh -i [PATH_TO_PRIVATE_SSH_KEY] opc@[INSTANCE_IP]

After you are connected, you should see a prompt that looks like this:

[opc@my-instance ~]$

Next, let’s create a simple Python web service next.

Create a Python web service

You create a Python web service that runs on the compute instance. Because Python is already installed on the Oracle Linux image, you can get started by writing some code.

Create a virtual environment and install Flask

  1. Create a folder called web-service:
[opc@my-instance ~]$ mkdir web-service && cd web-service
  1. Create a virtual environment:
[opc@my-instance web-service]$ python3 -m venv venv
  1. Activate the virtual environment:
[opc@my-instance web-service]$ . venv/bin/activate
(venv) [opc@my-instance web-service]$
  1. Update pip:
(venv) [opc@my-instance web-service]$ pip install --upgrade pip

Pip is a package manager for Python. It lets you install and manage libraries and dependencies for your Python applications.

  1. Within the virtual environment, install Flask:
(venv) [opc@my-instance web-service]$ pip install Flask
Collecting Flask
...
Successfully installed Flask-1.1.2

Now that you’ve created a virtual environment and installed Flask, create the service.

Create the service

  1. Use vim to create a file called app.py:
(venv) [opc@my-instance web-service]$ vim app.py
  1. In the editor, paste in the following code:
from flask import Flask
app = Flask(__name__)


@app.route('/')
def hello():
    return "Hello from the VM!"

if __name__ == '__main__':
    app.run(host="0.0.0.0", port=int("5000"))
  1. Save and close the file.

If you try to run the service now, you can’t access it from the internet. For that to work, you need to update the security list in the subnet and open the port in the firewall on the instance. Let’s update the security list first.

Update the security list

Open the Console UI and follow these steps:

  1. Open the navigation menu, select Networking, and then click Virtual Cloud Networks.
  2. Click the VCN that you created earlier.
  3. In the Subnets list, click the first (and only) subnet.
  4. In the Security Lists list, click the default security list.
  5. On the Security List Details page, click Add Ingress Rules.
  6. For the source CIDR, enter 0.0.0.0/0 to allow ingress traffic to the instance from any IP address.
  7. For the destination port range, enter 5000, because that’s the port that the service will run on.
  8. Click Add Ingress Rules.

Open the firewall

  1. Go back to the terminal that’s connected to your instance and open the port on the firewall:
(venv) [opc@my-instance web-service]$ sudo firewall-cmd --permanent --zone=public --add-port=5000/tcp
success
  1. Reload the firewall settings:
(venv) [opc@my-instance web-service]$ sudo firewall-cmd --reload
success

Access the service

With the security list updated and the firewall rules reloaded, you can run the service and then try to access it through the internet.

  1. From the instance, run the Python app:
(venv) [opc@my-instance web-service]$ python app.py
 * Serving Flask app "app" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
  1. Either open the instances’ public IP address in your browser (remember to add port 5000) or use curl like this:
$ curl [INSTANCE_IP]:5000/
Hello from the VM!

Now you can create an API gateway and then access the service on the instance through the gateway.

Create the API Gateway

Create an API gateway to route requests to the service that’s running on the compute instance. Instead of using the instance’s IP address, you can use the API gateway’s address. Before creating the API gateway, refer to the prerequisites for using API gateways.

To create an API Gateway, sign in to the OCI Console, and follow these steps:

  1. Open the navigation menu, select Developer Services, and then click API Gateway.
  2. Click Create Gateway.
  3. For the Name, enter service-api-gateway.
  4. From type, select Public.
  5. Select your compartment.
  6. Select the VCN that you created earlier.
  7. Select a public subnet.
  8. Click Create.

After the Gateway is created, you can create an API deployment. An API deployment contains the specification that defines your API.

Create an API deployment

  1. Click Create Deployment.
  2. Select From Scratch.
  3. For the name, enter python-service.
  4. For the path prefix, enter /api.
  5. In the Rate Limiting section, click Add.
  • For the number of requests per second, enter 1.
  1. For the type of rate limit, select Per client (IP).

This setting let’s you rate limit based on the client’s IP address. For example, if you make more than 10 requests per second from IP address 1.0.0.0, you are rate limited. However, someone else coming from IP address 2.0.0.0 can still make requests. The second option in the list is Total. With this option, it doesn’t matter what the client IP address is; when 10 requests per second are reached, you are rate limited. Note that setting the number of requests per second to 1 lets you hit that limit faster and try it out. It’s not a realistic value to for your production services.

  1. Click Apply Changes.
  2. Click Next.
  3. In the Route 1 section, specify the first route in the API deployment that maps a path and one or more methods to a backend service:
  • For the path, enter /hello.
  • For the method, select GET.
  • For the type, select HTTP.
  • For the URL, enter the IP address of the instance, including the port number. For example, http://1.0.0.0:5000.
  1. Click Show Route Response Policies, and then click Add.
  • For action, select Set.
  • For behavior, select Overwrite.
  • For header name, enter my-service.
  • For values, enter hello.
  1. Click Apply Changes.

For this specific route, you can also set the request policies such as CORS, header transformations, or query parameter transformations. This ability is extremely powerful because you can create different routes on the gateway with various transformations that route to the same or different backends. You can read more about this in Adding Request Policies and Response Policies to API Deployment Specifications.

  1. Click Next.
  2. Click Create.

When the deployment is completed, you can click the Copy link in the Endpoint column to copy the full URL of the gateway deployment. The URL includes the path prefix (/api).

Access the service through the gateway

Now try to make a request to the service through the gateway. You need to add the /hello to the URL because that’s the path you used for the route.

$ curl https://bfearcufe6wekphtsx4pmfj6ui.apigateway.us-ashburn-1.oci.customer-oci.com/api/hello
Hello from the VM!

Remember that response header you set? To see the header being returned when you make the request, add the -v to the curl command:

$ curl -v https://bfearcufe6wekphtsx4pmfj6ui.apigateway.us-ashburn-1.oci.customer-oci.com/api/hello
...
< Server: Oracle API Gateway
< my-service: hello
...
Hello from the VM!

In addition to other headers, also notice the my-service header that you created.

Finally, try the rate limiting. If you run the same curl command multiple times in a row (at least more than once per second), the API gateway responds with the HTTP 429 code, as follows:

$ curl https://bfearcufe6wekphtsx4pmfj6ui.apigateway.us-ashburn-1.oci.customer-oci.com/api/hello
{"message":"Too Many Requests","code":429}

Another option for accessing and exposing a service running on the instance is through a load balancer. Creating a public load balancer lets you select compute instances without assigning public IP addresses to them. So, you aren’t accessing the instances directly, but are using a load balancer to load balance between potential multiple instances (without public IP addresses). Then, when you create a gateway, you use the load balancer’s IP address, instead of the instance’s IP address, as a backend.

Conclusion

In this post, you learned how to create an API gateway and configure it to access a Python web service running on a compute instance. You also learned how to configure rate limiting on the API gateway and return an additional header from the route.

References


Every use case is different. The only way to know if Oracle Cloud Infrastructure is right for you is to try it. You can select either the Oracle Cloud Free Tier or a 30-day free trial, which includes US$300 in credit to get you started with a range of services, including compute, storage, and networking.

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.