Pushing OCI Audit logs to PostgreSQL

October 4, 2022 | 6 minute read
Vishak Chittuvalapil
Senior Cloud Engineer
Bhushan Arora
Principal Cloud Architect
Lasya Vadavalli
Staff Cloud Engineer
Text Size 100%:

Oracle Cloud Infrastructure (OCI) Audit logs record events or calls made to any supported API endpoints. The log events recorded by the Audit service include API calls made by the Oracle Cloud Console, command line interface (CLI), software development kits (SDK), your own custom clients, or other OCI services. These log events are available in the Logging service, which supports log collection from other services Events, Functions, Object Storage and custom logs.

OCI Logging to PostgreSQL

PostgreSQL, also known as Postgres, is a free, open source relational database management system (RDBMS) and is used as the primary data store or data warehouse for many web, mobile, geospatial, and analytics applications. In this blog post, we document a use case that pushes OCI Audit logs to PostgreSql. Customers then use these logs from the PostgreSql database for their in-house analytics. You can use this same approach for sending any other logs in OCI Logging to PostgreSQL.

Figure 1 illustrates the architecture for this use case. OCI Audit logs, are available in Logging, are transferred using Service Connector Hub to the Functions service. The log content is ultimately pushed out to PostgreSQL.

Architecture Diagram


Figure 1: Architecture diagram

The guide for this use case utilizes the following steps:

  1. Create a table in your PostgreSQL database to receive OCI Audit logs.

  2. Create a function for pushing logs to PostgreSQL.

  3. Create a service connector that allows Audit logs to be transferred from Logging to the Functions service.

  4. Verify the OCI Audit logs in the PostgreSQL.

Create a table in your PostgreSQL database to receive OCI Audit logs

Connect to the database that you want to house the table for the audit logs.

Copied to Clipboard
Error: Could not Copy
Copied to Clipboard
Error: Could not Copy
psql -h <<ip_address>> -p 5432 -U postgres -W -d postgres</ip_address>

Connect to the PostgreSQL DB

 

Create the required table.

Copied to Clipboard
Error: Could not Copy
Copied to Clipboard
Error: Could not Copy
CREATE TABLE logsdata (info json);

Create the required table

 

Describe the table with the following command:

Copied to Clipboard
Error: Could not Copy
Copied to Clipboard
Error: Could not Copy
\d+ logsdata

Describe the table

 

Create a function for pushing logs to PostgreSQL

  1. In the Oracle Cloud Console menu, navigate to Developer Services and select Functions

  2. Select an existing application or click Create Application. Create a function within your application. If you’re new to Functions, follow sections A, B, and C in the Functions Quick Start on Cloud Shell.

  3. We recommend creating a sample Python function first. The fn init command generates a folder called logging-postgres with three files: func.py, func.yaml, and requirements.txt.
    Copied to Clipboard
    Error: Could not Copy
    Copied to Clipboard
    Error: Could not Copy
    fn init --runtime python logging-postgres
  4. Replace the auto-generated files func.py, func.yaml, and requirements.txt with PostgresSQL’s source code. Replace func.py and func.yaml with the following. Remember to input the database name, user, password, host and port as required.

    Copied to Clipboard
    Error: Could not Copy
    Copied to Clipboard
    Error: Could not Copy
    import io
    import json
    import logging
    import os
    from fdk import response
    import psycopg2 #import python library for postgresql database connection
    from psycopg2.extras import Json
    
    #The following method will receive the list of log entries from OCI as input in the form of bytestream and is defined in func.yaml
    def handler(ctx, data: io.BytesIO = None):
        logger = logging.getLogger()
        logger.info("function start")
        try:
            logentries = json.loads(data.getvalue()) # convert the bytesstream input as JSON array
            if not isinstance(logentries, list):
                logger.error('Invalid connector payload. No log queries detected')
                raise
    
        # Optional: logging the input to functions as json. # Not to be used in production
            logger.info("json input from SCH")
            logger.info( data.getvalue())
    
            #create connection using postgres ip and credentials
            conn = psycopg2.connect(database="postgres", user='postgres', password='****', host='x.x.x.x', port= '5432')
            cur = conn.cursor()
    
            for logentry in logentries:
            #select the existing table "logsdata" and insert data into existing column "INFO" and commit the changes
                cur.execute("SELECT * FROM logsdata")
                cur.execute(f'''INSERT INTO logsdata(INFO) VALUES (%s)''', [Json(dict(logentry)])
                conn.commit()
    
                logger.info("function end")
            return
        except Exception as e:
            logger.error("Failure in the function: {}".format(str(e)))
            raise
    Copied to Clipboard
    Error: Could not Copy
    Copied to Clipboard
    Error: Could not Copy
    schema_version: 20180708
    name: logging-postgres
    version: 0.0.71
    runtime: python
    build_image: fnproject/python:3.9-dev
    run_image: fnproject/python:3.9
    entrypoint: /python/bin/fdk /function/func.py handler
    memory: 1024

    Replace requirements.txt with the following contents:

    Copied to Clipboard
    Error: Could not Copy
    Copied to Clipboard
    Error: Could not Copy
    fdk>=0.1.46
    
    psycopg2-binary
  5. Deploy your function.

    Copied to Clipboard
    Error: Could not Copy
    Copied to Clipboard
    Error: Could not Copy
    fn -v deploy --app logging-postgres

Create a service connector that allows Audit logs to be transferred from Logging to Functions

  1. In the Oracle Cloud Console menu, navigate to Observability and Management and select Service Connector.

  2. Create the service connector in a compartment and configure Logging as the source and Functions as the destination.

    • Under Configure Source, choose the compartment and log group as _Audit. If you want to send audit logs from all the compartments, choose the selection for Include_Audit in subcompartments as shown in the following figure.

    • Under Configure Destination, choose the compartment, function application, and function that you created previously.

    • Create the default policy allowing the service connector to write it to OCI Functions, which appears on the screen. Then, click Create.

    sc1sc2Figure 2: Creating a service connector

Verify the OCI audit logs in PostgreSQL

Verify the OCI Audit logs using the following query. However, we can try different queries for any required visualizations.

Copied to Clipboard
Error: Could not Copy
Copied to Clipboard
Error: Could not Copy
select   info -> 'data' -> 'eventName' as EventName,info ->  'data' -> 'identity' ->> 'principalName' as principalName, info ->  'data' -> 'identity' ->> 'ipAddress' as ipAddress,info -> 'data' -> 'request' ->> 'action' as action  from logsdata   limit 20;

Verify logs in PostgreSQL

 

Conclusion

This blog post showed you can transfer logs from OCI Audit to PostgreSQL. You can fulfill many requirements and use cases, such transferring any OCI logs from your virtual cloud network (VCN), load balancer, or WAF to PostgreSQL or any database, including Autonomous Databases. If you’re looking at visualization using machine learning, dashboarding and alerting, you can also utilize OCI Logging Analytics to receive logs from Oracle Cloud Infrastructure.

For more information, see the following resources:

Vishak Chittuvalapil

Senior Cloud Engineer

Bhushan Arora

Principal Cloud Architect

Lasya Vadavalli

Staff Cloud Engineer


Previous Post

AI at CloudWorld: Everything you need to know

Sherry Tiao | 5 min read

Next Post


How Oracle Cloud supports customer data privacy

Yana Rogach | 2 min read