Oracle Cloud Infrastructure (OCI) Audit logs record events or calls made to any supported API endpoints. The log events recorded by the Audit service include API calls made by the Oracle Cloud Console, command line interface (CLI), software development kits (SDK), your own custom clients, or other OCI services. These log events are available in the Logging service, which supports log collection from other services Events, Functions, Object Storage and custom logs.
OCI Logging to PostgreSQL
PostgreSQL, also known as Postgres, is a free, open source relational database management system (RDBMS) and is used as the primary data store or data warehouse for many web, mobile, geospatial, and analytics applications. In this blog post, we document a use case that pushes OCI Audit logs to PostgreSql. Customers then use these logs from the PostgreSql database for their in-house analytics. You can use this same approach for sending any other logs in OCI Logging to PostgreSQL.
Figure 1 illustrates the architecture for this use case. OCI Audit logs, are available in Logging, are transferred using Service Connector Hub to the Functions service. The log content is ultimately pushed out to PostgreSQL.
Figure 1: Architecture diagram
The guide for this use case utilizes the following steps:
Create a table in your PostgreSQL database to receive OCI Audit logs.
Create a function for pushing logs to PostgreSQL.
Create a service connector that allows Audit logs to be transferred from Logging to the Functions service.
Verify the OCI Audit logs in the PostgreSQL.
Create a table in your PostgreSQL database to receive OCI Audit logs
Connect to the database that you want to house the table for the audit logs.
In the Oracle Cloud Console menu, navigate to Developer Services and select Functions
Select an existing application or click Create Application. Create a function within your application. If you’re new to Functions, follow sections A, B, and C in the Functions Quick Start on Cloud Shell.
We recommend creating a sample Python function first. The fn init command generates a folder called logging-postgres with three files: func.py, func.yaml, and requirements.txt.
fn init --runtime python logging-postgres
Replace the auto-generated files func.py, func.yaml, and requirements.txt with PostgresSQL’s source code. Replace func.py and func.yaml with the following. Remember to input the database name, user, password, host and port as required.
import io
import json
import logging
import os
from fdk import response
import psycopg2 #import python library for postgresql database connection
from psycopg2.extras import Json
#The following method will receive the list of log entries from OCI as input in the form of bytestream and is defined in func.yaml
def handler(ctx, data: io.BytesIO = None):
logger = logging.getLogger()
logger.info("function start")
try:
logentries = json.loads(data.getvalue()) # convert the bytesstream input as JSON array
if not isinstance(logentries, list):
logger.error('Invalid connector payload. No log queries detected')
raise
# Optional: logging the input to functions as json. # Not to be used in production
logger.info("json input from SCH")
logger.info( data.getvalue())
#create connection using postgres ip and credentials
conn = psycopg2.connect(database="postgres", user='postgres', password='****', host='x.x.x.x', port= '5432')
cur = conn.cursor()
for logentry in logentries:
#select the existing table "logsdata" and insert data into existing column "INFO" and commit the changes
cur.execute("SELECT * FROM logsdata")
cur.execute(f'''INSERT INTO logsdata(INFO) VALUES (%s)''', [Json(dict(logentry)])
conn.commit()
logger.info("function end")
return
except Exception as e:
logger.error("Failure in the function: {}".format(str(e)))
raise
Replace requirements.txt with the following contents:
fdk>=0.1.46
psycopg2-binary
Deploy your function.
fn -v deploy --app logging-postgres
Create a service connector that allows Audit logs to be transferred from Logging to Functions
In the Oracle Cloud Console menu, navigate to Observability and Management and select Service Connector.
Create the service connector in a compartment and configure Logging as the source and Functions as the destination.
Under Configure Source, choose the compartment and log group as _Audit. If you want to send audit logs from all the compartments, choose the selection for Include_Audit in subcompartments as shown in the following figure.
Under Configure Destination, choose the compartment, function application, and function that you created previously.
Create the default policy allowing the service connector to write it to OCI Functions, which appears on the screen. Then, click Create.
Figure 2: Creating a service connector
Verify the OCI audit logs in PostgreSQL
Verify the OCI Audit logs using the following query. However, we can try different queries for any required visualizations.
select info -> 'data' -> 'eventName' as EventName,info -> 'data' -> 'identity' ->> 'principalName' as principalName, info -> 'data' -> 'identity' ->> 'ipAddress' as ipAddress,info -> 'data' -> 'request' ->> 'action' as action from logsdata limit 20;
Conclusion
This blog post showed you can transfer logs from OCI Audit to PostgreSQL. You can fulfill many requirements and use cases, such transferring any OCI logs from your virtual cloud network (VCN), load balancer, or WAF to PostgreSQL or any database, including Autonomous Databases. If you’re looking at visualization using machine learning, dashboarding and alerting, you can also utilize OCI Logging Analytics to receive logs from Oracle Cloud Infrastructure.
For more information, see the following resources:
Lasya is a Cloud Engineer with good experience on OCI. She has expertise in OCI Security, Monitoring and Logging, OCI Network architectures, connectivity to OCI using VPNs, Fastconnect, basics of terraform, Python sdk and OCI CLI.
Vishak Chittuvalapil
Senior Cloud Engineer
Vishak Chittuvalapil is a Senior Cloud Engineer at Oracle, focusing on OCI security, observability, and governance. He has authored several Oracle blogs and technical assets, contributed to field-wide initiatives like the MAP program, and holds CCSP ,Cybersecurity and multiple OCI specialist certifications. With a hands-on, solution-driven approach, Vishak helps customers design and implement secure, scalable architectures on Oracle Cloud.