In this article, we going to see how we can leverage the new Logging Analytics Service with other OCI Services to get Security and Governance Insights from an Identity Cloud Service instance.
Below, is one example dashboard that can be imported to quickly see a wide variety of useful insights from your IDCS Audit logs.
At the end of this blog, you can see an additional dashboard showing insight on administration actions.
To implement this, we are going through several steps:
At the end of this integration, you will be able to see IDCS Audit Logs in a Logging Analytics Dashboard.
We assume that you are familiar with OCI, Rest APIs, Logging Analytics Service is enabled and ready to be used. You have administrator access to both OCI Tenancy and IDCS.
In this step, we are going to create a client application, that uses a token to use IDCS Rest APIs.
This saves you from providing your username / password to authenticate the function we will create later.
In IDCS Console, go through Applications and click on Add:
Select confidential application:
Give it a name and click on next
Select configure now.
In configuration: check Allowed Grant Types and Resource Owner
Grant the client access to Identity Cloud Service Admin by adding Identity Domain Administrator role to the application.
Click on next.
Check Enforce Grant as Authorization
Click on next then finish.
A popup with the client ID and the Client Secret is shown.
Copy Client ID and Client Secret. They will be needed later.
Activate the application
In this step, we are going to prepare the OCI resources needed.
For isolation matter, all the resources will be created in a specific compartment IDCS_AUDITLOG.
We are going to create the compartment, a user, a group, a dynamic group, and VNC with all the network configuration, the Object Storage Bucket, and all the needed policies.
Create a compartment IDCS_AUDITLOG
Create an IAM user idcs_auditlog_user
For this user, Create an API Key by adding a .PEM public then save the fingerprint. You will need it later.
Generate an Auth Token and save the token. You will need it later.
Create an IAM group idcs_auditlog_grp and add idcs_auditlog_user to the group.
Create a VNC idcs_auditlog_vcn, create a private subnet and a public subnet.
Create an internet gateway and a service gateway.
Create 2 route tables: one that uses the Internet Gateway,
another one that uses the Service Gateway:
Edit the Public Subnet and set the Internet Gateway route as the default route:
Add an Ingress Rules to allow HTTPS traffic.
Edit the Private Subnet and set the Service Gateway Route as the default Route:
Navigate through Object Storage Service, pick the IDCS_AUDITLOG Compartment and create a bucket idcs_auditlog_bucket.
Navigate through API Gateway Service, pick the IDCS_AUDITLOG compartment and create an API Gateway by selecting the VNC then the public Subnet.
Navigate through Functions Service, select your compartment and Click on Create Application. Give it a name a select your VCN then the private subnet.
Create a Dynamic Group and add the following rule replacing the resource.compartment.id value by the compartement ocid:
ALL {resource.type = 'fnfunc', resource.compartment.id = 'ocid1.compartment.oc1..axxxx'}
Navigate through IAM Policies and create the following policies for the Function:
allow service FAAS to use virtual-network-family in tenancy
allow service FaaS to read repos in tenancy
allow group idcs_auditlog_grp to manage object-family in compartment IDCS_AUDITLOG
allow group idcs_auditlog_grp to manage repos in tenancy
allow group idcs_auditlog_grp to read metrics in tenancy
allow group idcs_auditlog_grp to read objectstorage-namespaces in compartment IDCS_AUDITLOG
allow group idcs_auditlog_grp to use virtual-network-family in compartment IDCS_AUDITLOG
allow group idcs_auditlog_grp to manage functions-family in compartment IDCS_AUDITLOG
allow group idcs_auditlog_grp to use cloud-shell in compartment IDCS_AUDITLOG
allow group idcs_auditlog_grp to use functions-family in compartment IDCS_AUDITLOG
allow dynamic-group dyn_grp_idcsauditlog to manage objects in compartment IDCS_AUDITLOG
allow any-user to use functions-family in compartment IDCS_AUDITLOG where ALL {request.principal.type= 'ApiGateway', request.resource.compartment.id = 'ocid1.compartment.oc1..aaaaaaaas22tg4mhouhdupee4iml2klinej7lpjt6pbvpyzttlpqqf7awckq'}
Add also the following policies needed for the logging Analytics Service:
allow service loganalytics to READ loganalytics-features-family in tenancy
allow group idcs_auditlog_grp to {LOG_ANALYTICS_OBJECT_COLLECTION_RULE_CREATE,LOG_ANALYTICS_LOG_GROUP_UPLOAD_LOGS, LOG_ANALYTICS_ENTITY_UPLOAD_LOGS, LOG_ANALYTICS_SOURCE_READ, BUCKET_UPDATE, LOG_ANALYTICS_OBJECT_COLLECTION_RULE_DELETE} in tenancy
allow service loganalytics to read buckets in tenancy
allow group idcs_auditlog_grp to read compartments in tenancy
allow service loganalytics to read objects in tenancy
allow service loganalytics to manage cloudevents-rules in tenancy
allow service loganalytics to inspect compartments in tenancy
allow service loganalytics to use tag-namespaces in tenancy where all {target.tag-namespace.name = /oracle-tags/}
allow group idcs_auditlog_grp to manage management-dashboard-family in tenancy
In this step, we are going to create a function that collect the last 5 minutes audit logs from IDCS through Rest API and send them to an Object Storage Bucket.
On OCI Console, go to Developer Services, click on Functions then click on the Application created previously.
Click on it and follow the Getting Started steps on the left side of the page using the Cloud Shell Setup:
1. Launch Cloud Shell
2. Set up fn CLI on Cloud Shell
3. Update the context with the function’s compartment
4. Update the context with the location of the Registry you want to use (Replace OCIR-REPO by appidcsloganalytics
5. Already generated in obtain OAuth token.
6. Log to the registry. Note: use the user created previously, and the OAuth token value.
7. Verify your setup
Stop the Getting Started steps here, and continue with the following instruction:
Create a postauditlogs python function by entering:
fn init --runtime python postauditlogs
A directory called postauditlogs is created with 3 files: func.py, func.yaml, requirements.txt
Cd postauditlogs
Edit the file requirements.txt to contain these three lines:
fdk
requests
oci
Edit the file func.py and replace it with the python code available on the blog resources.
Deploy the application by launching the command:
fn -v deploy --app App_idcs_auditlog
You should see your function deployed on the interface. Click on it, and in the configuration menu, add the following parameters: idcsurl, apiuser and apipwd
idcsurl value the IDCS URL with the format https://idcs-xxx.identity.oraclecloud.com/
apiuser and apipwd values are the Client ID and Secret ID previously created
To test the function, launch the command:
Fn invoke App_idcs_auditlog postauditlogs
You should see an object on your bucket:
In this step, we are going to schedule the function you just deployed for an excetion each 5 minutes. To do so, we will expose the function on an API Gateway, and use a 5 minutes interval health check to invoke it automatically.
Navigate through API Gateway click on the API Gateway previously created, and create a new deployment.
Give it a name a set the Path prefix to /
On the next page, set the path to /fn/loadauditlog, select the GET method and the function you created previously.
Click on Next then on Create.
Navigate through Health Check and Create a new HTTPS Health Check.
Select the API Gateway as a target,
Set the path to /fn/loadauditlog. Must be the same as the one created on the function deployment.
Select the GET method and set the interval to 5 minutes since the function is polling data for the last 5 minutes.
After few minutes, you can see the Health Check invoking the funtion each 5 minutes. Each invoke will send the last audit logs to the Object Storage Bucket.
In this step, we are going to configure Logging Analytics to ingest Logs from the Bucket.
We assume that Logging Analytics is activated, and the mandatory policies have been added.
allow group Log-Analytics-SuperAdmins to manage loganalytics-features-family in tenancy
allow service loganalytics to {LOG_ANALYTICS_LIFECYCLE_INSPECT, LOG_ANALYTICS_LIFECYCLE_READ} in tenancy
allow service loganalytics to read loganalytics-features-family in tenancy
allow group Log-Analytics-SuperAdmins to read compartments in tenancy
The policies needed for the log ingestion between OCI Logging and OCI Object Storage Bucket have been created previously in this blog.
Here, we are going to perform two steps: Import the Log Source and Parser from the associated blog resources, then create an OCI Logging Analytics Log Group.
To import the Source and Parsers into your OCI Logging Analytics service, navigate to the service, click on Administration then on Import Configuration Content in the Actions panel
Go to Logging Analytics the Administration, click on Log Groups, pick your compartment the create a new Log Group
To enable the log collection, an ObjectCollectionRule resource needs to be created.
You can use CLI or oci-curl or your favorite REST API tool to perform this.
In this example, we will use the CLI.
Set your oci-cli environement then launch the following command by setting a name for the rune and updating the values of compartmentId, osNamespace, osBucketName, namespace-name.
oci log-analytics object-collection-rule create --name rule_idcs --compartment-id XXX --os-namespace XXX --os-bucket-name XXX --log-group-id XXX --log-source-name IDCS --namespace-name XXX --collection-type HISTORIC_LIVE --poll-since BEGINNING
The logSourceName value is the one imported, keep it.
CollectionType and pollSince values are set to collect all historical data then continuously collect all new logs.
By now, if you navigate to your OCI Logging analytics, you can see the IDCS Logs
Here, we’re going to import the sample Dashboards into your OCI Logging Analytics Service.
Download the Dashboard json export from the resources.
Edit the file, and replace all EDITCOMPARTMENT fields by your compartmentId.
Launch the following oci-cli command .
oci management-dashboard dashboard import --dashboards /path/to/dashboard.json
You should see the new dashboard on your service that you can start to use.
I hope you found this blog helpful. Here are a couple of next steps to help you get started: