How to Ingest VCN Flow Logs into Splunk

Each instance in an Oracle Cloud Infrastructure virtual cloud network (VCN) has one or more virtual network interface cards (VNICs) for communication within and outside of the VCN. Oracle Cloud Infrastructure Networking uses security lists to determine what traffic is allowed in and out of a given VNIC. A VNIC is subject to all the rules in all the security lists and network security groups associated with the VNIC's subnet.

To help you meet audit and compliance requirements and monitor the traffic in and out of your VNICs, you can now set up VCN flow logs to record details about traffic that has been accepted or rejected based on the security list or network security group rules. This post describes multiple ways that you can ingest this data and visualize your VCN traffic by using Splunk.

VCN Flow Log Entry Syntax

A flow log record is a space-separated string that has the following format:

<version><srcaddr> <dstaddr> <srcport> <dstport> <protocol> <packets> <bytes> <start_time> <end_time> <action> <status>

For example:

2 73 89 11 102 349 1557424462 1557424510 ALLOW OK
2 82 64 13 112 441 1557424462 1557424486 REJECT OK

Ingestion Methods: Pull versus Push

Splunk supports numerous ways to ingest data, from monitoring local files by using an agent or streaming wire data, to pulling data from remote third-party APIs, to receiving data over syslog, TCP, UDP, or HTTP. One example of pushing data is through Oracle Functions, which is used to stream events over HTTPS to the Splunk HTTP Event Collector (HEC). An example of the pull method would be using the Oracle Cloud Infrastructure Object Storage plugin for Splunk.

These pull and push models apply to different use cases and have different considerations. This post pertains to the event-driven push model, which provides a more scalable and closer to real-time feed of flow log data. Because there are no dedicated pollers to manage and orchestrate, the push model generally offers the following benefits:

  • Lower operational complexity and costs
  • Easier to scale
  • Low friction
  • Low latency

Solution: Instantly Visualize Your VCN Flow Log Data on Splunk with Oracle Functions and Events

This solution uses the newly released Oracle Cloud Infrastructure Events service and Oracle Functions alongside Splunk's HTTP Event Collector technology to achieve a highly scalable pattern to meet large-scale enterprise ingestion requirements.

By following the steps in this post, you can create super-functional visualizations like the one shown in the following image.

Screenshot that shows a flow log visualization in Splunk.

Example Deployment Diagram

In the following diagram, the Oracle Cloud Infrastructure Logging service writes the flow log data to an Object Storage bucket. Object Storage then emits a Create Object event to the Events service. The Events service in turn triggers a function. This function then splits the data into chunks allowed by your Splunk HTTP Event Collector and sends those chunks over HTTP or HTTPS to your Event Collector endpoint.

Diagram that shows how the Logging service writes the flow log data to an Object Storage bucket.

Steps for Streaming VCN Flow Logs

The rest of this post walks you through the following steps to achieve this solution:

  • Step 1: Enable Flow Logs for a Subnet
  • Step 2: Configure the Splunk Source Type
  • Step 3: Create the Field Transform in Splunk
  • Step 4: Create an HEC Token from Splunk Enterprise
  • Step 5: Configure the Oracle Function
  • Step 6: Configure the Function to Trigger an Event
  • Step 7: Grant Access to Object Storage
  • Step 8: Test
  • Step 9: Visualize and Alert


  • We recommend using the Oracle Cloud Infrastructure Cloud Shell because it comes with all the preconfigured tools that you need. To open Cloud Shell, click the icon at the top of the Oracle Cloud Console window.

    Screenshot of the Oracle Cloud Console with the Cloud Shell icon in focus.

    Cloud Shell appears at the bottom of the Console. Keep it open because you’ll use it for several steps.

    Screenshot that shows the Cloud Shell area in the Console.

  • Create a repository in Oracle Cloud Infrastructure Registry to store the Oracle Functions images.

  • Ensure that your user has the correct permissions. Use the following example policy statements, which assume that your user is a member of the group called group flow-log-enablers. Replace this group with any existing group that you want to use. You can also adjust the policy to be less broad to meet your governance requirements.

    • Create an IAM policy in the root compartment. If you’re a network administrator who has permission to manage all the Networking service components, you have permission to enable flow logs for subnets. Otherwise, you need a policy such as this one:

      Allow group flow-log-enablers to use flow-log-configs in tenancy
      Allow group flow-log-enablers to use subnets in tenancy
      Allow group flow-log-enablers to manage functions-family in tenancy
      Allow group flow-log-enablers to use repos in tenancy
    • You also need a policy statement that grants access to the Registry repository:

      Allow group flow-log-enablers to manage repos in tenancy where ANY {request.permission = 'REPOSITORY_CREATE', request.permission = 'REPOSITORY_UPDATE'}
    • Enable this policy to grant access to Oracle Functions and the Events service:

      Allow service FaaS to read repos in tenancy
      Allow service FaaS to use virtual-network-family in compartment

Step 1: Enable Flow Logs for a Subnet

You can think of this operation as attaching the flow log configuration to the subnet.

My colleague, Paul Cainkar, wrote a detailed post about enabling flow logs for your subnet. Follow his steps to complete this task.

Step 2: Configure the Splunk Source Type

Now that flow logs are being recorded, start setting up the data pipeline at Splunk.

Follow the steps in the Splunk documentation to create a source type.

On the Advanced tab, set the following parameters:

  • LINE_BREAKER: ([\r\n]+)
  • category: Network & Security
  • disabled: false
  • pulldown_type: true

Screenshot that shows the Advanced tab with the described parameter values.

Step 3: Create the Field Transform in Splunk

This step of the data pipeline tells Splunk how to transform the raw flow log data into meaningful columns. Follow the instructions in the Splunk documentation to create a field transform.

In this step, you create a delimiter-based transform on the _raw source key. The delimiter in this case is a space, and you specify that as follows: " "

Screenshot that shows the delimiter value on the raw key and the field list values as described in the text.

The full field list for version 2 of VCN flow logs is as follows:


Step 4: Create an HEC Token from Splunk Enterprise

Configure your Splunk HTTP Event Collector (HEC), and create an HEC token that you will need in a later step. Follow the instructions in the Splunk documentation.

Note: For Splunk Cloud deployments, HEC must be enabled by Splunk Support.

Screenshot that shows the Splunk UI for creating an HTTP Event Collector.

When you configure the input settings, specify any index that you want HEC to forward your flow log data to, and specify flowlogs as the source type. Also note your new HEC token value; you’ll need it in the following step.

The following screenshot shows and example of the data input settings.

Screenshot that shows data input settings, such as flowlogs for the source type, Search and Reporting for the app context, and the main index selected.

Step 5: Configure the Oracle Function

Create an Oracle function called splunk-flow-log-python. Request access to a prebuilt function sample. The sample function implements the necessary logic to process the VCN flow logs data, including decoding it and decompressing it, and parsing the events before sending to it Splunk HEC.

  1. After you receive the function sample, upload the zip file to an Object Storage bucket.

  2. On the Bucket Details page, click Edit Visibility and set the file to public visibility.

  3. Get the URL of the file from the Object Details page. For example, https://objectstorage.us-ashburn-1.oraclecloud.com/n/<tenancyname>/b/<bucketname>/o/function.zip.

  4. In Cloud Shell, download the file:

    wget https://objectstorage.us-phoenix-1.oraclecloud.com/n/<tenancyname>/b/<bucketname>/o/function.zip
  5. In Cloud Shell, extract the function:

    unzip function.zip && rm function.zip
  6. In Cloud Shell, open the func.yaml file in vi and set the following required parameters:

    • SPLUNK_HEC_HOST: Your Splunk fully qualified domain name or IP address for the HEC endpoint. For example, https://<splunk_hec_host>:8088/services/collector.
    • SPLUNK_HEC_TOKEN: The token value from the HEC input that you created earlier.
    • SPLUNK_HEC_SSL: A Boolean value that determines whether your HEC endpoint uses SSL. The default value is true.
    • SPLUNK_HEC_PORT: The port used to connect to HEC. The most common values are 8088 for HTTP and 443 for HTTPS. The default value is 443.
  7. Create an Oracle Functions application called FlowLogs by running the following command in Cloud Shell.

    Note: Be sure to attach this application to a VCN and subnet that can communicate with HEC on the required port.

    oci fn application create --display-name FlowLogs --subnet-ids '["ocid1.subnet.oc1.phx.aaaaa...."]' --compartment-id ocid1.compartment.oc1..aaaa.....

    Screenshot that shows the Cloud Shell with the previous command and the output.

  8. Run the following commands in Cloud Shell to configure and deploy the function code:

    1. Create a context for this compartment and select it for use:

      fn update context provider oracle
    2. Update the context with the compartment ID and the Oracle Functions API URL:

      fn update context oracle.compartment-id <compartment-id>
      fn update context api-url  https://functions.<region>.oraclecloud.com
    3. Log in to the Docker repository. To obtain the three-digit region code, see the documentation.

      fn update context registry phx.ocir.io/<tenancy_name>/[YOUR-OCIR-REPO] <-- Replace phx with the three-digit region code
    4. Update the context with the location of the Registry repository that you want to use:

      docker login phx.ocir.io <-- Replace phx with the three-digit region code

      You are prompted for the following information:

      Note: If you are using Oracle Identity Cloud Service, your username is <tenancyname>/oracleidentitycloudservice/<username>.

    5. Deploy the function to the FlowLogs application that you created in the previous step:

      fn deploy --app FlowLogs

Step 6: Configure the Function to Trigger an Event

After you activate flow logs for a particular subnet, the Logging service generates the first log when an Object Storage bucket is created specifically for the compartment the subnet exists in.

  1. Get the name of the bucket. The name follows this format: oci-logs._flowlogs.

  2. In the Console main menu, select Application Integration and then Events Service.

  3. Create a rule with the following values. For more information about creating rules, see the Events documentation.

    • Display Name: Flowlogs
    • Description: Flowlogs
    • Event Type: Object Storage - Create Bucket
    • Attribute Name: The name of the bucket
    • Action Type: Functions
    • Function Compartment: The compartment where the function resides
    • Function Applications: FlowLogs
    • Function: splunk-flow-log-python

    Screenshot that shows the Add Rule dialog box in the Events service, with the values described in the step.

Step 7: Grant Access to Object Storage

To use and retrieve the flow log objects from Object Storage, you must grant access to your function through a dynamic group.

  1. Create a dynamic group by following the instructions in the documentation.

  2. Add a rule to the group that matches the resource.id field to the function OCID for the function that you created.

  3. Create a policy that allows the function to retrieve flow log objects from the flow log bucket:

    Allow dynamic-group flowlogs to read buckets in compartment
    Allow dynamic-group flowlogs to read objects in compartment

    After few minutes, you should start seeing events in Splunk Enterprise.

Step 8: Test

Search for events in Splunk by using the following query: source="http:VCNFlowlogs"

Screenshot of the Search interface in Splunk with the results of the preceding search.

Step 9: Visualize and Alert

To create visualizations like the one shown at the beginning of this post, and to alert on certain activities, use the instructions in the following Splunk documentation topics:


If you don’t see events in Splunk, you can troubleshoot this one pipeline stage at a time by following the data flow direction.

  • Ensure that VCN flow logs are captured in the VCN and sent to Object Storage. If you still don’t see any logs, following are some possible causes:

    • When a flow log is created, it can take several minutes to collect and publish the log to Object Storage.
    • The log group in VCN flow logs is created only when traffic is recorded. Ensure that there’s traffic on the network interfaces of the selected subnets.
    • VCN flow logs might not have adequate permissions. Review the IAM policy detailed previously.
  • Ensure that the function is being triggered. If you see request errors, here are some common causes:

    • The Splunk HEC port is behind a firewall.
    • The Splunk HEC token is invalid, which would return an unauthorized status code.
    • Ensure that the all function files have the same permissions before deploying the code: "-rw-r--r--"
    • Your VCN where the function is running doesn’t have a clear path to the internet in the Splunk Cloud or the local network where the Splunk HEC lives.


This post shows you how to configure a low-overhead and highly scalable data pipeline to stream your valuable VCN flow logs to your existing Splunk Enterprise by using Oracle Functions and Events alongside a Splunk HEC. That data pipeline enables near real-time processing and analysis of data by Splunk Enterprise. For an example of how to use a pull methodology, you can use the Splunk for Object Storage Plugin from SplunkBase.

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.