How Our Customers Recovered 50% Of Their Splunk Cost
(Part 2 AWS-CloudTrail)


Following our last LinkedIn poll, the most voted option was AWS CloudTrail and as promised, we will continue our blog series  to demonstrate how we help our customers reduce their Splunk cost using Cribl LogStream.

AWS CloudTrail

AWS CloudTrail services enable you to manage governance, compliance, operational and risk auditing requirements of AWS accounts as CloudTrail events cover API and non-API calls made through the AWS Management Console, AWS CLIs, AWS SDKs, and other AWS services. These logs offer enhanced visibility and better insights into user and resource activities across AWS infrastructure and serve as a valuable intelligence source for security investigations.

CloudTrail records three types of events:

  • Management events / Control plane operations: Provides details about management activities performed on an AWS account’s resources; logged by CloudTrail by default.
  • Data events / Data plane operations: Logs resource operations conducted on or within a resource.
  • CloudTrail Insights events: Records unusual API call rates or error rate activities.

Best practice requires every organisation to enable CloudTrail Logs to be effectively queried either proactively as a countermeasure or in response to an incident. But given the breadth and depth of the logs, inspecting CloudTrail events can become challenging. This is why it is common to integrate SIEM tools for real-time monitoring and analysis to help respond proactively to security incidents. However, as time progresses, CloudTrail logs will be laden with noisy, non-security related events, and as a result, licence charges of these analytical platforms could soar, which dissuade organisations from taking full advantage of CloudTrail’s potentials. 

This blog will demonstrate how your organisation can use Cribl LogStream to formulate your CloudTrail for more effective use of Splunk, helping reduce your licence, infrastructure and storage cost whilst ensuring an improved security posture.

Why Cribl

Cribl enables observability by giving you the power to make choices that best serve your business, without the negative tradeoffs. As your goals evolve, you have the flexibility to make new choices including new tools and destinations.

At Leo CybSec, we believe in the value Cribl can bring to our customers, which is why we deliver here Splunk-based evidence on Cribl capability to reduce Splunk licensing while maintaining logs fidelity for a variety of noisy log sources that cause headache and sleepless nights to Splunk admins.

Ingesting AWS CloudTrail Logs

Our lab consists of 3 tiers: AWS, Cribl and Splunk.


  • Configure SQS-based S3 input for CloudTrail delivery as per AWS documentations
    • Set up a dedicated S3 Bucket which will host the CloudTrail log files
    • Set up an SQS with sufficient permissions to be used for S3 notifications 
    • Configure S3 notifications to SQS  

Our lab is hosted on AWS and consists of the below systems:

Server roleIP
Cribl Worker 1172.31.70.181
Cribl Worker 1172.31.66.108
Splunk Indexer 1172.31.66.108
Splunk Indexer 2172.31.72.157
Splunk Indexer 3172.31.72.93
Splunk Indexer 4172.31.73.235
Splunk Cluster Master17​​
Heavy Forwarder172.31.64.212

The Splunk Architecture

A multi-site Splunk deployment with 4 Indexers and Index Discovery enabled

MB95eIAHUU9t60Jtc4hOVeIhp01ujbXRCTmUrfPGs 8Q5S3DNukJ3ywPjQBkMpGa2iTnzMZmKQqcFlYKsAEuqs

VQhoahXoTsEt3eR LdUWjbQr 1wJnZ9HIDDsRuov2kH2IMK78 vvQjnNQw3e

Cribl LogStream  Architecture

0aAOOcnbnEbU2XVNkh5DNXSnnonKHpkcvusjIe518qFOHZ4ygD9Ke2z607SSsV3REk615HuUwb8FQprz oy5INP6ObowFt54 ktui1uzjMHrXCGTmKVrfGi9H7tPZioYZrWEn7I

Enabled source for S3 using SQS

One of the many sources that Cribl LogStream supports is S3 over SQS which requires 3 LogStream UI steps to be enabled (account number and API key have been changed):

  1. Configure the input parameters
0546p5sCp3MVBix4 ILRqt Fz0EISmHIZvvUMtub7vA0zVRCrqlVZ v4FZhXq9g7wxo XVnoQ 8jeAaIqY4cql9GUACctgRs04U45LFw01au838pQLaxERUFSpGR8zx2KR9LgL F

2. Enter the credentials (API key and secret key)

PgILfSd u7ZROjKVTCpGglKNwi5FwEyRLbbQqdFktqpKB2n9OUbuaKQl6vxVErdyN1HlpCaiFyFXkBoa9 KiH9mbHwLPC3DM7VCQGf43ujiUWamYvC4s8SqX Dgshyv06jIOdSgH

3. Enter the event breaker settings (LogStream has predefined rules for reading AWS logs)

Based on the above, CloudTrail logs re getting to LogStream as shown in the image below;

65EMBcQD5pS 8Gd2OrAKK2e8bYPNROY MsTuOq1Uw0L2outACf6AhnzJnLxwX4Kg JfhX3ABkz6tvZnFB

Create a Route in LogStream to deal with CloudTrail Logs

The below route consists of a LogStream Pipeline to apply the processing on the CloudTrail Logs

UVe8FDxU1gDhC1GxJsnCbvH7sJxTN7zk0qK0z1Dax9WACT3BO4U9t6Z8PuROgmcXDaJ4wrTHEM yZJPH9xqZN il1zB8CIDdO NuOUO3N8gjiLMiZdoHROpHVUDWk6uFun9mYDjB

Create a Load Balanced Logstream Destination to the Splunk Cluster

Graphical user interface, application

Description automatically generated

The Log Sources

  • We sent the same copy of logs via two routes as shown in the lab figure below:

Vl0x0TbTOrdemwbJcdBsEjk noAsrBSrj1W1Yr510nL2 wCiJV7FUOd5Z9RnNPdaC h3MXND8df zFPIEiKNpDIDvydY6VR3vAq057KcAee5b2Kdo9EaYtPM 08w9XjNMYocjYLv

  • Upload the sample to S3 ⇒ SQS ⇒ Splunk Heavy Forwarder ⇒ Splunk indexers 
  • Upload the sample to S3 ⇒ SQS ⇒ LogStream worker ⇒ Splunk indexers

We used 4 samples from the publicly available CloudTrail logs: 

  • flaws_cloudtrail01.json.gz
  • flaws_cloudtrail05.json.gz
  • flaws_cloudtrail19.json.gz
  • flaws_cloudtrail11.json.gz


Using Cribl LogStream

We used a mix of 9 functions divided into 4 groups as shown below to achieve reduction via: 

  • Simplifying the logs from a nested multilayer json into a key-value pair
  • Renaming long names to short ones (abbreviated) and re-apply the mapping at Splunk
  • Removing duplicates and fields that can be calculated 
  • Dropping non-security related events

In this pipeline we will demonstrate 2 reduction scenarios: 

  • Dropping all not security related events and applying event-level reduction
  • Keeping all events and applying event-level reduction
Screenshot 2021 12 07 at 12.26.16


Dropping non-security events

One efficient way of trimming CloudTrail log size is by removing irrelevant events that do not contribute value to Security Assessments. In our scenario, “Describe*” and “List*” events were identified as unnecessary elements; therefore, we used Cribl to drop all events containing Describe or List as eventName.

After successfully dropping those events, we are able to demonstrate the achieved reduction by using LogStream’s Basic Statistics for the selected sample files:


qH9L3OELIKtsQ6bq4HFl8Bv87DIMrQF1boPA ulRxufsPugz

no3lqlNME LdV2jAcW9d2FoHm5HubuNJwEZne7qlqTFweV92TSB ETvKMbyiKB3BRZYZxAK4UsIyxVgatpBbWpmGbDnGEnl3HcfHq1 pqUf1Yl6uZr02K0ePQYFovOif K3syzZE

As shown above, we are able to achieve a minimum 36.49% reduction in logs volume!

Keeping non-security events

In this case we keep the Describe and List events by disabling the drop function.

7kg6enaVK3AuNDyUQIkibv K9oS3Du 8PUxEeEbi JDmtYaknPVOeF02ii6HuU334OlBmDlXjTCJvd9Mh6tiyagfQ45JPe8JJo62 57qahtvyg OOdVTeWCcu13LCXqQw4mXYOdr

fWBqm3Yek38MK3l3jyOgSIXntU4vZ4eAQxqg0yAojC66r U7GoPl6dxOheNcW3pKZe9iuVjKvxMuG qxz9KnqQiqznL8MfiyQth

zmN195Lh y iZ xXMSqwU2YaXNcg kp875Rmta beZWrKbkMSWoit5PDMACMGkAGdxOTNbWsyWXcDUfRJHJ1hUHGSEO l0oeW44YI5mXvq bQVHj nCsGmjdfBUF1anIHgvMvuBA

Using Splunk

To demonstrate the above findings using Splunk we push now the 4 samples into Splunk via 3 stages:

  1. On Splunk HF we enable SQS-based S3 input for CloudTrail logs

2. On LogStream we disable the SQS-based S3 input to ensure logs will only flow to Splunk HF

UDWL Dmzx9VFvemBxIb5eB1awSX2B9wTmB7y 9XangP0NSjHkAQ0rIDiEKMDb vRW Iuh12HVRtOYIQe5K2ipG6kNqhEQgPDA1IUsCxwz7lCqsFHUTLh 6amGBwz40Z3QnNmBFgO

3. We upload the 4 sample files into the destination S3

Screenshot 2021 12 07 at 12.40.10

4. Verify the logs are arriving at Splunk under the AWS index

Screenshot 2021 12 07 at 12.41.01

5. Disable the Splunk HF SQS-based S3 input and enable the LogStream source for SQS based S3

Enable the Drop function 

  • At this stage we enable the drop function to remove all the Describe and List functions  
  • Delete all the files from the S3 and upload them again 
  • Validate that the logs are getting to LogStream

  • Validate that the logs are heading to the destination (Splunk)
  • 4gTqQyMTP9zFjkuZjRZeV 0f 8EHWNlM5Iz5wFe9TbiBMu9ZE KWGJ8S9Uazz0YvHnoHputXFi3kWU2bT9Rv69nTJLeCIHEJcbFnwvCILqqQTIMcUnLdIYKRlvQ5MzgrhQwhqeVA

  • Validate that the logs are reaching Splunk and are searchable 
  • rvq mj6GM93u3DNmty6CNpUeeFXCyEZ35O2sQhhd8q2WCdq5uOBD7axbYbzPuZ9j1waB6EwN35HVT

    Screenshot 2021 12 07 at 12.44.34

    License Reduction

    Now let’s compare the consumed license in Splunk from the 4 sample log files;

    QSW8iQZvO5sldyfvWq4CAqcwIG e8X1tkb7aC 5lzwzllC6KYr3E8EVld2qRgy wyl5bcsMiNrRAQcFqPzCV6c41PcZhSMW5S2FXLEXg7pC2ulM9Fje7Xur2WQUSLIW1ZsQktTPH

    OUO42RaJUZBvndAcNOE9HlY3JRwBGNCuSkn7qz1yowFDBrYiAKTFNyiasJ2pmHusCfgYZL Tmd 1UTSVLaA 60JnEdnZWGbKjaDwlXJhGqGeYCW5ACjmScS3FtDYJ42rXMECzOgw

    Disable the Drop function

    Same comparison but this time without the Drop Function enabled;


    3xxdqUAUl1L dOB7b8Knt 4aMCMBlf13YN2bBldL3gcGwu 3Xs8Wq9nKN4ciwfUKikh7Vx8VUL4idYBJOM9j9F85SEJXRlQf eE GVrne nBTX5NbVISdfi7GCTdTMdFCow2zR6C


    • Not all CloudTrail events are essential for security, and Cribl’s LogStream can help filter them either by dropping or by directing them to another destination 
    • Utilizing LogStream’s inherent capabilities, we are able to drop redundant fields and fields that can be calculated at Splunk side using lookups e.g. (we only send eventName and using lookups we calculate eventSource and eventType) these are static values of CloudTrail events. For example, in the table below we only send the 1st column values and use lookup to fill 2nd and 3rd ones.

    • We managed to achieve a license reduction between 35 and 40% following an effortless but powerful use of LogStream.

    What other log sources do you want to see reduced with LogStream? 

    What other destinations do you want to reduce the logs ingestion volume to?

    Please visit our website or drop us an email at  and will be more than happy to help and answer any questions.

    Leave a Reply

    Your email address will not be published. Required fields are marked *