Capturing Osquery query results with AWS Firehose (Kinesis) and AWS Athena

Why?

Whilst Zercurity captures and presents the data we query across your many assets. You may want to log whats going on in its entirety. Capturing exactly what going on in Zercurity a your own queries. For processing in external systems such as Elastic Search or AWS S3.

How?

Getting Osquery to log to another external source is super simple. The --logger_plugin can be used to specify additional endpoints split by a comma delimiter.

--aws_access_key_id "ACCESS_KEY"
--aws_secret_access_key "SECRET_KEY"
--aws_region "us-east-1"
--aws_kinesis_stream "osquery"
--aws_firehose_stream "osquery"

Configuring AWS Kinesis

However, before all that. We need to first setup and configure our AWS Firehose to receive and process the incoming data.

AWS S3

We’re going to use S3 as the destination for our Firehose as we’ll then be able to use AWS Athena to query our results.

Creating our S3 bucket for our Osquery Firehose logs.

Kinesis

From the AWS console navigate to the Kinesis dashboard and click “Create delivery stream”. From here name your new delivery stream and ensure that the Source is set to “Direct PUT or other sources”. As the Osquery agent will be directly interfacing with the AWS API.

Creating our Osquery Firehose for “Direct PUT or other sources”
Choosing our S3 bucket for our AWS Kinesis Firehose
Create our IAM role for our Kinesis delivery stream.

Creating our Kinesis IAM policy

For our Osquery agent we’ll need to create a programmatic user account in order for the Osquery agent to send data to our Firehose. The first step is to setup a new IAM policy for which we’re going apply the required permissions to push data into our AWS Kinesis Firehose.

Creating our programmatic user

Lastly, we need to create our new programmatic user (which can be done via the Add user wizard).

Creating our AWS Kinesis user
Attaching our Osquery Firehose policy
Downloading our API keys for Osquery to access the AWS Firehose API

Configuring Osquery to send data to AWS Kinesis Firehose

The Zercurity configuration file currently uses both the filesystem and tls plugins as its logger. We’re going to add the AWS Firehose to this as well.

/usr/local/zercurity/zercurity.pem  # Mac OSX
/opt/zercurity/zercurity.pem # Linux
C:\Program Data\zercurity\zercurity.pem # Windows
Exception making HTTP request to URL (https://firehose.eu-central-1.amazonaws.com): certificate verify failed
openssl s_client -showcerts -verify 5 -connect firehose.eu-central-1.amazonaws.com:443
cat /etc/ssl/certs/ca-certificates.crt

Updating your Osquery configuration

To test things locally before deploying the new configuration changes to your wider fleet you’ll need to update your Osquery flags file which can be found in the following paths:

/Library/LaunchDaemons/com.zercurity.osqueryd.plist  # Mac OSX
/etc/osquery/osquery.flags # Linux
C:\Program Data\zercurity\osquery\osquery.flags # Windows
--aws_access_key_id "ACCESS_KEY"
--aws_secret_access_key "SECRET_KEY"
--aws_region "us-east-1"
--aws_firehose_stream "osquery"
# For Mac OSX run the following from the command line to restart the Osquery servicesudo launchctl unload \
/Library/LaunchDaemons/com.zercurity.osqueryd.plist
sudo launchctl load \
/Library/LaunchDaemons/com.zercurity.osqueryd.plist
# For Linux. Depending on your distribution you can run one of the following:sudo systemctl restart osqueryd
sudo /etc/init.d/osqueryd restart
# Windows Start->Run. Launch services.msc and then restart the Osquery serviceservices.msc
AWS Osquery Firehose results in AWS S3
{
"name":"f64af85f-a05e-4601-98cd-6c9a8f35feec",
"hostIdentifier":"MacBook-Pro.local",
"calendarTime":"Sun Feb 28 09:40:01 2021 UTC",
"unixTime":"1614505201",
...

"columns: {
"action":"CONNECT",
"family":"2",
"local_address":"192.168.15.249",
"local_port":"62531",
"path":"\/usr\/bin\/ssh",
"pid":"78775",
"protocol":"6",
"remote_address":"192.168.31.248",
"remote_port":"22",
"timestamp":"1614456741"
},
"action":"added",
"log_type":"result"
}

AWS S3 Athena

Now that we’ve got all our wonderful data being dumped in our S3 bucket we can configure AWS Athena to let us query this data stored within our S3 bucket.

Creating our AWS Glue crawler
Configuring our AWS Glue crawler
Giving the AWS Glue crawler access to our S3 bucket containing our Osquery result data
Running our new on-demand AWS Glue crawler
Querying Osquery sockets data with AWS Athena

Its all over!

We hope you found this helpful. Getting AWS Firehose and AWS Athena working alongside Zercurity. Please feel free to get in touch if you have any questions.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store