Tuesday, March 11, 2014

[fireeye] Getting More from Your Event Data, Part I


If you use one for more of the FireEye Threat Prevention Platforms, you’re already familiar with the wealth of security event data the appliances provide. This information is useful in itself —or even better, in context with event data generated by other tools. Many security information management tools (SIMs) can help correlate data between FireEye and other appliances.
What you might not know is that you can get even more from your FireEye data, thanks to many of the free tools available for download.
I’m a big fan of ElasticSearch and MongoDB, and they play well together. For ElasticSearch users, Kibana is a well-polished dashboard that can present your data in an easy-to-read format that you can act on.
Part I of this blog series explains how to get each of these tools running and receive data from FireEye. Part II, coming soon, explains how to use these tools to get more out of your FireEye data.

Prerequisites and assumptions

This setup outlined in this blog post assumes the following:
  • You are familiar with Linux
  • You are comfortable configuring software
  • You are using a single server to run Kibana, ElasticSearch, and MongoDB
  • You have a FireEye appliance that can be configured to send JSON-extended messages to the MongoDB server. You can do this with a single appliance or multiple appliances, and they don’t have to be managed by the same management station. These appliances should be configured within DNS so that when the alert data is clicked on, it calls up the FireEye user interface so you can inspect that event further.
Wherever possible, system package management tools are used. High availability and clustering are easy to set up, but beyond the scope of this blog post. This post also does not address how to lock down the data — that’s something you need to do but a much bigger topic that we can cover here.

Operating System

These instructions are written using CentOS Linux configured with a base installation and the “Web Server” package group selected. SE Linux is configured by default, and the CGI script requires access to open local network connections. Issue the following commands as root:
setsebool httpd_can_network_connect 1
setsebool httpd_can_network_connect_db 1

Software download and installation


To install the most current version of MongoDB, add the following lines to /etc/yum.repos.d/mongodb.conf (for 32 bit systems, replace “x86_64” with “i686”):
name=MongoDB Repository

To finish installation, run this command:
yum install mongodb mongodb-server

This command searches the system for required dependencies and prompts users to install the mongo packages. Respond with a “yes” when prompted.

ElasticSearch  and the MongoDB River

ElasticSearch is available at www.elasticsearch.org/download. Install the RPM package manager file with the following command:
rpm –Uvh elasticsearch-0.90.10.noarch.rpm

Note: The file name depends on the version downloaded. Please install the latest version available.


Installing Kibana is straightforward. The download, offered as a tar archive file, is available fromwww.elasticsearch.org/overview/kibana/installation/. Select the .tar.gz file and unpack into any directory accessible from your Web server.

Python and PyMongo

Install Python 2.6, the python-setuptools package, and pip as follows:
yum install python-setuptools
easy_install pip
pip install pymongo

Configure MongoDB

MongoDB is set up as a standalone instance. To work with the FireEye appliance, it must be configured to be part of a replication set as follows:
  1. In the /etc/mongod.conf file, add this line:
    replSet = rs0
  2. Issue the following command restart the MongoDB instance:
    service mongod restart
  3. Initiate the replication set from the mongo command line using thefollowing command:
For more information configuring MongoDB, refer to http://docs.mongodb.org/manual/.

Configure CGI script to insert event data

The script to insert data into MongoDB is straightforward. First, open a connection to the local database instance. Then take each line received and insert it into the database, as shown in Figure 1.

Figure 1: Configuring the CGI script
Here’s a link to the text in Figure 1: mongo-py
You can do a lot with this script to enrich the FireEye-generated data. One example would be to look up the physical location of the system that’s compromised. You could easily use that data to generate a help desk ticket and route it to the right group. The options are limitless.

Configure ElasticSearch

Before using ElasticSearch with FireEye, you need to configure a few options. First, create a “river” through which events flow from MongoDB to ElasticSearch as follows:
curl –XPUT “localhost:9200/_river/fireeye/_meta” –d ‘{
“type”: “mongodb”,
    “mongodb”: {
        “servers”: [ { “host”: “”, “port”: 27017 } ],
        “options”: { “secondary_read_preference”: true },
        “db”: “fireeye”,
        “collection”: “events” } }’
Create the FireEye index as follows:
curl –XPOST “localhost:9200/fireeye/
The default tokenizer in Elastic Search uses the “-” character as a delimiter. That’s probably not what we want for this. We can change it easily with another request to ElasticSearch. First, close the index, make the change, and reopen it.
curl –XPOST “localhost/fireeye/_close”
curl –XPUT “localhost/fireeye/_settings” –d ‘{
“analysis”: {
    “default”: {
    “tokenizer”: “whitespace”,
    “filter”: [“lowercase”] } } }’
curl –XPOST “localhost:9200/fireeeye/open”

This command sets up the index to properly deal with FireEye events. Now you’re ready to start sending events.

Configure FireEye

With the back-end work complete, you can set your appliances up to send notifications. In the Settings: Notifications page, add an entry for the CGI script in the http settings column. Make sure that the message format is one of the JSON formats. I typically use JSON Extended.
Figure 2: FireEye dashboard

Chill out in Kibana

Now that you have data flowing in, explore it with visual tools such as Kibana. You can start out with a blank dashboard and add widgets to it. Make sure to set the correct time field in the dashboard setup. FireEye events store the timestamp in the alert.occurred field, which is not Kibana’s default.
Figure 3: Kibana dashboard
Figure 4 shows a custom query that features various event types and widgets to help visualize the data. One cool feature is that every widget is color coded to match the query where applicable.
This query features the following:
  • A histogram showing the overall event rate over a period of time
  • A map keyed on the alert.location field to show where exfiltrated data is going
  • A terms panel (Targeted Applications) showing what applications are being targeted by the exploitation activity under analysis
  • The hits panels (Event Breakdown) shows a different view on the event type breakdown.
You can also add text blocks for any kinds of notes you may want as part of your dashboard
Kibana Dashboard
Figure 4: Kibana search query with custom widgets
Another impressive feature: you can see the actual events. Because all of these tools use JSON as its native data store, it does a good job displaying FireEye events. You can add a new row, drop in a table view, and add the fields you want to see for each column (see Figure 5).
Figure 5: Customizing events
Because it natively understands JSON, you don’t need to tell it how the data is laid out. You can just start using it immediately.

Coming up…

Part II of this series shows ways to enrich your data by bringing in other data sources — and some specific challenges about working with proxy environments. It also explains how to pull data directly from MongoDB to assist remediation.

No comments:

Post a Comment