Extracting field with fluentd

While playing around with fluentd the need came up to extract data nested deep into the logging input and add it as a flat field to the output. If you’re not familiar with fluentd (I’m not): it’s similar to logstash in that it collects logs from a variety of sources, filters, transforms and categorises them and either stores them in files or stream them to other sinks. It also supports a variety of plug-ins.

I find the concepts and architecture well suited to the task, the documentation less so – I feel I spent way too much time on figuring out how to do something simple, but at least the effort wasn’t in vain, this post came out 🙂

Fluentd reads log entries and splits them into records; so each entry is a record with a tag, a timestamp and several fields. Records are best imagined as JSON objects: fields can be flat (numbers and strings) or structured objects. I had the need to extract data deeply nested into the record and move it into a first level field “at the root” of the record.

The input records look something like this:

   "metadata":["the name of the app","some","other","entries"]

The field I’m looking for is accessible as (javascript notation) this.appdata.metadata[0]

The solution requires a plugin (installed by default) called record transformer:

  @type http # reads log entries over http
  port 9880  # opens a socket
    @type json # records are expected in JSON format
  tag logs

<filter logs> # extracts the input record appdata.metadata[0] into a new field app_name
  @type record_transformer
  enable_ruby true
    app_name ${record["appdata"]["metadata"][0]}

<match logs> # writes results to the output
  @type stdout

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.