- Logstash Reference: other versions:
- Logstash Introduction
- Getting Started with Logstash
- How Logstash Works
- Setting Up and Running Logstash
- Logstash Directory Layout
- Logstash Configuration Files
- logstash.yml
- Secrets keystore for secure settings
- Running Logstash from the Command Line
- Running Logstash as a Service on Debian or RPM
- Running Logstash on Docker
- Configuring Logstash for Docker
- Running Logstash on Windows
- Logging
- Shutting Down Logstash
- Installing X-Pack
- Setting Up X-Pack
- Upgrading Logstash
- Configuring Logstash
- Structure of a Config File
- Accessing Event Data and Fields in the Configuration
- Using Environment Variables in the Configuration
- Logstash Configuration Examples
- Multiple Pipelines
- Pipeline-to-Pipeline Communication (Beta)
- Reloading the Config File
- Managing Multiline Events
- Glob Pattern Support
- Converting Ingest Node Pipelines
- Logstash-to-Logstash Communication
- Centralized Pipeline Management
- X-Pack monitoring
- X-Pack security
- X-Pack Settings
- Managing Logstash
- Working with Logstash Modules
- Working with Filebeat Modules
- Data Resiliency
- Transforming Data
- Deploying and Scaling Logstash
- Performance Tuning
- Monitoring Logstash
- Monitoring APIs
- Working with plugins
- Input plugins
- azure_event_hubs
- beats
- cloudwatch
- couchdb_changes
- dead_letter_queue
- elasticsearch
- exec
- file
- ganglia
- gelf
- generator
- github
- google_cloud_storage
- google_pubsub
- graphite
- heartbeat
- http
- http_poller
- imap
- irc
- jdbc
- jms
- jmx
- kafka
- kinesis
- log4j
- lumberjack
- meetup
- pipe
- puppet_facter
- rabbitmq
- redis
- relp
- rss
- s3
- salesforce
- snmp
- snmptrap
- sqlite
- sqs
- stdin
- stomp
- syslog
- tcp
- udp
- unix
- varnishlog
- websocket
- wmi
- xmpp
- Output plugins
- boundary
- circonus
- cloudwatch
- csv
- datadog
- datadog_metrics
- elastic_app_search
- elasticsearch
- exec
- file
- ganglia
- gelf
- google_bigquery
- google_pubsub
- graphite
- graphtastic
- http
- influxdb
- irc
- juggernaut
- kafka
- librato
- loggly
- lumberjack
- metriccatcher
- mongodb
- nagios
- nagios_nsca
- opentsdb
- pagerduty
- pipe
- rabbitmq
- redis
- redmine
- riak
- riemann
- s3
- sns
- solr_http
- sqs
- statsd
- stdout
- stomp
- syslog
- tcp
- timber
- udp
- webhdfs
- websocket
- xmpp
- zabbix
- Filter plugins
- aggregate
- alter
- cidr
- cipher
- clone
- csv
- date
- de_dot
- dissect
- dns
- drop
- elapsed
- elasticsearch
- environment
- extractnumbers
- fingerprint
- geoip
- grok
- http
- i18n
- jdbc_static
- jdbc_streaming
- json
- json_encode
- kv
- memcached
- metricize
- metrics
- mutate
- prune
- range
- ruby
- sleep
- split
- syslog_pri
- threats_classifier
- throttle
- tld
- translate
- truncate
- urldecode
- useragent
- uuid
- xml
- Codec plugins
- Tips and Best Practices
- Troubleshooting Common Problems
- Contributing to Logstash
- How to write a Logstash input plugin
- How to write a Logstash codec plugin
- How to write a Logstash filter plugin
- How to write a Logstash output plugin
- Documenting your plugin
- Contributing a Patch to a Logstash Plugin
- Logstash Plugins Community Maintainer Guide
- Submitting your plugin to RubyGems.org and the logstash-plugins repository
- Contributing a Java Plugin
- Glossary of Terms
- Breaking Changes
- Release Notes
- Logstash 6.8.23 Release Notes
- Logstash 6.8.22 Release Notes
- Logstash 6.8.21 Release Notes
- Logstash 6.8.20 Release Notes
- Logstash 6.8.19 Release Notes
- Logstash 6.8.18 Release Notes
- Logstash 6.8.17 Release Notes
- Logstash 6.8.16 Release Notes
- Logstash 6.8.15 Release Notes
- Logstash 6.8.14 Release Notes
- Logstash 6.8.13 Release Notes
- Logstash 6.8.12 Release Notes
- Logstash 6.8.11 Release Notes
- Logstash 6.8.10 Release Notes
- Logstash 6.8.9 Release Notes
- Logstash 6.8.8 Release Notes
- Logstash 6.8.7 Release Notes
- Logstash 6.8.6 Release Notes
- Logstash 6.8.5 Release Notes
- Logstash 6.8.4 Release Notes
- Logstash 6.8.3 Release Notes
- Logstash 6.8.2 Release Notes
- Logstash 6.8.1 Release Notes
- Logstash 6.8.0 Release Notes
- Logstash 6.7.2 Release Notes
- Logstash 6.7.1 Release Notes
- Logstash 6.7.0 Release Notes
- Logstash 6.6.2 Release Notes
- Logstash 6.6.1 Release Notes
- Logstash 6.6.0 Release Notes
- Logstash 6.5.4 Release Notes
- Logstash 6.5.3 Release Notes
- Logstash 6.5.2 Release Notes
- Logstash 6.5.1 Release Notes
- Logstash 6.5.0 Release Notes
- Logstash 6.4.3 Release Notes
- Logstash 6.4.2 Release Notes
- Logstash 6.4.1 Release Notes
- Logstash 6.4.0 Release Notes
- Logstash 6.3.2 Release Notes
- Logstash 6.3.1 Release Notes
- Logstash 6.3.0 Release Notes
- Logstash 6.2.4 Release Notes
- Logstash 6.2.3 Release Notes
- Logstash 6.2.2 Release Notes
- Logstash 6.2.1 Release Notes
- Logstash 6.2.0 Release Notes
- Logstash 6.1.3 Release Notes
- Logstash 6.1.2 Release Notes
- Logstash 6.1.1 Release Notes
- Logstash 6.1.0 Release Notes
Deserializing Data
editDeserializing Data
editThe plugins described in this section are useful for deserializing data into Logstash events.
- avro codec
-
Reads serialized Avro records as Logstash events. This plugin deserializes individual Avro records. It is not for reading Avro files. Avro files have a unique format that must be handled upon input.
The following config deserializes input from Kafka:
input { kafka { codec => { avro => { schema_uri => "/tmp/schema.avsc" } } } } ...
- csv filter
-
Parses comma-separated value data into individual fields. By default, the filter autogenerates field names (column1, column2, and so on), or you can specify a list of names. You can also change the column separator.
The following config parses CSV data into the field names specified in the
columns
field:filter { csv { separator => "," columns => [ "Transaction Number", "Date", "Description", "Amount Debit", "Amount Credit", "Balance" ] } }
- fluent codec
-
Reads the Fluentd
msgpack
schema.The following config decodes logs received from
fluent-logger-ruby
:input { tcp { codec => fluent port => 4000 } }
- json codec
-
Decodes (via inputs) and encodes (via outputs) JSON formatted content, creating one event per element in a JSON array.
The following config decodes the JSON formatted content in a file:
input { file { path => "/path/to/myfile.json" codec =>"json" }
- protobuf codec
-
Reads protobuf encoded messages and converts them to Logstash events. Requires the protobuf definitions to be compiled as Ruby files. You can compile them by using the ruby-protoc compiler.
The following config decodes events from a Kafka stream:
input kafka { zk_connect => "127.0.0.1" topic_id => "your_topic_goes_here" codec => protobuf { class_name => "Animal::Unicorn" include_path => ['/path/to/protobuf/definitions/UnicornProtobuf.pb.rb'] } } }
- xml filter
-
Parses XML into fields.
The following config parses the whole XML document stored in the
message
field:filter { xml { source => "message" } }