Upgrading Logstash and Elasticsearch to 2.0

edit

Upgrading Logstash and Elasticsearch to 2.0

edit

If you are using Elasticsearch as an output, and wish to upgrade to Elasticsearch 2.0, please be aware of breaking changes before you upgrade. In addition, the following steps needs to be performed after upgrading to Elasticsearch 2.0:

Mapping changes: Users may have custom template changes, so by default a Logstash upgrade will leave the template as is. Even if you don’t have a custom template, Logstash will not overwrite an existing template by default.

There is one known issue (removal of path) with using GeoIP filter that needs a manual update to the template.

Note: If you have custom template changes, please make sure to save it and merge any changes. You can get the existing template by running:

curl -XGET localhost:9200/_template/logstash

Add the following option to your Logstash config:

output {
	elasticsearch {
		template_overwrite => true
	}
}

Restart Logstash.

Dots in fields: Elasticsearch 2.0 does not allow field names to contain the . character. Further details about this change here. Some plugins already have been updated to compensate for this breaking change, including logstash-filter-metrics and logstash-filter-elapsed. These plugin updates are available for Logstash 2.0. To upgrade to the latest version of these plugins, the command is:

bin/plugin update <plugin_name>

If you have a custom plugin that is creating field names with dots, we advice you to not use dots. You can safely use underscore or nested fields like [foo][bar]. Unfortunately, many users have no control over the sources of their fields. This has resulted in a poor user experience where dotted fields existed. To address this issue, the de_dot filter has been created. You can find the plugin documentation for it here.

You can install the de_dot filter using the plugin command:

bin/plugin install logstash-filter-de_dot

Multiline Filter: If you are using the Multiline Filter in your configuration and upgrade to Logstash 2.0, you will get an error. Make sure to explicitly set the number of filter workers (-w) to 1. You can set the number of workers by passing a command line flag such as:

bin/logstash `-w 1`