Stream any log file

edit

This guide shows you how to send a log file to Elasticsearch using a standalone Elastic Agent and configure the Elastic Agent and your data streams using the elastic-agent.yml file, and query your logs using the data streams you’ve set up.

Prerequisites

edit

To follow the steps in this guide, you need an Elastic Stack deployment that includes:

  • Elasticsearch for storing and searching data
  • Kibana for visualizing and managing data
  • Kibana user with All privileges on Fleet and Integrations. Since many Integrations assets are shared across spaces, users need the Kibana privileges in all spaces.
  • Integrations Server (included by default in every Elasticsearch Service deployment)

To get started quickly, spin up a deployment of our hosted Elasticsearch Service. The Elasticsearch Service is available on AWS, GCP, and Azure. Try it out for free.

Install and configure the standalone Elastic Agent

edit

Complete these steps to install and configure the standalone Elastic Agent and send your log data to Elasticsearch:

Step 1: Download and extract the Elastic Agent installation package

edit

On your host, download and extract the installation package that corresponds with your system:

curl -L -O https://artifacts.elastic.co/downloads/beats/elastic-agent/elastic-agent-8.16.0-darwin-x86_64.tar.gz
tar xzvf elastic-agent-8.16.0-darwin-x86_64.tar.gz

Step 2: Install and start the Elastic Agent

edit

After downloading and extracting the installation package, you’re ready to install the Elastic Agent. From the agent directory, run the install command that corresponds with your system:

On macOS, Linux (tar package), and Windows, run the install command to install and start Elastic Agent as a managed service and start the service. The DEB and RPM packages include a service unit for Linux systems with systemd, For these systems, you must enable and start the service.

You must run this command as the root user because some integrations require root privileges to collect sensitive data.

sudo ./elastic-agent install

During installation, you’re prompted with some questions:

  1. When asked if you want to install the agent as a service, enter Y.
  2. When asked if you want to enroll the agent in Fleet, enter n.

Step 3: Configure the Elastic Agent

edit

With your agent installed, configure it by updating the elastic-agent.yml file.

Locate your configuration file
edit

After installing the agent, you’ll find the elastic-agent.yml in one of the following locations according to your system:

Main Elastic Agent configuration file location:

/Library/Elastic/Agent/elastic-agent.yml

Update your configuration file
edit

The following is an example of a standalone Elastic Agent configuration. To configure your Elastic Agent, replace the contents of the elastic-agent.yml file with this configuration:

outputs:
  default:
    type: elasticsearch
    hosts: '<your-elasticsearch-endpoint>:<port>'
    api_key: 'your-api-key'
inputs:
  - id: your-log-id
    type: filestream
    streams:
      - id: your-log-stream-id
        data_stream:
          dataset: example
        paths:
          - /var/log/your-logs.log

Next, set the values for these fields:

  • hosts – Copy the Elasticsearch endpoint from Help menu (help icon) → Connection details. For example, https://my-deployment.es.us-central1.gcp.cloud.es.io:443.
  • api-key – Use an API key to grant the agent access to Elasticsearch. To create an API key for your agent, refer to the Create API keys for standalone agents documentation.

    The API key format should be <id>:<key>. Make sure you selected Beats when you created your API key. Base64 encoded API keys are not currently supported in this configuration.

  • inputs.id – A unique identifier for your input.
  • type – The type of input. For collecting logs, set this to filestream.
  • streams.id – A unique identifier for your stream of log data.
  • data_stream.dataset – The name for your dataset data stream. Name this data stream anything that signifies the source of the data. In this configuration, the dataset is set to example. The default value is generic.
  • paths – The path to your log files. You can also use a pattern like /var/log/your-logs.log*.
Restart the Elastic Agent
edit

After updating your configuration file, you need to restart the Elastic Agent:

First, stop the Elastic Agent and its related executables using the command that works with your system:

sudo launchctl unload /Library/LaunchDaemons/co.elastic.elastic-agent.plist

Elastic Agent will restart automatically if the system is rebooted.

Next, restart the Elastic Agent using the command that works with your system:

sudo launchctl load /Library/LaunchDaemons/co.elastic.elastic-agent.plist

Troubleshoot your Elastic Agent configuration

edit

If you’re not seeing your log files in Kibana, verify the following in the elastic-agent.yml file:

  • The path to your logs file under paths is correct.
  • Your API key is in <id>:<key> format. If not, your API key may be in an unsupported format, and you’ll need to create an API key in Beats format.

If you’re still running into issues, see Elastic Agent troubleshooting and Configure standalone Elastic Agents.

Next steps

edit

After you have your agent configured and are streaming log data to Elasticsearch:

  • Refer to the Parse and organize logs documentation for information on extracting structured fields from your log data, rerouting your logs to different data streams, and filtering and aggregating your log data.
  • Refer to the Filter and aggregate logs documentation for information on filtering and aggregating your log data to find specific information, gain insight, and monitor your systems more efficiently.