ECS formatted application logs
Use an ECS logger or an APM agent to format your logs in ECS format.
Logs formatted in Elastic Common Schema (ECS) don't require manual parsing, and the configuration can be reused across applications. ECS-formatted logs, when paired with an APM agent, allow you to correlate logs to easily view logs that belong to a particular trace.
You can format your logs in ECS format the following ways:
- ECS loggers: plugins for your logging libraries that reformat your logs into ECS format.
- APM agent ECS reformatting: Java, Ruby, and Python APM agents automatically reformat application logs to ECS format without a logger.
ECS loggers
ECS loggers reformat your application logs into ECS-compatible JSON, removing the need for manual parsing. ECS loggers require Filebeat or Elastic Agent configured to monitor and capture application logs. In addition, pairing ECS loggers with your framework's APM agent allows you to correlate logs to easily view logs that belong to a particular trace.
Get started
For more information on adding an ECS logger to your application, refer to the guide for your framework:
APM agent ECS reformatting
Java, Ruby, and Python APM agents can automatically reformat application logs to ECS format without an ECS logger or the need to modify your application. The APM agent also allows for log correlation so you can easily view logs that belong to a particular trace.
To set up log ECS reformatting:
- Enable APM agent reformatting
- Ingest logs with Filebeat or Elastic Agent.
- View logs in Logs Explorer
Enable log ECS reformatting
Log ECS reformatting is controlled by the log_ecs_reformatting
configuration option, and is disabled by default. Refer to the guide for your framework for information on enabling:
Ingest logs
After enabling log ECS reformatting, send your application logs to your project using one of the following shipping tools:
- Filebeat: A lightweight data shipper that sends log data to your project.
- Elastic Agent: A single agent for logs, metrics, security data, and threat prevention. With Fleet, you can centrally manage Elastic Agent policies and lifecycles directly from your project.
Ingest logs with Filebeat
Important
Use Filebeat version 8.11+ for the best experience when ingesting logs with Filebeat.
Follow these steps to ingest application logs with Filebeat.
Step 1: Install Filebeat
Install Filebeat on the server you want to monitor by running the commands that align with your system:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-8.13.4-darwin-x86_64.tar.gz
tar xzvf filebeat-8.13.4-darwin-x86_64.tar.gz
Step 2: Connect to your project
Connect to your project using an API key to set up Filebeat. Set the following information in the filebeat.yml
file:
output.elasticsearch:
hosts: ["your-projects-elasticsearch-endpoint"]
api_key: "id:api_key"
-
Set the
hosts
to your project's Elasticsearch endpoint. Locate your project's endpoint by clicking the help icon () and selecting Endpoints. Add the Elasticsearch endpoint to your configuration. -
From Developer tools, run the following command to create an API key that grants
manage
permissions for thecluster
and thefilebeat-*
indices using:POST /_security/api_key { "name": "filebeat_host001", "role_descriptors": { "filebeat_writer": { "cluster": ["manage"], "index": [ { "names": ["filebeat-*"], "privileges": ["manage"] } ] } } }
Refer to Grant access using API keys for more information.
Step 3: Configure Filebeat
Add the following configuration to your filebeat.yaml
file to start collecting log data.
- Add the following configuration to your
filebeat.yaml
file to start collecting log data.
filebeat.inputs:
- type: filestream
paths: /path/to/logs.json
parsers:
- ndjson:
overwrite_keys: true
add_error_key: true
expand_keys: true
fields:
service.name: your_service_name
service.version: your_service_version
service.environment: your_service_environment
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
Step 4: Set up and start Filebeat
From the Filebeat installation directory, set the index template by running the command that aligns with your system:
./filebeat setup --index-management
From the Filebeat installation directory, start filebeat by running the command that aligns with your system:
sudo chown root filebeat.yml
sudo ./filebeat -e
Note
You'll be running Filebeat as root, so you need to change ownership of the configuration file and any configurations enabled in the modules.d
directory, or run Filebeat with --strict.perms=false
specified. Refer to Config file ownership and permissions.
Ingest logs with Elastic Agent
Add the custom logs integration to ingest and centrally manage your logs using Elastic Agent and Fleet:
Step 1: Add the custom logs integration to your project
To add the custom logs integration to your project:
-
In your Observability project, go to Project Settings → Integrations.
-
Type
custom
in the search bar and select Custom Logs. -
Click Install Elastic Agent at the bottom of the page, and follow the instructions for your system to install the Elastic Agent. If you've already installed an Elastic Agent, you'll be taken directly to configuring your integration.
-
After installing the Elastic Agent, click Save and continue to configure the integration from the Add Custom Logs integration page.
-
Give your integration a meaningful name and description.
-
Add the Log file path. For example,
/var/log/your-logs.log
. -
Under Custom log file, click Advanced options.
-
In the Processors text box, add the following YAML configuration to add processors that enhance your data. See processors to learn more.
processors: - add_host_metadata: ~ - add_cloud_metadata: ~ - add_docker_metadata: ~ - add_kubernetes_metadata: ~
-
Under Custom configurations, add the following YAML configuration to collect data.
json:
overwrite_keys: trueadd_error_key: trueexpand_keys: truekeys_under_root: truefields_under_root: true fields:service.name: your_service_nameservice.version: your_service_versionservice.environment: your_service_environment -
An agent policy is created that defines the data your Elastic Agent collects. If you've previously installed an Elastic Agent on the host you're collecting logs from, you can select the Existing hosts tab and use an existing agent policy.
-
Click Save and continue.
View logs
Use Logs Explorer to search, filter, and visualize your logs. Refer to the filter and aggregate logs documentation for more information.