Swiftorial Logo
Home
Swift Lessons
Matchups
CodeSnaps
Tutorials
Career
Resources

Logstash Tutorial

Introduction to Logstash

Logstash is a powerful open-source tool for managing events and logs. It provides real-time pipelining capabilities, enabling you to collect, parse, and store logs for future use, usually in conjunction with Elasticsearch and Kibana (the ELK stack).

Installing Logstash

To install Logstash, follow these steps:

Step 1: Add the Elastic APT repository

curl -fsSL https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -

sudo apt-get install apt-transport-https

echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list

Step 2: Install Logstash

sudo apt-get update && sudo apt-get install logstash

Configuring Logstash

Logstash uses configuration files to specify its pipeline. A basic configuration file has three sections: input, filter, and output.

Example Configuration

input {
    file {
        path => "/path/to/your/logfile.log"
        start_position => "beginning"
    }
}
filter {
    grok {
        match => { "message" => "%{COMBINEDAPACHELOG}" }
    }
}
output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "logstash-%{+YYYY.MM.dd}"
    }
    stdout { codec => rubydebug }
}
                

Running Logstash

To run Logstash with a specific configuration file, use the following command:

sudo /usr/share/logstash/bin/logstash -f /path/to/your/configfile.conf

Verifying the Pipeline

You can verify that Logstash is processing the logs by checking the output in Elasticsearch or by looking at the console output if you have configured stdout.

Example Console Output

{ "@timestamp" => 2023-10-05T12:00:00.000Z, "message" => "127.0.0.1 - - [01/Jan/2023:12:00:00 +0000] \"GET /index.html HTTP/1.1\" 200 1024", "host" => "localhost", "path" => "/path/to/your/logfile.log", "@version" => "1" }

Common Filters

Logstash provides many filters to transform and manipulate your data. Commonly used filters include:

  • Grok: Parses unstructured data into structured data.
  • Date: Parses dates from fields and uses them as the event timestamp.
  • Mutate: Performs general transformations, such as renaming and converting fields.
  • GeoIP: Adds geographical location information based on IP addresses.

Integrating with Elasticsearch

Logstash can easily integrate with Elasticsearch by using the Elasticsearch output plugin. This plugin allows you to send data to an Elasticsearch index.

Example Elasticsearch Output Configuration

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "logstash-%{+YYYY.MM.dd}"
    }
}
                

Conclusion

Logstash is a versatile and powerful tool for managing log data. This tutorial covered the basics of installing, configuring, and running Logstash, as well as integrating it with Elasticsearch. For more advanced configurations and use cases, refer to the official Logstash documentation.