LogStash in action

ManujAggarwal1 1,382 views 48 slides Jun 13, 2017
Slide 1
Slide 1 of 48
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48

About This Presentation

Installation and configuration of Logstash. https://goo.gl/oZIrwS


Slide Content

LogStash in Action

What’s In It for You?

Data ingestion workhorse
Events enrichment and transformation
Extensible plugin ecosystem
Pluggable pipeline architecture
Horizontally scalable data processing pipeline
Strong Elasticsearchand Kibanasynergy
Handles data of all shapes and sizes
Key Features of LogStash
What’s In It for You?

Installation and Configuration

The Pre-requisites
Installation and Configuration
Prerequisites
requires Java 7 or higher
Installation steps
Download from elastic.co web site
Use Linux package manager to install
LogStash
Install LogStash as a service

The Service Commands
Installation and Configuration

Video Demonstration
Installing LogStash

Play Video

The Service Commands
Installation and Configuration
Start or stop service commands :
sudo/etc/init.d/logstashstart
sudo/etc/init.d/logstashstop
sudo/etc/init.d/logstashrestart
sudo/etc/init.d/logstashstatus

Simple Pipeline
Installation and Configuration
Verify LogStash installation
with a simple pipeline
Will take input from command
line and output it back to the
command line
Pipeline configuration
information is passed as text on
command line
Takes input from standard
input “stdin”
Outputs to standard output
“stdout” in a structured format

bin/logstash-e 'input { stdin{ } } output { stdout{} }'
Simple Pipeline
Installation and Configuration

4
5 # Simple LogStash configuration
6
7 Inputs{
8 Stdin{ }
9 }
10
11 Output {
12 Stdout{ }
13 }
14
Simple Pipeline
Installation and Configuration

Video Demonstration
Configuring a Simple Pipeline

Play Video

Advanced Pipeline
Installation and Configuration
Real world pipelines contain one or
more input, filter and outputs
Is generally provided in a
configuration file rather than command
line
Supplied to LogStash with –f
command line argument
Test the configuration using --
configtestargument

Skeleton LogStash Configuration
Installation and Configuration
4 # The # character at the beginning of
5 # a line indicates comment
6#Use the comments to describe your configuration
7 input{
8 input1 { }
9 input2 { }
10 }
11
12 # The filter part of this file is
13 # commented out to indicate that
14 # it is optional.
15
16# filter {
17# }
18
19 output{
20 output1 { }
21 output2 { }
22 }

LogStash Plugins
Installation and Configuration
LogStash Instance
Data Source ElasticSearch
Filter
Plugin
Output
Plugin
Input
Plugin

Input
Plugin
elasticsearch
file
imap
jdbc
stdin
s3
syslog
tcp
twitter
udp

Filter
Plugin
csv
date
drop
grok
mutate
range
sleep
translate

Output
Plugin
csv
elasticsearch
email
file
mongodb
stdout
s3
syslog
tcp
udp

4# ElasticSearchinput plugin
5
6Input{
7
8 # Read all documents from ElasticSearch
9 # matching the given query
10
11 elasticsearch{
12 hosts => “localhost”
13 index => “blogs”
14 query => ‘{ “localhost”: { “match_all”: { } } }’
15 type => “my-data-elasticsearch”
16 }
17}
18
Input-ElasticSearch
Installation and Configuration

3
4 # File input
5
6 Inputs{
7
8 # Read events from file or folder
9
10 file {
11 path => “/var/log/ * ”
12 exclude => “ * .gz”
13 sincedb_path=> “ /dev/null ”
14 start_position=> “ beginning “
15 type => “ my-data-csv”
16 }
17 }
18
Input-File
Installation and Configuration

4 # JDBC input
5
6 Input{
7
8 # Read all records from mySQL
9 # database
10
11 jdbc{
12
13 jdbc_driver_library=> “/opt/logstash/lib/mysql-connector-java-5.1.6-bin.jar “
14 jdbc_driver_class=> “com.mysql.jdbc.Driver”
15 jdbc_connection_string=> “jdbc: mysql: // localhost : 3306 / mydb”
16 jdbc_user=> “root ”
17 jdbc_password=> “password ”
18 statement=> “SELECT * from users “
19
20 }
21 }
22
Input-jbdc
Installation and Configuration

3 # AWS S3 input
4
5 Input {
6
7 # Read all documents from AWS S3
8
9 s3 {
10
11 bucket => “my-bucket “
12 credentials => [ “my-aws-key “, “my-aws-token “]
13 region_endpoint=> “us-east-1 “
14 codec => “json“
15
16 }
17 }
18
Input-s3
Installation and Configuration

4 # TCP input
5
6 Input{
7
8 # Read all events over TCP socket
9
10 tcp{
11 port => “ 5000 “
12 type => “ syslog “
13 }
14
15 }
Input-tcp
Installation and Configuration

4 # UDP input
5
6 input{
7
8 # Read all events over UDP port
9
10 udp{
11 port => “5001”
12 type => “netflow”
13 }
14
15 }
16
Input-udp
Installation and Configuration

Filter-csv
Installation and Configuration
2
3 # CSV filter
4 filter{
5
6 csv {
7
8 # List of columns as they appear in csv
9 column => [ “column_1” ,“column_2” ]
10 column => { “column_3” => “integer”, “column_4” => “boolean” }
11 type => “syslog”
12
13 }
14
15 }
16

2
3 # date filter
4
5 filter{
6
7
8 date{
9
10 match =>[“logdate” , “MMM ddHH:mm:ss”]
11 # Default for target is @timestamp
12 target => “logdate_modified”
13
14 }
15 }
Filter-date
Installation and Configuration
•Used for parsing dates and use as LogStash event timestamp in ISO8601 format
•For example “Jan 01 10:40:01” can be parsed using the pattern “MMM ddHH:mm:ss”

Filter-drop
Installation and Configuration
2
3 # drop filter
4
5 filter{
6
7 # drop the events of their loglevelis debug
8 drop{
9 if[loglevel]= = “debug” {
10 drop{ }
11 }
12 }
13 }
14
15

Filter-range
Installation and Configuration
2
3 # range filter
4
5 filter{
6 range{
7 ranges => [“request_time” , 0, 10, “tag: short” ,
8 “request_time” , 11, 100, “tag: medium”,
9 “request_time” , 101, 1000, “tag: long”,
10 “request_time” , 1001, 100000, “drop”,
11 “request_length” , 0, 100, “field: size: small”,
12 “request_length” , 101, 200, “field: size: normal”,
13 “request_length” , 201, 1000, “field: size: big”,
14 “request_length” , 1001, 100000, “field: size: hugel”,
15 “number_of_requests” , 0, 10, “tag: request_from_%{host}” ]
16 }
17 }

Filter-grok
Installation and Configuration
Grokis one of the most
widely used plugin
It is instrumental in parsing
arbitrary and unstructured text
into structedand queryable
data field
It is widely used to parse
syslog, apache logs, mySQL
logs, custom application logs,
postfix logs etc.
Grokworks based on
patterns
Syntax for grokpattern is
%{SYNTAX:SEMANTIC}
Custom patterns can be
added

Filter-grok
Installation and Configuration
2
3 # grokfilter
4
5 input {
6 file{
7 path => “/ var/log/http.log”
8
9 # sample log entry
10 # 55.11.55.11 GET/ index.html 453 12
11 }
12 }
13
14 filter {
15 # parse http log
16
17 grok{
18
19 match => { “message” => “% { IP: client} %{WORD: method} %{URIPATHPARAM: request} %{NUMBER: duration}”}
20
21 }
22 }
23

Filter-grok
Installation and Configuration
2
3 # grokfilter
4
5 input {
6 file{
7 path => “/ var/log/http.log”
8
9 # sample log entry
10 # 55.11.55.11 GET/ index.html 453 12
11 }
12 }
13
14 filter {
15 # parse http log
16
17 grok{
18
19 match => { “message” => “% { IP: client} %{WORD: method} %{URIPATHPARAM: request} %{NUMBER: duration}”}
20
21 }
22 }
23
Groksupports custom patterns
•inline custom pattern using Onigurumasyntax
•file based custom patterns

Filter-grok
Installation and Configuration
43 # ( ? < field_name> the pattern here
44
45
46 ( ? < message_id> [0-9A-F] {10, 11}

24
25 # grokfilter
26
27 filter{
28
29 grok{
30
31 patterns_dir=> [ “~/patterns”]
32 match => { “message” => “% {SYSLOGBASE} %{POSTFIXQUEUEID: queue_id}: %{GREEDYDATA:syslog_message}” }
33
34 }
35 }
Filter-grok
Installation and Configuration

Filter-mutate
Installation and Configuration

Filter-sleep
Installation and Configuration

Filter-translate
Installation and Configuration

Output-csv
Installation and Configuration

Output-file
Installation and Configuration

Output-stdout
Installation and Configuration

Output-elasticsearch
Installation and Configuration

Output-email
Installation and Configuration

Output-s3
Installation and Configuration

Output-tcp
Installation and Configuration