LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 06-05-2021, 11:13 AM   #1
Muruganandan
LQ Newbie
 
Registered: Dec 2020
Location: Coimbatore, Tamil Nadu,India
Posts: 5

Rep: Reputation: Disabled
Post configure Elasticsearch, Kibana, Filebeat to collect netflow and analyze


Hi Team ,

I have configured ELK,Kibana and filebeat to collect net flow data from the router. On Ubunto 20.04. Somehow I managed to collect the data successfully and Kibana started to show netflow data. But after I have added a few lines in the filebeat.yml file to give depth analyses of netflow. After adding those lines the filebeat started to throw errors even after I removed those lines.


Config for the netflow in the filebeat, Highlighted in black are the config codes have been added.
#
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/*.log
#- c:\programdata\elasticsearch\logs\*
# Exclude lines. A list of regular expressions to match. It drops the lines that are
# matching any regular expression from the list.
#exclude_lines: ['^DBG']
# Include lines. A list of regular expressions to match. It exports the lines that are
# matching an7y regular expression from the list.
#include_lines: ['^ERR', '^WARN']

- type: netflow
max_message_size: 10KiB
host: "0.0.0.0:2055"
protocols: [ v5, v9, ipfix ]
expiration_timeout: 30m
queue_size: 8192

# This requires a Kibana endpoint configuration.
setup.kibana:
host: http://localhost:5601

# Kibana Space ID
# ID of the Kibana Space into which the dashboards should be loaded. By default,
# the Default Space will be used.
#space.id:

output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]

# Protocol - either `http` (default) or `https`.
protocol: "http"
# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
==================
error

2021-06-02T12:36:50.155+0530 ERROR instance/beat.go:971 Exiting: No outputs are defined. Please define one under the output section.
Exiting: No outputs are defined. Please define one under the output section.


I disabled the logstash and I’m using Elastic search for log analysis. The following is the elasticsearch.yml file configuration
http.port: 9200
node.name: MIKROTIK
path.data: /var/lib/elasticsearch
# Path to log files:
path.logs: /var/log/elasticsearch
network.host: localhost
http.port: 9200


Please suggestions how to getrid of this alert.
 
Old 06-07-2021, 08:03 AM   #2
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 26,780

Rep: Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998
Quote:
Originally Posted by Muruganandan View Post
Hi Team ,
I have configured ELK,Kibana and filebeat to collect net flow data from the router. On Ubunto 20.04. Somehow I managed to collect the data successfully and Kibana started to show netflow data. But after I have added a few lines in the filebeat.yml file to give depth analyses of netflow. After adding those lines the filebeat started to throw errors even after I removed those lines.

Config for the netflow in the filebeat, Highlighted in black are the config codes have been added.
Code:
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/*.log
#- c:\programdata\elasticsearch\logs\*
# Exclude lines. A list of regular expressions to match. It drops the lines that are
# matching any regular expression from the list.
#exclude_lines: ['^DBG']
# Include lines. A list of regular expressions to match. It exports the lines that are
# matching an7y regular expression from the list.
#include_lines: ['^ERR', '^WARN']

- type: netflow
max_message_size: 10KiB
host: "0.0.0.0:2055"
protocols: [ v5, v9, ipfix ]
expiration_timeout: 30m
queue_size: 8192

# This requires a Kibana endpoint configuration.
setup.kibana:
host: http://localhost:5601

# Kibana Space ID
# ID of the Kibana Space into which the dashboards should be loaded. By default,
# the Default Space will be used.
#space.id:

output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]

# Protocol - either `http` (default) or `https`.
protocol: "http"
# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
error
Code:
2021-06-02T12:36:50.155+0530    ERROR    instance/beat.go:971    Exiting: No outputs are defined. Please define one under the output section.
Exiting: No outputs are defined. Please define one under the output section.
I disabled the logstash and I’m using Elastic search for log analysis. The following is the elasticsearch.yml file configuration
Code:
http.port: 9200
node.name: MIKROTIK
path.data: /var/lib/elasticsearch
# Path to log files:
path.logs: /var/log/elasticsearch
network.host: localhost
http.port: 9200
Please suggestions how to getrid of this alert.
Post your code/configs in CODE tags, to make them readable. And you have not highlighted anything in black, so we have no idea what you added. I have bolded the error you posted for emphasis only, since the error is very clear; you have not defined any outputs. The documentation should help:
https://discuss.elastic.co/t/filebea...section/269478

Since you didn't post in CODE tags, we can't see the formatting...indentation is important in the yaml files. Also please note we are volunteers here...we are not on your 'team'.
 
1 members found this post helpful.
Old 06-12-2021, 08:39 AM   #3
Muruganandan
LQ Newbie
 
Registered: Dec 2020
Location: Coimbatore, Tamil Nadu,India
Posts: 5

Original Poster
Rep: Reputation: Disabled
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/*.log
#- c:\programdata\elasticsearch\logs\*
# Exclude lines. A list of regular expressions to match. It drops the lines that are
# matching any regular expression from the list.
#exclude_lines: ['^DBG']
# Include lines. A list of regular expressions to match. It exports the lines that are
# matching an7y regular expression from the list.
#include_lines: ['^ERR', '^WARN']

- type: netflow
max_message_size: 10KiB
host: "0.0.0.0:2055"
protocols: [ v5, v9, ipfix ]
expiration_timeout: 30m
queue_size: 8192


# This requires a Kibana endpoint configuration.
setup.kibana:
host: http://localhost:5601

# Kibana Space ID
# ID of the Kibana Space into which the dashboards should be loaded. By default,
# the Default Space will be used.
#space.id:

output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]


# Protocol - either `http` (default) or `https`.
protocol: "http"
# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"



I high lighted the Elasticsearch config in the filebeat.yml config file.
Just enabled output:elasticsearch and mention server IP as local. then http to access.
I have not include anything.

Thanks
Muruganandan.C
 
Old 06-12-2021, 01:30 PM   #4
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 26,780

Rep: Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998Reputation: 7998
Quote:
Originally Posted by Muruganandan View Post
Code:
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/*.log
#- c:\programdata\elasticsearch\logs\*
# Exclude lines. A list of regular expressions to match. It drops the lines that are
# matching any regular expression from the list.
#exclude_lines: ['^DBG']
# Include lines. A list of regular expressions to match. It exports the lines that are
# matching an7y regular expression from the list.
#include_lines: ['^ERR', '^WARN']

- type: netflow
max_message_size: 10KiB
host: "0.0.0.0:2055"
protocols: [ v5, v9, ipfix ]
expiration_timeout: 30m
queue_size: 8192

# This requires a Kibana endpoint configuration.
setup.kibana:
host: http://localhost:5601

# Kibana Space ID
# ID of the Kibana Space into which the dashboards should be loaded. By default,
# the Default Space will be used.
#space.id:

output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]

# Protocol - either `http` (default) or `https`.
protocol: "http"
# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
I high lighted the Elasticsearch config in the filebeat.yml config file.Just enabled output:elasticsearch and mention server IP as local. then http to access.I have not include anything.
Again: put your code in CODE tags...it is STILL hard to read, and the indention in the yaml files is still missing.

No idea what you mean by "I have not include anything". Again, read the documentation; all you've done is re-post things (without CODE tags as requested), and not given us any further information.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: How to install Elasticsearch and Kibana on Linux LXer Syndicated Linux News 0 07-09-2019 08:51 PM
why does filebeat need to know about kibana vincix Linux - Server 2 05-15-2019 03:44 PM
LXer: Monitoring Your Picluster with Elasticsearch and Kibana LXer Syndicated Linux News 0 01-18-2018 05:13 PM
LXer: Elasticsearch and Kibana : installation and basic usage on Ubuntu 16.04 LXer Syndicated Linux News 0 03-03-2017 03:03 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 11:33 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration