{"id":3448,"date":"2019-06-29T16:24:08","date_gmt":"2019-06-29T13:24:08","guid":{"rendered":"https:\/\/kifarunix.com\/?p=3448"},"modified":"2024-03-11T22:54:15","modified_gmt":"2024-03-11T19:54:15","slug":"install-logstash-7-on-fedora-30-fedora-29-centos-7","status":"publish","type":"post","link":"https:\/\/kifarunix.com\/install-logstash-7-on-fedora-30-fedora-29-centos-7\/","title":{"rendered":"Install Logstash 7 on Fedora 30\/Fedora 29\/CentOS 7"},"content":{"rendered":"\n
This guide will focus on how to install Logstash 7 on Fedora 30\/Fedora 29\/CentOS 7 as a continuation of our guide on how to setup Elastic Stack 7 on Fedora 30\/Fedora 29\/CentOS 7<\/a>.<\/p>\n\n\n\n The installation of the first two components of ELastic Stack, Elasticsearch and Kibana have been discussed in our previous guides;<\/p>\n\n\n\n Install Elasticsearch 7 on Fedora 30<\/a><\/p>\n\n\n\n Install Elasticsearch 7.x on CentOS 7\/Fedora 29<\/a><\/p>\n\n\n\n Install Kibana 7 on Fedora 30\/Fedora 29\/CentOS 7<\/a><\/p>\n\n\n\n Once you have Elasticsearch and Kibana installed, proceed to install Logstash.<\/p>\n\n\n\n As a prerequisite, Logstash requires Java 8 or Java 11. You can install Java 8 on Fedora 30\/Fedora 29\/CentOS 7 by running the command below;<\/p>\n\n\n\n Once the installation is done, you can verify the version as in below;<\/p>\n\n\n\n If you need to use Java 11, install it as shown below;<\/p>\n\n\n\n As stated before, this is a continuation of our guide on how to setup Elastic Stack on Fedora 30\/Fedora 29\/CentOS 7. Therefore, we have already created the Elastic Stack repos in our servers.You can however create Elastic 7.x repos by executing the command below;<\/p>\n\n\n\n And now you can just install Logstash using the YUM\/DNF package manager.<\/p>\n\n\n\n To test your Logstash installation, run the most basic Logstash pipeline.<\/p>\n\n\n\n Once you see the, Pipeline main started, type any string and press ENTER.<\/p>\n\n\n\n Logstash adds timestamp and host address information to the message.<\/p>\n\n\n\n Stop Logstash by pressing Ctrl+D.<\/p>\n\n\n\n Once the installation is done, proceed to configure Logstash. Logstash data processing pipeline has three sections;<\/p>\n\n\n\n You can read more about Logstash Pipeline here<\/a>.<\/p>\n\n\n\n While configuring Logstash, you can have separate configuration files each for INPUT, FILTERS and OUTPUT. You can as well have single configuration file for all the sections. This guides uses separate configuration files.<\/p>\n\n\n\n Create Logstash input configuration file. In this guide, Beats are used as the data shippers. Hence, to configure Logstash to receive data from Beats<\/a> on TCP port 5044, create an input configuration file say, \/etc\/logstash\/conf.d\/beats-input.conf<\/strong>, with the content below;<\/p>\n\n\n\n Configure a filter plugin to process events received from the beats. This guide uses grok<\/a><\/strong> filter plugin. You can read about other plugins here<\/a>.<\/p>\n\n\n\n For demonstration purposes, we are going to configure beats to collect SSH authentication events from Ubuntu\/CentOS systems. Hence, we are going to create a filter to process such kind of events as shown below.<\/p>\n\n\n\n The grok pattern used in this example matches the ssh authetication log lines below;<\/p>\n\n\n\n The lines, if [fileset][module] == “system”<\/strong>, if [fileset][name] == “auth”<\/strong> would be used to specify to ask Logstash to apply Grok filters on the events sent by this module. However, I used these and my Grok Pattern failed to extract data fields. In case you have an idea around this, drop it in comments.<\/p>\n\n\n\n Kibana 7 comes bundled with Grok Debugger which is similar to herokuapp grokdebugger<\/a>. You can access Kibana Grok debugger under Dev Tools > Grok Debugger<\/strong>. You can utilize this to generate the correct grok patterns. You can as well check common logstash grok patterns here<\/a>.<\/p>\n\n\n\n There are different output plugins<\/a> that enables Logstash to sent event data to particular destinations. This guide uses elasticsearch<\/strong> that enables Logstash to sent data to Elasticsearch.<\/p>\n\n\n\n Create Logstash output configuration file with the content below. This confguration sents data to Elasticsearch running on the same host.<\/p>\n\n\n\n The index defines the index to write events to, logstash-%{+YYYY.MM.dd}<\/strong> is the default index.<\/p>\n\n\n\n If Elasticsearch is listening on non-loopback interface, replace localhost, hosts => [“localhost<\/strong>:9200″]<\/em> with an interface IP, for example; hosts => [“192.168.0.101<\/strong>:9200″]<\/em><\/p>\n\n\n\n So far we have used different configuration file for each Logstash section. <\/p>\n\n\n\n If you need to put them in one file, then create a configuration file as shown below;<\/p>\n\n\n\n If you need to sent the event data to standard output as well for the purposes of debugging plugin configurations, then you would add the line, stdout { codec => rubydebug }<\/strong> to the output configuration section.<\/p>\n\n\n\n You can also check sample Logstash pipelines here<\/a>.<\/p>\n\n\n\n Learn how to debug Logstash Grok Filters by following the link below;<\/p>\n\n\n\n How to Debug Logstash Grok Filters<\/a><\/p>\n\n\n\n Once you are done with configurations, run the command below to verify the Logstash configuration before you can start it.<\/p>\n\n\n\n Well, if you get Configuration OK<\/strong> then you are good to go.<\/p>\n\n\n\n To run Logstash and load a specific configuration file for debugging, you can execute the command below;<\/p>\n\n\n\n You can now start and enable Logstash to run on system boot.<\/p>\n\n\n\n On CentOS 7, if you try to start Logstash and get the error, Unit logstash.service could not be found<\/strong>, run the command below to generate systemd unit file.<\/p>\n\n\n\n To check the \/var\/log\/logstash\/logstash-plain.log<\/strong> log file for any logstash configuration errors.<\/p>\n\n\n\n If Firewalld is running and you want to receive event data from remote systems, ensure that TCP 5044.<\/p>\n\n\n\n Once you are done with configuration, proceed to install and configure Filebeat data shippers. See our next guide on how to install Filebeats on Fedora 30\/Fedora 29\/CentOS 7.<\/p>\n\n\n\n Install Filebeat on Fedora 30\/Fedora 29\/CentOS 7<\/a><\/p>\n\n\n\n That is all on how to install and configure Logstash 7 on on Fedora 30\/Fedora 29\/CentOS 7.<\/p>\n\n\n\n Reference:<\/p>\n\n\n\n Getting Started with Logstash<\/a><\/p>\n\n\n\n Install and Configure Logstash 7 on Ubuntu 18\/Debian 9.8<\/a><\/p>\n\n\n\n Install and Configure Filebeat 7 on Ubuntu 18.04\/Debian 9.8<\/a><\/p>\n\n\n\n Install Elastic Stack 7 on Ubuntu 18.04\/Debian 9.8<\/a><\/p>\n\n\n\nInstalling Logstash 7 on Fedora 30\/Fedora 29\/CentOS 7<\/h2>\n\n\n\n
Prerequisites<\/h3>\n\n\n\n
yum install java-1.8.0-openjdk.x86_64<\/code><\/pre>\n\n\n\n
java -version\nopenjdk version \"1.8.0_212\"\nOpenJDK Runtime Environment (build 1.8.0_212-b04)\nOpenJDK 64-Bit Server VM (build 25.212-b04, mixed mode)<\/code><\/pre>\n\n\n\n
dnf install java-11-openjdk.x86_64<\/code><\/pre>\n\n\n\n
Installing Logstash 7 on Fedora 30\/Fedora 29\/CentOS 7<\/h3>\n\n\n\n
\n
rpm --import https:\/\/artifacts.elastic.co\/GPG-KEY-elasticsearch<\/code><\/pre>\n\n\n\n
\n
cat > \/etc\/yum.repos.d\/elastic-7.x.repo << EOF\n[elasticsearch-7.x]\nname=Elasticsearch repository for 7.x packages\nbaseurl=https:\/\/artifacts.elastic.co\/packages\/7.x\/yum\ngpgcheck=1\ngpgkey=https:\/\/artifacts.elastic.co\/GPG-KEY-elasticsearch\nenabled=1\nautorefresh=1\ntype=rpm-md\nEOF<\/code><\/pre>\n\n\n\n
yum install logstash<\/code><\/pre>\n\n\n\n
Testing Logstash<\/h3>\n\n\n\n
cd \/usr\/share\/logstash\/bin\/<\/code><\/pre>\n\n\n\n
.\/logstash -e 'input { stdin { } } output { stdout {} }'<\/code><\/pre>\n\n\n\n
...\n[INFO ] 2019-06-29 15:12:32.023 [Ruby-0-Thread-1: \/usr\/share\/logstash\/lib\/bootstrap\/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}\n[INFO ] 2019-06-29 15:12:32.821 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}\nHello world\n...<\/code><\/pre>\n\n\n\n
...\n{\n \"host\" => \"elastic.example.com\",\n \"@version\" => \"1\",\n \"message\" => \"Hello world\",\n \"@timestamp\" => 2019-06-29T12:13:06.994Z\n}\n...<\/code><\/pre>\n\n\n\n
Configuring Logstash 7 on Fedora 30\/Fedora 29\/CentOS 7<\/h3>\n\n\n\n
\n
Configure Logstash Input plugin<\/h4>\n\n\n\n
vim \/etc\/logstash\/conf.d\/beats-input.conf<\/code><\/pre>\n\n\n\n
input {\n beats {\n port => 5044\n }\n}<\/code><\/pre>\n\n\n\n
Configure Logstash Filters<\/h4>\n\n\n\n
vim \/etc\/logstash\/conf.d\/ssh-auth-filter.conf<\/code><\/pre>\n\n\n\n
Jun 29 13:19:13 fedora29 sshd[2764]: Failed password<\/strong> for root from 192.168.43.17 port 40284 ssh2\nJun 29 13:13:31 fedora29 sshd[2598]: Accepted password<\/strong> for root from 192.168.43.17 port 40182 ssh2<\/code><\/pre>\n\n\n\n
filter {\n grok {\n match => { \"message\" => \"%{SYSLOGTIMESTAMP:timestamp}\\s+%{IPORHOST:dst_host}\\s+%{WORD:syslog_program}\\[\\d+\\]:\\s+(?<status>\\w+\\s+password)\\s+for\\s+%{USER:auth_user}\\s+from\\s+%{SYSLOGHOST:src_host}.*\" }\n add_field => { \"activity\" => \"SSH Logins\" }\n add_tag => \"linux_auth\"\n }\n }\n<\/code><\/pre>\n\n\n\n
Configure Logstash Output<\/h4>\n\n\n\n
vim \/etc\/logstash\/conf.d\/elasticsearch-output.conf<\/code><\/pre>\n\n\n\n
output {
elasticsearch {
hosts => [\"192.168.0.101<\/strong><\/em>:9200\"]
manage_template => false
index => \"ssh_auth-%{+YYYY.MM}\"
}
}<\/code><\/pre>\n\n\n\nAll in one Logstash configuration file<\/a><\/h4>\n\n\n\n
vim \/etc\/logstash\/conf.d\/ssh-authentication.conf<\/code><\/pre>\n\n\n\n
input {\n beats {\n port => 5044\n }\n}\nfilter {\n grok {\n match => { \"message\" => \"%{SYSLOGTIMESTAMP:timestamp}\\s+%{IPORHOST:dst_host}\\s+%{WORD:syslog_program}\\[\\d+\\]:\\s+(?<status>\\w+\\s+password)\\s+for\\s+%{USER:auth_user}\\s+from\\s+%{SYSLOGHOST:src_host}.*\" }\n add_field => { \"activity\" => \"SSH Logins\" }\n add_tag => \"linux_auth\"\n }\n}\noutput {\n elasticsearch {\n hosts => [\"localhost:9200\"]\n manage_template => false\n index => \"ssh_auth-%{+YYYY.MM}\"\n }\n}<\/code><\/pre>\n\n\n\n
output {\n elasticsearch {\n hosts => [\"localhost:9200\"]\n index => \"ssh_auth-%{+YYYY.MM}\"\n}\n stdout { codec => rubydebug }\n}<\/code><\/pre>\n\n\n\n
Test Logstash Configuration<\/h4>\n\n\n\n
sudo -u logstash \/usr\/share\/logstash\/bin\/logstash --path.settings \/etc\/logstash -t<\/code><\/pre>\n\n\n\n
Sending Logstash logs to \/var\/log\/logstash which is now configured via log4j2.properties\nConfiguration OK<\/strong>\n...<\/code><\/pre>\n\n\n\n
sudo -u logstash \/usr\/share\/logstash\/bin\/logstash -f \/etc\/logstash\/conf.d\/configuration-file.conf<\/strong> --path.settings \/etc\/logstash\/<\/code><\/pre>\n\n\n\n
Running Logstash<\/h3>\n\n\n\n
systemctl start logstash
systemctl enable logstash<\/code><\/pre>\n\n\n\n\/usr\/share\/logstash\/bin\/system-install \/etc\/logstash\/startup.options systemd<\/code><\/pre>\n\n\n\n
firewall-cmd --add-port=5044\/tcp --permanent\nfirewall-cmd --reload<\/code><\/pre>\n\n\n\n
Related Tutorials;<\/h3>\n\n\n\n