Start a Project

Setting up Elasticsearch, Logstash and Kibana for centralized logging

For proper analysis of a server or a system, logs play a very important role. By reading logs we can know about  system incidents  which happened in a particular time. Logs help us to detect the problems in the system and troubleshoot the problems. Logs are generally stored in files. For different services different log files are generated. If we have many systems and  we need to analyze the logs,then it will be difficult to manage all the logs. So it is best practice to store the logs in a centralized sever.

As logs are saved in files, sometimes it is hard to analyze the data in text format. If we can extract the data from logs and convert the data in table, graph, pie-chart form it will be easy for us to analyze the data. It is easier to analyze the data in detailed picture form rather than reading the text based log files.

In case of apache  log, contents of the log file increases every time some one visit the website. So for proper analysis we should able to read the logs in real time. So the centralized server should be able to fetch the data from the servers in real time.

The above requirements are fulfilled by the ELK (Elasticsearch, Logstash and Kibana) stack. Elasticsearch is used as a centralized log server where as Logstash is used to send log data to Elasticsearch in real time. Kibana is used for view the complex log data in picture form.

In this tutorial I will show you how to setup  ELK stack. I am using ubuntu 14.04 in all the servers.

Setting up ELK stack

Suppose I have 3 different servers running apache service and I need to get the log files from each server. So, I need to setup Logstash in each of the server. Logstash will read the apache log files and send it to the Elasticsearch server. Elasticsearch will save the logs in unique index defined by the Logstash. I need another server which will use Kibana to fetch the data from the Elasticsearch and visualize the data.

Install Java

The ELK stack needs java to run the applications. So you need to install java in all the servers. Before installing any of the application make sure Java is installed.

Install Kibana

Login to you server where you want to setup kibana

Open /etc/kibana/kibana.yml and change the following.

Now, restart the kibana service.

Install Elasticsearch

Login to your server where you want to setup Elasticsearch

Open /etc/elasticsearch/elasticsearch.yml  and change the following.

Now,restart the Elasticsearch service.

Install Logstash

You need to install Logstash for each  server. Installation and configuration is same for all the server, you just need to define different index name for Elasticsearch.

Create a file  /etc/logstash/conf.d/apache.conf and copy the following content.

For different server change the index. Make sure the path to the log file is correct in the input section. In filter section you can define your own pattern and get the logs in key value format.

Now, restart the Logstash service and use the configuration file we wrote for sending logs to Elasticsearch.

If there is no error in configuration, you will see output like this

Now, In the browser open Kibana server, in my case it is http://192.168.1.80:5601.

Go to Management -> Index Patterns -> Add New and add your index name., the index name you added in the logstash configuration file.

After you create your index you can able to visualize your logs in Kibana Dashboard. Do this for other servers also and you will be able to view logs by index name. If you have any query regarding  ELK stack setup, you can ask me in the comment.

 

 

IN CASE OF ANY QUERY,CONTACT US

 

Exit mobile version