ElasticSearch integration with Logstash and Kibana to monitor Apache2 logs

Prerequisite:

Download official Docker images of ElasticSearch, Logstash and Kibana:

Run the following commands-

sudo docker pull elasticsearch:7.7.1

sudo docker pull logstash:7.7.1

sudo docker pull kibana:7.7.1

Create Docker network:

Create a user defined network (useful for connecting to other services attached to the same network (e.g. Kibana, Logstash)).

sudo docker network create elknetwork

Run Elasticsearch:

sudo docker run -dit — name elasticsearch — net elknetwork -p 9200:9200 -p 9300:9300 -e “discovery.type=single-node” elasticsearch:7.7.1

Remember to give the same name given in the command(‘ — name elasticsearch’) to your container.

Now, go to the browser and search http://locallhost:9200. You should see output like this-

Run Kibana:

sudo docker run -dit --name kibana --net elknetwork -p 5601:5601 kibana:7.7.1

  • Note: In this example, Kibana is using the default configuration and expects to connect to a running Elasticsearch instance at http://locallhost:9200.

Also, Kibana will a attach to a user defined network(elknetwork, useful for connecting to other services (e.g. Elasticsearch)).

Now, go to the browser and search http://localhost:5601. Kibana may take few minutes to start.So keep refreshing the browser and eventually you should see output like this-

Click on Explore on my own option and you will be redirected to Kibana home page.

Click on Connect to Elasticsearch index option as shown in the image.

Run Logstash:

For allowing the Logstash to read the apache2 log files, you must give the following permission to your apache2 log folder.To give the permission, run the following command:

chmod -R o+rx /var/log/apache2

Before running the logstash you must create your custom configuration file for Logstash.So run the following command:

mkdir /var/www/logconf && cd /var/www/logconf

(you can create directory anywhere)

Here, create the file:

vim logstash.conf

(you can give any name to conf file but remember to give extension .conf)

Now run the Logstash container using docker command:

sudo docker run -it — name logstash — net elknetwork -v /var/log/apache2:/log -v /var/www/logconf:/conf logstash:7.7.1 -f /conf/logstash.conf

  • Here, we have mounted the /var/log/apache2 directory to /log folder of our Logstash container, so that all the log files can be attached to the container.
  • Also, we have mounted the /var/www/logconf directory, having our Logstash conf file, to the /conf directory of our container.
  • We have used -f /conf/logstash.conf, so that Logstash container uses our custom configuration files(logstash.conf).

Make sure container shows the following output in the terminal after running the container -

Now go to your browser and search for your web pages(e.g. localhost/my.html).
This will give some similar output in the terminal like this -

Now go to theKibana dashboard http://localhost:5601.

Here, click Check for new data options.

In the Index pattern input field, type the name of the index your created(in this case logdb)

Click the Next step button.

From the Time Filter field name dropdown, select @timestamp.

Click Create index pattern.

Note: When you define an index pattern, the indices that match that pattern must exist in Elasticsearch and they must contain data. To check which indices are available, open the menu, then go to Dev Tools > Console and enter GET _cat/indices. Alternately, use curl -XGET “http://localhost:9200/_cat/indices".

Now again click Discover option from the left panel and you will see the output like this -

Your ELK stack has been successfully setup.Now you can keep monitoring your Apache2 server.

DevOps | Python Developer