In this blog, we will walk through Elasticsearch, Fluentd and Kibana installation and setup.
Have you ever been stuck with your application behaving waywardly and, unfortunately, with no significant logs whatsoever? Or having hundreds and thousands of streams of logs bombarding continuously which makes it even harder to comprehend.
These scenarios, come so often, marks the significance of well ordered & legible logs in development of any application. Even the absence of sorting and filtering features could make debugging and issue tracking a tedious task. Besides, an application could be a flock of microservices, so accessing their logs on different locations could be a hectic approach.
To the rescue, there are ways of managing logs in a more organised manner. EFK Stack is one of the logging solutions.
Elasticsearch, Fluentd and Kibana
Elasticsearch, Fluentd, and Kibana comprises a powerful logging and monitoring stack which allows logs management, visualisation, debugging easier in an interactive and centralised manner. It allows users to understand real-time heavy log data by allowing sorting and filtering by time, category, incident, index etc.
Elasticsearch is an analytical search engine. It allows textual, numerical, geospatial, structured and unstructured data search. Elasticsearch coupled with Kibana, which is a powerful visualisation tool, provide desirable readability of logs. With an interactive web interface, kibana helps in managing, sorting and visualising elasticsearch data. Third component of this stack, Fluentd, is a data collector and shipper. It collects logs from a source, manipulate it and then ship it to a destination. Destination could be it another fluentd or elasticsearch or any other targets.
A few people prefer ELK (Elasticsearch, Logstash and Kibana) over EFK. Both logstash and fluentd are capable of carrying out log shipping job. You can prefer the one more suitable for your application.
This article will only cover EFK stack installation and setup on Ubuntu 18.04 operating system. EFK configuration to visualise logs will be discussed in subsequent blogs. Here, we will deploy server client two-tier architecture. Fluentd will be installed on server to be monitored (where logs are being stored) and EFK stack will be installed on separate server. Fluentd client will then ship data to EFK server. Client side fluentd will communicate with server side fluentd over tcp port 24224. Then, server side fluentd will transfer data on elasticsearch over 9200 tcp port present on the same server. Server side td-agent uses fluent-plugin-elasticsearch to transfer data to elasticsearch server.
Let’s begin with Elasticsearch, Fluentd and Kibana installation at server side and fluentd installation at client side:
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo apt-get install apt-transport-https
echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list
sudo apt-get update && sudo apt-get install elasticsearch
sudo update-rc.d elasticsearch defaults 95 10
service elasticsearch restart
service elasticsearch status
tail -f /var/log/elasticsearch/elasticsearch.log
sudo apt-get update && sudo apt-get install kibana
sudo update-rc.d kibana defaults 95 10
sudo -i service kibana start
service kibana status
Next, we will install TD Agent which is a stable and much simpler distribution of Fluentd.
curl -L https://toolbelt.treasuredata.com/sh/install-ubuntu-bionic-td-agent3.sh | sh
td-agent-gem install fluent-plugin-elasticsearch
chown -R td-agent: /var/log/td-agent/
sudo systemctl restart td-agent.service
sudo systemctl status td-agent.service
sudo update-rc.d td-agent defaults 95 10
sudo /bin/systemctl daemon-reload
Furthermore, you can repeat the td-agent installation steps, mentioned above, on Client server in the same way.
Now adding configuration in server side td-agent to listen on port 24224 from everywhere.
mv td-agent.conf td-agent.conf-default
Add following lines in td-agent file, save it and restart the td-agent server.
EFK stack is now installed on the server and fluentd (td-agent) on client. Make sure all services are running seamlessly and server-client side td-agents are able to communicate with each other over port 24224.
Not to mention, next step would be to start configuring client fluentd to capture and ship logs from client to server. We will also be configuring elasticsearch and kibana with indexes, password protected . In addition to this, we will setup Nginx as a reverse proxy server to run kibana over http/https protocol. We will cover all these points in our next blog.