Way To Ship Kafka Logs To Kibana Dashboard

data codes through eyeglasses
Reading Time: 2 minutes

What are Logs?

Logs are well-explanatory records of events regarding the application, its performance, and user activities. The events can include deleting, or modifying a file for the application. They also include system configuration changes.

Whereas, Log files stores records about these events occurring. Since they are used for tracking information about computing events. They come in handy when it comes to identifying the problems and correcting them.

How to Ship Kafka Logs to the Kibana Dashboard?

To ship the Kafka logs, we will be using the filebeat agent. A filebeat agent is a lightweight shipper whose purpose is to forward and centralize the log data.

For filebeat to work, you need to install it as an agent on the desired servers. Filebeat then monitors the log files, collects the log events, and forwards them to the ElasticSearch or LogStash for indexing.

To install the filebeat agent to servers, we will use ansible since it makes the deployment of the agent effortless.

How Filebeat agent Works

When you start the filebeat agent it starts its input and looks into the log location you have mentioned. For every log the filebeat locates, a harvester is started. Moreover every single harvester reads a single log for new content and sends the new log data to libbeat. Libbeat is a go framework used for data forwarding. Libbeat aggregates the events and sends the collected data to the output that you’ve configured for Filebeat. In our case it is Kibana.

Ansible playbook for deploying the Filebeat agent to Kafka servers

- name: Run kafka logging role to deploy filebeat to kafka servers 
  hosts: <kafka_server_hosts>
  gather_facts: False 
  roles:
    - deploy-filebeat
  become: True
  become_method: sudo 

Role to run the necessary tasks for deploying Filebeat

Before you run the role make sure you have the filebeat.yml and kafka.yml files which consist of the configuration changes needed for the role and also the set-logstash-endpoint.sh script which sets the endpoints needed for the logstash. You can have both these files stored in the files directory of the role deploy-filebeat.

- name:  Install filebeat 6.2.4
  shell: |
    curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.2.4-amd64.deb
    sudo dpkg -i filebeat-6.2.4-amd64.deb


- name: Upload filebeat.yml
  copy: 
    src: filebeat.yml
    dest: /etc/filebeat 
    owner: root
    group: root 
    mode: 644
  
- name: Install filebeat Kafka module
  shell: |
    filebeat modules enable kafka

- name: Upload set-logstash-endpoint.sh script
  copy: 
    src: set-logstash-endpoint.sh
    dest: /usr/local/bin/
    owner: root
    group: root
    mode: 755
  
- name: Upload filebeat Kafka module configuration
  copy: 
    src: kafka.yml
    dest: /etc/filebeat/modules.d/
    owner=root
    group=root
    mode=644
  notify:
    - restart filebeat

- name:  Set the logstash endpoint for this environment
  shell: |
    /usr/local/bin/set-logstash-endpoint.sh

- name:  Start filebeat
  shell: |
    sudo /etc/init.d/filebeat start

After running this playbook you can go and check on your kibana dashboard endpoint as the kafka logs from the kafka servers will start popping up.

If running locally you can go to – http://localhost:5601/app/logs/ and check the Kafka logs that are shipped here.

logs

Reference

Written by 

Shivani Sarthi is a Software Consultant at Knoldus Software. She has completed her MCA from BCIIT and Bachelors in Electronic Science from Delhi University. She has a keen interest toward learning new technologies. Her practice area is Devops. When not working, you will find her watching anime or with a book.

1 thought on “Way To Ship Kafka Logs To Kibana Dashboard3 min read

Comments are closed.