Introduction to Kibana: Explore, Visualize and Analyze Elasticsearch Data πŸ“Š

 

 In my previous post Introduction to Elasticsearch: Create, Update, Delete and Search Documents πŸ” I showed you how to set up an Elasticsearch index and manage its data. But as you could see in that post the data looked very ugly on the cmd window. That's because this is not Elasticsearch's job.

Data representation and visualization is key to understanding your data and even predicting its trend. And as I previously said, the power in Elasticsearch lies in its integration with other powerful tools such as our guest of honor: Kibana.

Kibana, developed by Elastic, is a powerful open-source data visualization and exploration platform. It seamlessly integrates with Elasticsearch, making it an essential component of the Elastic Stack. Whether you’re a data analyst, developer, or business user, Kibana empowers you to unlock valuable insights from your data. With its intuitive interface, you can create interactive dashboards, explore logs, analyze metrics, and visualize trends—all while harnessing the full potential of Elasticsearch indices. 

In this tutorial, I will show you how to set up Kibana, view your indices and even add new data to them. Which is really the bare minimum you need to know before you can explore on your own the countless features and extensions that this tool offers.

Let's go ahead and set up Kibana!

Running Kibana πŸƒ

Just like most of my previous posts, I will be running this tool using a YAML file. Which is something you should also get familiar with because it really simplifies running multiple containers using just one simple command.

So, let's prepare a docker-compose.yml file that can run Elasticsearch and Kibana.

version: '3.7'

services:

  elasticsearch:

    image: docker.elastic.co/elasticsearch/elasticsearch:8.13.0

    environment:

      - node.name=elasticsearch

      - cluster.name=es-docker-cluster

      - discovery.type=single-node

      - bootstrap.memory_lock=true

      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"

      - "xpack.security.enabled=false"

    ulimits:

      memlock:

        soft: -1

        hard: -1

    volumes:

      - esdata1:/usr/share/elasticsearch/data

    ports:

      - 9200:9200

  kibana:

    image: docker.elastic.co/kibana/kibana:8.13.0

    ports:

      - 5601:5601

    environment:

      ELASTICSEARCH_URL: http://elasticsearch:9200

    depends_on:

      - elasticsearch

volumes:

  esdata1:

    driver: local

Now a few things you need to know. First is that the versions of Elasticsearch and Kibana must match. Second, you have to specify that Kibana depends on Elasticsearch.

The depends_on option in Docker Compose is used to control the startup order of services. This means that Docker Compose will start the Elasticsearch service before it starts the Kibana service when you run docker-compose up.

However, please note that depends_on only controls the startup order, but it does not wait for a service to be "ready" before starting the dependent services. In other words, even though Elasticsearch is started before Kibana, it might not be fully initialized and ready to accept connections when Kibana starts. To handle this, Kibana has built-in retry logic to keep trying to connect to Elasticsearch if the first connection attempt fails.

Okay so, where were we? Let's run now docker-compose up after navigating to the directory where the docker-compose.yml file exists and see if Kibana has started.

Usually, you would have to wait around 2 minutes for the services to run, but you could wait even more if it's the first time you pull the images.

After that just open your browser and go to http://localhost:5601/ as this is the port we specified in our YAML file.


This is what you should see if everything went right. Now let's see how to view our indices and maybe create a new one.

Index Management πŸ“

To add indices, delete them or manage their documents. Just head to Stack Management under Management in the left menu.


After that click on Index Management and you will be able to see a list of all your current indices.


We will then proceed to create a new index by clicking on the Create Index button as you can see above.

If you click on the index after you create it, you should be able to see this page.


Now, from my previous post you remember that we had to set the mapping or the structure of the documents that this index will store. Which is -by the way- the recommended way to declare an index.

Because this will allow you to control the schema based on a model you have in the domain layer.

But creating an index from Kibana will leave your mappings empty. And this is something you can see yourself if you open the Mappings tab of your index.


The mappings will be filled out automatically based on the document you start inserting. And as I said, I personally don't prefer this approach but I'm just showing you that this is possible.

So, let's add a document and see if the mappings really get inferred from the inserted document or not.

Create Document ➕πŸ“„

Creating a document is very simple just open the console located below and go ahead and post a document just like this.


Note that I did not specify an Id field in my document so Elasticsearch created a random Id and will use it as the identifier for this document in this index.

Okay, now that we posted a document with a name field and an isLoyal field, we're expecting that Kibana inferred that we have string and boolean fields for the type Customer. So, let's head back to the Mappings tab in our customer index to see if this is true.



Perfect! Seems like Kibana did infer the fields and their types from the posted document. Now, let's see if we can view the document I just posted.

All you need to do is just click on Discover Index.

Analyze Documents πŸ”¬

After clicking on Discover Index you will be directed to the Discover page of the Analytics part of Kibana. Where you should be able to view, filter and analyze your data and even more.


Great! Seems like our document did get posted and we can view it. I think this is enough for an introduction to Kibana.

It's really important to note I have barely scratched the surface here. There are tons of features that I haven't even mentioned such as security and dashboards. But these are building blocks on top of the foundation we just laid here in this tutorial, so I strongly urge you to continue exploring this fun tool and discover all of its perks.

Now that we have looked at Elasticsearch and Kibana, it seems that we're missing something to complete our ELK stack: Logstash. Let's discuss it in a later post. But until then, enjoy playing around with Kibana!

Comments

Popular posts

Why I Hate Microservices Part 1: The Russian Dolls Problem πŸͺ†πŸͺ†πŸͺ†

Why I Hate Microservices Part 3: The Identity Crisis 😡

Why I Hate Microservices Part 2: The Who's Telling the Truth Problem 🀷