Introduction to Kibana: Explore, Visualize and Analyze Elasticsearch Data π
In my previous post Introduction to Elasticsearch: Create, Update, Delete and Search Documents π I showed you how to set up an Elasticsearch index and manage its data. But as you could see in that post the data looked very ugly on the cmd window. That's because this is not Elasticsearch's job.
Data representation and visualization is key to understanding your data and even predicting its trend. And as I previously said, the power in Elasticsearch lies in its integration with other powerful tools such as our guest of honor: Kibana.
Kibana, developed by Elastic, is a powerful open-source data visualization and exploration platform. It seamlessly integrates with Elasticsearch, making it an essential component of the Elastic Stack. Whether you’re a data analyst, developer, or business user, Kibana empowers you to unlock valuable insights from your data. With its intuitive interface, you can create interactive dashboards, explore logs, analyze metrics, and visualize trends—all while harnessing the full potential of Elasticsearch indices.
In this tutorial, I will show you how to set up Kibana, view your indices and even add new data to them. Which is really the bare minimum you need to know before you can explore on your own the countless features and extensions that this tool offers.
Let's go ahead and set up Kibana!
Running Kibana π
Just like most of my previous posts, I will be running this tool using a YAML file. Which is something you should also get familiar with because it really simplifies running multiple containers using just one simple command.
So, let's prepare a docker-compose.yml file that can run Elasticsearch and Kibana.
version: '3.7'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.13.0
environment:
- node.name=elasticsearch
- cluster.name=es-docker-cluster
- discovery.type=single-node
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
- "xpack.security.enabled=false"
ulimits:
memlock:
soft: -1
hard: -1
volumes:
- esdata1:/usr/share/elasticsearch/data
ports:
- 9200:9200
kibana:
image: docker.elastic.co/kibana/kibana:8.13.0
ports:
- 5601:5601
environment:
ELASTICSEARCH_URL: http://elasticsearch:9200
depends_on:
- elasticsearch
volumes:
esdata1:
driver: local
Now a few things you need to know. First is that the versions of Elasticsearch and Kibana must match. Second, you have to specify that Kibana depends on Elasticsearch.
The depends_on option in Docker Compose is used to control the startup order of services. This means that Docker Compose will start the Elasticsearch service before it starts the Kibana service when you run docker-compose up.
However, please note that depends_on only controls the startup order, but it does not wait for a service to be "ready" before starting the dependent services. In other words, even though Elasticsearch is started before Kibana, it might not be fully initialized and ready to accept connections when Kibana starts. To handle this, Kibana has built-in retry logic to keep trying to connect to Elasticsearch if the first connection attempt fails.
Okay so, where were we? Let's run now docker-compose up after navigating to the directory where the docker-compose.yml file exists and see if Kibana has started.
Usually, you would have to wait around 2 minutes for the services to run, but you could wait even more if it's the first time you pull the images.
After that just open your browser and go to http://localhost:5601/ as this is the port we specified in our YAML file.
This is what you should see if everything went right. Now let's see how to view our indices and maybe create a new one.
Index Management π
To add indices, delete them or manage their documents. Just head to Stack Management under Management in the left menu.
After that click on Index Management and you will be able to see a list of all your current indices.
Because this will allow you to control the schema based on a model you have in the domain layer.
But creating an index from Kibana will leave your mappings empty. And this is something you can see yourself if you open the Mappings tab of your index.
So, let's add a document and see if the mappings really get inferred from the inserted document or not.
Create Document ➕π
All you need to do is just click on Discover Index.
Analyze Documents π¬
Now that we have looked at Elasticsearch and Kibana, it seems that we're missing something to complete our ELK stack: Logstash. Let's discuss it in a later post. But until then, enjoy playing around with Kibana!
Comments
Post a Comment