ELK Dashboard for Pihole Logs

Published: 2019-12-29
Last Updated: 2019-12-29 19:48:45 UTC
by Guy Bruneau (Version: 1)
4 comment(s)

In my last Pihole Diary, I shared a Pihole parser to collect its logs and stored them into Elastic. In this diary, I'm sharing a dashboard to visualize the Pihole DNS data. Here are some of the output from the dashboard.

Pihole Overall

Pihole Dashboard

Pihole Regex List Match

This is the output from the Blocklist for Regex and Wildcard blocking

Pihole Regex

Pihole Gravity List Match

This is the output from the Blocklists generated by Pi-hole Gravity

Pihole Gravity

The JSON dashboard file can be downloaded here.

[1] https://isc.sans.edu/diary/25582
[2] https://handlers.sans.edu/gbruneau/elk/pihole.conf
[3] https://handlers.sans.edu/gbruneau/elk/pihole_graphs.ndjson
[4] https://www.elastic.co/

Guy Bruneau IPSS Inc.
My Handler Page
Twitter: GuyBruneau
gbruneau at isc dot sans dot edu

4 comment(s)


I have a similar setup which is partially working, I have a few questions.

1. What do you use to forward the syslog data from the pihole?
I have remote_syslog2 sending data to my home SOF-ELK instance https://github.com/papertrail/remote_syslog2, it works but all my log data isn't parsed
2. Can i use the remote_syslog2 in combination with this conf file? I'm still new to using elk stack and setting up these grok parsers, and i haven't figured out where to put this conf file, or i think maybe i have it in the right spot but it's not sending it in the right format.
3. Is it possible filebeats on a raspberry pi? i've tried but failed with some online guides.
I have installed filebeat on CentOS and the filebeat.yml configuration is:


- type: log
enabled: true
- "/var/log/pihole.log"

hosts: [""]

Refer this this diary: https://isc.sans.edu/diary/25582 I have published a logstash parser (to add under conf.d/pihole.conf) to send the data to ELK which can be downloaded here: https://handlers.sans.edu/gbruneau/elk/pihole.conf

I haven't tried to setup filebeat on a Rasberry pi but maybe someone else can answer that.
Thank you! i hadn't considered setting up my pihole on something other than a raspberry
To create an ELK dashboard on Localhost for Pihole logs, you can follow these general steps:

Install Elasticsearch, Logstash, and Kibana on your system.

Configure Logstash to read in Pihole logs. You can use the file input plugin to read in log files, and the grok filter plugin to parse the log lines and extract relevant information.

Define an Elasticsearch index template for the Pihole logs. This will ensure that the logs are indexed properly and that the correct field types are assigned.

Create visualizations in Kibana to display the data from the Pihole logs. For example, you might create a bar chart that shows the top blocked domains or a line chart that shows the number of queries over time.

Organize the visualizations into a dashboard that provides an overview of the Pihole logs. You might include visualizations for top clients, top domains, and top queries, for example.

Diary Archives