this evening I wanted to see what is written in the firewall log of my pc, which is inside the family LAN. As I had selected in the firewall to “Log only critical” events, I expected an empty or short list … and was shocked to see how long this list was.
But I was even more shocked about the content of the list … whatever the problem was that the firewall had detected, I was not able to read/understand this log at all.
I have the feeling that his log is not thought to be used by humans - that is something for a numbercruncher.
This log is without a tool that is able to translate this log into a form that is understandable for a human being, a waste of processor time and storage capacity - completly useless !
And it would mean that I simply have to “believe” that the firewall is well configured and doing it’s job - no feedback, can’t verify - Chaos !
How do you guys analyse your firewall logs ?
Nobody ? Do you guys don’t check your logs from time to time ?
From what I understand do these logs contain a lot and even redundant information, but they were not made for humans. Which software do you use to read these logs ?
I guess those of us who have no trouble understanding them aren’t humans.
I grep data from them, which I find especially interesting.
I used to grep my logs, too.
If there was a specific block of entries I wanted to study more closely, I sometimes imported them into a spreadsheet.
Depending on the log format, sometimes they can be read in a viewer of some sort.
I’m using a leading edge solution which is probably beyond the reach of the average User (unless that User wanted to commit at least a week or so in this <very> useful solution) which is to import the data into a hadoop style cluster.
I don’t run the standard hadoop stack which is fairly complex and requires a lot of things to be set up correctly before you can even start to play with the data, I deploy Elasticsearch with various optional components.
The features of this type of approach:
- The data is completely flat and unstructured (vs a database like MySQL) so it’s like a gigantic spreadsheet. No structures get in your way. Everything is indexed and searchable limited only by your imagination.
- Although the data is completely flat, you can prepare the data to respond more quickly to various types of queries if you already know ahead of time what you will be querying.
- There is no limit to data size. You don’t have to deploy on a single machine, you deploy across multiple nodes in a cluster. This gives you flexibility in redundancy/reliability and availability by distributing copies of or each data point to different nodes. And, this is mostly automated. Unlike traditional cluster building, Elasticsearch manages and tracks its data based on a single config file and doesn’t require intimate configuration of every node. So, for example if you want to add a node to enlarge your cluster you typically just boot up the node running ES and everything that follows happens automatically (scanning/recognizing/joining the cluster, followed by re-distribution of data to the new node).
- Queries are typically done through a web form although of course can also be done by command line. You need to learn the syntax which is not too difficult. Nothing pre-built will likely work.
This is the real future for all types of log searching and if you want to take a step further, analysis.
There are also a number of frontends which can be used to display standard collections of data, I use graphite as well as the recommended frontend, Kibana.
If you want to read a bit more, I describe how to install Elasticsearch on openSUSE, plus a few things more at the following link
Much of the articles are getting a bit old (written >6 mths ago when the technology is changing <very> quickly)
Thanks for your answers.
Grep ! Hmmm? I always thought I am working on the user/admin layer of the Open Suse Linux “OSI layer” model. But now I have the feeling that logs in OpenSuse are just a tiny step away form the Hex & Binary wizards layer … I am wondering whether this is also the case with the commercial Suse servers ?
I tried to import that log into a LibreOffice spreadsheet - as the firewall developers could avoid to use clear single char separators or padding to ensure same length entries, it lead to a useless result. For a second or 2 it looked a bit better as the raw log, though.
Guess I have to write my own program to make this log useable. But where can I find the legend that describes the content of those fields in the log ? Well some are obvious, but whats about those which are not ?
I tried to find some information in the internet searching for <“Suse Firewall” log format> , <“Suse Firewall” log legend> and <“Suse Firewall” log explained>, but I got nothing of value. If you have some information about it, please feel free.
But while searching I found the picoFirewall … any other firewall suggestions that would solve my log problem ?
TIA , Joe
Information for package ufw:
Status: not installed
Installed Size: 671.0 KiB
Summary: Uncomplicated Firewall
The Uncomplicated Firewall(ufw) is a front-end for netfilter, which
aims to make it easier for people unfamiliar with firewall concepts.
Ufw provides a framework for managing netfilter as well as
manipulating the firewall.
and its GUI tool, gufw.