{"id":3118,"date":"2019-05-28T22:36:37","date_gmt":"2019-05-28T19:36:37","guid":{"rendered":"https:\/\/kifarunix.com\/?p=3118"},"modified":"2019-05-28T22:36:38","modified_gmt":"2019-05-28T19:36:38","slug":"monitor-squid-logs-with-grafana-and-graylog","status":"publish","type":"post","link":"https:\/\/kifarunix.com\/monitor-squid-logs-with-grafana-and-graylog\/","title":{"rendered":"Monitor Squid logs with Grafana and Graylog"},"content":{"rendered":"\n
In this guide, we are going to learn how to monitor squid logs with Grafana and Graylog. You can check our other guides on installing Graylog, forwarding squid logs to Graylog and creating Graylog squid log field extractors by following the links below;<\/p>\n\n\n\n
Install Graylog 3.0 on CentOS 7<\/a><\/p>\n\n\n\n Monitor Squid Access Logs with Graylog Server<\/a><\/p>\n\n\n\n Create Squid Logs Extractors on Graylog Server<\/a><\/p>\n\n\n\n Grafana is an opensource tool for visualizing data collected from different types of data stores such as Prometheus, InfluxDB, Elasticsearch, Graphite, MySQL and several other databases. In this case of integrating it with Graylog, we will use Elasticsearch as our Grafana datasource.<\/p>\n\n\n\n To learn how to install Grafana on Ubuntu, Debian or Fedora, see the links below;<\/p>\n\n\n\n Elasticsearch is listening on localhost by default in Graylog server. To configure it to allow remote connection, you need define an interface IP for the network.host<\/strong> parameter.<\/p>\n\n\n\n If firewall is running, open the Elasticsearch ports<\/p>\n\n\n\n Restart Elasticsearch<\/p>\n\n\n\n Login to Grafana and run the command below to verify connection to Elasticsearch by running the command below;<\/p>\n\n\n\n Graylog uses one or more sets of Elasticsearch indices to optimize search and analysis operations for speed and low resource consumption.<\/p>\n\n\n\n To create an index, navigate to System > Indices<\/strong>. Hit Create index set. On the index configuration page, set the name of the index, description, a unique prefix for use in Elasticsearch, number of Elasticsearch shards, index rotation strategy.<\/p>\n\n\n\n Once you are done, click save<\/strong> to save the index.<\/p>\n\n\n\n To verify the index name for your Elasticsearch datasource;<\/p>\n\n\n\n Our index in this case is squidaccess_0<\/strong>. Note that using the Graylog Elasticsearch indices may bring issues due to constant rotation. We will look at the possible work around in our next guide.<\/p>\n\n\n\n If connection to Elasticsearch from Grafana server is okay, proceed to create Grafana Elasticsearch datasource. To add Grafana datasource, navigate to Configuration > Data Sources<\/strong>.<\/p>\n\n\n\n Click Add data source<\/strong> and choose Elasticsearch. Under the Elasticsearch datasource settings, set the name of the datasource, the URL of the Graylog Elasticsearch, Elasticsearch index prefix as defined in Graylog index above, time field name (timestamp<\/strong>).<\/p>\n\n\n\n Next, click Save & Test <\/strong>to test the connection to Elasticsearch datasource. If everything is fine, then you should get Index Ok.<\/p>\n\n\n\n Once you have you Graylog Elasticsearch datasource added to Grafana, you need to create the dashboards for visualizing the data. This involves creating various queries for different dashboards you may want to have. You can also import a ready made dashboard.<\/p>\n\n\n\n To create a new or import Grafana dashboard, click on the HOME<\/strong> dropdown on the top left corner and choose whether to import dashboard json file or create a new one.<\/p>\n\n\n\n For example, based on my Graylog squid log extractor, this is a simple dashboard that we have created.<\/p>\n\n\n\n Below are the panels that makes up this dashboard.<\/p>\n\n\n\n Total Traffic:<\/p>\n\n\n\n Top Sites:<\/p>\n\n\n\n Top 10 Denied Sites:<\/p>\n\n\n\nMonitor Squid logs with Grafana and Graylog<\/h2>\n\n\n\n
Configure Elasticsearch Remote Connection<\/h3>\n\n\n\n
vim \/etc\/elasticsearch\/elasticsearch.yml<\/code><\/pre>\n\n\n\n
...\nnetwork.host: GRAYLOG_SERVER_IP\n...<\/code><\/pre>\n\n\n\n
firewall-cmd --add-port=9200\/tcp --permanent\nfirewall-cmd --reload<\/code><\/pre>\n\n\n\n
ufw allow from Graylog_IP<\/strong> to any port 9200 proto tcp<\/code><\/pre>\n\n\n\n
systemctl restart elasticsearch<\/code><\/pre>\n\n\n\n
Verify Elasticsearch Connection<\/h3>\n\n\n\n
curl http:\/\/Graylog_IP_)R_HOSTNAME:9200\n{\n \"name\" : \"x55YNL_\",\n \"cluster_name\" : \"graylog\",\n \"cluster_uuid\" : \"CQBqPDoCRKW7tt955kq5Uw\",\n \"version\" : {\n \"number\" : \"6.8.0\",\n \"build_flavor\" : \"default\",\n \"build_type\" : \"rpm\",\n \"build_hash\" : \"65b6179\",\n \"build_date\" : \"2019-05-15T20:06:13.172855Z\",\n \"build_snapshot\" : false,\n \"lucene_version\" : \"7.7.0\",\n \"minimum_wire_compatibility_version\" : \"5.6.0\",\n \"minimum_index_compatibility_version\" : \"5.0.0\"\n },\n \"tagline\" : \"You Know, for Search\"\n}<\/code><\/pre>\n\n\n\n
Create Graylog Squid Logs Elasticsearch Index Set<\/h3>\n\n\n\n
<\/a><\/figure>\n\n\n\n
<\/a><\/figure>\n\n\n\n
curl -XGET graylog.example.com:9200\/_cat\/indices?v\nhealth status index uuid pri rep docs.count docs.deleted store.size pri.store.size\ngreen open squidaccess_0<\/strong> EiMgXL2UQqWym-5VZ-atDg 1 0 8859 0 1.9mb 1.9mb<\/code><\/pre>\n\n\n\n
Add Grafana Datasource<\/h3>\n\n\n\n
<\/a><\/figure>\n\n\n\n
<\/a><\/figure>\n\n\n\n
<\/a><\/figure>\n\n\n\n
Create Grafana Dashboard for Squid Logs<\/h3>\n\n\n\n
<\/a><\/figure>\n\n\n\n
<\/a><\/figure>\n\n\n\n
<\/a><\/figure>\n\n\n\n
<\/a><\/figure>\n\n\n\n
<\/a><\/figure>\n\n\n\n