I did something very similar with Opensearch rather than grafana, but it’s definitely possible. My setup:
- fluent-bit installed on webserver to scrape and parse nginx logs, then forward them over TLS to the monitoring server
- on the monitoring server, a second fluent-bit service runs here to collect the forwarded logs and insert them into the correct index pattern. A filter also inserts geoip lookups into the records.
- opensearch & dashboards set up to exclude known “bot” user-agents from the analytics, and do some other basic data cleanup to make the dashboards pretty
It works well, but could be a bit simpler admittedly. You may choose to use Loki instead of Opensearch/Elasticsearch, and there are plenty of other log parsing tools out there.
Another, much simpler option is to just run Goaccess on your log files, either periodically to generate reports, or as a daemon to create a live dashboard.