Using Loki in Grafana
Add it as a data source and you are ready to build dashboards or query your log data in Explore. Refer to for instructions on how to add a data source to Grafana. Only users with the organization admin role can add data sources.
You can run Loki on your own hardware or use . The free forever plan includes Grafana, 50 GB of Loki logs, 10K Prometheus series, and more. Create a free account to get started.
Loki settings
To access Loki settings, click the Configuration (gear) icon, then click Data Sources, and then click the Loki data source.
The Derived Fields configuration allows you to:
- Add fields parsed from the log message.
- Add a link that uses the value of the field.
For example, you can use this functionality to link to your tracing backend directly from your logs, or link to a user profile page if a userId is present in the log line. These links appear in the .
Each derived field consists of:
- Name - Shown in the log details as a label.
- Regex - A Regex pattern that runs on the log message and captures part of it as the value of the new field. Can only contain a single capture group.
- URL/query - If the link is external, then enter the full link URL. If the link is internal link, then this input serves as query for the target data source. In both cases, you can interpolate the value from the field with
${__value.raw }
macro. - URL Label - (Optional) Set a custom display label for the link. The link label defaults to the full external URL or name of the linked internal data source and is overridden by this setting.
- Internal link - Select if the link is internal or external. In case of internal link, a data source selector allows you to select the target data source. Only tracing data sources are supported.
You can use a debug section to see what your fields extract and how the URL is interpolated. Click Show example log message to show the text area where you can enter a log message.
The new field with the link shown in log details:
Loki query editor
Log browser
With Loki log browser you can easily navigate through your list of labels and values and construct the query of your choice. Log browser has multi-step selection:
- Choose the labels you would like to consider for your search.
- Pick the values for selected labels. Log browser supports facetting and therefore it shows you only possible label combinations.
Choose the type of query - logs query or rate metrics query. Additionally, you can also validate selector.
There are two types of LogQL queries:
- Log queries - Return the contents of log lines.
- Metric queries - Extend log queries and calculate sample values based on the content of logs from a log query.
Querying and displaying log data from Loki is available via , and with the logs panel in dashboards. Select the Loki data source, and then enter a query to display your logs.
A log query consists of two parts: log stream selector, and a log pipeline. For performance reasons begin by choosing a log stream by selecting a log label.
Log context
When using a search expression as detailed above, you can retrieve the context surrounding your filtered results. By clicking the Show Context
link on the filtered rows, you’ll be able to investigate the log messages that came before and after the log message you’re interested in.
Loki supports Live tailing which displays logs in real-time. This feature is supported in Explore.
Note that Live Tailing relies on two Websocket connections: one between the browser and the Grafana server, and another between the Grafana server and the Loki server. If you run any reverse proxies, please configure them accordingly. The following example for Apache2 can be used for proxying between the browser and the Grafana server:
The following example shows basic NGINX proxy configuration. It assumes that the Grafana server is available at http://localhost:3000/
, Loki server is running locally without proxy, and your external site uses HTTPS. If you also host Loki behind NGINX proxy, then you might want to repeat the following configuration for Loki as well.
In the http
section of NGINX configuration, add the following map definition:
map $http_upgrade $connection_upgrade {
'' close;
Note: This feature is only available in Grafana v6.3+.
Metric queries
LogQL supports wrapping a log query with functions that allow for creating metrics out of the logs. See LogQL documentation on how to create and use metrics queries.
Templating
Instead of hard-coding things like server, application and sensor name in your metric queries, you can use variables in their place. Variables are shown as drop-down select boxes at the top of the dashboard. These drop-down boxes make it easy to change the data being displayed in your dashboard.
Check out the Templating documentation for an introduction to the templating feature and the different types of template variables.
Variable of the type Query allows you to query Loki for a list labels or label values. The Loki data source plugin provides the following functions you can use in the Query
input field.
Ad hoc filters variable
Loki supports the special ad hoc filters variable type. It allows you to specify any number of label/value filters on the fly. These filters are automatically applied to all your Loki queries.
You can use some global built-in variables in query variables; $__interval
, $__interval_ms
, $__range
, $__range_s
and $__range_ms
. For more information, refer to Global built-in variables.
Annotations
You can use any non-metric Loki query as a source for annotations. Log content will be used as annotation text and your log stream labels as tags, so there is no need for additional mapping.
Configure the data source with provisioning
You can set up the data source via config files with Grafana’s provisioning system. You can read more about how it works and all the settings you can set for data sources on the provisioning docs page
Here is an example:
apiVersion: 1
datasources:
- name: Loki
type: loki
access: proxy
url: http://localhost:3100
jsonData:
maxLines: 1000
Here’s another with basic auth and derived field. Keep in mind that $
character needs to be escaped in YAML values as it is used to interpolate environment variables:
datasources:
- name: Jaeger
type: jaeger
url: http://jaeger-tracing-query:16686/
access: proxy
# UID should match the datasourceUid in dervidedFields.