Working with the Log Explorer

Your organization may generate large amounts of log data and events through applications, networks, systems, and users, and requires a systematic process to manage and monitor different data across log files. Log management requires a continuous process of centrally collecting, parsing, storing, analyzing, and disposing of data to provide actionable insights for supporting troubleshooting, performance enhancement, or security monitoring.

Computer-generated data that serves as a primary source for information about activities, usage patterns, and operations on your applications, servers, or other devices can be stored as log files. Use log files to identify and analyze situations where applications, networks, and systems experience bottlenecks or performance issues. Log files provide detailed information about every action and provide insights into identifying root causes of anomalies or problems.

Managing log files requires collecting data from multiple sources of logs, and these are the most common types of log files:

  • System logs - logs that record events generated within an OS, such as driver errors or CPU usage.

  • Application logs - logs generated when an event occurs inside an application. Use application logs to measure and understand how your application functions after releasing it or during the development cycle.

  • Security logs - logs generated when security events such as unsuccessful login attempts, failed authentication requests, or password changes occur in your organization.

In the Getting Started with Observe tutorial, you investigated log events for a Kubernetes container Dataset using a Worksheet and OPAL. Using Log Explorer allows you to locate the logs in a Dataset without initially creating a Worksheet and filtering for the logs.

To start using the Log Explorer, log into your Observe instance and locate the Logs icon on the left navigation bar. Click the icon to display the following interface:

Log Explorer Interface

Figure 1 - Log Explorer Interface

Quick Start Using Kubernetes Container Logs

This quick tutorial builds on the Getting Started with Observe material, but instead of looking for errors in the logs, you want to know about the errors responses that occur in the logs.

Log into your Observe instance and use the following steps to search for errors in your Kubernetes Container Logs:

  1. From the left navigation bar, under Investigate, click Logs.

  2. In the Search log datasets field, enter Container Logs, and select it from the search results.

  3. From the Filters list for the Dataset, select frontend to view only errors from the front end.

  4. In the Filter field, enter error. Corresponding entries in the Container Logs appear highlighted in the log column.

Container Logs with frontend and error Filters

Figure 2 - Container Logs with frontend and error Filters

  1. Filter out Unimplemented as you don’t need this data for analysis.

  2. Filter for log=customerID and review errors associated with customers.

Live Mode


Only customers with usage-based pricing can access this feature.

You can click Live Mode when viewing logs and see your logs stream into Observe. Filter your logs and generate visualizations that continuously update with new data.

Live Mode

Figure 5 - Enabling Live Mode

Since Live Mode increases your credit usage, you may want to disable it unless actively working on troubleshooting an ongoing issue. Live Mode automatically becomes disabled after 15 minutes. Using the Time Scrubber feature also automatically disables Live Mode.

For Log Explorer, you can select from 5 minutes, 10 minutes, or 15 minutes.

When you enable Live Mode, and click the Query icon, you see information about the query similar to the following image.

Live Mode Query Details

Figure 6 - Live Mode Query Details

Latest data received - the time that data required for the query most recently arrived on the Observe instance but has not yet been processed.

Latest data available to query - the latest system time at which new data was processed and became available for Observe to query it. Live Mode users can typically expect between 30 and 90 seconds of latency from source to screen, depending on data rate and agent configuration.

These two status messages may have slightly different times as the first one designates the time that the data required for the query most recently arrived on the Observe instance but has not yet been processed. The second message designates the time the data became available for Observe to query it.

Log Explorer Overview

The left menu displays a Search that allows you to search for specific Log Datasets.

Log Explorer displays a list of available Log Datasets logs. If you don’t see the desired Datasets, use the Search function to locate it.

The left menu also displays a list of Filters to use with the currently selected Logs dataset. When you select a Filter, the Filter appears in the Query Builder.

In the center panel, you can build filters using the Query Builder and select from a dropdown list of parameters, or you can select OPAL and use OPAL to build your filter list.

Click Logs to display the datasets as log events. Selecting a row from the list of Logs displays the Log details in the right panel.

While in Logs mode, you can use column formatting tools to filter, sort, and visualize the data. Use a right-click on the column header to display options for working with data in a single column. The type of options depends on the data type in the column.

  • Filter - filter data by a single parameter.

  • Remove empty cells - remove empty cells from the column.

  • Create as visualization - create a visual representation of data in a column.

  • Create summary - add a summary of the data in a column.

  • Extract from string - extract parameters depending on the type of string.

  • Sort A -> Z (descending order)

  • Sort Z -> A (ascending order)

  • Conditional Formatting - Apply color and style to string or numeric column types based on conditional formatting rules

  • Hide column - hide the column from view.

  • Convert -

    • int - integer

    • float - floating-point type

    • time - timestamp, date, time, interval

    • JSON - JavaScript Object Notation type

  • Add parameters

    • Existing - add existing parameters to the column data.

    • Create new - create a new parameter.

  • Add to resource

    • Datasets - add the column data to an existing Dataset.

    • Add to new - add the column data to a new Dataset

  • Link to other dataset

    • Datasets - link the column data to a listed Dataset.

    • View more - list more Datasets.

Click Table to display the Log Events in a read-only table format.

Click Visualize to display the Log Events as a visualization of the events. When you click Visualize, you can then build an expression from the displayed parameters.

From the Type menu, select from the following (see the Visualization Types Reference for more detail):

  • Line Chart

  • Bar Chart

  • Stacked Area

  • Single Stat

  • Pie Chart

  • List

  • Value Over Time

  • Geographic

  • Choropleth

From the Plot menu, select from the following:

  • over time

  • summary

  • as is

From the using function menu, select from the following:

  • Any

  • Any Not Null

  • Array Aggregation

  • Array Aggregation with duplicates remembered

  • Average

  • Count Distinct Exact

  • Count Distinct Fast

  • Count Values

  • Deriv

  • First

  • First Not Null

  • Last

  • Last Not Null

  • Maximum

  • Median

  • Median Exact

  • Minimum

  • Percentile

  • Rate

  • Standard Deviation

  • Sum

From the of menu, select from the displayed parameters. This list varies from Dataset to Dataset.

From the by menu, select from the displayed parameters. This list varies from Dataset to Dataset.

The top right contains a button, Actions, that allows you to perform the following tasks:

  • Create monitor - create a Monitor to watch your Log Dataset for desired events.

  • Add to dashboard - use the Log Dataset to create a new Dashboard.

  • Open data in worksheet - open the data in a Worksheet to further model and refine the data.

Actions Menu Options

Figure 7 - Select From a list of Actions

You can also select a time range for your data using the menu to the left of Actions.

Select a time range for your data

Figure 8 - Select a Time Range for your data

When you right-click on a line in a Time Series visualization, this displays a menu that allows you to select from two options:

  • Show this data only - This allows you to display only that graph line in the visualization.

  • Exclude this data - Remove the data from the visualization.

  • Copy - Copy the graph line.

  • Inspect - Inspect the data for the graph line.

  • For selected resource - Displays the related resource which you can open in a new window.

  • View related - View the following related data in new windows:

    • Dashboard

    • Metrics

    • Logs

Options for a single graph line

Figure 9 - Select From a List of Options for a Single Graph Line

Exporting Data

To download the data displayed in Log Explorer, click the Export button. You may select CSV or JSON format, and a maximum size limit (one thousand, ten thousand, or one hundred thousand rows). Note that hidden fields will be included. Use the pick_col OPAL verb to reduce the width of downloaded data.