Data Ingestion

This section describes the ingesting process to add data into Observe. Observe accepts data in any format. More sources or forwarders may be available in future releases.


Sources may send data to Observe directly through an outgoing Webhook, a forwarder, or other types of agents. The documentation for each source describes the recommended method and any additional installations required.

Examples: AWS CloudWatch logs, Jenkins build logs


Forwarders collect data from a source and send it to Observe. They offer additional features, such as the ability to aggregate data from multiple sources or perform lightweight transformations. Forwarders can be useful when the original source does not have a way to send data to Observe, such as a process that only generates a local log file.

Examples: FluentBit, Prometheus Server


Observe uses Datastreams as a flexible way to manage data ingestion. Each datastream sends a dataset, managed with unique, revokable, tokens.


Endpoints support various wire protocols that Observe can ingest. All of the source and forwarder instructions ultimately send data to an endpoint. If you have a custom or highly customized source, you may configure it to use the appropriate endpoint directly.

Example: JSON through an HTTP POST

Troubleshooting Data Ingestion

When you ingest different sources of data into Observe, you may encounter issues during the ingestion. Refer to Troubleshooting Data Ingestion for possible causes and solutions.

Configurable Data Retention

Observe retains your data for thirteen (13) months by default. You can configure the data retention period for less or more than 13 months by contacting Observe Support. Data older than the retention period automatically deletes from the Data Set.

Downstream Data Sets inherit the retention period from the Datastream as the minimum value for all input source Datastreams. Observe enforces data retention once a day.

As an additional benefit, shortening the data retention period can reduce your storage costs.