Cisco and Splunk have introduced the Cisco Time Series Model, a univariate zero shot time series foundation model designed for observability and security metrics. It is released as an open weight checkpoint on Hugging Face.
* **Multiresolution data is common:** The model handles data where fine-grained (e.g., 1-minute) and coarse-grained (e.g., hourly) data coexist, a typical pattern in observability platforms where older data is often aggregated.
* **Long context windows are needed:** It's built to leverage longer historical data (up to 16384 points) than many existing time series models, improving forecasting accuracy.
* **Zero-shot forecasting is desired:** The model aims to provide accurate forecasts *without* requiring task-specific fine-tuning, making it readily applicable to a variety of time series datasets.
* **Quantile forecasting is important:** It predicts not just the mean forecast but also a range of quantiles (0.1 to 0.9), providing a measure of uncertainty.
This article details the steps to move a Large Language Model (LLM) from a prototype to a production-ready system, covering aspects like observability, evaluation, cost management, and scalability.
This article explores how prompt engineering can be used to improve time-series analysis with Large Language Models (LLMs), covering core strategies, preprocessing, anomaly detection, and feature engineering. It provides practical prompts and examples for various tasks.
This article demonstrates how to use the attention mechanism in a time series classification framework, specifically for classifying normal sine waves versus 'modified' (flattened) sine waves. It details the data generation, model implementation (using a bidirectional LSTM with attention), and results, achieving high accuracy.
This paper introduces Toto, a time series forecasting foundation model with 151 million parameters, and BOOM, a large-scale benchmark for observability time series data. Toto uses a decoder-only architecture and is trained on a large corpus of observability, open, and synthetic data. Both Toto and BOOM are open-sourced under the Apache 2.0 License.
Datadog announces the release of Toto, a state-of-the-art open-weights time series foundation model, and BOOM, a new observability benchmark. Toto achieves SOTA performance on observability metrics, and BOOM provides a challenging dataset for evaluating time series models in the observability domain.
Sawmills AI has introduced a smart telemetry data management platform aimed at reducing costs and improving data quality for enterprise observability. By acting as a middleware layer that uses AI and ML to optimize telemetry data before it reaches vendors like Datadog and Splunk, Sawmills helps companies manage data efficiently, retain data sovereignty, and reduce unnecessary data processing costs.
This article provides a hands-on guide to classifying human activity using sensor data and machine learning. It covers preparing data, creating a feature extraction pipeline using TSFresh, training a machine learning classifier with scikit-learn, and validating the model using the Data Studio.
SHREC is a physics-based unsupervised learning framework that reconstructs unobserved causal drivers from complex time series data. This new approach addresses the limitations of contemporary techniques, such as noise susceptibility and high computational cost, by using recurrence structures and topological embeddings. The successful application of SHREC on diverse datasets highlights its wide applicability and reliability in fields like biology, physics, and engineering, improving the accuracy of causal driver reconstruction.
Outlier treatment is a necessary step in data analysis. This article, part 3 of a four-part series, eases the process and provides insights on effective methods and tools for outlier detection.