TSLN: Time-Series Lean Notation
A Novel Data Serialization Format for Token-Efficient Analysis with Large Language Models
DOI:
https://doi.org/10.31224/6375Keywords:
Large Language Models, Data Serialization, Time-Series Compression, Token Optimization, AI EconomicsAbstract
The integration of Large Language Models (LLMs) with real-time time-series data presents a significant economic challenge: token-based pricing models make data transmission costs prohibitive at scale. We introduce TSLN (TimeSeries Lean Notation), a specialized data serialization format achieving 68-73% token reduction compared to JSON through innovative encoding strategies. TSLN employs schema-first architecture, relative timestamps, differential encoding, and adaptive repeat markers optimized for temporal patterns. Comprehensive benchmarks across cryptocurrency, stock market, and IoT datasets demonstrate consistent compression ratios of 70%+ while maintaining lossless fidelity. For applications processing 1 million records daily, TSLN enables annual savings exceeding $18,000 in LLM API costs. This work contributes a production-ready format, empirical validation methodology, and economic impact analysis, establishing a foundation for tokenefficient AI-powered time-series analytics.
Downloads
Downloads
Posted
Versions
- 2026-02-23 (5)
- 2026-02-12 (4)
- 2026-02-09 (3)
- 2026-01-29 (1)
License
Copyright (c) 2026 Manas Mudbari, Chandan Bhagat

This work is licensed under a Creative Commons Attribution 4.0 International License.