Getting Started

Install and start using TONL in 30 seconds

Installation

npm install tonl

Quick Example

import { TONLDocument } from 'tonl';

// Create from JSON
const doc = TONLDocument.fromJSON({
  users: [{ name: 'Alice', age: 30 }]
});

// Query
const result = doc.query('users[*].name');

// Modify
doc.set('users[0].age', 31);

// Save
await doc.save('data.tonl');

Encoding Options

Compact Mode (Default)

encodeTONL(data)

Maximum token savings (38-50%), no type hints

With Type Hints

encodeTONL(data, { includeTypes: true })

Schema validation enabled (~32% savings)

Custom Delimiter

encodeTONL(data, { delimiter: '|' })

Use pipe, tab, or semicolon delimiters

Smart Encoding

encodeSmart(data)

Auto-selects best delimiter and options

Need more?

Check out the full getting started guide on GitHub.

API Reference

Complete TONLDocument API and core functions

šŸ“¦ Creation Methods

TONLDocument.fromJSON(data)

Create TONL document from JavaScript object

const doc = TONLDocument.fromJSON({ users: [...] })

TONLDocument.fromTONL(text)

Parse TONL text into document

const doc = TONLDocument.fromTONL(tonlText)

TONLDocument.load(path)

Load TONL file from disk

const doc = await TONLDocument.load('data.tonl')

šŸ” Query Methods

query(path)

JSONPath-like queries with filters

doc.query('users[?(@.role == "admin")]')

get(path)

Get single value at path

doc.get('users[0].name') // "Alice"

has(path)

Check if path exists

doc.has('users[0].email') // true/false

āœļø Modification Methods

set(path, value)

Update value at path

doc.set('users[0].age', 31)

delete(path)

Remove field or array element

doc.delete('user.tempField')

push(path, value)

Append to array

doc.push('users', newUser)

merge(path, object)

Deep merge objects

doc.merge('config', updates)

šŸ’¾ Save & Export

toTONL()

Export as TONL string

const tonlText = doc.toTONL()

toJSON()

Export as JavaScript object

const obj = doc.toJSON()

save(path)

Atomic file save with backup

await doc.save('output.tonl')
View complete API reference (50+ methods)

Query Syntax

JSONPath-like query expressions

user.name
Property access
users[0]
Array indexing
users[*].name
Wildcard (all names)
$..email
Recursive descent (all emails at any depth)
users[?(@.age > 18)]
Filter expression

Filter Operators

Comparison
==, !=, >, <, >=, <=
Logical
&&, ||, !
String
contains, startsWith, endsWith

Advanced Examples

users[?(@.age > 25 && @.active)]
Multiple conditions with AND
users[0:3]
Array slicing (first 3 items)
products[?(@.price < 100)]
Filter by numeric value
users[?(@.name contains "Smith")]
String contains filter
$.store.products[*].price
All product prices
View complete query documentation

CLI Tools

Command-line interface for TONL

ENCODE
tonl encode data.json --smart
DECODE
tonl decode data.tonl
QUERY
tonl query users.tonl 'users[*]'
GET
tonl get data.tonl "user.name"
VALIDATE
tonl validate --schema schema.tonl
STATS
tonl stats data.json
View full CLI documentation

LLM Integration Prompt

System prompt for teaching LLMs to read TONL data

šŸ“‹ Ready-to-Use System Prompt

Copy this into your LLM system prompt when sending TONL formatted data:

The following data is in TONL format. Parse it as follows:

• Lines with [count]{fields}: are array headers, data rows follow
• Lines with {fields}: are object headers, field: value pairs follow
• Indentation (2 spaces) indicates nesting levels
• Default delimiter is comma unless #delimiter header specifies otherwise
• Type hints may appear: field:type (e.g., id:u32, name:str)
  → Ignore the :type part, just parse the values
• Value types: unquoted numbers/booleans, quoted strings, null

Examples:
Without types: users[2]{id,name,role}:
With types: users[2]{id:u32,name:str,role:str}:
Both parse the same - just read the data values.

This represents: {"users": [{"id":1,"name":"Alice","role":"admin"}, {"id":2,"name":"Bob","role":"user"}]}

šŸ’” Why This Works

  • • Prompt is only ~150 tokens
  • • Negligible cost vs 32-50% data savings
  • • Works with GPT, Claude, Gemini, Llama, etc.
  • • LLMs naturally understand structured text
View full LLM integration guide

Schema Validation

Define and validate data structures with TONL schemas

āœ… What is Schema Validation?

Schemas let you define data structure, types, and constraints. TONL validates data against schemas and can auto-generate TypeScript types.

Basic Schema Example

@schema v1
@strict true

User: obj
  id: u32 required
  username: str required min:3 max:20
  email: str required pattern:email
  age: u32? min:13 max:150
  roles: list<str> required

users: list<User> required min:1

Validation

import { parseSchema, validateTONL } from 'tonl/schema';

// Load and parse schema
const schema = parseSchema(schemaContent);

// Validate data
const result = validateTONL(data, schema);

if (!result.valid) {
  result.errors.forEach(err => {
    console.error(`${err.field}: ${err.message}`);
  });
}

Type System

Primitive Types

str, u32, i32, u64, i64, f32, f64, bool

Complex Types

obj, list, list<T>, optional (field?)

String Constraints

min, max, pattern, email, url

Number Constraints

min, max, positive, negative, integer
View full schema specification

Streaming API

Process multi-GB files with constant memory usage

🌊 Why Streaming?

Stream processing allows you to work with files larger than available RAM. TONL's line-based format is perfect for streaming - process records one at a time with <100MB memory usage.

Stream Encoding

import { createEncodeStream } from 'tonl/stream';
import { createReadStream, createWriteStream } from 'fs';

// Stream encode large JSON files
createReadStream('huge.json')
  .pipe(createEncodeStream({ smart: true }))
  .pipe(createWriteStream('huge.tonl'));

Stream Query

import { streamQuery } from 'tonl/stream';

// Query huge files efficiently
await streamQuery(
  'large-dataset.tonl',
  'users[?(@.active)]',
  (chunk) => {
    // Process each matching chunk
    console.log(chunk);
  }
);

// Memory stays constant ~10MB

Performance Metrics

~50 MB/s
Processing Speed
<100 MB
Memory Usage
10+ GB
File Size Support
O(1)
Memory Complexity

Implementation Guide

Build TONL libraries in any language

Complete Guides Available

95KB of implementation documentation with algorithms, pseudo-code, and test requirements.

Supported Languages

šŸ
Python
šŸ”µ
Go
šŸ¦€
Rust
ā˜•
Java