docs: add recommended media type & file extension

This commit is contained in:
Johann Schopplich
2025-11-23 18:51:50 +01:00
parent 7a05d03e73
commit a200a9fa54
8 changed files with 54 additions and 12 deletions

View File

@@ -234,8 +234,20 @@ console.log(JSON.stringify(data, null, 2))
Round-tripping is lossless: `decode(encode(x))` always equals `x` (after normalization of non-JSON types like `Date`, `NaN`, etc.).
## Media Type & File Extension
When using TOON over HTTP or in content-typeaware systems:
- Use `text/toon` as the media type (provisional; see [spec §18.2](https://github.com/toon-format/spec/blob/main/SPEC.md#182-provisional-media-type))
- Use `.toon` as the standard file extension
- TOON is always UTF-8 encoded; `charset=utf-8` is optional
Example:
```http
Content-Type: text/toon; charset=utf-8
```
## Where to Go Next
Now that you've seen your first TOON document, read the [Format Overview](/guide/format-overview) for complete syntax details (objects, arrays, quoting rules, key folding), then explore [Using TOON with LLMs](/guide/llm-prompts) to see how to use it effectively in prompts. For implementation details, check the [API reference](/reference/api) (TypeScript) or the [specification](/reference/spec) (language-agnostic normative rules).
For large datasets or streaming use-cases, see `encodeLines`, `decodeFromLines`, and `decodeStream` in the [API reference](/reference/api).
Now that you've seen your first TOON document, read the [Format Overview](/guide/format-overview) for complete syntax details (objects, arrays, quoting rules, key folding), then explore [Using TOON with LLMs](/guide/llm-prompts) to see how to use it effectively in prompts. For implementation details, check the [API Reference](/reference/api) (TypeScript) or the [Specification](/reference/spec) (language-agnostic normative rules).

View File

@@ -83,7 +83,7 @@ catch (error) {
}
```
Strict mode checks counts, indentation, and escaping so you can detect truncation or malformed TOON. For complete details, see the [API reference](/reference/api#decode).
Strict mode checks counts, indentation, and escaping so you can detect truncation or malformed TOON. For complete details, see the [API Reference](/reference/api#decode).
## Delimiter Choices for Token Efficiency
@@ -116,7 +116,7 @@ The CLI also supports streaming for memory-efficient JSON-to-TOON conversion:
toon large-dataset.json --output output.toon
```
This streaming approach prevents out-of-memory errors when preparing large context windows for LLMs. For complete details on `encodeLines()`, see the [API reference](/reference/api#encodelines).
This streaming approach prevents out-of-memory errors when preparing large context windows for LLMs. For complete details on `encodeLines()`, see the [API Reference](/reference/api#encodelines).
**Consuming streaming LLM outputs:** If your LLM client exposes streaming text and you buffer by lines, you can decode TOON incrementally: