feat: streaming decode functionality with event-based parsing (closes #131)

This commit is contained in:
Johann Schopplich
2025-11-21 22:29:57 +01:00
parent 9ebad53ea3
commit 6c57a14009
19 changed files with 2220 additions and 431 deletions

View File

@@ -108,19 +108,25 @@ cat data.toon | toon --decode
Both encoding and decoding operations use streaming output, writing incrementally without building the full output string in memory. This makes the CLI efficient for large datasets without requiring additional configuration.
**JSON → TOON (Encode)**
- Streams TOON lines to output
- No full TOON string in memory
**JSON → TOON (Encode)**:
**TOON → JSON (Decode)**
- Streams JSON tokens to output
- No full JSON string in memory
- Streams TOON lines to output.
- No full TOON string in memory.
**TOON → JSON (Decode)**:
- Uses the same event-based streaming decoder as the `decodeStream` API in `@toon-format/toon`.
- Streams JSON tokens to output.
- No full JSON string in memory.
- When `--expand-paths safe` is enabled, falls back to non-streaming decode internally to apply deep-merge expansion before writing JSON.
Process large files with minimal memory usage:
```bash
# Encode large JSON file with minimal memory usage
# Encode large JSON file
toon huge-dataset.json -o output.toon
# Decode large TOON file with minimal memory usage
# Decode large TOON file
toon huge-dataset.toon -o output.json
# Process millions of records efficiently via stdin