Getting Started with jsonreader

jsonreader is a specialized utility for processing JSON data from streams in real-time, with a particular focus on handling responses from AI tool calls efficiently.

Installation

Add jsonreader to your project using your preferred package manager:

BASH
# Using npm
npm install jsonreader

# Using yarn
yarn add jsonreader

# Using pnpm
pnpm add jsonreader

Why Use jsonreader?

When working with AI tool calls or any large JSON responses, jsonreader provides several key advantages:

  1. Process data as it arrives: Don't wait for the entire API response to complete.
  2. Early UI updates: Show users results immediately as they become available.
  3. Path-specific extraction: Access specific parts of the JSON as soon as they appear in the stream.
  4. Optimized for AI tools: Perfect for processing responses from AI function calls that can be large and slow to complete.

Basic Usage

Streaming JSON Processing

The simplest way to use jsonreader is to process JSON data as it streams in:

JAVASCRIPT
import { jsonReader } from '@formkit/jsonreader';

async function processStreamingData() {
  // Get a stream from somewhere (e.g., fetch API)
  const response = await fetch('https://api.ai-service.com/v1/generate');
  const reader = response.body.getReader();
  
  // Process JSON as it comes in
  for await (const partialData of jsonReader(reader)) {
    console.log('Received partial data:', partialData);
    
    // Update your UI with partial data
    updateUI(partialData);
  }
}

In this example, partialData will contain progressively more complete JSON objects as the data streams in. Each iteration yields the most complete JSON structure possible with the data received so far.

Extracting Specific Paths

You can also extract specific paths from the JSON as they become available:

JAVASCRIPT
import { jsonReader } from '@formkit/jsonreader';

async function processToolResults() {
  const response = await fetch('https://api.ai-service.com/v1/generate');
  const reader = response.body.getReader();
  
  // Extract the 'results' array as soon as elements appear
  for await (const { results } of jsonReader(reader, {
    required: ['results']
  })) {
    if (results?.length) {
      // Process each batch of results as they arrive
      for (const item of results) {
        renderResultItem(item);
      }
    }
  }
}

This approach is particularly useful when working with AI tool calls that may return large arrays of results that you want to display incrementally as they become available.

Using jsonPathReader

For more targeted extraction of specific JSON paths, you can use the jsonPathReader function:

JAVASCRIPT
import { jsonPathReader } from '@formkit/jsonreader';

// Define paths to extract
const paths = [
  'results',
  'progress',
  'metadata.timing',
  'items.*.id' // Use wildcard to match all item IDs
];

for await (const [value, path] of jsonPathReader(reader, paths)) {
  if (path === 'results') {
    updateResultsList(value);
  } else if (path === 'progress') {
    updateProgressBar(value);
  } else if (path.startsWith('items.') && path.endsWith('.id')) {
    // Handle individual item IDs matched by the wildcard
    trackItem(value);
  }
}

The jsonPathReader approach is ideal when you need to handle updates to multiple specific paths independently. It yields each value and path as a tuple ([value, path]) as soon as they change in the stream. You can use the '*' wildcard character in path segments to match any property name or array index at that position.

Error Handling

JSONReader will throw an error when parsing fails:

JAVASCRIPT
try {
  for await (const data of jsonReader(reader)) {
    // Process JSON data
    updateUI(data);
  }
} catch (error) {
  // Handle any errors
  console.error('Error processing JSON:', error);
}