Examples
Practical examples showing how to use jsonreader in real-world scenarios.
Processing AI Tool Calls
One of the most powerful use cases for jsonreader is processing JSON responses from AI services. These responses can be large and take time to generate, but with jsonreader, you can start processing the data as it arrives.
Streaming Function Call Results
import { jsonReader } from '@formkit/jsonreader';
async function processAIFunctionCall() {
// Call an AI model with function calling enabled
const response = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${API_KEY}`
},
body: JSON.stringify({
model: 'gpt-4-turbo',
messages: [
{ role: 'user', content: 'Generate a list of 50 product ideas' }
],
tools: [{
type: 'function',
function: {
name: 'generate_startup_ideas'
}
}],
stream: true
})
});
const reader = response.body.getReader();
// Track ideas we've seen
const processedIdeas = new Set();
const startTime = Date.now();
for await (const partialData of jsonReader(reader)) {
try {
// Look for tool calls in the response
const toolCall = partialData?.choices?.[0]?.delta?.tool_calls?.[0];
if (toolCall?.function?.arguments) {
// Parse the function arguments
const args = JSON.parse(toolCall.function.arguments);
if (args.ideas && Array.isArray(args.ideas)) {
// Process each new idea as it arrives
for (const idea of args.ideas) {
if (!processedIdeas.has(idea.id)) {
processedIdeas.add(idea.id);
console.log(`Idea #${idea.id} received after ${Date.now() - startTime}ms`);
// Update the UI with the new idea
addIdeaToInterface(idea);
}
}
}
}
} catch (error) {
// Handle parsing errors but continue processing
console.error('Error processing partial data:', error);
}
}
}
This example shows how to process streaming AI function call results, updating the UI as new ideas arrive without waiting for the entire response.
Path-Based Extraction Examples
The path-based reader is perfect for progressively updating UIs as different parts of the JSON become available:
import { jsonPathReader } from '@formkit/jsonreader';
async function progressiveProfileLoader() {
const response = await fetch('/api/user/profile');
const reader = response.body.getReader();
// Define critical paths to extract as they become available
const paths = [
'user.name', // Critical information
'user.avatar_url',
'user.email', // Important but not critical
'user.preferences.theme', // Less important details
'user.recent_activity.*.id',
'user.recent_activity.*.title'
];
// Show loading UI
showLoadingState();
// Process each path as it arrives
for await (const [value, path] of jsonPathReader(reader, paths)) {
console.log(`Received ${path}`);
// Update UI based on what data we received
if (path === 'user.name') {
updateHeader(value);
document.title = `${value}'s Profile`;
} else if (path === 'user.avatar_url') {
updateAvatar(value);
} else if (path === 'user.email') {
updateContactInfo(value);
} else if (path === 'user.preferences.theme') {
applyTheme(value);
} else if (path.match(/user\.recent_activity\.\d+\.title/)) {
const index = parseInt(path.match(/user\.recent_activity\.(\d+)\.title/)[1]);
updateActivityTitle(index, value);
}
}
// Complete loading
hideLoadingStates();
}
This pattern allows you to update different parts of your UI as soon as the corresponding data becomes available, creating a more responsive experience.
Processing Large Files
jsonreader is great for processing large JSON files without loading them entirely into memory:
import { createReadStream } from 'fs';
import { jsonReader } from '@formkit/jsonreader';
async function processLargeJsonFile(filePath) {
// Create a file stream
const fileStream = createReadStream(filePath);
// Convert to a web-standard ReadableStream
const webStream = new ReadableStream({
start(controller) {
fileStream.on('data', chunk => {
controller.enqueue(new Uint8Array(chunk));
});
fileStream.on('end', () => controller.close());
fileStream.on('error', error => controller.error(error));
}
});
const reader = webStream.getReader();
let recordCount = 0;
const startTime = Date.now();
// Use JSON reader to process the file
for await (const partialData of jsonReader(reader)) {
if (partialData.records && Array.isArray(partialData.records)) {
const newCount = partialData.records.length;
if (newCount > recordCount) {
console.log(`Processed ${newCount - recordCount} new records`);
recordCount = newCount;
}
}
}
console.log(`Completed in ${(Date.now() - startTime) / 1000}s`);
}