Schema-driven LLM interaction via XML Agents. Structure emerges from chaos.
Main entry point for the AI SDK. Provides core functionalities for interacting with AI models, generating text, structured XML, handling documents, and creating embeddings.
Streaming Text Generation
Streaming Text Generation
import { AIClient } from "@palladio/ai-sdk"; const client = new AIClient({ model: "google/gemini-flash-1.5" }); console.log("Streaming response for 'Write a short poem about Deno':"); try { await client.generateText({ prompt: "Write a short poem about Deno, the secure runtime.", stream: true, onData: ({ text, done }) => { if (!done) { // Process each chunk as it arrives Deno.stdout.write(new TextEncoder().encode(text)); } else { console.log("\n\n--- Stream finished ---"); } }, }); } catch (err) { console.error("\nError during streaming:", err); }
Creating Documents (Various Sources & Formats)
Creating Documents (Various Sources & Formats)
import { createDocument, type Document } from "@palladio/ai-sdk"; // 1. From string (inferred as 'text') const textDoc = createDocument({ name: "Note", source: "Meeting summary: AI SDK progress looks good." }); // 2. From object (inferred as 'json') const jsonDoc = createDocument({ name: "Config", source: { theme: "dark", features: ["A", "B"] } }); // 3. From string, explicitly JSON const jsonStringDoc = createDocument({ name: "ApiResp", source: '{"userId": 123, "data": null}', sourceFormat: 'json' }); // 4. From string, explicitly YAML const yamlDoc = createDocument({ name: "YamlData", source: "user:\n name: Alice\n role: admin", sourceFormat: 'yaml' }); // 5. From string, explicitly CSV const csvDoc = createDocument({ name: "CsvSource", source: "id,value\n1,apple\n2,banana", sourceFormat: 'csv' }); // 6. From URL (format inferred from extension - requires network access) // Using a raw GitHub content URL as an example const remoteUrl = new URL("[https://raw.githubusercontent.com/denoland/deno/main/README.md](https://raw.githubusercontent.com/denoland/deno/main/README.md)"); const urlDoc = createDocument({ name: "DenoReadme", source: remoteUrl }); // You can await content later // const readmeContent = await urlDoc.getContent(); // console.log(`Workspaceed README (first 100 chars): ${readmeContent.substring(0, 100)}...`); console.log(`Created ${textDoc.name} (Format: ${textDoc.getFormat()})`); console.log(`Created ${jsonDoc.name} (Format: ${jsonDoc.getFormat()})`); console.log(`Created ${jsonStringDoc.name} (Format: ${jsonStringDoc.getFormat()})`); console.log(`Created ${yamlDoc.name} (Format: ${yamlDoc.getFormat()})`); console.log(`Created ${csvDoc.name} (Format: ${csvDoc.getFormat()})`); console.log(`Created ${urlDoc.name} (Format: ${urlDoc.getFormat()}) - Content fetched on demand`);
Creating Document with Preprocessing
Creating Document with Preprocessing
import { createDocument, type Document } from "@palladio/ai-sdk"; // Source is CSV string, we want processed content as array of { id: number, product: string } const csvData = "ID,Product Name\n101,Widget A\n102,Gadget B"; // Define expected processed type type ProcessedCsvRow = { id: number; product: string }; const processedCsvDoc = createDocument<ProcessedCsvRow[], Record<string, string>[]>({ name: "Processed Inventory", source: csvData, sourceFormat: 'csv', // Ensure it's parsed as objects first preprocess: (parsedCsv: Record<string, string>[]) => { // Parsed CSV usually gives Record<string, string>[] // Transform into the desired ProcessedCsvRow[] type return parsedCsv.map(row => ({ id: parseInt(row['ID'], 10), // Assuming CSV parser creates 'ID' key product: row['Product Name'] // Assuming CSV parser creates 'Product Name' key })); } }); // Get the *processed* content async function displayProcessedContent() { const content = await processedCsvDoc.getContent(); console.log("Preprocessed CSV Content:", content); // Example Output: [{ id: 101, product: "Widget A" }, { id: 102, product: "Gadget B" }] } displayProcessedContent();
Structured XML Generation (Agent) - Basic
Structured XML Generation (Agent) - Basic
import { createAgent, createOutputDocument, createDocument } from "@palladio/ai-sdk"; // Updated import import { z } from "zod"; // Create the OutputDocument const summaryOutputDoc = createOutputDocument({ name: "summary", // Element name schema: z.object({ topic: z.string().describe("The main subject of the text"), keyPoints: z.array(z.string()).describe("A list of 3-5 main takeaways"), sentiment: z.enum(["positive", "negative", "neutral"]).describe("Overall sentiment"), }), examples: [{ // Example data topic: "Example Topic", keyPoints: ["Point 1", "Point 2", "Point 3"], sentiment: "neutral", }] }); // Create the Agent using the OutputDocument const summarizerAgent = createAgent({ outputDocument: summaryOutputDoc, // Use the OutputDocument purpose: "Summarize the key points and sentiment of the provided text.", instructions: [ "Read the document carefully.", "Identify the main topic.", "Extract 3 to 5 key points.", "Determine the overall sentiment (positive, negative, or neutral)." ], // model: "google/gemini-flash-1.5" // Optionally specify model }); const textToSummarize = createDocument({ name: "Review", source: "The new AI SDK is incredibly powerful and flexible, though the documentation examples could be expanded further. Overall, a very positive development." }); async function runSummarizer() { const response = await summarizerAgent([textToSummarize]); if (response.success) { console.log("Generated Summary:", response.object); } else { console.error("Summarizer Agent Error:", response.error); } } runSummarizer();
XML Agent with Complex Schema (Nested Objects/Arrays)
XML Agent with Complex Schema (Nested Objects/Arrays)
import { createAgent, createOutputDocument, createDocument } from "@palladio/ai-sdk"; // Updated import import { z } from "zod"; // Create the OutputDocument const projectOutputDoc = createOutputDocument({ name: "projectDetails", // Element name schema: z.object({ projectId: z.string().uuid().describe("Unique project identifier"), projectName: z.string().describe("Name of the project"), members: z.array(z.object({ name: z.string(), role: z.enum(["Lead", "Developer", "Tester"]), })).min(1).describe("List of team members (at least one)"), billingAddress: z.object({ street: z.string().optional(), city: z.string(), zip: z.string().regex(/^\d{5}$/, "Must be 5 digits"), // Added validation }).optional().describe("Optional billing address"), }), examples: [{ // Example data projectId: "a1b2c3d4-e5f6-7890-1234-567890abcdef", projectName: "AI SDK Refactor", members: [{ name: "Alice", role: "Lead" }, { name: "Bob", role: "Developer" }], billingAddress: { city: "Anytown", zip: "98765" } }] }); // Create the Agent using the OutputDocument const projectAgent = createAgent({ outputDocument: projectOutputDoc, // Use the OutputDocument purpose: "Extract detailed project information from meeting notes.", instructions: [ "Identify the project name and generate a UUID for projectId.", "List all mentioned team members and their roles (infer if not explicit).", "Extract the billing address if provided.", "Ensure zip code is exactly 5 digits." ], }); const meetingNotes = createDocument({ name: "Meeting Notes", source: "Project: AI SDK Refactor. Team: Alice (Lead), Bob. We need to finalize billing address later. Zip should be 12345 for the main office." }); async function runProjectAgent() { const response = await projectAgent([meetingNotes]); if (response.success) { console.log("Extracted Project Details:", response.object); } else { console.error("Project Agent Error:", response.error.message); // Log metadata which might contain the raw XML attempt console.error("Metadata:", response._meta); } } runProjectAgent();
XML Agent with Multiple Documents (Context Aggregation)
XML Agent with Multiple Documents (Context Aggregation)
import { createAgent, createOutputDocument, createDocument } from "@palladio/ai-sdk"; // Updated import import { z } from "zod"; // Create the OutputDocument const recipeOutputDoc = createOutputDocument({ name: "recipe", // Element name schema: z.object({ dishName: z.string(), ingredients: z.array(z.object({ item: z.string(), quantity: z.string().optional() })), prepTime: z.string().optional().describe("Estimated preparation time"), source: z.string().optional().describe("Where the recipe is from"), }), examples: [{ // Example data dishName: "Example Bake", ingredients: [{ item: "Flour", quantity: "2 cups" }, { item: "Sugar" }], prepTime: "20 minutes", source: "Grandma's Cookbook" }] }); // Create the Agent using the OutputDocument const recipeAgent = createAgent({ outputDocument: recipeOutputDoc, // Use the OutputDocument purpose: "Combine information from multiple documents to create a recipe.", instructions: [ "Determine the dish name.", "List all ingredients mentioned across documents, including quantities if available.", "Find the preparation time if stated.", "Note the source if mentioned." ], }); const doc1 = createDocument({ name: "Ingredients List", source: "Pasta Bake: Need pasta, tomato sauce (1 can), cheese."}); const doc2 = createDocument({ name: "Instructions", source: "For the Pasta Bake, prep time is 15 mins. From website RecipeMaster."}); const doc3 = createDocument({ name: "Shopping List", source: "Get ground beef (1lb) for the bake."}); async function runRecipeAgent() { const response = await recipeAgent([doc1, doc2, doc3]); // Pass all documents if (response.success) { console.log("Combined Recipe:", response.object); // Expected output might combine ingredients, prep time, and source } else { console.error("Recipe Agent Error:", response.error); } } runRecipeAgent();
Handling XML Agent Errors
Handling XML Agent Errors
import { createAgent, createOutputDocument, createDocument, AIClientGenerateStructuredXMLError } from "@palladio/ai-sdk"; // Updated import import { z } from "zod"; // Create the OutputDocument const errorOutputDoc = createOutputDocument({ name: "data", // Element name schema: z.object({ value: z.number() }), examples: [{ value: 123 }] // Example data }); // Create the Agent using the OutputDocument const errorAgent = createAgent({ outputDocument: errorOutputDoc, // Use the OutputDocument purpose: "Intentionally try to cause a validation error.", instructions: ["Return XML where 'value' is a string, not a number."], }); const dummyDoc = createDocument({ name: "Input", source: "Provide invalid data." }); async function runErrorAgent() { const response = await errorAgent([dummyDoc]); if (!response.success) { console.error("--------------------"); console.error("Agent failed as expected!"); console.error("Error Type:", response.error.name); // e.g., AIClientGenerateStructuredXMLError console.error("Error Message:", response.error.message); // Often includes Zod issues // Detailed metadata for debugging console.error("\nMetadata:"); console.error(`- Prompt: ${response._meta.prompt ? response._meta.prompt.substring(0, 100) + '...' : 'N/A'}`); console.error(`- Raw Response: ${response._meta.rawResponse ?? 'N/A'}`); console.error(`- Fixed XML Attempt: ${response._meta.xmlResponseFixed ?? 'N/A'}`); console.error(`- Time Taken: ${response._meta.timeTaken ?? 'N/A'}ms`); console.error(`- Model: ${response._meta.model ?? 'N/A'}`); console.error("--------------------"); // You can check for specific error types or codes if needed if (response.error instanceof AIClientGenerateStructuredXMLError && response.error.message.includes("Validation failed")) { console.log("Detected Zod validation failure within the error."); } } else { console.warn("Agent succeeded unexpectedly:", response.object); } } runErrorAgent();
Manually Rendering Documents for XML Embedding (Less Common)
Manually Rendering Documents for XML Embedding (Less Common)
import { createDocument, documentsToXmlEmbed } from "@palladio/ai-sdk"; const docA = createDocument({ name: "DocA", source: "Content A" }); const docB = createDocument({ name: "DocB", source: JSON.stringify({ key: "Value B" }), sourceFormat: 'json' }); async function generateManualEmbed() { // This renders each document individually using its renderAsXmlEmbed method // and wraps them in a <documents> tag. const combinedXml = await documentsToXmlEmbed([docA, docB]); console.log("Manually Embedded XML:\n", combinedXml); } generateManualEmbed(); // Output typically looks like: // <documents> // <document name="DocA" format="text">Content A</document> // <document name="DocB" format="json">{ // "key": "Value B" // }</document> // </documents>
XML Agent with Image Documents
XML Agent with Image Documents
import { createAgent, createOutputDocument, createImageDocument, createImageDocuments } from "@palladio/ai-sdk"; // Updated import import { z } from "zod"; // Define a schema for describing images // Create the OutputDocument const imageOutputDoc = createOutputDocument({ name: "imageAnalysis", // Element name schema: z.object({ descriptions: z.array(z.object({ imageName: z.string().describe("The name of the image file or source."), summary: z.string().describe("A brief description of the image content."), objectsDetected: z.array(z.string()).optional().describe("List of objects identified in the image."), })).describe("An array containing descriptions for each image provided."), }), examples: [{ // Example data descriptions: [{ imageName: "example.png", summary: "A sample image description.", objectsDetected: ["object1", "object2"], }], }] }); // Create an agent to describe images const imageDescriberAgent = createAgent({ outputDocument: imageOutputDoc, // Use the OutputDocument purpose: "Analyze the provided image(s) and describe their content.", instructions: [ "For each image document provided, generate a description.", "Include the image name as specified in the document.", "Provide a concise summary of what the image depicts.", "If possible, list key objects visible in the image.", ], // Note: Requires a model that supports vision/image input, like Gemini Flash 1.5 // model: "google/gemini-flash-1.5" // Specify appropriate model if needed }); async function runImageAgent() { try { // --- Creating Image Documents from Various Sources --- // 1. From URL (File or HTTP/S) - Ensure accessible // Replace with a valid file:// or https:// URL const imageUrl = new URL("https://via.placeholder.com/10"); // Example URL const urlImageDoc = await createImageDocument("UrlImage", { image: imageUrl, description: "An image loaded from a URL.", }); // 2. From Base64 String (Data URI) // (Using a 1x1 pixel transparent PNG data URI) const base64DataUri = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0lEQVR42mNkYAAAAAYAAjCB0C8AAAAASUVORK5CYII="; const base64ImageDoc = await createImageDocument("Base64Image", { image: base64DataUri }); // 2b. From Raw Base64 String (Requires mimeType or defaults to png) const rawBase64 = "iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0lEQVR42mNkYAAAAAYAAjCB0C8AAAAASUVORK5CYII="; const rawBase64ImageDoc = await createImageDocument("RawBase64Image", { image: rawBase64, mimeType: "image/png", // Good practice to provide mimeType for raw base64 description: "Image from raw base64 string." }); // 3. From Uint8Array (e.g., read from file or received from API) // (Using a 1x1 pixel black PNG byte array) const uint8ArrayData = new Uint8Array([137, 80, 78, 71, 13, 10, 26, 10, 0, 0, 0, 13, 73, 72, 68, 82, 0, 0, 0, 1, 0, 0, 0, 1, 8, 4, 0, 0, 0, 185, 111, 189, 4, 0, 0, 0, 10, 73, 68, 65, 84, 120, 109, 99, 100, 0, 0, 0, 6, 0, 1, 140, 32, 209, 47, 0, 0, 0, 0, 73, 69, 78, 68, 174, 66, 96, 130]); const uint8ArrayImageDoc = await createImageDocument("Uint8ArrayImage", { image: uint8ArrayData, mimeType: "image/png", // Provide mimeType when using Uint8Array description: "A tiny black square from bytes." }); // 4. From File Path Object // Replace with an actual relative path to an image file in your project const filePathImageDoc = await createImageDocument("FilePathImage", { image: { filePath: "./path/to/your/image.jpg" }, // Example path description: "Image loaded via file path object." }); // 5. Using createImageDocuments for multiple sources at once const multiSourceDocs = await createImageDocuments([ { name: "WebImage", options: { image: new URL("https://via.placeholder.com/50") } }, { name: "BytesImage", options: { image: uint8ArrayData, mimeType: "image/png", description: "Multi-bytes" } }, { name: "DataUriImage", options: { image: base64DataUri, metadata: { source: "multi" } } }, { name: "PathImage", options: { image: { filePath: "./path/to/another/image.png" } } }, // Example path ]); // --- Combine all created documents for the agent --- const allImageDocs = [ urlImageDoc, base64ImageDoc, rawBase64ImageDoc, // Added raw base64 example uint8ArrayImageDoc, filePathImageDoc, // Added file path example ...multiSourceDocs, ]; console.log(`Running agent with ${allImageDocs.length} image documents...`); // Run the agent with the image documents const response = await imageDescriberAgent(allImageDocs); if (response.success) { console.log("Image Analysis Result:", JSON.stringify(response.object, null, 2)); } else { console.error("Image Agent Error:", response.error.message); console.error("Metadata:", response._meta); } } catch (error) { console.error("Error creating image documents or running agent:", error); console.log("Ensure image paths/URLs are correct and accessible."); } } // Note: Running this example requires valid image sources and potentially network access. // runImageAgent(); // Uncomment to run, replacing placeholders first.
Document Transformers and Chunking
Document Transformers and Chunking
import { createTransformableDocument, DocumentTransformer } from "@palladio/ai-sdk"; // Create a transformable document const doc = createTransformableDocument({ name: "Large Article", source: "This is a very long article about AI. " + "Lorem ipsum...".repeat(500), }); // Chain transformations const processed = await doc .transform(DocumentTransformer.extractFirst(1000)) // Take first 1000 chars .transform(DocumentTransformer.summarize()) // Add summary metadata .transform(DocumentTransformer.normalize()) // Clean whitespace .getValue(); // Smart chunking for large documents const chunks = await createTransformableDocument({ name: "Technical Manual", source: "Chapter 1: Introduction\n\nThis chapter covers...\n\nChapter 2: Setup\n\nTo begin setup..." }) .transform(DocumentTransformer.smartChunk({ maxChunkSize: 1000, overlapSize: 100, preserveBoundaries: true, // Respects paragraph/chapter boundaries })) .getValue(); console.log(`Document split into ${chunks.length} chunks`);
Schema Inference from Examples
Schema Inference from Examples
import { inferSchemaFromExamples, createSchemaWithDescriptions, createSimpleAgent } from "@palladio/ai-sdk"; // Automatically infer schema from example data const examples = [ { name: "iPhone 14", price: 999, inStock: true, color: "blue" }, { name: "Samsung S23", price: 899, inStock: false }, ]; const productSchema = inferSchemaFromExamples(examples); // Inferred: z.object({ // name: z.string(), // price: z.number(), // inStock: z.boolean(), // color: z.string().optional() // }) // Add descriptions for better AI understanding const enhancedSchema = createSchemaWithDescriptions(productSchema, { name: "Product name or model", price: "Price in USD", inStock: "Whether the product is currently available", color: "Product color (if applicable)", }); // Use with simple agent const agent = createSimpleAgent({ schema: enhancedSchema, purpose: "Extract product information from descriptions", });
Simple Agent Creation
Simple Agent Creation
import { createSimpleAgent, createDocument } from "@palladio/ai-sdk"; import { z } from "zod"; // Quick agent creation with minimal config const sentimentAgent = createSimpleAgent({ schema: z.object({ sentiment: z.enum(["positive", "negative", "neutral"]), confidence: z.number().min(0).max(1), keywords: z.array(z.string()), }), purpose: "Analyze sentiment and extract key phrases from text", }); const doc = createDocument({ source: "This product exceeded my expectations! Great value for money.", }); const result = await sentimentAgent([doc]); if (result.success) { console.log("Sentiment:", result.object.sentiment); console.log("Confidence:", result.object.confidence); console.log("Keywords:", result.object.keywords); }
Error Handling with safeAsync
Error Handling with safeAsync
import { safeAsync, AIClient } from "@palladio/ai-sdk"; const client = new AIClient({ model: "google/gemini-2.0-flash-001" }); // Wrap any async operation for safe error handling const [error, response] = await safeAsync( client.generateText({ prompt: "Explain quantum computing" }) ); if (error) { console.error("Failed to generate text:", error.message); // Handle error gracefully - maybe retry or use fallback } else { console.log("Generated text:", response.text); } // Chain operations safely async function processDocument(doc: Document<any>) { const [parseError, content] = await safeAsync(doc.getContent()); if (parseError) return null; const [agentError, result] = await safeAsync(agent([doc])); if (agentError) return null; return result.object; }
Add Package
deno add jsr:@palladio/ai-sdk
Import symbol
import * as ai_sdk from "@palladio/ai-sdk";
Import directly with a jsr specifier
import * as ai_sdk from "jsr:@palladio/ai-sdk";
Add Package
pnpm i jsr:@palladio/ai-sdk
pnpm dlx jsr add @palladio/ai-sdk
Import symbol
import * as ai_sdk from "@palladio/ai-sdk";
Add Package
yarn add jsr:@palladio/ai-sdk
yarn dlx jsr add @palladio/ai-sdk
Import symbol
import * as ai_sdk from "@palladio/ai-sdk";
Add Package
vlt install jsr:@palladio/ai-sdk
Import symbol
import * as ai_sdk from "@palladio/ai-sdk";
Add Package
npx jsr add @palladio/ai-sdk
Import symbol
import * as ai_sdk from "@palladio/ai-sdk";
Add Package
bunx jsr add @palladio/ai-sdk
Import symbol
import * as ai_sdk from "@palladio/ai-sdk";