LlamaIndex
Use @codespar/llama-index to give LlamaIndex.TS agents commerce capabilities in Latin America.
LlamaIndex Adapter
The @codespar/llama-index adapter converts CodeSpar session tools into LlamaIndex.TS's FunctionTool format. Each tool has a call method that routes execution through the CodeSpar session for billing and audit. Use it with LlamaIndex agents, query engines, and RAG pipelines that need commerce capabilities.
Installation
npm install @codespar/sdk @codespar/llama-indexpnpm add @codespar/sdk @codespar/llama-indexyarn add @codespar/sdk @codespar/llama-index@codespar/llama-index has a peer dependency on @codespar/sdk@^0.2.0. You also need llamaindex for the LlamaIndex runtime.
API Reference
getTools(session): Promise<LlamaIndexTool[]>
Fetches all tools from the session and converts them to LlamaIndex tool format. Each tool has name, description, parameters (JSON Schema), and a call method.
import { CodeSpar } from "@codespar/sdk";
import { getTools } from "@codespar/llama-index";
const codespar = new CodeSpar({ apiKey: process.env.CODESPAR_API_KEY });
const session = await codespar.sessions.create({
servers: ["stripe", "mercadopago"],
});
const tools = await getTools(session);
console.log(tools[0].name); // "codespar_checkout"
console.log(tools[0].parameters); // { type: "object", properties: { ... } }toLlamaIndexTool(tool, session): LlamaIndexTool
Converts a single CodeSpar tool to LlamaIndex format with a bound call method.
handleToolCall(session, toolName, args): Promise<ToolResult>
Convenience executor that routes a tool call through the CodeSpar session.
Full agent loop
This is a complete example of a LlamaIndex agent with CodeSpar tools:
import { OpenAIAgent } from "llamaindex";
import { CodeSpar } from "@codespar/sdk";
import { getTools } from "@codespar/llama-index";
const codespar = new CodeSpar({ apiKey: process.env.CODESPAR_API_KEY });
async function run(userMessage: string) {
// 1. Create a session
const session = await codespar.sessions.create({
servers: ["stripe", "asaas", "correios"],
});
// 2. Get tools in LlamaIndex format
const tools = await getTools(session);
// 3. Create the agent
const agent = new OpenAIAgent({
tools,
systemPrompt:
"You are a commerce assistant for a Brazilian e-commerce store. " +
"Handle payments, invoicing, and shipping. " +
"Respond in the same language the user writes in.",
});
// 4. Chat with the agent
const response = await agent.chat({ message: userMessage });
// 5. Clean up
await session.close();
return response.message.content;
}
const reply = await run("Generate a boleto for R$250 due in 7 days");
console.log(reply);Handling parallel tool calls
Execute multiple tool calls in parallel:
const results = await Promise.all(
toolCalls.map(async (tc) => {
const tool = tools.find((t) => t.name === tc.name);
if (!tool) throw new Error(`Unknown tool: ${tc.name}`);
return tool.call(tc.args);
})
);Streaming
LlamaIndex supports streaming via the agent's chat method with streaming enabled:
import { OpenAIAgent } from "llamaindex";
import { CodeSpar } from "@codespar/sdk";
import { getTools } from "@codespar/llama-index";
const codespar = new CodeSpar({ apiKey: process.env.CODESPAR_API_KEY });
async function runStreaming(userMessage: string) {
const session = await codespar.sessions.create({
servers: ["stripe", "mercadopago"],
});
const tools = await getTools(session);
const agent = new OpenAIAgent({
tools,
systemPrompt: "You are a commerce assistant for a Brazilian store.",
});
const stream = await agent.chat({
message: userMessage,
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.message.content);
}
await session.close();
}
await runStreaming("Create a Pix payment for R$150");Error handling
Wrap tool.call() in try-catch:
for (const tool of tools) {
try {
const result = await tool.call(args);
console.log(`${tool.name}:`, result);
} catch (error) {
console.error(`${tool.name} failed:`, error instanceof Error ? error.message : error);
}
}LlamaIndex agents handle tool errors internally and feed them back to the LLM for reasoning.
Best practices
-
Always close sessions. Use
try/finallyto ensuresession.close()runs. -
Scope servers narrowly. Only connect the MCP servers your agent needs.
-
Combine with RAG. Use LlamaIndex's retrieval capabilities alongside CodeSpar tools for context-aware commerce operations.
-
Use OpenAIAgent for tool calling. It has the best tool-calling support in LlamaIndex.TS.
-
Return errors as strings. Let the agent reason about failures.
-
Filter tools when possible. Use
session.findTools()to get only relevant tools.
Next steps
- Sessions -- Session lifecycle and configuration
- Tools and Meta-Tools -- Understand the 6 meta-tools and routing
- OpenAI Adapter -- Direct OpenAI SDK integration
- LangChain Adapter -- Alternative agent framework
- Quickstart -- End-to-end setup in under 5 minutes