Wiki · agentic engineering

Anthropic SDK — Cheatsheet

Referencia rápida del Anthropic TypeScript SDK: creación de mensajes, tool use, streaming, prompt caching y patrones comunes.

anthropic-sdktypescriptreferencetool-useActualizado 2026-04-22

Instalación

bash
npm install @anthropic-ai/sdk

Client básico

typescript
import Anthropic from "@anthropic-ai/sdk";
 
const client = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY, // default — no hace falta si está en env
});

Mensaje simple

typescript
const response = await client.messages.create({
  model: "claude-opus-4-5",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Explain agentic systems in 2 sentences." }],
});
 
console.log(response.content[0].text);

Con system prompt

typescript
const response = await client.messages.create({
  model: "claude-opus-4-5",
  max_tokens: 2048,
  system: "You are a TypeScript expert focused on type safety.",
  messages: [{ role: "user", content: "..." }],
});

Tool use

typescript
const tools: Anthropic.Tool[] = [
  {
    name: "read_file",
    description: "Reads the content of a file at the given path",
    input_schema: {
      type: "object",
      properties: {
        path: { type: "string", description: "Relative file path" },
      },
      required: ["path"],
    },
  },
];
 
const response = await client.messages.create({
  model: "claude-opus-4-5",
  max_tokens: 4096,
  tools,
  messages: [{ role: "user", content: "Read src/index.ts" }],
});
 
// Handle tool_use blocks
for (const block of response.content) {
  if (block.type === "tool_use") {
    console.log(block.name, block.input);
  }
}

Loop agéntico completo

typescript
async function runAgent(task: string): Promise<string> {
  const messages: Anthropic.MessageParam[] = [
    { role: "user", content: task },
  ];
 
  while (true) {
    const response = await client.messages.create({
      model: "claude-opus-4-5",
      max_tokens: 4096,
      tools,
      messages,
    });
 
    messages.push({ role: "assistant", content: response.content });
 
    if (response.stop_reason === "end_turn") {
      const lastText = response.content.find(b => b.type === "text");
      return lastText?.text ?? "";
    }
 
    // Execute tool calls
    const toolResults: Anthropic.ToolResultBlockParam[] = [];
    for (const block of response.content) {
      if (block.type !== "tool_use") continue;
      const result = await executeTool(block.name, block.input);
      toolResults.push({
        type: "tool_result",
        tool_use_id: block.id,
        content: result,
      });
    }
 
    messages.push({ role: "user", content: toolResults });
  }
}

Prompt caching

typescript
const response = await client.messages.create({
  model: "claude-opus-4-5",
  system: [
    {
      type: "text",
      text: "Base instructions...",
    },
    {
      type: "text",
      text: longDocumentContext, // Este bloque se cachea
      cache_control: { type: "ephemeral" },
    },
  ],
  messages,
});
 
// Verificar cache hits
console.log(response.usage.cache_read_input_tokens);
console.log(response.usage.cache_creation_input_tokens);

Regla: Los bloques marcados con cache_control se cachean por 5 minutos. Ahorra ~90% del costo en tokens repetidos.

Streaming

typescript
const stream = await client.messages.stream({
  model: "claude-opus-4-5",
  max_tokens: 1024,
  messages: [{ role: "user", content: "..." }],
});
 
for await (const chunk of stream) {
  if (
    chunk.type === "content_block_delta" &&
    chunk.delta.type === "text_delta"
  ) {
    process.stdout.write(chunk.delta.text);
  }
}
 
const finalMessage = await stream.getFinalMessage();

Modelos disponibles (2026)

ModelUso
claude-opus-4-5Tareas complejas, agentes de producción
claude-sonnet-4-5Balance inteligencia/velocidad/costo
claude-haiku-3-5Tareas simples, alta velocidad, bajo costo

Token usage

typescript
console.log(response.usage.input_tokens);
console.log(response.usage.output_tokens);
// Con caching:
console.log(response.usage.cache_creation_input_tokens);
console.log(response.usage.cache_read_input_tokens);