Getting started
Vertana is an LLM-powered agentic translation library for TypeScript. It uses autonomous agent workflows to gather contextual information for high-quality translations that preserve meaning, tone, and formatting.
Installation
Vertana consists of two packages:
- @vertana/facade: High-level API for translation tasks
- @vertana/core: Core translation logic (used internally by facade)
For most use cases, you only need to install the @vertana/facade package:
deno add jsr:@vertana/facadenpm add @vertana/facadepnpm add @vertana/facadeyarn add @vertana/facadebun add @vertana/facadeVercel AI SDK
Vertana uses the Vercel AI SDK for LLM interactions, so you'll also need to install a model provider. For example, to use OpenAI:
deno add npm:@ai-sdk/openainpm add @ai-sdk/openaipnpm add @ai-sdk/openaiyarn add @ai-sdk/openaibun add @ai-sdk/openaiSwitching providers
Vertana works with any provider supported by the Vercel AI SDK. Simply install the provider package and import it:
// OpenAI
import { openai } from "@ai-sdk/openai";
const model = openai("gpt-5.1");
// Anthropic
import { anthropic } from "@ai-sdk/anthropic";
const model = anthropic("claude-opus-4-5-20251101");
// Google
import { google } from "@ai-sdk/google";
const model = google("gemini-3-flash-preview");Each provider has its own package following the @ai-sdk/<provider> naming convention:
| Provider | Package | Example models |
|---|---|---|
| OpenAI | @ai-sdk/openai | gpt-5.1, gpt-4o |
| Anthropic | @ai-sdk/anthropic | claude-opus-4-5-20251101, claude-sonnet-4-5-20241022 |
@ai-sdk/google | gemini-3-flash-preview, gemini-3-pro | |
| Mistral | @ai-sdk/mistral | mistral-large-2512 |
| Amazon | @ai-sdk/amazon-bedrock | anthropic.claude-opus-4-5 |
TIP
Model names are passed directly to the provider's API, so you need to use the exact model IDs from their documentation. Check each provider's official docs for the full list of available models:
See also the Vercel AI SDK providers documentation for integration details.
Basic usage
The main entry point is the translate() function. Here's a minimal example:
import { translate } from "@vertana/facade";
import { openai } from "@ai-sdk/openai";
const result = await translate(
openai("gpt-4o"),
"ko", // Target language (BCP 47 tag)
"Hello, world! How are you today?"
);
console.log(result.text);
// => "안녕하세요! 오늘 기분이 어떠세요?"The translate() function takes three required arguments:
- A language model from the Vercel AI SDK
- The target language (as a BCP 47 language tag or
Intl.Locale) - The text to translate
With options
You can customize the translation with various options:
import { translate } from "@vertana/facade";
import { openai } from "@ai-sdk/openai";
const result = await translate(
openai("gpt-4o"),
"ja",
"The patient presented with acute myocardial infarction.",
{
sourceLanguage: "en",
domain: "medical",
tone: "formal",
context: "This is from a medical case study."
}
);
console.log(result.text);
// => "患者は急性心筋梗塞を呈した。"Tracking progress
For long documents, you can track translation progress:
import { translate } from "@vertana/facade";
import { openai } from "@ai-sdk/openai";
const result = await translate(
openai("gpt-4o"),
"es",
longDocument,
{
onProgress: (progress) => {
console.log(`Stage: ${progress.stage}, Progress: ${progress.progress}`);
}
}
);Progress stages include chunking, gatheringContext, translating, refining, and selecting.
Command-line interface
Vertana also provides a CLI for quick translations. Install it with:
deno install -g --name vertana --allow-all jsr:@vertana/clinpm install -g @vertana/clipnpm add -g @vertana/clibun add -g @vertana/cliThen translate files directly from the terminal:
vertana translate -l ko document.md