Skip to content

Getting started

Vertana is an LLM-powered agentic translation library for TypeScript. It uses autonomous agent workflows to gather contextual information for high-quality translations that preserve meaning, tone, and formatting.

Installation

Vertana consists of two packages:

  • @vertana/facade: High-level API for translation tasks
  • @vertana/core: Core translation logic (used internally by facade)

For most use cases, you only need to install the @vertana/facade package:

deno add jsr:@vertana/facade
npm add @vertana/facade
pnpm add @vertana/facade
yarn add @vertana/facade
bun add @vertana/facade

Vercel AI SDK

Vertana uses the Vercel AI SDK for LLM interactions, so you'll also need to install a model provider. For example, to use OpenAI:

deno add npm:@ai-sdk/openai
npm add @ai-sdk/openai
pnpm add @ai-sdk/openai
yarn add @ai-sdk/openai
bun add @ai-sdk/openai

Switching providers

Vertana works with any provider supported by the Vercel AI SDK. Simply install the provider package and import it:

// OpenAI
import { openai } from "@ai-sdk/openai";
const model = openai("gpt-5.1");

// Anthropic
import { anthropic } from "@ai-sdk/anthropic";
const model = anthropic("claude-opus-4-5-20251101");

// Google
import { google } from "@ai-sdk/google";
const model = google("gemini-3-flash-preview");

Each provider has its own package following the @ai-sdk/<provider> naming convention:

ProviderPackageExample models
OpenAI@ai-sdk/openaigpt-5.1, gpt-4o
Anthropic@ai-sdk/anthropicclaude-opus-4-5-20251101, claude-sonnet-4-5-20241022
Google@ai-sdk/googlegemini-3-flash-preview, gemini-3-pro
Mistral@ai-sdk/mistralmistral-large-2512
Amazon@ai-sdk/amazon-bedrockanthropic.claude-opus-4-5

TIP

Model names are passed directly to the provider's API, so you need to use the exact model IDs from their documentation. Check each provider's official docs for the full list of available models:

See also the Vercel AI SDK providers documentation for integration details.

Basic usage

The main entry point is the translate() function. Here's a minimal example:

import { 
translate
} from "@vertana/facade";
import {
openai
} from "@ai-sdk/openai";
const
result
= await
translate
(
openai
("gpt-4o"),
"ko", // Target language (BCP 47 tag) "Hello, world! How are you today?" );
console
.
log
(
result
.
text
);
// => "안녕하세요! 오늘 기분이 어떠세요?"

The translate() function takes three required arguments:

  1. A language model from the Vercel AI SDK
  2. The target language (as a BCP 47 language tag or Intl.Locale)
  3. The text to translate

With options

You can customize the translation with various options:

import { 
translate
} from "@vertana/facade";
import {
openai
} from "@ai-sdk/openai";
const
result
= await
translate
(
openai
("gpt-4o"),
"ja", "The patient presented with acute myocardial infarction.", {
sourceLanguage
: "en",
domain
: "medical",
tone
: "formal",
context
: "This is from a medical case study."
} );
console
.
log
(
result
.
text
);
// => "患者は急性心筋梗塞を呈した。"

Tracking progress

For long documents, you can track translation progress:

import { 
translate
} from "@vertana/facade";
import {
openai
} from "@ai-sdk/openai";
const
result
= await
translate
(
openai
("gpt-4o"),
"es",
longDocument
,
{
onProgress
: (
progress
) => {
console
.
log
(`Stage: ${
progress
.
stage
}, Progress: ${
progress
.
progress
}`);
} } );

Progress stages include chunking, gatheringContext, translating, refining, and selecting.

Command-line interface

Vertana also provides a CLI for quick translations. Install it with:

deno install -g --name vertana --allow-all jsr:@vertana/cli
npm install -g @vertana/cli
pnpm add -g @vertana/cli
bun add -g @vertana/cli

Then translate files directly from the terminal:

vertana translate -l ko document.md