The language to build incredible AI agents
Write AI agents in minutes, not hours. Vibe makes prompts first-class citizens and handles the complexity so you can focus on what your agent does.
model translator = {
name: "claude-haiku-4.5",
provider: "anthropic",
apiKey: env("ANTHROPIC_API_KEY")
}
const languages: text[] = do "List the major human languages"
for language in languages {
const translated = do "Translate 'Hello World' into {language}"
print(translated)
}npm install -g @vibe-lang/vibe AI-Native Syntax
Prompts are first-class language primitives. No wrapping strings in function calls, no async/await boilerplate. Just write what you want the AI to do.
// Traditional approach - verbose and clunky
const response = await openai.chat.completions.create({
model: "gpt-5.2",
messages: [{ role: "user", content: prompt }]
});
const answer = response.choices[0].message.content;
// Vibe - clean and expressive
const answer = do "Explain quantum computing"Strong Typing
AI calls return typed values. Request a number, get a number. Request a boolean, get a boolean. No more JSON.parse() and hoping for the best.
// Vibe automatically returns the right type
const count: number = do "How many planets?"
const isPrime: boolean = do "Is 17 prime?"
const tags: text[] = do "List 3 languages"
// Use them directly - no parsing needed
print(count + 1) // 9
print(!isPrime) // false
print(tags[0]) // "Python"Seamless TypeScript Interop
Drop into TypeScript whenever you need it. Access the full npm ecosystem, use complex data structures, or implement custom logic.
// Embed TypeScript for complex operations
const result = ts(data) {
const parsed = JSON.parse(data);
return parsed.items
.filter(item => item.score > 0.8)
.map(item => item.name)
.join(", ");
}
// Import from TypeScript files
import { processData } from "./utils.ts"
// Use npm packages directly
const html = ts(markdown) {
return require('marked').parse(markdown);
}The 'vibe' Keyword
The vibe keyword is the core of agent orchestration. It handles AI calls, tool loops, and structured output automatically. Just describe what you want.
import { writeFile } from "system/tools"
model poet = {
name: "claude-haiku-4.5",
provider: "anthropic",
apiKey: env("ANTHROPIC_API_KEY"),
tools: [writeFile] // define which tools are available to the model
}
const topics = ["sunset", "coffee", "mountains", "rain", "stars"]
// One call: generates a poem for each topic, writes each to a file
vibe "Write a poem for each topic in {topics} and save to separate files"Smart Context
Vibe automatically manages AI context windows. Function-local variables are cleared on return, while results persist. Control loop context with modifiers.
function analyze(url: text): text {
const html = fetch(url)
const content = do "Extract article text: {html}"
return do "Summarize the content of the article" // model is aware of the content variable in context
} // html and content cleared from context after return
const articles:text[] = ["https://example.com/1", "https://example.com/2"]
const summaries:text[] = []
for article in articles {
const summary = analyze(article)
summaries.push(summary)
} forget // discard all loop context after completion
// } compress // summarize loop context to save tokens
// } verbose // keep full loop context (default behavior)
Custom Tools
Define tools that AI can invoke with full type safety. Implement inline with TypeScript or import from existing modules.
// Imported: Complex integration lives in separate file
import { createIncident } from "./pagerduty.ts"
tool alertOnCall(severity: text, title: text, details: text): json
@description "Create an incident and page the on-call engineer"
{
ts(severity, title, details) {
return createIncident({ severity, title, details })
}
}
// Inline: Fetch data for AI to analyze
tool getMetrics(service: text, hours: number): json
@description "Get performance metrics for a service"
{
ts(service, hours) {
const res = await fetch(env("METRICS_API") + "/v1/query?service=" + service + "&hours=" + hours)
return res.json()
}
}
model monitor = {
name: "claude-opus-4.5",
provider: "anthropic",
apiKey: env("ANTHROPIC_API_KEY"),
tools: [getMetrics, alertOnCall]
}
// AI analyzes metrics and decides whether to alert
vibe "Check api-gateway metrics. Alert if critical."Multi-Provider Support
Switch between OpenAI, Anthropic, and Google AI with a single line change. Same syntax, same behavior, different provider.
// Define models from different providers
model gpt = {
name: "gpt-5.2",
provider: "openai",
apiKey: env("OPENAI_API_KEY")
}
model haiku = {
name: "claude-haiku-4.5",
provider: "anthropic",
apiKey: env("ANTHROPIC_API_KEY")
}
model gemini = {
name: "gemini-3-pro",
provider: "google",
apiKey: env("GOOGLE_API_KEY")
}
// Same code works with any model
const answer = do "Explain recursion"Get Started in Seconds
Up and running with just a few commands
Install Vibe
# npm
npm install -g @vibe-lang/vibe
# bun
bun install -g @vibe-lang/vibe Create hello.vibe
model ai = {
name: "claude-haiku-4.5",
provider: "anthropic",
apiKey: "your-api-key"
}
const movies: text[] = do "What are your top 5 movies?"
print(movies) Run it
vibe hello.vibe