LLM InfrastructureOpen Source✦ Free Tier

Vercel AI SDK

TypeScript SDK for streaming AI UIs

12,000 stars● Health 75ActiveDev Productivity & App Infrastructure

About

Open-source SDK for building AI-powered streaming UIs in Next.js and Node.js. Unified provider API supports OpenAI, Anthropic, and 20+ others.

Choose Vercel AI SDK when…

  • You're building a Next.js or React AI app
  • Streaming UI and server actions are central to your stack
  • You want multi-provider AI with one TypeScript SDK

Builder Slot

Which models does your stack route through?Optional for most stacks

A gateway that normalizes calls across providers — one API for all models, with fallbacks

Dev Tools
Not applicable
App Infra
Optional
Hybrid
Optional

Other tools in this slot:

Stack Genome Detection

AIchitect's Genome scanner detects Vercel AI SDK in your project via these signals:

npm packages
ai@ai-sdk/openai@ai-sdk/anthropic@ai-sdk/google

Integrates with (8)

CursorCoding Assistants

Cursor authors Next.js and Node.js apps that use the Vercel AI SDK — its agent mode understands the SDK's streaming and tool-calling patterns.

Faster AI-powered app development with Cursor's context awareness covering the SDK's provider switching and streaming UI patterns.

Compare →
MastraAgent Frameworks

Mastra uses the Vercel AI SDK's model interface as its underlying LLM abstraction layer.

All Vercel AI SDK-compatible providers are natively available to Mastra agents through one consistent interface.

Compare →
OpenAI APILLM Infrastructure

The Vercel AI SDK wraps OpenAI's API in its unified provider interface, handling streaming, tool calling, and structured output natively.

Streaming AI UIs backed by OpenAI with one import — useChat, useCompletion, and tool calling work out of the box.

Compare →
Anthropic APILLM Infrastructure

The Vercel AI SDK wraps Anthropic's API in its provider interface, enabling Claude with the same streaming and tool-calling API as other providers.

Claude-powered streaming UIs in Next.js or Node.js with the same code as any other Vercel AI SDK provider.

Compare →
LangChainPipelines & RAG

LangChain can be used as an orchestration layer that Vercel AI SDK calls feed into, or as a tool within SDK-powered streaming endpoints.

LangChain's retrieval and agent logic surfaced through Vercel AI SDK's streaming UI primitives in Next.js apps.

Compare →
LangfuseLLM Infrastructure

Langfuse's SDK wraps the Vercel AI SDK's model calls, capturing every streaming generation with token counts and latency.

Per-request observability on all AI calls made through the Vercel AI SDK — cost and quality metrics without changing streaming code.

Compare →
LiteLLMLLM Infrastructure

The Vercel AI SDK can point to LiteLLM's OpenAI-compatible endpoint as a custom provider, routing all SDK calls through LiteLLM.

Provider-agnostic Vercel AI SDK apps — swap between Claude, GPT-4o, and open models at the LiteLLM layer without changing SDK code.

Compare →
QdrantLLM Infrastructure

Apps built with the Vercel AI SDK call Qdrant directly for retrieval in RAG endpoints, fetching context before passing it to the SDK's generate function.

Semantic retrieval in Vercel AI SDK streaming endpoints — context from Qdrant enriches every generation without breaking streaming.

Compare →

Pricing

✦ Free tier available

In 4 stacks

Ruled out by 9 stacks

MCP Power User Stack
You're not building a product — you're augmenting your own workflow
No-Code AI Automation Stack
Code-first SDK; the target user doesn't write TypeScript
Agentic Coding Stack
SDK for building AI products, not for running autonomous coding agents
Browser AI / Web Agent Stack
Frontend SDK — irrelevant when the browser is the tool, not the UI layer
Data + AI Pipeline
Frontend streaming SDK — the pipeline runs server-side on scheduled triggers, not user requests
LLM Cost Reduction Stack
Frontend SDK — cost optimization happens at the routing and model-selection layer, not the UI
Legacy App + AI Stack
TypeScript/Next.js SDK — useless if your app is in Python, Ruby, or Java
Enterprise RAG Stack
Frontend streaming SDK — enterprise RAG serves multiple internal clients, not one web app
Research & Synthesis Stack
Frontend streaming SDK — synthesis pipelines run as background jobs, not interactive web requests

Badge

Add to your GitHub README

Vercel AI SDK on AIchitect[![Vercel AI SDK](https://aichitect.dev/badge/tool/vercel-ai-sdk)](https://aichitect.dev/tool/vercel-ai-sdk)

Explore the full AI landscape

See how Vercel AI SDK fits into the bigger picture — browse all 207 tools and their relationships.

Explore graph →