TypeScript SDK for streaming AI UIs
Open-source SDK for building AI-powered streaming UIs in Next.js and Node.js. Unified provider API supports OpenAI, Anthropic, and 20+ others.
A gateway that normalizes calls across providers — one API for all models, with fallbacks
Other tools in this slot:
AIchitect's Genome scanner detects Vercel AI SDK in your project via these signals:
ai@ai-sdk/openai@ai-sdk/anthropic@ai-sdk/googleCursor authors Next.js and Node.js apps that use the Vercel AI SDK — its agent mode understands the SDK's streaming and tool-calling patterns.
→ Faster AI-powered app development with Cursor's context awareness covering the SDK's provider switching and streaming UI patterns.
Mastra uses the Vercel AI SDK's model interface as its underlying LLM abstraction layer.
→ All Vercel AI SDK-compatible providers are natively available to Mastra agents through one consistent interface.
The Vercel AI SDK wraps OpenAI's API in its unified provider interface, handling streaming, tool calling, and structured output natively.
→ Streaming AI UIs backed by OpenAI with one import — useChat, useCompletion, and tool calling work out of the box.
The Vercel AI SDK wraps Anthropic's API in its provider interface, enabling Claude with the same streaming and tool-calling API as other providers.
→ Claude-powered streaming UIs in Next.js or Node.js with the same code as any other Vercel AI SDK provider.
LangChain can be used as an orchestration layer that Vercel AI SDK calls feed into, or as a tool within SDK-powered streaming endpoints.
→ LangChain's retrieval and agent logic surfaced through Vercel AI SDK's streaming UI primitives in Next.js apps.
Langfuse's SDK wraps the Vercel AI SDK's model calls, capturing every streaming generation with token counts and latency.
→ Per-request observability on all AI calls made through the Vercel AI SDK — cost and quality metrics without changing streaming code.
The Vercel AI SDK can point to LiteLLM's OpenAI-compatible endpoint as a custom provider, routing all SDK calls through LiteLLM.
→ Provider-agnostic Vercel AI SDK apps — swap between Claude, GPT-4o, and open models at the LiteLLM layer without changing SDK code.
Apps built with the Vercel AI SDK call Qdrant directly for retrieval in RAG endpoints, fetching context before passing it to the SDK's generate function.
→ Semantic retrieval in Vercel AI SDK streaming endpoints — context from Qdrant enriches every generation without breaking streaming.
Add to your GitHub README
[](https://aichitect.dev/tool/vercel-ai-sdk)Explore the full AI landscape
See how Vercel AI SDK fits into the bigger picture — browse all 207 tools and their relationships.