LLM tracing, evaluation, and dataset management
LangChain's observability platform. Trace every LLM call, run evals on datasets, and manage prompts in production.
Traces every LLM call, eval, and cost so you know exactly what your stack is doing
Other tools in this slot:
AIchitect's Genome scanner detects LangSmith in your project via these signals:
langsmithlangsmithLANGSMITH_API_KEYLANGCHAIN_API_KEYLANGCHAIN_TRACING_V2LangGraph sends traces to LangSmith automatically when LANGCHAIN_TRACING_V2 is set — every node execution becomes a separate trace span.
→ Step-by-step graph execution visibility: see which nodes ran, in what order, with what inputs, outputs, and token cost.
LangSmith is LangChain's native tracing platform — one env var enables automatic tracing of every chain, LLM call, and tool invocation.
→ Zero-friction observability for any LangChain app — complete execution traces without adding a single line of instrumentation.
PydanticAI sends traces to LangSmith via its OpenTelemetry exporter.
→ Structured observability for PydanticAI agents — typed inputs and outputs visible alongside raw traces in LangSmith.
Add to your GitHub README
[](https://aichitect.dev/tool/langsmith)Explore the full AI landscape
See how LangSmith fits into the bigger picture — browse all 207 tools and their relationships.