Announcing langchain-vercel-ai-sdk-adapter — Connect Your Python LangChain Backend to Vercel AI SDK Frontend #14421
Replies: 1 comment
-
|
This discussion was automatically closed because the community moved to community.vercel.com/ai-sdk |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Acknowledgement
Post
Hey everyone! I've been working on a small but useful adapter that solves a real interoperability problem: connecting Python LangChain backends with Vercel AI SDK frontend
clients.
The Problem
If you've tried pairing LangChain (Python) on the backend with Vercel AI SDK's useChat hook on the frontend, you've hit a wall — LangChain's streaming output format doesn't
match what Vercel AI SDK expects. The SSE format, message structure, and ID generation all differ.
The Solution
langchain-vercel-ai-sdk-adapter is a lightweight bridge that handles the translation:
from langchain_anthropic import ChatAnthropic
from langchain_ai_sdk_adapter import to_base_messages, to_ui_message_stream, create_ui_message_stream_response
@app.post("/chat")
async def chat(request: Request):
model = ChatAnthropic(model="claude-3-5-sonnet-20241022")
langchain_messages = await to_base_messages(await request.json())
stream = await model.astream(langchain_messages)
return create_ui_message_stream_response(to_ui_message_stream(stream))
Key Features
Why This Exists
LangChain is maturing fast on the Python side, and Vercel AI SDK is dominating the React/Next.js frontend space. But integrating them has been painful. This adapter is the
minimal, focused solution I wished existed.
Check It Out
Would love to hear if this solves a problem for anyone else, or if there are additional features you'd want see added.
Want me to adjust the tone, add anything, or change the focus?
Beta Was this translation helpful? Give feedback.
All reactions