LangChain in React Applications
Integrating LangChain with React requires a client-server architecture because LangChain uses API keys that must be kept secret. The frontend sends requests to your backend API, which runs LangChain and streams responses back to the client.
⚛️ Architecture Pattern
- React Frontend: UI components for chat/input
- API Route: Server endpoint that runs LangChain
- Streaming: Real-time token delivery to the client
- State Management: Handle messages, loading, errors
Project Setup
Set up a React + TypeScript project and install LangChain plus the Vercel AI SDK for streaming.
# Create React app with Vite
npm create vite@latest my-ai-app -- --template react-ts
cd my-ai-app
# Install dependencies
npm install langchain @langchain/openai @langchain/core
npm install ai # Vercel AI SDK for streaming
Custom Chat Hook
Create a reusable hook for chat functionality:
This hook manages messages, loading state, and errors while posting to your API.
// src/hooks/useChat.ts
import { useState, useCallback } from 'react';
interface Message {
id: string;
role: 'user' | 'assistant';
content: string;
}
export function useChat() {
const [messages, setMessages] = useState<Message[]>([]);
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
const sendMessage = useCallback(async (content: string) => {
const userMessage: Message = {
id: Date.now().toString(),
role: 'user',
content,
};
setMessages(prev => [...prev, userMessage]);
setIsLoading(true);
setError(null);
try {
const response = await fetch('/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
messages: [...messages, userMessage],
}),
});
if (!response.ok) throw new Error('Failed to send message');
const data = await response.json();
const assistantMessage: Message = {
id: (Date.now() + 1).toString(),
role: 'assistant',
content: data.content,
};
setMessages(prev => [...prev, assistantMessage]);
} catch (err) {
setError(err instanceof Error ? err.message : 'Unknown error');
} finally {
setIsLoading(false);
}
}, [messages]);
const clearMessages = () => setMessages([]);
return { messages, isLoading, error, sendMessage, clearMessages };
}
Streaming Chat Hook
Handle streaming responses for real-time token display:
Read streamed chunks from the response body and append them to the assistant message.
// src/hooks/useStreamingChat.ts
import { useState, useCallback } from 'react';
interface Message {
id: string;
role: 'user' | 'assistant';
content: string;
}
export function useStreamingChat() {
const [messages, setMessages] = useState<Message[]>([]);
const [isLoading, setIsLoading] = useState(false);
const sendMessage = useCallback(async (content: string) => {
const userMessage: Message = {
id: Date.now().toString(),
role: 'user',
content,
};
// Add user message immediately
setMessages(prev => [...prev, userMessage]);
setIsLoading(true);
// Create placeholder for assistant response
const assistantId = (Date.now() + 1).toString();
setMessages(prev => [...prev, {
id: assistantId,
role: 'assistant',
content: '',
}]);
try {
const response = await fetch('/api/chat/stream', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ messages: [...messages, userMessage] }),
});
const reader = response.body?.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader!.read();
if (done) break;
const chunk = decoder.decode(value);
// Update assistant message with new content
setMessages(prev => prev.map(msg =>
msg.id === assistantId
? { ...msg, content: msg.content + chunk }
: msg
));
}
} catch (error) {
console.error('Streaming error:', error);
} finally {
setIsLoading(false);
}
}, [messages]);
return { messages, isLoading, sendMessage };
}
Chat Component
A basic UI that renders messages and sends input through the streaming hook.
// src/components/Chat.tsx
import { useState, useRef, useEffect } from 'react';
import { useStreamingChat } from '../hooks/useStreamingChat';
export function Chat() {
const [input, setInput] = useState('');
const { messages, isLoading, sendMessage } = useStreamingChat();
const messagesEndRef = useRef<HTMLDivElement>(null);
// Auto-scroll to bottom
useEffect(() => {
messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
}, [messages]);
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
if (!input.trim() || isLoading) return;
sendMessage(input);
setInput('');
};
return (
<div className="flex flex-col h-screen max-w-2xl mx-auto">
{/* Messages */}
<div className="flex-1 overflow-y-auto p-4 space-y-4">
{messages.map(message => (
<div
key={message.id}
className={`p-4 rounded-lg ${
message.role === 'user'
? 'bg-blue-100 ml-auto max-w-[80%]'
: 'bg-gray-100 max-w-[80%]'
}`}
>
<p className="text-sm font-medium mb-1">
{message.role === 'user' ? 'You' : 'AI'}
</p>
<p className="whitespace-pre-wrap">{message.content}</p>
</div>
))}
<div ref={messagesEndRef} />
</div>
{/* Input */}
<form onSubmit={handleSubmit} className="p-4 border-t">
<div className="flex gap-2">
<input
type="text"
value={input}
onChange={e => setInput(e.target.value)}
placeholder="Type a message..."
className="flex-1 p-2 border rounded-lg"
disabled={isLoading}
/>
<button
type="submit"
disabled={isLoading || !input.trim()}
className="px-4 py-2 bg-blue-500 text-white rounded-lg disabled:opacity-50"
>
{isLoading ? 'Sending...' : 'Send'}
</button>
</div>
</form>
</div>
);
}
Using Vercel AI SDK
The Vercel AI SDK simplifies streaming integration:
The useChat hook handles input state, message streaming, and API wiring for you.
// Using Vercel AI SDK's useChat hook
import { useChat } from 'ai/react';
export function ChatWithAISDK() {
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat({
api: '/api/chat',
});
return (
<div className="flex flex-col h-screen">
<div className="flex-1 overflow-y-auto p-4">
{messages.map(m => (
<div key={m.id} className="mb-4">
<strong>{m.role === 'user' ? 'You' : 'AI'}:</strong>
<p>{m.content}</p>
</div>
))}
</div>
<form onSubmit={handleSubmit} className="p-4 border-t">
<input
value={input}
onChange={handleInputChange}
placeholder="Say something..."
className="w-full p-2 border rounded"
/>
</form>
</div>
);
}
State Management with Zustand
Use Zustand when you need shared chat state across multiple components.
// src/stores/chatStore.ts
import { create } from 'zustand';
interface Message {
id: string;
role: 'user' | 'assistant';
content: string;
}
interface ChatStore {
messages: Message[];
isLoading: boolean;
addMessage: (message: Message) => void;
updateMessage: (id: string, content: string) => void;
setLoading: (loading: boolean) => void;
clearMessages: () => void;
}
export const useChatStore = create<ChatStore>((set) => ({
messages: [],
isLoading: false,
addMessage: (message) =>
set((state) => ({ messages: [...state.messages, message] })),
updateMessage: (id, content) =>
set((state) => ({
messages: state.messages.map((m) =>
m.id === id ? { ...m, content } : m
),
})),
setLoading: (loading) => set({ isLoading: loading }),
clearMessages: () => set({ messages: [] }),
}));
⚠️ Security Reminder
Never import LangChain directly in client-side React code. Always use API routes to keep your API keys secure. The examples above assume you have a backend API.
💡 Key Takeaways
- • LangChain runs on the server, React handles the UI
- • Use streaming for better user experience
- • Custom hooks encapsulate chat logic cleanly
- • Vercel AI SDK simplifies streaming integration
- • Consider Zustand for global state management
📚 Learn More
-
Vercel AI SDK Documentation →
Simplified AI streaming for React applications.
-
LangChain Deployment Guide →
Best practices for deploying LangChain apps.