What is LangChain?
LangChain is a powerful framework for building applications powered by Large Language Models (LLMs). It provides tools and abstractions to connect LLMs with external data sources, enable memory in conversations, create agents that can use tools, and build complex AI-powered workflows.
π Why LangChain?
- Model Agnostic: Works with OpenAI, Anthropic, Google, local models, and more
- Composable: Build complex chains from simple, reusable components
- Memory: Add conversation history and context to your apps
- RAG: Connect LLMs to your own data sources
- Agents: Create autonomous AI that can use tools and make decisions
Core Concepts
π€ Models
LLMs and Chat Models from various providers
π Prompts
Templates for structuring inputs to models
βοΈ Chains
Sequences of calls to models and other components
π§ Memory
Persist state between chain runs
π Retrieval
Load and query external data sources
π οΈ Agents
Autonomous systems that use tools
LangChain.js vs LangChain Python
LangChain is available in both Python and JavaScript/TypeScript. For React and Next.js applications, we use LangChain.js which provides the same powerful features optimized for the JavaScript ecosystem.
| Feature | LangChain.js | LangChain Python |
|---|---|---|
| Runtime | Node.js, Edge, Browser | Python |
| TypeScript | Native support | Type hints |
| Streaming | Excellent | Good |
| Integrations | Growing | Extensive |
| Best for | Web apps, APIs | Data science, scripts |
Setting Up Your First Project
Step 1: Install LangChain
Install the core LangChain packages first, then add the provider SDK you plan to use.
# Install core LangChain.js
npm install langchain @langchain/core
# Install provider-specific packages
npm install @langchain/openai # For OpenAI
npm install @langchain/anthropic # For Claude
npm install @langchain/google-genai # For Google AI
Step 2: Set Up Environment Variables
Store API keys in .env.local so they stay server-side and out of source control.
# .env.local
OPENAI_API_KEY=sk-your-openai-key
# or
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key
Step 3: Your First LangChain Call
Initialize a chat model and make a simple invocation to verify your setup.
import { ChatOpenAI } from "@langchain/openai";
// Initialize the model
const model = new ChatOpenAI({
modelName: "gpt-4",
temperature: 0.7,
});
// Make a simple call
const response = await model.invoke("What is LangChain?");
console.log(response.content);
Project Structure
This structure keeps LangChain setup in a shared library and exposes API routes for server-side calls.
my-langchain-app/
βββ src/
β βββ lib/
β β βββ langchain.ts # LangChain setup
β β βββ chains/ # Custom chains
β βββ app/
β β βββ api/
β β βββ chat/ # API routes
β βββ components/
βββ .env.local # API keys
βββ package.json
Using Different Providers
Swap providers by importing the corresponding SDK and choosing the model name you need.
// OpenAI
import { ChatOpenAI } from "@langchain/openai";
const openai = new ChatOpenAI({ modelName: "gpt-4" });
// Anthropic (Claude)
import { ChatAnthropic } from "@langchain/anthropic";
const claude = new ChatAnthropic({ modelName: "claude-3-opus-20240229" });
// Google AI
import { ChatGoogleGenerativeAI } from "@langchain/google-genai";
const gemini = new ChatGoogleGenerativeAI({ modelName: "gemini-pro" });
β οΈ Security Note
Never expose API keys in client-side code. Always use environment variables and make LangChain calls from server-side code (API routes in Next.js or server components).
π‘ Key Takeaways
- β’ LangChain is a framework for building LLM-powered applications
- β’ LangChain.js is optimized for JavaScript/TypeScript and web apps
- β’ It supports multiple LLM providers (OpenAI, Anthropic, Google, etc.)
- β’ Core concepts: Models, Prompts, Chains, Memory, Retrieval, Agents
π Learn More
-
LangChain.js Documentation β
Official documentation for LangChain.js.
-
Quick Start Guide β
Get started with your first LangChain project.
-
LangChain.js GitHub β
Source code and examples.