TechLead
Lesson 1 of 18
5 min read
LangChain

Introduction to LangChain

Learn what LangChain is, its core concepts, and how to set up your first project

What is LangChain?

LangChain is a powerful framework for building applications powered by Large Language Models (LLMs). It provides tools and abstractions to connect LLMs with external data sources, enable memory in conversations, create agents that can use tools, and build complex AI-powered workflows.

πŸ”— Why LangChain?

  • Model Agnostic: Works with OpenAI, Anthropic, Google, local models, and more
  • Composable: Build complex chains from simple, reusable components
  • Memory: Add conversation history and context to your apps
  • RAG: Connect LLMs to your own data sources
  • Agents: Create autonomous AI that can use tools and make decisions

Core Concepts

πŸ€– Models

LLMs and Chat Models from various providers

πŸ“ Prompts

Templates for structuring inputs to models

⛓️ Chains

Sequences of calls to models and other components

🧠 Memory

Persist state between chain runs

πŸ“š Retrieval

Load and query external data sources

πŸ› οΈ Agents

Autonomous systems that use tools

LangChain.js vs LangChain Python

LangChain is available in both Python and JavaScript/TypeScript. For React and Next.js applications, we use LangChain.js which provides the same powerful features optimized for the JavaScript ecosystem.

Feature LangChain.js LangChain Python
RuntimeNode.js, Edge, BrowserPython
TypeScriptNative supportType hints
StreamingExcellentGood
IntegrationsGrowingExtensive
Best forWeb apps, APIsData science, scripts

Setting Up Your First Project

Step 1: Install LangChain

Install the core LangChain packages first, then add the provider SDK you plan to use.

# Install core LangChain.js
npm install langchain @langchain/core

# Install provider-specific packages
npm install @langchain/openai      # For OpenAI
npm install @langchain/anthropic   # For Claude
npm install @langchain/google-genai # For Google AI

Step 2: Set Up Environment Variables

Store API keys in .env.local so they stay server-side and out of source control.

# .env.local
OPENAI_API_KEY=sk-your-openai-key
# or
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key

Step 3: Your First LangChain Call

Initialize a chat model and make a simple invocation to verify your setup.

import { ChatOpenAI } from "@langchain/openai";

// Initialize the model
const model = new ChatOpenAI({
  modelName: "gpt-4",
  temperature: 0.7,
});

// Make a simple call
const response = await model.invoke("What is LangChain?");
console.log(response.content);

Project Structure

This structure keeps LangChain setup in a shared library and exposes API routes for server-side calls.

my-langchain-app/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ lib/
β”‚   β”‚   β”œβ”€β”€ langchain.ts     # LangChain setup
β”‚   β”‚   └── chains/          # Custom chains
β”‚   β”œβ”€β”€ app/
β”‚   β”‚   └── api/
β”‚   β”‚       └── chat/        # API routes
β”‚   └── components/
β”œβ”€β”€ .env.local               # API keys
└── package.json

Using Different Providers

Swap providers by importing the corresponding SDK and choosing the model name you need.

// OpenAI
import { ChatOpenAI } from "@langchain/openai";
const openai = new ChatOpenAI({ modelName: "gpt-4" });

// Anthropic (Claude)
import { ChatAnthropic } from "@langchain/anthropic";
const claude = new ChatAnthropic({ modelName: "claude-3-opus-20240229" });

// Google AI
import { ChatGoogleGenerativeAI } from "@langchain/google-genai";
const gemini = new ChatGoogleGenerativeAI({ modelName: "gemini-pro" });

⚠️ Security Note

Never expose API keys in client-side code. Always use environment variables and make LangChain calls from server-side code (API routes in Next.js or server components).

πŸ’‘ Key Takeaways

  • β€’ LangChain is a framework for building LLM-powered applications
  • β€’ LangChain.js is optimized for JavaScript/TypeScript and web apps
  • β€’ It supports multiple LLM providers (OpenAI, Anthropic, Google, etc.)
  • β€’ Core concepts: Models, Prompts, Chains, Memory, Retrieval, Agents

πŸ“š Learn More

Continue Learning