What Is Tool Calling?
Tool calling (also known as function calling) lets LLMs decide when to call external functions to gather information or take actions. Instead of hallucinating answers, the model can call a weather API, query a database, do math, or interact with any system you define.
🛠️ How It Works
- 1. Define tools — Describe functions with names, descriptions, and parameter schemas
- 2. Bind to model — Attach tools to the LLM so it knows what's available
- 3. LLM decides — Based on the user's message, the model decides whether to call a tool
- 4. Execute — Your code runs the function and returns the result to the LLM
- 5. Final answer — The LLM uses the tool result to generate the final response
Creating Tools with @tool Decorator
import { tool } from "@langchain/core/tools";
import { z } from "zod";
// Define a weather tool
const getWeather = tool(
async ({ city }) => {
// In production, call a real weather API
const weatherData: Record<string, string> = {
"new york": "72°F, Sunny",
"london": "58°F, Cloudy",
"tokyo": "80°F, Humid",
};
return weatherData[city.toLowerCase()] || "Weather data not available";
},
{
name: "get_weather",
description: "Get the current weather for a city",
schema: z.object({
city: z.string().describe("The city name to get weather for"),
}),
}
);
// Define a calculator tool
const calculator = tool(
async ({ expression }) => {
try {
// Simple eval for demo — use a proper math library in production
const result = Function(`"use strict"; return (${expression})`)();
return String(result);
} catch {
return "Invalid expression";
}
},
{
name: "calculator",
description: "Evaluate a mathematical expression",
schema: z.object({
expression: z.string().describe("The math expression to evaluate, e.g. '2 + 2'"),
}),
}
);
Binding Tools to a Model
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({ modelName: "gpt-4" });
// Bind tools to the model
const modelWithTools = model.bindTools([getWeather, calculator]);
// The model now decides when to call tools
const response = await modelWithTools.invoke(
"What's the weather in Tokyo, and what's 25 * 4?"
);
// Check if the model wants to call tools
console.log(response.tool_calls);
// [
// { name: "get_weather", args: { city: "Tokyo" } },
// { name: "calculator", args: { expression: "25 * 4" } }
// ]
Automatic Tool Execution
Use an agent to automatically execute tools and feed results back to the model.
import { ChatOpenAI } from "@langchain/openai";
import { createToolCallingAgent, AgentExecutor } from "langchain/agents";
import { ChatPromptTemplate } from "@langchain/core/prompts";
const tools = [getWeather, calculator];
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant with access to tools."],
["human", "{input}"],
["placeholder", "{agent_scratchpad}"],
]);
const agent = createToolCallingAgent({
llm: new ChatOpenAI({ modelName: "gpt-4" }),
tools,
prompt,
});
const executor = new AgentExecutor({ agent, tools });
const result = await executor.invoke({
input: "What's the weather in New York? Also, if the temperature is 72°F, what's that in Celsius? Use the formula (F - 32) * 5/9",
});
console.log(result.output);
// "The weather in New York is 72°F and Sunny.
// 72°F in Celsius is approximately 22.2°C."
Building a Database Query Tool
const queryDatabase = tool(
async ({ query, table }) => {
// Example: Query your Supabase database
const { data, error } = await supabase
.from(table)
.select("*")
.textSearch("content", query)
.limit(5);
if (error) return `Error: ${error.message}`;
return JSON.stringify(data);
},
{
name: "query_database",
description: "Search the database for information",
schema: z.object({
query: z.string().describe("The search query"),
table: z.enum(["articles", "products", "users"])
.describe("Which table to search"),
}),
}
);
const searchWeb = tool(
async ({ query }) => {
const response = await fetch(
`https://api.search.com?q=${encodeURIComponent(query)}`
);
const data = await response.json();
return JSON.stringify(data.results.slice(0, 3));
},
{
name: "search_web",
description: "Search the web for current information",
schema: z.object({
query: z.string().describe("The search query"),
}),
}
);
Tool Choice Control
// Force the model to use a specific tool
const result = await modelWithTools.invoke("Hello!", {
tool_choice: { type: "function", function: { name: "get_weather" } },
});
// Let the model decide (default)
const result2 = await modelWithTools.invoke("What's 2 + 2?", {
tool_choice: "auto",
});
// Prevent tool use for this call
const result3 = await modelWithTools.invoke("Just say hello", {
tool_choice: "none",
});
💡 Key Takeaways
- • Tool calling lets LLMs interact with external APIs, databases, and functions
- • Define tools with name, description, and Zod schema for parameters
- • Use
bindTools()to attach tools, orAgentExecutorfor automatic execution - • The LLM decides when and which tools to call based on the user's input
- • Always validate and sanitize tool inputs — never trust raw LLM output