Creating custom tools in Langgraph
How to create your custom tools in a chatbot so you're able to realize specific functions like accessing a database, or calling an api.
The Problem
Implementing a conversational agent that can call domain‑specific APIs (e.g., adding items to a packlist or querying gym data) requires a clean way to expose those actions to an LLM. LangGraph’s tool‑binding and routing make it possible to interleave LLM text generation with concrete tool executions inside a single graph.
Services and Libraries
Tool – Declares a callable function with name, description, and implementation.
ToolNode – Executes any tool calls found in the LLM output.
init_chat_model – Instantiates a Gemini‑based chat model.
StateGraph – Builds a state machine where nodes can be LLM or tool invocations.
CompiledStateGraph – The runnable, compiled version of the graph.
See the code on github, https://github.com/bernatsampera/langgraph-playground/tree/main/examples/introduction/custom_tools_chatbot
Implementation Details of the Topic
1. Define domain tools
Create one‑liner functions that interact with the external API, then wrap them in Tool
objects.
# Example code
def show_gyms_city(city: str) -> list:
"""Retrieve a list of gyms in the specified city."""
print(f"Showing gyms in {city}")
return ["Gym 1", "Gym 2", "Gym 3"]
def show_gym_details(gym_name: str) -> str:
"""Retrieve details for a specific gym."""
print(f"Showing details for {gym_name}")
return f"{gym_name} is a great gym"
tools = [
Tool(
name="show_gyms_city",
description="List available gyms in a specified city",
func=show_gyms_city,
),
Tool(
name="show_gym_details",
description="Show detailed information about a specific gym",
func=show_gym_details,
),
]
2. Bind tools to the LLM
Attach the tool list to the chat model so it can emit calls in its output.
# Example code
llm = init_chat_model("google_genai:gemini-2.5-flash-lite")
llm_with_tools = llm.bind_tools(tools)
3. Create the ToolNode
Instantiate a node that will run any tool calls the LLM produces.
# Example code
tool_node = ToolNode(tools)
4. Chatbot node
The node that sends the current conversation state to the LLM and returns the LLM’s response.
# Example code
def chatbot(state: State) -> dict:
"""Process the state and generate a response using the LLM."""
return {"messages": [llm_with_tools.invoke(state["messages"])]}
5. Routing logic
Decide whether to send the output to the tool node or terminate the dialogue.
# Example code
def should_continue(state: State) -> str:
"""Determine whether to route to the tools node or end the conversation."""
last_message = state["messages"][-1]
if last_message.tool_calls:
return "tools"
return END
6. Build and compile the graph
Add the nodes and conditional edges, then compile for execution.
# Example code
def build_graph() -> CompiledStateGraph:
"""Create and configure the state graph for the chatbot."""
graph_builder = StateGraph(State)
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_node("tools", tool_node)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_conditional_edges("chatbot", should_continue)
graph_builder.add_edge("tools", "chatbot")
compiled_graph = graph_builder.compile()
return compiled_graph
By following these steps, a LangGraph‑based chatbot can seamlessly integrate custom domain tools, enabling the LLM to delegate actionable tasks while maintaining conversational flow.
Related Posts
Create a chatbot: Part 3 - Adding the frontend
This post explains how to use assistant ui to render a chatbot and handle the calls with the backend
How to handle Human in the loop with Langgraph and FastAPI
A practical guide to building a chatbot with human review capabilities. Learn how to combine AI chat with human oversight using React, FastAPI, and Gemini LLM. Perfect for developers looking to create more reliable AI chat applications that benefit from human expertise
Create a chatbot: Part 4 - Adding memory
In this part we'll see how to make the chatbot to remember things in our langgraph chatbot, this will be useful so our chatbot can remember a conversation
Let's connect !!
Get in touch if you want updates, examples, and insights on how AI agents, Langchain and more are evolving and where they’re going next.