By Bernat Sampera 2 min read Follow:

Create a chatbot: Part 1 - Creating the graph with langgraph

How to use langgraph to create a chatbot that is wrapped around by a fastapi istance and displayed in the frontend with React. This first part explains how to set up the project

Clone the demo template

Clone the template from https://github.com/bernatsampera/reactfastapitemplate and set it up following the readme.

(If you are using vscode I’d recommend to disable pylance, pyright or any other linting from python and use ruff instead, the configuration is already in the pyproject.toml)

Requirements

Get a GOOGLE_API_KEY

In this project I am using gemini, google_genai:gemini-2.5-flash-lite, to set it up you can go to https://aistudio.google.com/apikey and get your own api key. Then add this in a .env file inside the backend/

GOOGLE_API_KEY=<you_api_key>

Create the basic chatbot in langgraph

Add these dependencies in pyproject.toml

dependencies = [
    "dotenv>=0.9.9",
    "langchain>=0.3.27",
    "langchain-google-genai>=2.1.9",
    "langgraph>=0.6.6",
	"langgraph-cli"
]
uv sync

Then create the folder chatbot and add the basic chatbot there

# backend/chatbot/graph.py
from typing import Annotated

from dotenv import load_dotenv
from langchain.chat_models import init_chat_model
from langgraph.graph import END, START, StateGraph
from langgraph.graph.message import add_messages
from typing_extensions import TypedDict

load_dotenv()

class State(TypedDict):
    messages: Annotated[list, add_messages]

# Initialize the LLM
llm = init_chat_model("google_genai:gemini-2.5-flash-lite")

# Define the chatbot node
def chatbot(state: State):
    return {"messages": [llm.invoke(state["messages"])]}

# Build the graph
graph_builder = StateGraph(State)
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_edge("chatbot", END)

graph = graph_builder.compile()

Start langgraph studio

To visualise and interact with the graph, create the file langgraph.json in the backend

# backend/langgraph.json
{
  "dockerfile_lines": [],
  "graphs": {
    "chatbot": "./chatbot/graph.py:graph"
  },
  "python_version": "3.12",
  "env": ".env",
  "dependencies": ["."],
  "auth": {}
}

Add the following in the pyproject.toml

[tool.setuptools]
packages = ["chatbot"]

Start the langgraph studio with

 uvx --refresh --from "langgraph-cli[inmem]" --with-editable . --python 3.12 langgraph dev --allow-blocking

The url from langgraph studio should be opened automatically in ur browser

Let's connect !!

Get in touch if you want updates, examples, and insights on how AI agents, Langchain and more are evolving and where they’re going next.