By Bernat Sampera 2 min read Follow:

Create a chatbot: Part 2 - Creating the FastApi endpoints

How to use langgraph to create a chatbot that is wrapped around by a fastapi istance and displayed in the frontend with React. This second part explains how to use fastapi to create the endpoints that will be accessed from the frontend

This posts start from the end of the part 1, you can see it here, https://github.com/bernatsampera/chatbot/tree/part1

Setting up fastapi

This is required to get the fastapi backend up and running

uv add fastapi uvicorn pydantic

Creating the endpoints

Replace the current content of the main.py file for the following

from chatbot.graph import graph
import uvicorn
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from pydantic import BaseModel


app = FastAPI(
    title="Template Project REACT + Fastapi",
    description="Template Project REACT + Fastapi",
    version="1.0.0",
)

# Add CORS middleware before defining routes
app.add_middleware(
    CORSMiddleware,
    allow_origins=["*"],
    allow_credentials=True,
    allow_methods=["*"],
    allow_headers=["*"],
)

class ChatRequest(BaseModel):
    message: str

@app.post("/chat")
def chat(request: ChatRequest):
    """Chat endpoint to start the graph with a user message."""
    # Create initial state with user message
    initial_state = {"messages": [{"role": "user", "content": request.message}]}

    # Run the graph
    result = graph.invoke(initial_state)

    # Extract the assistant's response
    assistant_message = result["messages"][-1].content

    return {"response": assistant_message}


if __name__ == "__main__":
    uvicorn.run("main:app", host="127.0.0.1", port=8008, reload=True)

Here the most interesting part is what’s being passed into the graph

    initial_state = {"messages": [{"role": "user", "content": request.message}]}

We pass the messages because this is the state of the agent, what’s being used as the input.

class State(TypedDict):
    messages: Annotated[list, add_messages]

Test the backend

curl -X POST http://127.0.0.1:8008/chat \
  -H "Content-Type: application/json" \
  -d '{"message": "Hello, how are you?"}'

Example output

{"response":"Hello! I am doing well, thank you for asking. As a large language model, I don't experience emotions or physical states like humans do, but I am functioning optimally and ready to assist you.\n\nHow are you today?"}   

Let's connect !!

Get in touch if you want updates, examples, and insights on how AI agents, Langchain and more are evolving and where they’re going next.