Create a chatbot: Part 4 - Adding memory
In this part we'll see how to make the chatbot to remember things in our langgraph chatbot, this will be useful so our chatbot can remember a conversation
This posts start from the end of the part 3, you can see it here, https://github.com/bernatsampera/chatbot/tree/part3
For default the langgraph studio has already the logic to remember the current conversation and thread, but if we want to create our own application with fastapi we need to implement this logic ourselves.
Add the memory store to the graph
For this example we’ll just be using the MemorySaver, but for prod would be better to use another of the built in langgraph classes to connect it with a db, take a look at https://samperalabs.com/posts/basic-langgraph-chatbot-with-memory to see how it’s done.
# chatbot/chat.py
memory = MemorySaver()
graph = graph_builder.compile(checkpointer=memory)
Handle the thread_id from the backend endpoint
class ChatRequest(BaseModel):
message: str
conversation_id: str | None = None
@app.post("/chat")
def chat(request: ChatRequest):
"""Chat endpoint to start the graph with a user message."""
# Get or create conversation ID (thread_id in LangGraph)
thread_id = request.conversation_id
if not thread_id:
thread_id = str(uuid.uuid4())
# Create the config with thread_id for LangGraph's checkpointer
config = {"configurable": {"thread_id": thread_id}}
# Create input state with the new user message
input_state = {"messages": [{"role": "user", "content": request.message}]}
# Run the graph with the config - LangGraph will handle memory automatically
result = graph.invoke(input_state, config)
# Extract the assistant's response
assistant_message = result["messages"][-1]
return {"response": assistant_message.content, "conversation_id": thread_id}
Update the frontend so it remembers the current conversation id
Let’s create a reference with react to keep the conversation id through a session.
const createChatModelAdapter = (
conversationIdRef: React.RefObject<string | null>
): ChatModelAdapter => ({
...
body: JSON.stringify({
message: userMessage,
conversation_id: conversationIdRef.current
}),
...
const data = await response.json();
const responseText = data.response || ERROR_MESSAGES.NO_RESPONSE;
// Store the conversation ID for future requests
if (data.conversation_id) {
conversationIdRef.current = data.conversation_id;
}
return {
content: [
{
type: "text",
text: responseText
}
]
};
And on the react component
const conversationIdRef = useRef<string | null>(null);
const chatRuntime = useLocalRuntime(
createChatModelAdapter(conversationIdRef)
);
Related Posts
Create a chatbot: Part 2 - Creating the FastApi endpoints
How to use langgraph to create a chatbot that is wrapped around by a fastapi istance and displayed in the frontend with React. This second part explains how to use fastapi to create the endpoints that will be accessed from the frontend
Create a chatbot: Part 1 - Creating the graph with langgraph
How to use langgraph to create a chatbot that is wrapped around by a fastapi istance and displayed in the frontend with React. This first part explains how to set up the project
Basic Langgraph chatbot with memory
How can we achieve that a chatbot has memory? Langgraph has the tools for it, in this quick guide we'll see how to add memory to a chatbot and how to persist this memory in a database.
Let's connect !!
Get in touch if you want updates, examples, and insights on how AI agents, Langchain and more are evolving and where they’re going next.