Basic Langgraph chatbot with memory
How can we achieve that a chatbot has memory? Langgraph has the tools for it, in this quick guide we'll see how to add memory to a chatbot and how to persist this memory in a database.
The Problem
We want a chatbot that can retain conversation context across multiple turns without relying on a persistent database, yet also provide an optional durable storage fallback. Implementing memory in LangGraph requires integrating a checkpointing mechanism that captures and restores the graph’s state.
Services and Libraries
langgraph.graph – builds the conversational graph and manages state transitions.
langgraph.checkpoint.memory.InMemorySaver – lightweight in‑memory state persistence.
langgraph.checkpoint.sqlite.SqliteSaver – durable SQLite‑based checkpointing.
langchain.chat_models.init_chat_model – initializes the language model backend.
dotenv.load_dotenv – loads environment variables for secure credentials.
sqlite3 – Python’s SQLite driver for on‑disk storage.
Implementation Details of the Topic
Choose a checkpointing strategy
- In‑memory: ideal for short‑lived sessions or testing.
Example RAM Memory: https://github.com/bernatsampera/langgraph-playground/blob/main/examples/introduction/memory_chatbot/main.py
- SQLite: persists state to disk, enabling recovery after restarts.
Example Sqlite: https://github.com/bernatsampera/langgraph-playground/blob/main/examples/introduction/sqlite_memory_chatbot/main.py
Instantiate the checkpoint saver
# In‑memory from langgraph.checkpoint.memory import InMemorySaver checkpointer = InMemorySaver() # SQLite import sqlite3 from langgraph.checkpoint.sqlite import SqliteSaver conn = sqlite3.connect("checkpoints.sqlite", check_same_thread=False) checkpointer = SqliteSaver(conn)
Compile the graph with the chosen saver
graph = graph_builder.compile(checkpointer=checkpointer)
Configure thread‑specific context
from langchain_core.runnables import RunnableConfig config = RunnableConfig({"configurable": {"thread_id": "1"}}) # or a custom ID
Run and stream updates
def stream_graph_updates(user_input: str): events = graph.stream( {"messages": [{"role": "user", "content": user_input}]}, config, stream_mode="values", ) for event in events: event["messages"][-1].pretty_print()
Graceful shutdown (SQLite only)
conn.close()
By swapping the checkpointer
implementation, the same LangGraph workflow can switch seamlessly between transient in‑memory state and durable SQLite storage, providing flexible memory management for chatbot sessions.
Related Posts
Create a chatbot: Part 4 - Adding memory
In this part we'll see how to make the chatbot to remember things in our langgraph chatbot, this will be useful so our chatbot can remember a conversation
Create a chatbot: Part 2 - Creating the FastApi endpoints
How to use langgraph to create a chatbot that is wrapped around by a fastapi istance and displayed in the frontend with React. This second part explains how to use fastapi to create the endpoints that will be accessed from the frontend
Create a chatbot: Part 1 - Creating the graph with langgraph
How to use langgraph to create a chatbot that is wrapped around by a fastapi istance and displayed in the frontend with React. This first part explains how to set up the project
Let's connect !!
Get in touch if you want updates, examples, and insights on how AI agents, Langchain and more are evolving and where they’re going next.