By Bernat Sampera 2 min read

Basic Langgraph chatbot with memory

How can we achieve that a chatbot has memory? Langgraph has the tools for it, in this quick guide we'll see how to add memory to a chatbot and how to persist this memory in a database.

The Problem

We want a chatbot that can retain conversation context across multiple turns without relying on a persistent database, yet also provide an optional durable storage fallback. Implementing memory in LangGraph requires integrating a checkpointing mechanism that captures and restores the graph’s state.

Services and Libraries

  • langgraph.graph – builds the conversational graph and manages state transitions.

  • langgraph.checkpoint.memory.InMemorySaver – lightweight in‑memory state persistence.

  • langgraph.checkpoint.sqlite.SqliteSaver – durable SQLite‑based checkpointing.

  • langchain.chat_models.init_chat_model – initializes the language model backend.

  • dotenv.load_dotenv – loads environment variables for secure credentials.

  • sqlite3 – Python’s SQLite driver for on‑disk storage.

Implementation Details of the Topic

Choose a checkpointing strategy

  • In‑memory: ideal for short‑lived sessions or testing.

Example RAM Memory: https://github.com/bernatsampera/langgraph-playground/blob/main/examples/introduction/memory_chatbot/main.py

  • SQLite: persists state to disk, enabling recovery after restarts.

Example Sqlite: https://github.com/bernatsampera/langgraph-playground/blob/main/examples/introduction/sqlite_memory_chatbot/main.py

  1. Instantiate the checkpoint saver

    # In‑memory
    from langgraph.checkpoint.memory import InMemorySaver
    checkpointer = InMemorySaver()
    
    # SQLite
    import sqlite3
    from langgraph.checkpoint.sqlite import SqliteSaver
    conn = sqlite3.connect("checkpoints.sqlite", check_same_thread=False)
    checkpointer = SqliteSaver(conn)
    
  2. Compile the graph with the chosen saver

    graph = graph_builder.compile(checkpointer=checkpointer)
    
  3. Configure thread‑specific context

    from langchain_core.runnables import RunnableConfig
    config = RunnableConfig({"configurable": {"thread_id": "1"}})  # or a custom ID
    
  4. Run and stream updates

    def stream_graph_updates(user_input: str):
        events = graph.stream(
            {"messages": [{"role": "user", "content": user_input}]},
            config,
            stream_mode="values",
        )
        for event in events:
            event["messages"][-1].pretty_print()
    
  5. Graceful shutdown (SQLite only)

    conn.close()
    

By swapping the checkpointer implementation, the same LangGraph workflow can switch seamlessly between transient in‑memory state and durable SQLite storage, providing flexible memory management for chatbot sessions.