HexaCluster Logo

Persistent Memory for Chatbots using PostgreSQL and LangChain

In today’s rapidly evolving AI landscape, chatbots have become necessary tools for organizations looking to simplify customer support and enhance employee experiences. As chatbots are increasingly deployed across various sectors, the need for advanced capabilities such as contextual understanding and memory retention is more critical than ever. For enhancing Chatbots with Persistent Memory Using PostgreSQL and LangChain, we can leverage the langchain component PostgresChatMessageHistory within the langchain-postgres package. In this article, we shall dive deep into how PostgreSQL can be leveraged as a persistent memory for creating natural and engaging conversational Chatbots using LangChain with PostgresChatMessageHistory component.

To facilitate the development of conversational AI, LangChain is a cutting-edge framework that can be integrated with various memory and retrieval mechanisms. PostgreSQL, a robust and scalable relational database, serves as an ideal backend for storing complex datasets and ensuring data persistence.

Why PostgreSQL for Integrations ?

PostgreSQL consistently ranks high in the DB-Engines Ranking, a popular indicator of database management system popularity. Postgres was announced as the DBMS of the Year in 2023 and the consistently growing ranking reflects its growing adoption and recognition within the industry. Read our summary of PostgreSQL achievements in 2023.

PostgreSQL is not only a Relational database, it also supports NoSQL (JSON) and Vector database capabilities. Leveraging PostgreSQL simplifies users in maintaining a single database engine supporting multiple requirements. The PostgresChatMessageHistory component from langchain-postgres enables you to seamlessly store your chatbot’s conversation history within the same database that holds other relevant user information. This integration goes beyond just chat history. We can store structured user data (e.g., user profiles, transaction records) and unstructured data (e.g., chat messages, document embeddings) in PostgreSQL. We can also integrate with support ticketing systems by joining session data with Support Systems and Token Tracking systems.

Persistent Memory for Chatbots using PostgreSQL and LangChain

Why Memory is Crucial in Conversations?

To begin with, imagine having a conversation where you constantly repeat yourself because the other person forgets what you just said. Well, that’s the limitation of chatbots without memory. A chatbot needs to access and process information from previous exchanges in order to have a meaningful conversation.

This implies:

  • Remembering past user questions and responses
  • Maintaining a context of the conversation topic
  • Recognizing and keeping track of entities mentioned earlier (e.g., names, locations)

LangChain’s Memory System

LangChain offers a toolbox for incorporating memory into your conversational AI systems. These tools are currently under development (beta) and primarily work with the older chain syntax (Legacy chains). However, the production-ready ChatMessageHistory functionality integrates seamlessly with the newer LCEL syntax.

Designing a Memory System

Building an effective memory system involves two key decisions as follows.

Storage : How will you store information about past interactions ?
LangChain offers options ranging from simple in-memory lists of chat messages to integrations with persistent databases for long-term storage.

Querying : How will you retrieve specific information from the stored data?
A basic system might return the most recent messages. More sophisticated systems might provide summaries of past conversations, extract entities, or use past information to tailor responses to the current context.

 Need experts assistance in AI/ML services ?

Please enable JavaScript in your browser to complete this form.
How can we help you ?


This is a super lightweight wrapper that provides convenient methods for saving HumanMessages, AIMessages, and then fetching them all.

You may want to use this class directly if you are managing memory outside of a chain.

pip install langchain langchain-community

from langchain.memory import ChatMessageHistory

history = ChatMessageHistory()


history.add_ai_message("whats up?")

Here, ChatMessageHistory() initializes a memory object and then using methods like add_user_message and add_ai_message, we can add the messages to the ChatMessageHistory Object “history”.

To retrieve the stored messages, we utilize the messages property of this class.


It returns a list of BaseMessage objects.

[HumanMessage(content='hi!', additional_kwargs={}),
AIMessage(content='whats up?', additional_kwargs={})]

Storing History in PostgreSQL

PostgresChatMessageHistory is a powerful component within LangChain’s memory module. It bridges the gap between LangChain’s in-memory conversation history buffers and persistent storage solutions by enabling you to store and retrieve chat message history in a PostgreSQL database.

This integration offers several advantages over short-lived conversation buffers such as


  • Unlike in-memory buffers that vanish after a session, PostgresChatMessageHistory leverages PostgreSQL, a robust and scalable relational database management system, to ensure that your chat history is permanently stored.


  • As your chatbot interacts with more users and the conversation history grows, PostgreSQL can efficiently handle the increasing data volume. This scalability is essential for real-world applications where chatbots continuously engage in numerous conversations.

Steps to incorporate PostgreSQL Chat Message History

Incorporating PostgresChatMessageHistory into your LangChain application involves a few straightforward steps:

Install the package using the following command.

pip install langchain-postgres psycopg[binary,pool]

Import the required libraries using the following commands.

import uuid

from langchain_core.messages import AIMessage, HumanMessage
from langchain_postgres import PostgresChatMessageHistory
import psycopg

Establish a synchronous connection to the database (or use psycopg.AsyncConnection for async).

conn_info = "postgresql://username:password@host/db_name"
sync_connection = psycopg.connect(conn_info)

Replace username, password, host, and db_name with your database Info.

Create the table schema (only needs to be done once).

table_name = "message_store"
PostgresChatMessageHistory.create_tables(sync_connection, table_name)

Initialize, add, and retrieve messages to and from History.

session_id = str(uuid.uuid4())

# Initialize the chat history manager
chat_history = PostgresChatMessageHistory(

# Add messages to the chat history
HumanMessage(content="Hi Ora2pg bot"),
AIMessage(content="Hello, How can I assist you today"),


Integrating ChatHistory with a chatbot

Here, we will use RunnableWithMessageHistory which will automate the process of adding messages and retrieving the messages to and from message History whenever the chain is invoked, we can customize our chain even without using RunnableWithMessageHistory.

pip install langchain-openai

Import the following, here we will use openAI for demonstration.

from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_openai.chat_models import ChatOpenAI
from langchain_core.runnables.history import RunnableWithMessageHistory

Set your OpenAI API key in the environment variable openai_api_key
Give a prompt for your chatbot.

model = ChatOpenAI(model_name = 'gpt-3.5-turbo-0125',openai_api_key=your_key)
prompt = ChatPromptTemplate.from_messages(
           "You're an Ora2pg assistant who's good at solving user queries related to ora2pg tool (oracle to Postgres Migration tool).",
       ("human", "{input}"),

Initialize RunnableWithMessageHistory with the variable you used in your prompt like input for user query and history for chat message history.

chain = prompt | model
with_message_history = RunnableWithMessageHistory(

get_session_history is a function that returns a type of BaseChatMessageHistory object, and PostgresChatMessageHistory is a subclass of BaseChatMessageHistory.

def get_session_history(session_id: str) -> BaseChatMessageHistory:
   return PostgresChatMessageHistory(

RunnableWithMessageHistory internally invokes messages property to fetch the messages into prompt and pass the prompt into the model as in chain.

Finally, invoke your model,

   {"input": "How to install Ora2pg?"},
   config={"configurable": {"session_id": session_id}},


In conclusion, langchain’s memory system, particularly the PostgresChatMessageHistory integration, equips us with the tools to build chatbots that surpass the limitations of forgetfulness. This persistent memory allows them to maintain context, reference past interactions, and ultimately deliver more natural and engaging user experiences.

If you need help build AI Chatbots with Advanced RAG for your business use case, Contact Us today. In addition to AI Use Cases, we are experts in PostgreSQL Consulting and Database Migrations. You may fill the form below and someone from our team will get in touch with you.

 Contact Us Today!

Please enable JavaScript in your browser to complete this form.
Machine Learning Services
PostgreSQL Consulting Services
Migration Services


  • Surya Sree Bathini

    Surya Sree is working as a Machine Learning Engineer and a Backend Developer at HexaCluster. Surya Sree holds dual degree from Top Universities like IIT and IIIT. She is passionate about Machine Learning and Open-source databases since the beginning of her academics. She has helped multiple Customers of HexaCluster build AI Chatbots using Advanced RAG techniques and also supported her Customers with various AI Use Cases.

Add A Comment

We're glad you have chosen to leave a comment. Please keep in mind that all comments are moderated according to our privacy policy, and all links are nofollow. Do NOT use keywords in the name field. Let's have a personal and meaningful conversation.

Our PostgreSQL Services

Our Machine Learning Services

Share article

HexaCluster Logo

Get in Touch

© 2023 Company. All Rights Reserved. Privacy Policy.