AI makes itself remembered by recording the conversations through sophisticated algorithms which are designed to record and store data temporarily when you talk to ai. Every time we ask an AI system like GPT-4 a question, it assesses our input, considers all the context surrounding it and starts generating responses. As an example, GPT-4 has been trained on more than 570 gigabytes of text to learn patterns, meaning and respond accordingly. Note that AI does not store conversations forever but can track information across a session so as to make logical replies.
Models like GPT-4 use something called “context windows” to keep track of the ongoing conversation. They essentially give the AI a short-term memory of between 1,000 and 4,000 tokens (words/units of info) of the conversation so that it can reference your past questions and previous answers. Imagine that you are asking a chain of connected questions between which there is a gap. AI will then look back at your previous references made to continue the conversation. It is a part of the process that forms the basis on which AI, just like an intelligent human are able to respond with answers continuously and in context. OpenAI reports that its newest models have a context window on the order of 8,000 or so tokens to enable longer and richer conversations.
But AI does not keep these details across sessions! After the conversation, everything is cleared, and none of the context exists; the conversation restarts with no memory of previous chats. This means you can rest easy knowing that AI systems do not have a built-in memory of personal information they are processing, and even in the case of any use for running models on the stable diffusion dataset (which is impossible) to store then relay information about an individual unless otherwise instructed by a custom or learned network. On the other hand, in many customer service type applications AI systems can retain information across sessions by linking conversation histories together, providing stronger and more individualized experience. According to Gartner, in 2022, even more than half (60%) of customer service AI systems are leveraging this kind of long-term tracking for improving user experience.
There are also AI systems that use many of the same techniques we do, like reinforcement learning to learn over time based on feedback and interactions. To illustrate, Google’s AI model AlphaGo trained itself through lessons learned from hundreds of thousands of prior Go games and ultimately beat the world champion. This form of machine learning is rarer, usually employed in more specialized tasks like gaming or data analysis, but it exemplifies the possibilities for AI to evolve as more data is accrued.
Even with these improvements, the only form of tracking AI does on how you discuss things is through its structure. Despite this, unlike a human participant remembering personal conversations or following up on past data unless it is explicitly programmed to do so in specific applications, AI cannot remember anything except conversation flow as part of the same session. In the words of OpenAI co-founder Sam Altman, “AI doesn’t have a memory like humans do; it’s just identifying patterns in data.
Image source: The 4 Spheres Of AIAs AI technology becomes even more sophisticated, we will be able to follow individual conversations in real-time which can help users enjoy much more relevant experiences. As for AI remembering what you talked about it is limited by design to only your current session making sure privacy is still an essential element when using the tool.