Memory
Memory plays a crucial role in building effective AI agents, allowing them to maintain context, persist information across interactions, and retrieve relevant knowledge when needed.
Memory System Components
LocalMemory
Enabling a single agent use local memory to retain and utilize knowledge.
SharedMemory
Enabling Agents in a team session to retain and utilize knowledge.
PersistentMemory
Enabling Agents to Build and Refine Knowledge Over Time.
Implementing Memory in your AgentOpera
LocalMemory
LocalMemory allows agents to temporarily store user preferences or recent context during a single conversation or task.
In this example, the assistant remembers the user prefers metric units for temperature and will use this info when generating responses — even if the user doesn’t repeat it.
Short-Term Memory is:
Session-based.
Useful for holding temporary context.
Cleared after the conversation ends.
from agentopera.chatflow.agents import AssistantAgent
from agentopera.memory import LocalMemory, MemoryContent, MemoryMimeType
from agentopera.models.openai import OpenAIChatCompletionClient
# Initialize user memory
user_memory = LocalMemory()
# Add user preferences
await user_memory.add(MemoryContent(
content="The user prefers temperatures in metric units",
mime_type=MemoryMimeType.TEXT
))
# Create assistant with memory
assistant = AssistantAgent(
name="assistant",
model_client=OpenAIChatCompletionClient(model="gpt-4"),
memory=[user_memory], # You can provide multiple memory instances
)
# The memory will automatically be used to provide context
response = await assistant.run(task="What's the weather in New York?")
SharedMemory (Shared within Teams)
SharedMemory allows multiple agents in a team to share real-time context during collaboration.
In this example, agents store user actions and suggestions into a shared memory space — making it easy for any agent to access the latest team insights when handling a task.
Key points:
Shared across agents.
Context expires after a set time (expiration_time).
Perfect for team collaboration and live coordination.
from agentopera.chatflow.agents import AssistantAgent
from agentopera.memory import SharedMemory, MemoryContent, MemoryMimeType
from agentopera.models.openai import OpenAIChatCompletionClient
# Initialize shared short-term memory for the team
team_short_term_memory = SharedMemory(expiration_time=3600,
engine="mem0",
token="your_mem0_token") # Memory expires in 1 hour
# Store real-time collaboration info in shared memory
await team_short_term_memory.add(MemoryContent(
content="User_A is analyzing New York's weather data.",
mime_type=MemoryMimeType.TEXT
))
await team_short_term_memory.add(MemoryContent(
content="User_B suggested converting temperature to metric units.",
mime_type=MemoryMimeType.TEXT
))
# Create assistant with access to shared short-term team memory
assistant = AssistantAgent(
name="team_assistant",
model_client=OpenAIChatCompletionClient(model="gpt-4"),
memory=[team_short_term_memory], # Shared memory for all agents in the team
)
# Agent uses the team's shared memory as part of its context
response = await assistant.run(task="Summarize the team's weather analysis for New York.")
PersistentMemory
PersistentMemory allows agents to store and recall long-term knowledge across sessions, making it ideal for building personalized user experiences.
In this example, the assistant stores user preferences (like writing style and study interests) with unique keys and IDs. This allows the assistant to recall and use this information in future interactions, even after the session ends.
Key points:
Persistent: Memory remains across sessions.
Personalized: Enables tailored responses based on stored user knowledge.
Efficient: Uses unique key and id for easy memory retrieval.
from agentopera.chatflow.agents import AssistantAgent
from agentopera.memory import PersistentMemory, MemoryContent, MemoryMimeType
from agentopera.models.openai import OpenAIChatCompletionClient
# Initialize persistent long-term memory (backed by mem0 engine)
user_long_term_memory = PersistentMemory(
engine="mem0",
token="your_mem0_token"
)
# Add long-term knowledge with key and id
await user_long_term_memory.add(
MemoryContent(
content="The user prefers formal writing style for professional documents.",
mime_type=MemoryMimeType.TEXT
),
key="user_preferences",
id="writing_style_001"
)
await user_long_term_memory.add(
MemoryContent(
content="The user is interested in exploring international study opportunities.",
mime_type=MemoryMimeType.TEXT
),
key="user_goals",
id="study_plan_2025"
)
# Create assistant with long-term memory
assistant = AssistantAgent(
name="knowledgeable_assistant",
model_client=OpenAIChatCompletionClient(model="gpt-4"),
memory=[user_long_term_memory],
)
# The assistant uses long-term memory to provide personalized responses
response = await assistant.run(
task="Could you recommend a study plan based on my past interests?"
)
Last updated