AgentOpera offers a powerful GroupChat system that enables multiple agents to collaborate effectively to solve complex tasks. This document outlines the key GroupChat components and their usage patterns.
Introduction to GroupChat
A GroupChat in AgentOpera is a team of multiple agents that work together on tasks by broadcasting messages to all participants. The GroupChat system is built on a foundation of shared message context, allowing agents to engage in a collective conversation while solving problems.
Each GroupChat includes:
Multiple participant agents (each with their own capabilities)
A GroupChat manager that handles message routing and speaker selection
Termination conditions to determine when the chat should end
GroupChats are especially useful for complex problem-solving tasks that benefit from multiple specialized perspectives.
Core GroupChat Types
AgentOpera provides several built-in GroupChat implementations:
RoundRobinGroupChat
The RoundRobinGroupChat implements a team where participants take turns in a fixed order to publish messages to all participants. This ensures each agent contributes sequentially to the task.
from agentopera.models.openai import OpenAIChatCompletionClientfrom agentopera.chatflow.agents import AssistantAgentfrom agentopera.chatflow.team import RoundRobinGroupChatfrom agentopera.chatflow.conditions import TextMentionTerminationfrom agentopera.chatflow.ui import Consoleasyncdefmain()->None: model_client =OpenAIChatCompletionClient(model="gpt-4o")# Create assistant agents agent1 =AssistantAgent("Assistant1",model_client=model_client) agent2 =AssistantAgent("Assistant2",model_client=model_client)# Define termination condition termination =TextMentionTermination("TERMINATE")# Create the team with round-robin messaging chatgroup =RoundRobinGroupChat([agent1, agent2],termination_condition=termination)# Run the team with a taskawaitConsole(chatgroup.run_stream(task="Discuss the pros and cons of electric vehicles."))
The RoundRobinGroupChat is useful when:
A predictable speaking order is desired
Equal participation from all agents is important
The workflow follows a relatively fixed pattern
SelectorGroupChat
The SelectorGroupChat implements a more dynamic team approach where participants take turns based on a model's selection of the next speaker. This enables a more context-aware collaboration pattern.
from agentopera.models.openai import OpenAIChatCompletionClient
from agentopera.chatflow.agents import AssistantAgent
from agentopera.chatflow.team import SelectorGroupChat
from agentopera.chatflow.conditions import TextMentionTermination
from agentopera.chatflow.ui import Console
async def main() -> None:
model_client = OpenAIChatCompletionClient(model="gpt-4o")
# Create specialized assistant agents
research_agent = AssistantAgent(
"Research_Agent",
model_client=model_client,
description="Expert at finding and sharing information."
)
analysis_agent = AssistantAgent(
"Analysis_Agent",
model_client=model_client,
description="Expert at analyzing data and information."
)
summary_agent = AssistantAgent(
"Summary_Agent",
model_client=model_client,
description="Expert at summarizing complex topics clearly."
)
# Define termination condition
termination = TextMentionTermination("TERMINATE")
# Create the team with dynamic selector-based speaker choice
chatgroup = SelectorGroupChat(
[research_agent, analysis_agent, summary_agent],
model_client=model_client,
termination_condition=termination
)
# Run the team with a task
await Console(chatgroup.run_stream(task="Analyze the impact of AI on healthcare."))
from agentopera.chatflow.conditions import TextMentionTermination, MaxMessageTermination
# End chat when specific text is mentioned
text_termination = TextMentionTermination("COMPLETE")
# End chat after a maximum number of messages
max_messages = MaxMessageTermination(max_messages=20)
# Combine conditions with logical OR
termination = text_termination | max_messages
# Create group chat with combined termination conditions
chatgroup = RoundRobinGroupChat(
[agent1, agent2],
termination_condition=termination
)
async def get_weather(location: str) -> str:
"""Get weather information for a location."""
return f"The weather in {location} is currently sunny and 75°F."
# Create an agent with tool access
assistant_with_tools = AssistantAgent(
"Weather_Assistant",
model_client=model_client,
tools=[get_weather],
description="Expert at providing weather information."
)
# Include the agent in a group chat
chatgroup = SelectorGroupChat(
[assistant_with_tools, other_agent],
model_client=model_client,
termination_condition=termination
)