Stream Protocol with WebUI

AgentOpera offers a comprehensive streaming protocol that facilitates seamless interaction between backend AI agents and frontend web interfaces. Developers can utilize this protocol to quickly build and deploy advanced AI applications with immersive, interactive user experiences. By enabling instant feedback, this approach enhances user engagement, making it ideal for applications such as customer support bots, research assistants, and creative writing tools.

1. Key Features

  • Vercel AI SDK Compatibility: Seamlessly integrates with any frontend built using Vercel AI SDK

  • Real-time Streaming: Delivers AI responses in chunks as they're generated, providing immediate feedback

  • Tool Calls Support: Streams both text content and structured data such as tool calls and tool results

  • Reasoning Visibility: Exposes reasoning process during AI agent execution

  • Flexible Integration: Works with any React, Next.js, Vue, or custom frontend implementation

  • Event-Driven Architecture: Based on a reactive system that propagates events through the system

  • Session Management: Maintains conversation context across multiple interactions

2. Technical Implementation

AgentOpera implements the Vercel AI SDK streaming protocol through a specialized UI adapter system that transforms internal message formats into the standardized streaming format. This enables any frontend application built with the Vercel AI SDK to communicate with AgentOpera's powerful agent infrastructure.

This architecture diagram shows the flow of communication in AgentOpera's streaming protocol:

  1. Client applications (built with React/Next.js, Vue.js, or custom interfaces) connect to AgentOpera using the Vercel AI SDK.

  2. Communication happens via HTTP or WebSocket protocols.

  3. The AgentOpera backend consists of multiple components:

  • Router System: Directs requests to appropriate agents

  • Session Manager: Maintains conversation state

  • Agent Registry: Tracks available agents and their capabilities

  1. Messages undergo transformation to convert between internal formats and the Vercel AI streaming protocol format.

  2. Streaming responses include various part types, such as text, tool calls, tool results, and reasoning.

  3. The system integrates with language models, specialized agents, and external tool providers.

The architecture is designed to be extensible and compatible with the Vercel AI SDK streaming protocol, enabling real-time, interactive experiences for users while leveraging powerful AI agent capabilities.

2.1 Streaming Protocol Format

The Vercel AI Streaming Protocol uses a specific message format where each message consists of:

TYPE_ID:CONTENT_JSON\n

For example, text content is streamed as:

0:"This is a text chunk"\n

AgentOpera supports the following stream part types:

| Type ID | Description | Format |
|---------|-------------|--------|
| 0 | Text Part | `0:string\n` |
| g | Reasoning Part | `g:string\n` |
| 9 | Tool Call Part | `9:{toolCallId:string; toolName:string; args:object}\n` |
| a | Tool Result Part | `a:{toolCallId:string; result:object}\n` |
| f | Start Step Part | `f:{messageId:string}\n` |
| e | Finish Step Part | `e:{finishReason:string; usage:{promptTokens:number; completionTokens:number}; isContinued:boolean}\n` |
| d | Finish Message Part | `d:{finishReason:string; usage:{promptTokens:number; completionTokens:number}}\n` |
| 8 | Message Annotation Part | `8:Array<JSONValue>\n` |
| 3 | Error Part | `3:string\n` |

2.2 Protocol Demonstration: Complete Request/Response Example

The following example demonstrates the raw protocol exchange between a client and the AgentOpera backend API, showing the exact message formats and streaming behavior:

HTTP Request:

curl 'https://test.tensoropera.ai/api/chat' \ // Points to your AgentOpera endpoint
  -H 'content-type: application/json' \
  --data-raw '{
    "id":"CrNsQw8Tm95f7T2c",
    "messages":[{
      "role":"user",
      "content":"Search for the latest 5 AI news articles in the US."
    }],
    "user_id":"user-2ilgvqz53qn"
  }'

Response Headers:

X-Vercel-Ai-Data-Stream: v1

Response Body (Streamed):

f:{"messageId":"msg-EgEv03CL7LOmXT5EKPn9B2bI"}
9:{"toolCallId":"call_33498098","toolName":"web_search","args":{"queries":["latest AI news USA 2025","recent artificial intelligence developments US 2025","new AI technology updates United States 2025","AI industry news USA latest"],"maxResults":[10,10,10,10],"topics":["news","news","news","news"],"searchDepth":["advanced","advanced","advanced","advanced"],"exclude_domains":[]}}
8:[{"type":"query_completion","data":{"query":"latest AI news USA 2025","index":0,"total":4,"status":"completed","resultsCount":10,"imagesCount":5}}]
a:{"toolCallId":"call_33498098","result":{"searches":[...]}}
e:{"finishReason":"tool-calls","usage":{"promptTokens":null,"completionTokens":null},"isContinued":false}
f:{"messageId":"msg-cDFc17o3FucbVPoERjwrva5K"}
0:"Here "
0:"are "
0:"the "
0:"latest "
...
0:"in "
0:"2025\n"
e:{"finishReason":"stop","usage":{"promptTokens":null,"completionTokens":null},"isContinued":false}
d:{"finishReason":"stop","usage":{"promptTokens":null,"completionTokens":null}}

This example demonstrates the full protocol lifecycle:

  1. The client sends a user query about AI news

  2. AgentOpera begins the response with a message ID

  3. The system makes a tool call to search the web

  4. Message annotations show query completion information

  5. Tool results are returned

  6. The first message step completes

  7. A new message step begins with the final response

  8. Text chunks are streamed as they're generated

  9. The message concludes with finish notifications

2.3 Integration with the Router System

AgentOpera's router system integrates with the streaming protocol by:

  1. Receiving requests from the client via HTTP or WebSocket

  2. Converting user messages to internal formats

  3. Routing messages to appropriate agents based on intent

  4. Streaming responses back to clients in real-time

  5. Managing session state throughout the conversation

3. Frontend Integration Example

The following example demonstrates how to integrate AgentOpera's streaming protocol with a frontend application using the Vercel AI SDK:

// Client-side code using Vercel AI SDK
import { useChat } from 'ai/react'

export default function ChatComponent() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: 'https://test.tensoropera.ai/api/chat',  // Points to your AgentOpera endpoint
  })

  return (
    <div>
      <div className="messages">
        {messages.map(m => (
          <div key={m.id} className={m.role}>
            {m.content}
          </div>
        ))}
      </div>
      <form onSubmit={handleSubmit}>
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Say something..."
        />
        <button type="submit">Send</button>
      </form>
    </div>
  )
}

This frontend integration example shows how to:

  1. Use the Vercel AI SDK's useChat hook to connect to AgentOpera

  2. Display the streamed messages in a chat interface

  3. Handle user input and form submission

  4. Automatically manage the streaming connection and state

4. Advanced Features

Tool Use Visualization

AgentOpera's streaming implementation allows frontend applications to visualize tool usage by agents in real-time. As agents make tool calls and receive results, these are streamed to the client with appropriate type markers, enabling rich UI representations of the agent's thought process.

Multi-Agent Support

The streaming protocol supports complex multi-agent scenarios, where several AI agents might collaborate on solving a user request. The protocol maintains session consistency and properly attributes responses to the correct agent.

Custom Extensions

While maintaining compatibility with the Vercel AI SDK protocol, AgentOpera extends the protocol with custom annotations that can provide additional context about agent activities, enabling richer frontend experiences.

5. Integration with Third-Party Frameworks

AgentOpera's streaming protocol is designed to integrate with various frontend frameworks:

  • React/Next.js: Native support through Vercel AI SDK

  • Vue.js: Support through Vercel AI SDK Vue adapter

  • Custom interfaces: Any system capable of handling server-sent events or WebSocket connections

Last updated