Skip to content
Navigation

The orbiter-server package provides an HTTP server for exposing agents as web services. Built on FastAPI, it supports REST endpoints for chat, agent management, session persistence, and real-time streaming via Server-Sent Events (SSE) and WebSocket.

Basic Usage

python
from orbiter_server import create_app, register_agent
from orbiter.agent import Agent

# Create an agent
agent = Agent(
    name="assistant",
    model="openai:gpt-4o",
    instructions="You are a helpful assistant.",
)

# Register it with the server
register_agent("assistant", agent)

# Create and run the FastAPI app
app = create_app()

# Run with uvicorn
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)

Creating the App

The create_app() factory builds a FastAPI application with all routes:

python
from orbiter_server import create_app

app = create_app()
# Includes:
#   POST /chat          - Chat endpoint
#   GET/POST /agents/*  - Agent management
#   GET/POST /sessions/* - Session management
#   GET /stream         - SSE streaming
#   WS /ws/chat         - WebSocket chat

Registering Agents

Register agents before creating the app:

python
from orbiter_server import register_agent
from orbiter.agent import Agent

# Register multiple agents
register_agent("assistant", Agent(name="assistant", model="openai:gpt-4o"))
register_agent("coder", Agent(name="coder", model="openai:gpt-4o", instructions="You write code."))
register_agent("researcher", Agent(name="researcher", model="anthropic:claude-3-5-sonnet"))

Chat Endpoint

The main interaction endpoint:

code
POST /chat

Request

python
from orbiter_server.app import ChatRequest

# Request body
request = ChatRequest(
    agent_name="assistant",
    input="What is Python?",
    session_id="session-123",  # optional: for conversation continuity
    stream=False,              # optional: enable SSE streaming
)
json
{
    "agent_name": "assistant",
    "input": "What is Python?",
    "session_id": "session-123",
    "stream": false
}

Response

json
{
    "output": "Python is a high-level programming language...",
    "agent_name": "assistant",
    "session_id": "session-123"
}

Streaming Response

When stream=true, the endpoint returns Server-Sent Events:

python
import httpx

async with httpx.AsyncClient() as client:
    async with client.stream("POST", "http://localhost:8000/chat", json={
        "agent_name": "assistant",
        "input": "Explain decorators",
        "stream": True,
    }) as response:
        async for line in response.aiter_lines():
            if line.startswith("data: "):
                print(line[6:], end="", flush=True)

Agent Management

List Agents

code
GET /agents

Returns a list of registered agents with metadata:

json
[
    {
        "name": "assistant",
        "model": "openai:gpt-4o",
        "description": "A helpful assistant"
    }
]

Get Agent Details

code
GET /agents/{name}

Returns detailed information about a specific agent:

json
{
    "name": "assistant",
    "model": "openai:gpt-4o",
    "description": "A helpful assistant",
    "tools": ["search", "calculator"],
    "max_steps": 10
}

Workspace Files

code
GET /agents/{name}/workspace
GET /agents/{name}/workspace/{path}

Access workspace files for an agent.

Session Management

Sessions maintain conversation state across multiple requests.

Create Session

code
POST /sessions
json
{
    "agent_name": "assistant",
    "metadata": {"user_id": "u-123"}
}

Get Session

code
GET /sessions/{session_id}

Returns session details including message history:

json
{
    "id": "session-123",
    "agent_name": "assistant",
    "messages": [
        {"role": "user", "content": "Hello"},
        {"role": "assistant", "content": "Hi there!"}
    ],
    "created_at": "2025-01-15T10:30:00Z"
}

List Sessions

code
GET /sessions

Delete Session

code
DELETE /sessions/{session_id}

WebSocket Streaming

For real-time bidirectional communication:

code
WS /ws/chat
python
import websockets
import json

async with websockets.connect("ws://localhost:8000/ws/chat") as ws:
    # Send a message
    await ws.send(json.dumps({
        "agent_name": "assistant",
        "input": "Hello!",
        "session_id": "session-123",
    }))

    # Receive streaming response
    async for message in ws:
        data = json.loads(message)
        if data.get("type") == "text":
            print(data["content"], end="")
        elif data.get("type") == "done":
            break

SSE Streaming

The dedicated SSE endpoint:

code
GET /stream?agent_name=assistant&input=Hello&session_id=s-1
python
import httpx

async with httpx.AsyncClient() as client:
    async with client.stream("GET", "http://localhost:8000/stream", params={
        "agent_name": "assistant",
        "input": "Tell me a story",
    }) as response:
        async for line in response.aiter_lines():
            if line.startswith("data: "):
                print(line[6:], end="")

Advanced Patterns

Custom Middleware

Add authentication or logging middleware:

python
from fastapi import Request
from orbiter_server import create_app

app = create_app()

@app.middleware("http")
async def auth_middleware(request: Request, call_next):
    api_key = request.headers.get("X-API-Key")
    if not api_key:
        return JSONResponse(status_code=401, content={"error": "Missing API key"})
    response = await call_next(request)
    return response

Multiple Agent Deployment

Deploy multiple agents on a single server:

python
from orbiter_server import create_app, register_agent

# Register specialized agents
register_agent("general", general_agent)
register_agent("coder", coding_agent)
register_agent("analyst", data_agent)

app = create_app()
# All agents accessible via /chat with different agent_name values

Health Check

The app can be extended with health check endpoints:

python
app = create_app()

@app.get("/health")
async def health():
    return {"status": "ok", "agents": len(registered_agents)}

API Summary

SymbolModuleDescription
create_apporbiter_serverFastAPI app factory
register_agentorbiter_serverRegister an agent with the server
ChatRequestorbiter_server.appChat request model: agent_name, input, session_id, stream
ChatResponseorbiter_server.appChat response model: output, agent_name, session_id
agent_routerorbiter_server.agentsRouter for /agents endpoints
session_routerorbiter_server.sessionsRouter for /sessions endpoints
stream_routerorbiter_server.streamingRouter for /stream and /ws/chat endpoints