Skip to content
Navigation

Provider-agnostic LLM abstraction layer for the Orbiter framework.

Module Path

code
orbiter.models

Installation

bash
pip install "orbiter-models @ git+https://github.com/Midsphere-AI/orbiter-ai.git#subdirectory=packages/orbiter-models"

Overview

The orbiter-models package provides a unified interface for calling LLM providers. It defines a ModelProvider abstract base class with complete() and stream() methods, normalized response types (ModelResponse, StreamChunk), and a registry-based factory for building providers from "provider:model_name" strings.

Four built-in providers are included: OpenAIProvider, AnthropicProvider, GeminiProvider, and VertexProvider. All auto-register on import.

Exports

ExportTypeDescription
ModelProviderABCAbstract base for LLM provider implementations
OpenAIProviderClassOpenAI chat completions provider
AnthropicProviderClassAnthropic messages API provider
GeminiProviderClassGoogle Gemini API provider (API key auth)
VertexProviderClassGoogle Vertex AI provider (GCP ADC auth)
ModelResponsePydantic modelNon-streaming LLM response
StreamChunkPydantic modelSingle streaming chunk
ToolCallDeltaPydantic modelIncremental tool call fragment
FinishReasonLiteral typeWhy the model stopped generating
ModelErrorExceptionLLM provider call failure
model_registryRegistry[type[ModelProvider]]Global provider class registry
get_providerFunctionFactory: build a provider from a model string

Import Patterns

python
# Import everything from the package
from orbiter.models import (
    ModelProvider,
    OpenAIProvider,
    AnthropicProvider,
    GeminiProvider,
    VertexProvider,
    ModelResponse,
    StreamChunk,
    ToolCallDelta,
    FinishReason,
    ModelError,
    model_registry,
    get_provider,
)

# Common usage: build a provider via factory
from orbiter.models import get_provider

provider = get_provider("openai:gpt-4o", api_key="sk-...")
provider = get_provider("gemini:gemini-2.0-flash", api_key="AIza...")
provider = get_provider("vertex:gemini-2.0-flash")  # uses GCP ADC

Quick Example

python
import asyncio
from orbiter.models import get_provider
from orbiter.types import UserMessage

async def main():
    provider = get_provider("openai:gpt-4o", api_key="sk-...")

    # Non-streaming completion
    response = await provider.complete(
        [UserMessage(content="What is 2+2?")],
        temperature=0.0,
    )
    print(response.content)       # "4"
    print(response.usage)         # Usage(input_tokens=12, output_tokens=1, ...)
    print(response.finish_reason) # "stop"

    # Streaming completion
    async for chunk in await provider.stream(
        [UserMessage(content="Tell me a story")],
    ):
        if chunk.delta:
            print(chunk.delta, end="")
        if chunk.finish_reason:
            print(f"\n[Done: {chunk.finish_reason}]")

asyncio.run(main())

Architecture

code
get_provider("openai:gpt-4o")
    |
    v
model_registry.get("openai")  -->  OpenAIProvider class
    |
    v
OpenAIProvider(ModelConfig(...))  -->  provider instance
    |
    v
provider.complete(messages)  -->  ModelResponse
provider.stream(messages)    -->  AsyncIterator[StreamChunk]

See Also

  • types — Response and error types
  • provider — ModelProvider ABC and factory
  • openai — OpenAI provider details
  • anthropic — Anthropic provider details
  • gemini — Google Gemini provider details
  • vertex — Google Vertex AI provider details