Starting a OGX Server
- uv (recommended)
- Container
- As a Library
- Kubernetes
The fastest way to get started. No global install needed:
uvx --from 'ogx[starter]' ogx stack run starter
Or if you have a project with ogx as a dependency:
uv run ogx stack run starter
Run a pre-built container image:
docker run -it \
-p 8321:8321 \
-v ~/.ogx:/root/.ogx \
-e OLLAMA_URL=http://host.docker.internal:11434 \
ogx/distribution-starter
Other pre-built images are available on Docker Hub:
| Image | Description |
|---|---|
llamastack/distribution-starter | General purpose (recommended) |
llamastack/distribution-postgres-demo | Starter with PostgreSQL storage |
See Building Custom Distributions to create your own image.
Use OGX directly in your Python process without running a server:
from ogx.core.library_client import OGXAsLibraryClient
client = OGXAsLibraryClient("starter")
client.initialize()
See Using OGX as a Library for details.
Deploy the container image to a Kubernetes cluster. See the Kubernetes Deployment Guide.
The server runs at http://localhost:8321 by default. Use --port to change it.
Logging
Control log output via environment variables:
# Per-component levels
OGX_LOGGING=server=debug,core=info ogx stack run starter
# Global level
OGX_LOGGING=all=debug ogx stack run starter
# Log to file
OGX_LOG_FILE=/tmp/ogx.log ogx stack run starter
Categories: all, core, server, router, inference, tools, client.
Levels: debug, info, warning, error, critical.