Skip to main content

Starting a Llama Stack Server

The fastest way to get started. No global install needed:

uvx --from 'llama-stack[starter]' llama stack run starter

Or if you have a project with llama-stack as a dependency:

uv run llama stack run starter

The server runs at http://localhost:8321 by default. Use --port to change it.

Logging

Control log output via environment variables:

# Per-component levels
LLAMA_STACK_LOGGING=server=debug,core=info llama stack run starter

# Global level
LLAMA_STACK_LOGGING=all=debug llama stack run starter

# Log to file
LLAMA_STACK_LOG_FILE=/tmp/llama-stack.log llama stack run starter

Categories: all, core, server, router, inference, safety, tools, client. Levels: debug, info, warning, error, critical.