Skip to main content

2 posts tagged with "multi-sdk"

View All Tags

Every Protocol. Every Framework. Zero Code Changes.

· 4 min read
Sébastien Han
OGX Core Team

Agents shouldn't change a line of code to run on your infrastructure.

That sentence sounds simple, but it represents a fundamental shift in how enterprises can adopt AI agents. Today, every agentic framework speaks a different protocol. Teams using Claude Agents talk Anthropic Messages. Teams using ADK talk Google Interactions. Most agents still call OpenAI Chat Completions or the newer Responses API. Each choice creates a hard dependency on a vendor's infrastructure, SDK, and API contract.

OGX exists to break that coupling. It's a server that speaks every major agentic protocol natively, translating them to any model running on any infrastructure. No vendor lock-in. No SDK rewrites.

From Llama Stack to OGX: A New Name, A Sharper Mission

· 5 min read
OGX Team
Core Team

Llama Stack is now OGX. The name changed, but more importantly, so did the mission.

When this project started, it was an API standardization effort — a set of specs for building AI applications, anchored to the Llama model family. That framing attracted contributors and integrations, but it also created confusion about what the project actually is. People thought it was a spec. Or a Llama-only thing. Or another framework.

It's none of those. OGX is a server. Specifically, it's a server-side agentic loop that speaks the native API of every major frontier lab — OpenAI, Anthropic, and Google — so your application code doesn't have to care which one you're using.

This post explains why we renamed, what changed in the project's direction, and what that means for you.