Skip to content

Rust SDK

gproxy-sdk is the entry crate for the GPROXY Rust SDK. It exposes the protocol types, routing helpers, and the provider engine through one surface — suitable for Rust developers who want to assemble their own LLM agent, gateway, forwarding layer, or multi-upstream aggregation service without running the full GPROXY server.

sdk/gproxy-sdk/src/lib.rs re-exports the three layers:

  • pub use gproxy_protocol as protocol;
  • pub use gproxy_channel as channel;
  • pub use gproxy_engine as engine;
CrateRe-exported asLayerResponsibility
gproxy-protocolgproxy_sdk::protocolL0Wire-format types for Claude, OpenAI, and Gemini, plus cross-protocol transform conversions. Light dependencies, no HTTP.
gproxy-channelgproxy_sdk::channelL1The Channel trait, concrete channel implementations (OpenAI, Anthropic, Gemini, …), credential types, request / response types, billing, health tracking, and token counting. Use this layer when you want a strongly typed single-provider client.
gproxy-enginegproxy_sdk::engineL2The full multi-channel GproxyEngine, ProviderStore, retry / credential affinity, backend traits for rate-limit / quota / affinity state, and routing helpers. Use this layer to build your own LLM gateway.

None of the three has a dependency on the database, the HTTP server, or Axum. You can build an entirely different service on top of them.

Add the SDK. If you only need one channel, disable defaults and opt into the feature you want:

Terminal window
cargo add gproxy-sdk --no-default-features --features openai

Then build a minimal engine:

use gproxy_sdk::channel::{
channels::openai::{OpenAiChannel, OpenAiCredential, OpenAiSettings},
health::ModelCooldownHealth,
};
use gproxy_sdk::engine::GproxyEngine;
let engine = GproxyEngine::builder()
.add_provider(
"openai-main",
OpenAiChannel,
OpenAiSettings::default(),
vec![(
OpenAiCredential {
api_key: std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY"),
},
ModelCooldownHealth::default(),
)],
)
.enable_usage(true)
.enable_upstream_log(true)
.enable_upstream_log_body(false)
.build();
let providers = engine.store().list_providers().unwrap();
assert_eq!(providers.len(), 1);

This is the minimal viable setup: one provider, one credential, health tracked by ModelCooldownHealth, usage and upstream logging on (body capture off).

Declared in sdk/gproxy-sdk/Cargo.toml:

FeatureForwards toNotes
defaultall-channelsEnables every channel.
all-channelsgproxy-channel/all-channels + gproxy-engine/all-channelsUmbrella for all channel features.
openaigproxy-channel/openai + gproxy-engine/openaiOpenAI channel.
anthropicgproxy-channel/anthropic + gproxy-engine/anthropicAnthropic channel.
aistudiogproxy-channel/aistudio + gproxy-engine/aistudioGoogle AI Studio channel.
vertexgproxy-channel/vertex + gproxy-engine/vertexVertex AI channel.
vertexexpressgproxy-channel/vertexexpress + gproxy-engine/vertexexpressVertex AI Express channel.
geminicligproxy-channel/geminicli + gproxy-engine/geminicliGemini CLI channel.
claudecodegproxy-channel/claudecode + gproxy-engine/claudecodeClaude Code channel.
codexgproxy-channel/codex + gproxy-engine/codexCodex channel.
antigravitygproxy-channel/antigravity + gproxy-engine/antigravityAntigravity channel.
nvidiagproxy-channel/nvidia + gproxy-engine/nvidiaNVIDIA channel.
deepseekgproxy-channel/deepseek + gproxy-engine/deepseekDeepSeek channel.
groqgproxy-channel/groq + gproxy-engine/groqGroq channel.
openroutergproxy-channel/openrouter + gproxy-engine/openrouterOpenRouter channel.
customgproxy-channel/custom + gproxy-engine/customCustom OpenAI-compatible channel.

The SDK layer does not expose a redis feature; the workspace uses Redis only from the full server binary.

  • Use the binary when you want a working multi-tenant LLM proxy with a console, storage, and background workers out of the box.
  • Use the SDK when you need the routing / protocol-transform / provider-engine pieces inside a larger Rust service — for example, an agent runtime that occasionally needs to fan out to several upstreams, or a custom gateway with its own auth and storage model.

Most of the interesting types — GproxyEngine, ProviderStore, the Channel trait, ModelCooldownHealth, transform::* — have doc-comments in their source files under sdk/.