Skip to main content
Agents rely on provider-specific API keys to call large language model (LLM) endpoints and other foundation models. StreamNative cloud Secrets let you manage those credentials centrally and expose them to running agents through the --secrets flag. This guide explains how to map secrets to environment variables for supported agent frameworks and how to retrieve those values at runtime.

Before you start

  1. Create the required secrets in your organization. Each secret maps a path (the secret name) to one or more key entries.
  2. Ensure snctl is configured for the target tenant and namespace.
  3. Decide which environment variable names your agent framework expects (see the sections below).

Secret mapping syntax

When you submit or update an agent, pass provider credentials with the --secrets flag. The JSON payload follows this structure:
snctl agents create \
  --name <agent-name> \
  --directory <package-root> \
  --agent-framework <framework> \
  --secrets '{
    "ENV_NAME": {"path": "secret-name", "key": "entry"}
  }'
  • ENV_NAME becomes an environment variable inside the agent runtime. You can also access the same value through AgentContext.current().get_secret("ENV_NAME").
  • path references the StreamNative secret name, and key selects the field within that secret.
  • Provide multiple entries in the JSON object to surface several credentials at once.

Google agent development kit secrets

Google’s Agent Development Kit (ADK) uses the google-genai SDK, which respects the following environment variables:
  • GOOGLE_API_KEY—API key for the Gemini Developer API.
  • GEMINI_API_KEY—legacy alias; the runtime prefers GOOGLE_API_KEY when both are present.
  • GOOGLE_GENAI_USE_VERTEXAI—set to true when you want to call Vertex AI endpoints instead of the public Gemini API.
  • GOOGLE_CLOUD_PROJECT—required for Vertex AI requests.
  • GOOGLE_CLOUD_LOCATION—the Vertex AI region (for example, us-central1).
Example deployment snippet:
snctl agents create \
  --tenant <tenant> \
  --namespace <namespace> \
  --name gemini-agent \
  --directory multi_tool_agent \
  --agent-framework adk \
  --session-mode SHARED \
  --inputs <input-topic> \
  --output <output-topic> \
  --agent-file multi_tool_agent.zip \
  --secrets '{
    "GOOGLE_API_KEY": {"path": "gemini-secret", "key": "api_key"},
    "GOOGLE_GENAI_USE_VERTEXAI": {"path": "gemini-secret", "key": "use_vertex"},
    "GOOGLE_CLOUD_PROJECT": {"path": "gemini-secret", "key": "project"},
    "GOOGLE_CLOUD_LOCATION": {"path": "gemini-secret", "key": "location"}
  }'
Inside your agent module, read the values with the ADK SDK or directly from the environment:
import os
from google import genai

client = genai.Client()
vertex_enabled = os.environ.get("GOOGLE_GENAI_USE_VERTEXAI", "false").lower() in ("true", "1")
Because client = genai.Client() automatically checks these environment variables, no additional configuration is required once the secrets are mapped.

Manage secrets for OpenAI agents

The OpenAI Agents SDK expects the standard OpenAI environment variables:
  • OPENAI_API_KEY—required for all requests.
  • OPENAI_PROJECT—optional project scoping, used when you organize keys by project.
  • OPENAI_ORG_ID—optional organization identifier.
  • OPENAI_BASE_URL—override for custom endpoints such as Azure OpenAI or on-prem gateways.
Submit an agent with the necessary secrets:
snctl agents create \
  --tenant <tenant> \
  --namespace <namespace> \
  --name openai-agent \
  --directory openai_multi_tool \
  --agent-framework openai \
  --session-mode SHARED \
  --inputs <input-topic> \
  --output <output-topic> \
  --agent-file openai_multi_tool.zip \
  --secrets '{
    "OPENAI_API_KEY": {"path": "openai-secret", "key": "api_key"},
    "OPENAI_PROJECT": {"path": "openai-secret", "key": "project"},
    "OPENAI_BASE_URL": {"path": "openai-secret", "key": "base_url"}
  }'
In your agent code, rely on the SDK’s default environment handling or fetch values manually:
import os
from agents import Agent

api_key = os.environ.get("OPENAI_API_KEY")
base_url = os.environ.get("OPENAI_BASE_URL")

root_agent = Agent(
    name="Assistant",
    instructions="Use the configured OpenAI model to respond.",
)
If you prefer not to expose certain variables broadly, call AgentContext.current().get_secret("OPENAI_API_KEY") inside request handlers instead of reading from os.environ.

Operational tips

  • Store non-string data (for example, JSON configs) as base64-encoded strings inside the secret value.
  • Rotate provider keys by updating the StreamNative secret; re-run snctl agents update with the same --secrets payload to refresh running agents.
  • Share secrets across multiple agents by reusing the same path while pointing each --secrets entry to the appropriate key.
  • Document required environment variables alongside your agent code so collaborators know which secret entries to maintain.
With this setup, StreamNative cloud Secrets act as the single source of truth for model credentials while Orca Engine handles secure distribution to ADK and OpenAI agents.
I