Skip to main content

Configuration

Set up openstackai for your environment.


Environment Variables

OpenAI

export OPENAI_API_KEY=sk-your-api-key

Azure OpenAI

# With API Key
export AZURE_OPENAI_API_KEY=your-api-key
export AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
export AZURE_OPENAI_DEPLOYMENT=gpt-4o-mini

# With Azure AD (recommended - no API key needed!)
export AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
export AZURE_OPENAI_DEPLOYMENT=gpt-4o-mini
# Uses your az login credentials automatically

Anthropic

export ANTHROPIC_API_KEY=sk-ant-your-api-key

Ollama (Local)

# No configuration needed for default
# Or specify custom endpoint
export OLLAMA_HOST=http://localhost:11434

Programmatic Configuration

import openstackai

openstackai.configure(
api_key="sk-...",
model="gpt-4o",
temperature=0.7
)

Provider-Specific Configuration

OpenAI

from openstackai.core import OpenAIProvider, LLMConfig

provider = OpenAIProvider(LLMConfig(
api_key="sk-...",
model="gpt-4o-mini",
temperature=0.7,
max_tokens=4096,
))

Azure OpenAI

from openstackai.core import AzureOpenAIProvider, LLMConfig

# API Key auth
provider = AzureOpenAIProvider(LLMConfig(
api_key="your-key",
api_base="https://your-resource.openai.azure.com/",
model="gpt-4o-mini",
api_version="2024-02-15-preview",
))

# Azure AD auth (recommended for enterprise)
provider = AzureOpenAIProvider(LLMConfig(
api_base="https://your-resource.openai.azure.com/",
model="gpt-4o-mini",
# No api_key = uses DefaultAzureCredential
))

Anthropic

from openstackai.core import AnthropicProvider, LLMConfig

provider = AnthropicProvider(LLMConfig(
api_key="sk-ant-...",
model="claude-3-sonnet-20240229",
max_tokens=4096,
))

Model Selection

openstackai automatically selects the appropriate model based on environment:

  1. If AZURE_OPENAI_ENDPOINT is set → Azure OpenAI
  2. If OPENAI_API_KEY is set → OpenAI
  3. If ANTHROPIC_API_KEY is set → Anthropic
  4. If Ollama is running → Ollama

Override with explicit provider:

from openstackai import Agent
from openstackai.core import AzureOpenAIProvider, LLMConfig

agent = Agent(
name="Bot",
instructions="...",
llm=AzureOpenAIProvider(LLMConfig(...))
)

YAML Configuration

Define agents in YAML files:

# agents/research_assistant.yaml
name: ResearchAssistant
instructions: |
You are a research assistant that helps users find information.
Be thorough and cite your sources.
model: gpt-4o-mini
temperature: 0.7
tools:
- web_search
- summarize
memory:
type: conversation
max_messages: 50

Load and use:

from openstackai.config import load_agent, AgentBuilder

config = load_agent("agents/research_assistant.yaml")
agent = AgentBuilder.from_config(config).build()

Default Settings

SettingDefaultDescription
modelgpt-4o-miniDefault model
temperature0.7Creativity level
max_tokens4096Max output tokens
timeout60Request timeout (seconds)
retry_count3Retry attempts
retry_delay1.0Delay between retries

Next Steps

  • [[Quick Start]] - Run your first program
  • [[Azure AD Auth]] - Enterprise authentication
  • [[Agent]] - Create agents