The LLMProfileStore class provides a centralized mechanism for managing LLM configurations.
Define a profile once, reuse it everywhere — across scripts, sessions, and even machines.
The store manages a directory of JSON profile files. By default it uses ~/.openhands/profiles,
but you can point it anywhere.
from openhands.sdk import LLMProfileStore# Default location: ~/.openhands/profilesstore = LLMProfileStore()# Or bring your own directorystore = LLMProfileStore(base_dir="./my-profiles")
Secret fields are masked by default for security, so the saved JSON keeps the field shape without exposing the
real value. Pass include_secrets=True to persist the actual secret values.
This directory-based example ships with a pre-generated profiles/fast.json file created from a normal save, then creates a second profile at runtime in a temporary store.
"""Example: Using LLMProfileStore to save and reuse LLM configurations.This example ships with one pre-generated profile JSON file and creates anotherprofile at runtime. The checked-in profile comes from a normal save, so secretsare masked instead of exposed and non-secret fields like `base_url` are keptwhen present."""import osimport shutilimport tempfilefrom pathlib import Pathfrom pydantic import SecretStrfrom openhands.sdk import LLM, LLMProfileStoreSCRIPT_DIR = Path(__file__).parentEXAMPLE_PROFILES_DIR = SCRIPT_DIR / "profiles"DEFAULT_MODEL = "anthropic/claude-sonnet-4-5-20250929"profile_store_dir = Path(tempfile.mkdtemp()) / "profiles"shutil.copytree(EXAMPLE_PROFILES_DIR, profile_store_dir)store = LLMProfileStore(base_dir=profile_store_dir)print(f"Seeded profiles: {store.list()}")api_key = os.getenv("LLM_API_KEY")creative_llm = LLM( usage_id="creative", model=os.getenv("LLM_MODEL", DEFAULT_MODEL), api_key=SecretStr(api_key) if api_key else None, base_url=os.getenv("LLM_BASE_URL"), temperature=0.9,)# The checked-in fast.json was generated with a normal save, so its api_key is# masked and any configured base_url would be preserved. This runtime profile# also avoids persisting the real API key because secrets are masked by default.store.save("creative", creative_llm)creative_profile_json = (profile_store_dir / "creative.json").read_text()if api_key is not None: assert api_key not in creative_profile_jsonprint(f"Stored profiles: {store.list()}")fast_profile = store.load("fast")creative_profile = store.load("creative")print( "Loaded fast profile. " f"usage: {fast_profile.usage_id}, " f"model: {fast_profile.model}, " f"temperature: {fast_profile.temperature}.")print( "Loaded creative profile. " f"usage: {creative_profile.usage_id}, " f"model: {creative_profile.model}, " f"temperature: {creative_profile.temperature}.")store.delete("creative")print(f"After deletion: {store.list()}")print("EXAMPLE_COST: 0")
You can run the example code as-is.
The model name should follow the LiteLLM convention: provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o).
The LLM_API_KEY should be the API key for your chosen provider.
ChatGPT Plus/Pro subscribers: You can use LLM.subscription_login() to authenticate with your ChatGPT account and access Codex models without consuming API credits. See the LLM Subscriptions guide for details.
You can use a saved profile to switch the active model on a running conversation between turns. This is useful when you want to start with one model, then switch to another for later user messages while keeping the same conversation history and combined usage metrics.
"""Mid-conversation model switching.Usage: uv run examples/01_standalone_sdk/44_model_switching_in_convo.py"""import osfrom openhands.sdk import LLM, Agent, LocalConversation, Toolfrom openhands.sdk.llm.llm_profile_store import LLMProfileStorefrom openhands.tools.terminal import TerminalToolLLM_API_KEY = os.getenv("LLM_API_KEY")store = LLMProfileStore()store.save( "gpt", LLM(model="openhands/gpt-5.2", api_key=LLM_API_KEY), include_secrets=True,)agent = Agent( llm=LLM( model=os.getenv("LLM_MODEL", "openhands/claude-sonnet-4-5-20250929"), api_key=LLM_API_KEY, ), tools=[Tool(name=TerminalTool.name)],)conversation = LocalConversation(agent=agent, workspace=os.getcwd())# Send a message with the default modelconversation.send_message("Say hello in one sentence.")conversation.run()# Switch to a different model and send another messageconversation.switch_profile("gpt")print(f"Switched to: {conversation.agent.llm.model}")conversation.send_message("Say goodbye in one sentence.")conversation.run()# Print metrics per modelfor usage_id, metrics in conversation.state.stats.usage_to_metrics.items(): print(f" [{usage_id}] cost=${metrics.accumulated_cost:.6f}")combined = conversation.state.stats.get_combined_metrics()print(f"Total cost: ${combined.accumulated_cost:.6f}")print(f"EXAMPLE_COST: {combined.accumulated_cost}")store.delete("gpt")
You can run the example code as-is.
The model name should follow the LiteLLM convention: provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o).
The LLM_API_KEY should be the API key for your chosen provider.
ChatGPT Plus/Pro subscribers: You can use LLM.subscription_login() to authenticate with your ChatGPT account and access Codex models without consuming API credits. See the LLM Subscriptions guide for details.