Fix Ollama: set_active after configure + fix default URL
Some checks failed
Build Sidecars / Bump sidecar version and tag (push) Successful in 5s
Release / Bump version and tag (push) Successful in 5s
Build Sidecars / Build Sidecar (macOS) (push) Successful in 4m35s
Release / Build App (macOS) (push) Successful in 1m18s
Release / Build App (Linux) (push) Has been cancelled
Release / Build App (Windows) (push) Has been cancelled
Build Sidecars / Build Sidecar (Linux) (push) Successful in 16m56s
Build Sidecars / Build Sidecar (Windows) (push) Successful in 37m0s

The configure action registered the provider but never called
set_active(), so the sidecar kept using the old/default provider.
Also updated the local provider default from localhost:8080 to
localhost:11434/v1 (Ollama). Added debug logging for configure.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Claude
2026-03-22 19:55:06 -07:00
parent da49c04119
commit ca5dc98d24

View File

@@ -254,15 +254,15 @@ def make_ai_chat_handler() -> HandlerFunc:
)
if action == "configure":
# Re-create a provider with custom settings
# Re-create a provider with custom settings and set it active
provider_name = payload.get("provider", "")
config = payload.get("config", {})
if provider_name == "local":
from voice_to_notes.providers.local_provider import LocalProvider
service.register_provider("local", LocalProvider(
base_url=config.get("base_url", "http://localhost:8080"),
model=config.get("model", "local"),
base_url=config.get("base_url", "http://localhost:11434/v1"),
model=config.get("model", "llama3.2"),
))
elif provider_name == "openai":
from voice_to_notes.providers.openai_provider import OpenAIProvider
@@ -286,6 +286,10 @@ def make_ai_chat_handler() -> HandlerFunc:
api_key=config.get("api_key"),
api_base=config.get("api_base"),
))
# Set the configured provider as active
print(f"[sidecar] Configured AI provider: {provider_name} with config: {config}", file=sys.stderr, flush=True)
if provider_name in ("local", "openai", "anthropic", "litellm"):
service.set_active(provider_name)
return IPCMessage(
id=msg.id,
type="ai.configured",