Files
voice-to-notes/python/voice_to_notes/providers/openai_provider.py

54 lines
1.5 KiB
Python
Raw Normal View History

Phase 1 foundation: Tauri shell, Python sidecar, SQLite database Tauri v2 + Svelte + TypeScript frontend: - App shell with workspace layout (waveform, transcript, speakers, AI chat) - Placeholder components for all major UI areas - Typed stores (project, transcript, playback, AI) - TypeScript interfaces matching the database schema - Tauri bridge service with typed invoke wrappers - svelte-check passes with 0 errors Rust backend: - Tauri v2 app entry point with command registration - SQLite database layer (rusqlite with bundled SQLite) - Full schema: projects, media_files, speakers, segments, words, ai_outputs, annotations (with indexes) - Model structs with serde serialization - CRUD queries for projects, speakers, segments, words - Segment text editing preserves original text - Schema versioning for future migrations - 6 tests passing - Command stubs for project, transcribe, export, AI, settings, system - App state management Python sidecar: - JSON-line IPC protocol (stdin/stdout) - Message types: IPCMessage, progress, error, ready - Handler registry with routing and error handling - Ping/pong handler for connectivity testing - Service stubs: transcribe, diarize, pipeline, AI, export - Provider stubs: local (llama-server), OpenAI, Anthropic, LiteLLM - Hardware detection stubs - 14 tests passing, ruff clean Also adds: - Testing strategy document (docs/TESTING.md) - Validation script (scripts/validate.sh) - Updated .gitignore for Svelte, Rust, Python artifacts Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 15:16:06 -08:00
"""OpenAI provider — direct OpenAI SDK integration."""
from __future__ import annotations
import os
from typing import Any
from voice_to_notes.providers.base import AIProvider
class OpenAIProvider(AIProvider):
"""Connects to the OpenAI API."""
def __init__(
self,
api_key: str | None = None,
model: str = "gpt-4o-mini",
) -> None:
self._api_key = api_key or os.environ.get("OPENAI_API_KEY", "")
self._model = model
self._client: Any = None
def _ensure_client(self) -> Any:
if self._client is not None:
return self._client
if not self._api_key:
raise RuntimeError("OpenAI API key not configured. Set OPENAI_API_KEY or provide it in settings.")
try:
from openai import OpenAI
self._client = OpenAI(api_key=self._api_key)
except ImportError:
raise RuntimeError("openai package is required. Install with: pip install openai")
return self._client
def chat(self, messages: list[dict[str, str]], **kwargs: Any) -> str:
client = self._ensure_client()
response = client.chat.completions.create(
model=kwargs.get("model", self._model),
messages=messages,
temperature=kwargs.get("temperature", 0.7),
max_tokens=kwargs.get("max_tokens", 2048),
)
return response.choices[0].message.content or ""
def is_available(self) -> bool:
return bool(self._api_key)
@property
def name(self) -> str:
return "OpenAI"