Tauri v2 + Svelte + TypeScript frontend:
- App shell with workspace layout (waveform, transcript, speakers, AI chat)
- Placeholder components for all major UI areas
- Typed stores (project, transcript, playback, AI)
- TypeScript interfaces matching the database schema
- Tauri bridge service with typed invoke wrappers
- svelte-check passes with 0 errors
Rust backend:
- Tauri v2 app entry point with command registration
- SQLite database layer (rusqlite with bundled SQLite)
- Full schema: projects, media_files, speakers, segments, words,
ai_outputs, annotations (with indexes)
- Model structs with serde serialization
- CRUD queries for projects, speakers, segments, words
- Segment text editing preserves original text
- Schema versioning for future migrations
- 6 tests passing
- Command stubs for project, transcribe, export, AI, settings, system
- App state management
Python sidecar:
- JSON-line IPC protocol (stdin/stdout)
- Message types: IPCMessage, progress, error, ready
- Handler registry with routing and error handling
- Ping/pong handler for connectivity testing
- Service stubs: transcribe, diarize, pipeline, AI, export
- Provider stubs: local (llama-server), OpenAI, Anthropic, LiteLLM
- Hardware detection stubs
- 14 tests passing, ruff clean
Also adds:
- Testing strategy document (docs/TESTING.md)
- Validation script (scripts/validate.sh)
- Updated .gitignore for Svelte, Rust, Python artifacts
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
24 lines
676 B
Python
24 lines
676 B
Python
"""Abstract base class for AI providers."""
|
|
|
|
from __future__ import annotations
|
|
|
|
from abc import ABC, abstractmethod
|
|
from collections.abc import AsyncIterator
|
|
from typing import Any
|
|
|
|
|
|
class AIProvider(ABC):
|
|
"""Base interface for all AI providers."""
|
|
|
|
@abstractmethod
|
|
async def chat(self, messages: list[dict[str, Any]], config: dict[str, Any]) -> str:
|
|
"""Send a chat completion request and return the response."""
|
|
...
|
|
|
|
@abstractmethod
|
|
async def stream(
|
|
self, messages: list[dict[str, Any]], config: dict[str, Any]
|
|
) -> AsyncIterator[str]:
|
|
"""Send a streaming chat request, yielding tokens as they arrive."""
|
|
...
|