Phase 1 foundation: Tauri shell, Python sidecar, SQLite database
Tauri v2 + Svelte + TypeScript frontend:
- App shell with workspace layout (waveform, transcript, speakers, AI chat)
- Placeholder components for all major UI areas
- Typed stores (project, transcript, playback, AI)
- TypeScript interfaces matching the database schema
- Tauri bridge service with typed invoke wrappers
- svelte-check passes with 0 errors
Rust backend:
- Tauri v2 app entry point with command registration
- SQLite database layer (rusqlite with bundled SQLite)
- Full schema: projects, media_files, speakers, segments, words,
ai_outputs, annotations (with indexes)
- Model structs with serde serialization
- CRUD queries for projects, speakers, segments, words
- Segment text editing preserves original text
- Schema versioning for future migrations
- 6 tests passing
- Command stubs for project, transcribe, export, AI, settings, system
- App state management
Python sidecar:
- JSON-line IPC protocol (stdin/stdout)
- Message types: IPCMessage, progress, error, ready
- Handler registry with routing and error handling
- Ping/pong handler for connectivity testing
- Service stubs: transcribe, diarize, pipeline, AI, export
- Provider stubs: local (llama-server), OpenAI, Anthropic, LiteLLM
- Hardware detection stubs
- 14 tests passing, ruff clean
Also adds:
- Testing strategy document (docs/TESTING.md)
- Validation script (scripts/validate.sh)
- Updated .gitignore for Svelte, Rust, Python artifacts
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
1
python/voice_to_notes/providers/__init__.py
Normal file
1
python/voice_to_notes/providers/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""AI provider adapters — local (llama-server), LiteLLM, OpenAI, Anthropic."""
|
||||
5
python/voice_to_notes/providers/anthropic_provider.py
Normal file
5
python/voice_to_notes/providers/anthropic_provider.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""Anthropic provider — direct Anthropic SDK integration."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
# TODO: Implement Anthropic provider
|
||||
23
python/voice_to_notes/providers/base.py
Normal file
23
python/voice_to_notes/providers/base.py
Normal file
@@ -0,0 +1,23 @@
|
||||
"""Abstract base class for AI providers."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from abc import ABC, abstractmethod
|
||||
from collections.abc import AsyncIterator
|
||||
from typing import Any
|
||||
|
||||
|
||||
class AIProvider(ABC):
|
||||
"""Base interface for all AI providers."""
|
||||
|
||||
@abstractmethod
|
||||
async def chat(self, messages: list[dict[str, Any]], config: dict[str, Any]) -> str:
|
||||
"""Send a chat completion request and return the response."""
|
||||
...
|
||||
|
||||
@abstractmethod
|
||||
async def stream(
|
||||
self, messages: list[dict[str, Any]], config: dict[str, Any]
|
||||
) -> AsyncIterator[str]:
|
||||
"""Send a streaming chat request, yielding tokens as they arrive."""
|
||||
...
|
||||
5
python/voice_to_notes/providers/litellm_provider.py
Normal file
5
python/voice_to_notes/providers/litellm_provider.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""LiteLLM provider — multi-provider gateway."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
# TODO: Implement LiteLLM provider
|
||||
9
python/voice_to_notes/providers/local_provider.py
Normal file
9
python/voice_to_notes/providers/local_provider.py
Normal file
@@ -0,0 +1,9 @@
|
||||
"""Local AI provider — bundled llama-server (OpenAI-compatible API)."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
# TODO: Implement local provider
|
||||
# - Connect to llama-server on localhost:{port}
|
||||
# - Use openai SDK with custom base_url
|
||||
# - Support chat and streaming
|
||||
5
python/voice_to_notes/providers/openai_provider.py
Normal file
5
python/voice_to_notes/providers/openai_provider.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""OpenAI provider — direct OpenAI SDK integration."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
# TODO: Implement OpenAI provider
|
||||
Reference in New Issue
Block a user