Phase 1 foundation: Tauri shell, Python sidecar, SQLite database

Tauri v2 + Svelte + TypeScript frontend:
- App shell with workspace layout (waveform, transcript, speakers, AI chat)
- Placeholder components for all major UI areas
- Typed stores (project, transcript, playback, AI)
- TypeScript interfaces matching the database schema
- Tauri bridge service with typed invoke wrappers
- svelte-check passes with 0 errors

Rust backend:
- Tauri v2 app entry point with command registration
- SQLite database layer (rusqlite with bundled SQLite)
  - Full schema: projects, media_files, speakers, segments, words,
    ai_outputs, annotations (with indexes)
  - Model structs with serde serialization
  - CRUD queries for projects, speakers, segments, words
  - Segment text editing preserves original text
  - Schema versioning for future migrations
  - 6 tests passing
- Command stubs for project, transcribe, export, AI, settings, system
- App state management

Python sidecar:
- JSON-line IPC protocol (stdin/stdout)
- Message types: IPCMessage, progress, error, ready
- Handler registry with routing and error handling
- Ping/pong handler for connectivity testing
- Service stubs: transcribe, diarize, pipeline, AI, export
- Provider stubs: local (llama-server), OpenAI, Anthropic, LiteLLM
- Hardware detection stubs
- 14 tests passing, ruff clean

Also adds:
- Testing strategy document (docs/TESTING.md)
- Validation script (scripts/validate.sh)
- Updated .gitignore for Svelte, Rust, Python artifacts

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-02-26 15:16:06 -08:00
parent c450ef3c0c
commit 503cc6c0cf
95 changed files with 9607 additions and 0 deletions

View File

@@ -0,0 +1,2 @@
// AI provider commands — chat, summarize via Python sidecar
// TODO: Implement when AI provider service is built

View File

@@ -0,0 +1,2 @@
// Export commands — trigger caption/text export via Python sidecar
// TODO: Implement when export service is built

View File

@@ -0,0 +1,6 @@
pub mod ai;
pub mod export;
pub mod project;
pub mod settings;
pub mod system;
pub mod transcribe;

View File

@@ -0,0 +1,27 @@
use crate::db::models::Project;
#[tauri::command]
pub fn create_project(name: String) -> Result<Project, String> {
// TODO: Use actual database connection from app state
Ok(Project {
id: uuid::Uuid::new_v4().to_string(),
name,
created_at: chrono::Utc::now().to_rfc3339(),
updated_at: chrono::Utc::now().to_rfc3339(),
settings: None,
status: "active".to_string(),
})
}
#[tauri::command]
pub fn get_project(id: String) -> Result<Option<Project>, String> {
// TODO: Use actual database connection from app state
let _ = id;
Ok(None)
}
#[tauri::command]
pub fn list_projects() -> Result<Vec<Project>, String> {
// TODO: Use actual database connection from app state
Ok(vec![])
}

View File

@@ -0,0 +1,2 @@
// Settings commands — app preferences, model selection, AI provider config
// TODO: Implement when settings UI is built

View File

@@ -0,0 +1,2 @@
// System commands — hardware detection, llama-server lifecycle
// TODO: Implement hardware detection and llama-server management

View File

@@ -0,0 +1,2 @@
// Transcription commands — start/stop/monitor transcription via Python sidecar
// TODO: Implement when sidecar IPC is connected