Phase 5: AI provider system with local and cloud support
- Implement AIProvider base interface with chat() and is_available() - Add LocalProvider connecting to bundled llama-server via OpenAI SDK - Add OpenAIProvider for direct OpenAI API access - Add AnthropicProvider for Anthropic Claude API - Add LiteLLMProvider for multi-provider gateway - Build AIProviderService with provider routing, auto-selection, and transcript context injection - Add ai.chat IPC handler supporting chat, list_providers, set_provider, and configure actions - Add ai_chat, ai_list_providers, ai_configure Tauri commands - Build interactive AIChatPanel with message history, quick actions (Summarize, Action Items), and transcript context awareness - Tests: 30 Python, 6 Rust, 0 Svelte errors Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -3,6 +3,7 @@ pub mod db;
|
||||
pub mod sidecar;
|
||||
pub mod state;
|
||||
|
||||
use commands::ai::{ai_chat, ai_configure, ai_list_providers};
|
||||
use commands::export::export_transcript;
|
||||
use commands::project::{create_project, get_project, list_projects};
|
||||
use commands::transcribe::{run_pipeline, transcribe_file};
|
||||
@@ -19,6 +20,9 @@ pub fn run() {
|
||||
transcribe_file,
|
||||
run_pipeline,
|
||||
export_transcript,
|
||||
ai_chat,
|
||||
ai_list_providers,
|
||||
ai_configure,
|
||||
])
|
||||
.run(tauri::generate_context!())
|
||||
.expect("error while running tauri application");
|
||||
|
||||
Reference in New Issue
Block a user