Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
|
|
|
import { writable } from 'svelte/store';
|
|
|
|
|
import { invoke } from '@tauri-apps/api/core';
|
|
|
|
|
|
|
|
|
|
export interface AppSettings {
|
|
|
|
|
ai_provider: string;
|
|
|
|
|
openai_api_key: string;
|
|
|
|
|
anthropic_api_key: string;
|
|
|
|
|
openai_model: string;
|
|
|
|
|
anthropic_model: string;
|
|
|
|
|
litellm_model: string;
|
2026-03-20 21:33:43 -07:00
|
|
|
litellm_api_key: string;
|
|
|
|
|
litellm_api_base: string;
|
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
|
|
|
local_model_path: string;
|
|
|
|
|
local_binary_path: string;
|
|
|
|
|
transcription_model: string;
|
|
|
|
|
transcription_device: string;
|
|
|
|
|
transcription_language: string;
|
|
|
|
|
skip_diarization: boolean;
|
2026-02-26 18:08:51 -08:00
|
|
|
hf_token: string;
|
2026-03-20 21:33:43 -07:00
|
|
|
num_speakers: number | null;
|
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
|
|
|
}
|
|
|
|
|
|
|
|
|
|
const defaults: AppSettings = {
|
|
|
|
|
ai_provider: 'local',
|
|
|
|
|
openai_api_key: '',
|
|
|
|
|
anthropic_api_key: '',
|
|
|
|
|
openai_model: 'gpt-4o-mini',
|
|
|
|
|
anthropic_model: 'claude-sonnet-4-6',
|
|
|
|
|
litellm_model: 'gpt-4o-mini',
|
2026-03-20 21:33:43 -07:00
|
|
|
litellm_api_key: '',
|
|
|
|
|
litellm_api_base: '',
|
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
|
|
|
local_model_path: '',
|
|
|
|
|
local_binary_path: 'llama-server',
|
|
|
|
|
transcription_model: 'base',
|
|
|
|
|
transcription_device: 'cpu',
|
|
|
|
|
transcription_language: '',
|
|
|
|
|
skip_diarization: false,
|
2026-02-26 18:08:51 -08:00
|
|
|
hf_token: '',
|
2026-03-20 21:33:43 -07:00
|
|
|
num_speakers: null,
|
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
|
|
|
};
|
|
|
|
|
|
|
|
|
|
export const settings = writable<AppSettings>({ ...defaults });
|
|
|
|
|
|
|
|
|
|
export async function loadSettings(): Promise<void> {
|
|
|
|
|
try {
|
|
|
|
|
const saved = await invoke<Record<string, unknown>>('load_settings');
|
|
|
|
|
settings.update(s => ({ ...s, ...saved } as AppSettings));
|
|
|
|
|
} catch {
|
|
|
|
|
// Use defaults if settings can't be loaded
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
export async function saveSettings(s: AppSettings): Promise<void> {
|
|
|
|
|
settings.set(s);
|
|
|
|
|
await invoke('save_settings', { settings: s });
|
2026-03-20 21:33:43 -07:00
|
|
|
|
|
|
|
|
// Configure the AI provider in the Python sidecar
|
|
|
|
|
const configMap: Record<string, Record<string, string>> = {
|
|
|
|
|
openai: { api_key: s.openai_api_key, model: s.openai_model },
|
|
|
|
|
anthropic: { api_key: s.anthropic_api_key, model: s.anthropic_model },
|
|
|
|
|
litellm: { api_key: s.litellm_api_key, api_base: s.litellm_api_base, model: s.litellm_model },
|
|
|
|
|
local: { model: s.local_model_path, base_url: 'http://localhost:8080' },
|
|
|
|
|
};
|
|
|
|
|
const config = configMap[s.ai_provider];
|
|
|
|
|
if (config) {
|
|
|
|
|
try {
|
|
|
|
|
await invoke('ai_configure', { provider: s.ai_provider, config });
|
|
|
|
|
} catch {
|
|
|
|
|
// Sidecar may not be running yet — provider will be configured on first use
|
|
|
|
|
}
|
|
|
|
|
}
|
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
|
|
|
}
|