Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< script lang = "ts" >
2026-02-26 18:21:42 -08:00
import { invoke } from '@tauri-apps/api/core';
import { openUrl } from '@tauri-apps/plugin-opener';
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
import { settings , saveSettings , type AppSettings } from '$lib/stores/settings';
interface Props {
visible: boolean;
onClose: () => void;
}
let { visible , onClose } : Props = $props();
let localSettings = $state< AppSettings > ({ ... $settings } );
2026-03-22 11:55:06 -07:00
let activeTab = $state< 'transcription' | 'speakers' | 'ai' | 'debug'>('transcription');
2026-02-26 18:21:42 -08:00
let modelStatus = $state< 'idle' | 'downloading' | 'success' | 'error'>('idle');
let modelError = $state('');
2026-03-20 21:33:43 -07:00
let revealedFields = $state< Set < string > >(new Set());
2026-02-26 18:21:42 -08:00
async function testAndDownloadModel() {
if (!localSettings.hf_token) {
modelStatus = 'error';
modelError = 'Please enter a HuggingFace token first.';
return;
}
modelStatus = 'downloading';
modelError = '';
try {
const result = await invoke< { ok : boolean ; error? : string } >('download_diarize_model', {
hfToken: localSettings.hf_token,
});
if (result.ok) {
modelStatus = 'success';
} else {
modelStatus = 'error';
modelError = result.error || 'Unknown error';
}
} catch (err) {
modelStatus = 'error';
modelError = String(err);
}
}
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
// Sync when settings store changes
$effect(() => {
localSettings = { ... $settings } ;
});
async function handleSave() {
await saveSettings(localSettings);
onClose();
}
function handleCancel() {
localSettings = { ... $settings } ;
onClose();
}
function handleOverlayClick(e: MouseEvent) {
if ((e.target as HTMLElement).classList.contains('modal-overlay')) {
handleCancel();
}
}
< / script >
{ #if visible }
<!-- svelte - ignore a11y_no_static_element_interactions -->
< div class = "modal-overlay" onclick = { handleOverlayClick } onkeydown= {( e ) => { if ( e . key === 'Escape' ) handleCancel (); }} >
< div class = "modal" >
< div class = "modal-header" >
< h2 > Settings< / h2 >
< button class = "close-btn" onclick = { handleCancel } > x</button >
< / div >
< div class = "tabs" >
< button class = "tab" class:active = { activeTab === 'transcription' } onclick= {() => activeTab = 'transcription' } >
Transcription
< / button >
2026-02-26 18:08:51 -08:00
< button class = "tab" class:active = { activeTab === 'speakers' } onclick= {() => activeTab = 'speakers' } >
Speakers
< / button >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< button class = "tab" class:active = { activeTab === 'ai' } onclick= {() => activeTab = 'ai' } >
AI Provider
< / button >
2026-03-22 11:55:06 -07:00
< button class = "tab" class:active = { activeTab === 'debug' } onclick= {() => activeTab = 'debug' } >
Debug
2026-03-22 10:55:48 -07:00
< / button >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< / div >
< div class = "modal-body" >
{ #if activeTab === 'transcription' }
< div class = "field" >
< label for = "stt-model" > Whisper Model< / label >
< select id = "stt-model" bind:value = { localSettings . transcription_model } >
< option value = "tiny" > Tiny (fastest, least accurate)< / option >
< option value = "base" > Base (fast, good accuracy)< / option >
< option value = "small" > Small (balanced)< / option >
< option value = "medium" > Medium (slower, better accuracy)< / option >
< option value = "large-v3" > Large v3 (slowest, best accuracy)< / option >
< / select >
< / div >
< div class = "field" >
< label for = "stt-device" > Device< / label >
< select id = "stt-device" bind:value = { localSettings . transcription_device } >
< option value = "cpu" > CPU< / option >
< option value = "cuda" > CUDA (NVIDIA GPU)< / option >
< / select >
< / div >
< div class = "field" >
< label for = "stt-lang" > Language (blank = auto-detect)< / label >
< input id = "stt-lang" type = "text" bind:value = { localSettings . transcription_language } placeholder="e.g., en , es , fr " />
< / div >
2026-02-26 18:08:51 -08:00
{ :else if activeTab === 'speakers' }
< div class = "field" >
< label for = "hf-token" > HuggingFace Token< / label >
2026-03-20 21:33:43 -07:00
< div class = "input-reveal" >
< input id = "hf-token" type = { revealedFields . has ( 'hf-token' ) ? 'text' : 'password' } bind:value= { localSettings . hf_token } placeholder = "hf_..." />
< button type = "button" class = "reveal-btn" onclick = {() => { const s = new Set ( revealedFields ); s . has ( 'hf-token' ) ? s . delete ( 'hf-token' ) : s . add ( 'hf-token' ); revealedFields = s ; }} > { revealedFields . has ( 'hf-token' ) ? 'Hide' : 'Show' } </button >
< / div >
2026-02-26 18:08:51 -08:00
< / div >
< div class = "info-box" >
2026-02-26 18:21:42 -08:00
< p class = "info-title" > Setup (one-time)< / p >
< p > Speaker detection uses < strong > pyannote.audio< / strong > models hosted on HuggingFace. You must accept the license for each model:< / p >
2026-02-26 18:08:51 -08:00
< ol >
2026-02-26 18:21:42 -08:00
< li > Create a free account at <!-- svelte-ignore a11y_no_static_element_interactions --> < a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/join' )} > huggingface.co</a ></ li >
2026-02-26 19:46:07 -08:00
< li > Accept the license on < strong > all three< / strong > of these pages:
2026-02-26 18:21:42 -08:00
< ul >
<!-- svelte - ignore a11y_no_static_element_interactions -->
< li >< a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/pyannote/speaker-diarization-3.1' )} > pyannote/speaker-diarization-3.1</a ></ li >
<!-- svelte - ignore a11y_no_static_element_interactions -->
< li >< a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/pyannote/segmentation-3.0' )} > pyannote/segmentation-3.0</a ></ li >
2026-02-26 19:46:07 -08:00
<!-- svelte - ignore a11y_no_static_element_interactions -->
< li >< a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/pyannote/speaker-diarization-community-1' )} > pyannote/speaker-diarization-community-1</a ></ li >
2026-02-26 18:21:42 -08:00
< / ul >
< / li >
<!-- svelte - ignore a11y_no_static_element_interactions -->
< li > Create a token at < a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/settings/tokens' )} > huggingface.co/settings/tokens</a > (read access)</ li >
< li > Paste the token above and click < strong > Test & Download< / strong > < / li >
2026-02-26 18:08:51 -08:00
< / ol >
< / div >
2026-02-26 18:21:42 -08:00
< button
class="btn-download"
onclick={ testAndDownloadModel }
disabled={ modelStatus === 'downloading' }
>
{ #if modelStatus === 'downloading' }
Downloading model...
{ : else }
Test & Download Model
{ /if }
< / button >
{ #if modelStatus === 'success' }
< p class = "status-success" > Model downloaded successfully. Speaker detection is ready.< / p >
{ /if }
{ #if modelStatus === 'error' }
< p class = "status-error" > { modelError } </ p >
{ /if }
2026-03-20 21:33:43 -07:00
< div class = "field" style = "margin-top: 1rem;" >
< label for = "num-speakers" > Number of speakers< / label >
< select
id="num-speakers"
value={ localSettings . num_speakers === null || localSettings . num_speakers === 0 ? '0' : String ( localSettings . num_speakers )}
onchange={( e ) => {
const v = parseInt((e.target as HTMLSelectElement).value, 10);
localSettings.num_speakers = v === 0 ? null : v;
}}
>
< option value = "0" > Auto-detect< / option >
{ #each Array . from ({ length : 20 }, ( _ , i ) => i + 1 ) as n }
< option value = { String ( n )} > { n } </option >
{ /each }
< / select >
< p class = "hint" > Hint the expected number of speakers to speed up diarization clustering.< / p >
< / div >
2026-02-26 18:21:42 -08:00
< div class = "field checkbox" style = "margin-top: 1rem;" >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< label >
< input type = "checkbox" bind:checked = { localSettings . skip_diarization } / >
2026-02-26 18:08:51 -08:00
Skip speaker detection (faster, no speaker labels)
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< / label >
< / div >
{ :else if activeTab === 'ai' }
< div class = "field" >
< label for = "ai-provider" > AI Provider< / label >
< select id = "ai-provider" bind:value = { localSettings . ai_provider } >
2026-03-22 11:55:06 -07:00
< option value = "local" > Ollama< / option >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< option value = "openai" > OpenAI< / option >
< option value = "anthropic" > Anthropic< / option >
2026-03-20 21:33:43 -07:00
< option value = "litellm" > OpenAI Compatible< / option >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< / select >
< / div >
2026-03-22 11:55:06 -07:00
{ #if localSettings . ai_provider === 'local' }
< div class = "field" >
< label for = "ollama-url" > Ollama URL< / label >
< input id = "ollama-url" type = "text" bind:value = { localSettings . ollama_url } placeholder="http://localhost:11434" />
< / div >
< div class = "field" >
< label for = "ollama-model" > Model< / label >
< input id = "ollama-model" type = "text" bind:value = { localSettings . ollama_model } placeholder="llama3.2" />
< / div >
< p class = "hint" >
Install Ollama from ollama.com, then pull a model with < code > ollama pull llama3.2< / code > .
The app connects via Ollama's OpenAI-compatible API.
< / p >
{ :else if localSettings . ai_provider === 'openai' }
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< div class = "field" >
< label for = "openai-key" > OpenAI API Key< / label >
2026-03-20 21:33:43 -07:00
< div class = "input-reveal" >
< input id = "openai-key" type = { revealedFields . has ( 'openai-key' ) ? 'text' : 'password' } bind:value= { localSettings . openai_api_key } placeholder = "sk-..." />
< button type = "button" class = "reveal-btn" onclick = {() => { const s = new Set ( revealedFields ); s . has ( 'openai-key' ) ? s . delete ( 'openai-key' ) : s . add ( 'openai-key' ); revealedFields = s ; }} > { revealedFields . has ( 'openai-key' ) ? 'Hide' : 'Show' } </button >
< / div >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< / div >
< div class = "field" >
< label for = "openai-model" > Model< / label >
< input id = "openai-model" type = "text" bind:value = { localSettings . openai_model } / >
< / div >
{ :else if localSettings . ai_provider === 'anthropic' }
< div class = "field" >
< label for = "anthropic-key" > Anthropic API Key< / label >
2026-03-20 21:33:43 -07:00
< div class = "input-reveal" >
< input id = "anthropic-key" type = { revealedFields . has ( 'anthropic-key' ) ? 'text' : 'password' } bind:value= { localSettings . anthropic_api_key } placeholder = "sk-ant-..." />
< button type = "button" class = "reveal-btn" onclick = {() => { const s = new Set ( revealedFields ); s . has ( 'anthropic-key' ) ? s . delete ( 'anthropic-key' ) : s . add ( 'anthropic-key' ); revealedFields = s ; }} > { revealedFields . has ( 'anthropic-key' ) ? 'Hide' : 'Show' } </button >
< / div >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< / div >
< div class = "field" >
< label for = "anthropic-model" > Model< / label >
< input id = "anthropic-model" type = "text" bind:value = { localSettings . anthropic_model } / >
< / div >
{ :else if localSettings . ai_provider === 'litellm' }
2026-03-20 21:33:43 -07:00
< div class = "field" >
< label for = "litellm-base" > API Base URL< / label >
< input id = "litellm-base" type = "text" bind:value = { localSettings . litellm_api_base } placeholder="https://your-litellm-proxy.example.com" />
< / div >
< div class = "field" >
< label for = "litellm-key" > API Key< / label >
< div class = "input-reveal" >
< input id = "litellm-key" type = { revealedFields . has ( 'litellm-key' ) ? 'text' : 'password' } bind:value= { localSettings . litellm_api_key } placeholder = "sk-..." />
< button type = "button" class = "reveal-btn" onclick = {() => { const s = new Set ( revealedFields ); s . has ( 'litellm-key' ) ? s . delete ( 'litellm-key' ) : s . add ( 'litellm-key' ); revealedFields = s ; }} > { revealedFields . has ( 'litellm-key' ) ? 'Hide' : 'Show' } </button >
< / div >
< / div >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< div class = "field" >
< label for = "litellm-model" > Model< / label >
< input id = "litellm-model" type = "text" bind:value = { localSettings . litellm_model } placeholder="provider/model-name" />
< / div >
{ /if }
2026-03-22 11:55:06 -07:00
{ :else if activeTab === 'debug' }
2026-03-22 10:55:48 -07:00
< div class = "field checkbox" >
< label >
< input
type="checkbox"
checked={ localSettings . devtools_enabled }
onchange={ async ( e ) => {
localSettings.devtools_enabled = (e.target as HTMLInputElement).checked;
await invoke('toggle_devtools', { open : localSettings.devtools_enabled } );
}}
/>
Enable Developer Tools
< / label >
< p class = "hint" > Opens the browser inspector for debugging. Changes take effect immediately.< / p >
< / div >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
{ /if }
< / div >
< div class = "modal-footer" >
< button class = "btn-secondary" onclick = { handleCancel } > Cancel</button >
< button class = "btn-primary" onclick = { handleSave } > Save</button >
< / div >
< / div >
< / div >
{ /if }
< style >
.modal-overlay {
position: fixed;
inset: 0;
background: rgba(0, 0, 0, 0.6);
display: flex;
align-items: center;
justify-content: center;
z-index: 100;
}
.modal {
background: #16213e;
border-radius: 12px;
width: 500px;
max-width: 90vw;
max-height: 80vh;
display: flex;
flex-direction: column;
color: #e0e0e0;
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.5);
}
.modal-header {
display: flex;
align-items: center;
justify-content: space-between;
padding: 1rem 1.25rem;
border-bottom: 1px solid #2a3a5e;
}
.modal-header h2 {
margin: 0;
font-size: 1.1rem;
}
.close-btn {
background: none;
border: none;
color: #999;
font-size: 1.2rem;
cursor: pointer;
padding: 0.25rem;
}
.close-btn:hover {
color: #e0e0e0;
}
.tabs {
display: flex;
border-bottom: 1px solid #2a3a5e;
padding: 0 1.25rem;
}
.tab {
background: none;
border: none;
color: #888;
padding: 0.6rem 1rem;
cursor: pointer;
font-size: 0.85rem;
border-bottom: 2px solid transparent;
}
.tab:hover {
color: #e0e0e0;
}
.tab.active {
color: #e94560;
border-bottom-color: #e94560;
}
.modal-body {
padding: 1.25rem;
overflow-y: auto;
flex: 1;
}
.field {
margin-bottom: 1rem;
}
.field label {
display: block;
font-size: 0.8rem;
color: #aaa;
margin-bottom: 0.3rem;
}
2026-03-20 21:33:43 -07:00
.input-reveal {
display: flex;
gap: 0;
}
.input-reveal input {
flex: 1;
border-top-right-radius: 0;
border-bottom-right-radius: 0;
}
.reveal-btn {
background: #0f3460;
border: 1px solid #4a5568;
border-left: none;
color: #aaa;
padding: 0.5rem 0.6rem;
border-radius: 0 4px 4px 0;
cursor: pointer;
font-size: 0.75rem;
white-space: nowrap;
}
.reveal-btn:hover {
color: #e0e0e0;
background: #1a4a7a;
}
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
.field input,
.field select {
width: 100%;
background: #1a1a2e;
color: #e0e0e0;
2026-03-20 21:33:43 -07:00
color-scheme: dark;
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
border: 1px solid #4a5568;
border-radius: 4px;
padding: 0.5rem;
font-size: 0.85rem;
font-family: inherit;
box-sizing: border-box;
}
.field input:focus,
.field select:focus {
outline: none;
border-color: #e94560;
}
.field.checkbox label {
display: flex;
align-items: center;
gap: 0.5rem;
cursor: pointer;
color: #e0e0e0;
}
.field.checkbox input {
width: auto;
}
.hint {
font-size: 0.75rem;
color: #666;
line-height: 1.4;
}
2026-02-26 18:08:51 -08:00
.info-box {
background: rgba(233, 69, 96, 0.05);
border: 1px solid #2a3a5e;
border-radius: 6px;
padding: 0.75rem 1rem;
margin-bottom: 1rem;
font-size: 0.8rem;
color: #b0b0b0;
line-height: 1.5;
}
.info-box p {
margin: 0 0 0.5rem;
}
.info-box p:last-child {
margin-bottom: 0;
}
.info-box .info-title {
color: #e0e0e0;
font-weight: 600;
font-size: 0.8rem;
}
.info-box ol {
margin: 0.25rem 0 0.5rem;
padding-left: 1.25rem;
}
.info-box li {
margin-bottom: 0.25rem;
}
.info-box strong {
color: #e0e0e0;
}
2026-02-26 18:21:42 -08:00
.ext-link {
color: #e94560;
cursor: pointer;
text-decoration: underline;
}
.ext-link:hover {
color: #ff6b81;
}
.info-box ul {
margin: 0.25rem 0;
padding-left: 1.25rem;
}
.btn-download {
background: #0f3460;
border: 1px solid #4a5568;
color: #e0e0e0;
padding: 0.5rem 1rem;
border-radius: 6px;
cursor: pointer;
font-size: 0.85rem;
width: 100%;
margin-bottom: 0.5rem;
}
.btn-download:hover:not(:disabled) {
background: #1a4a7a;
border-color: #e94560;
}
.btn-download:disabled {
opacity: 0.6;
cursor: not-allowed;
}
.status-success {
color: #4ecdc4;
font-size: 0.8rem;
margin: 0.25rem 0;
}
.status-error {
color: #e94560;
font-size: 0.8rem;
margin: 0.25rem 0;
word-break: break-word;
}
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
.modal-footer {
display: flex;
justify-content: flex-end;
gap: 0.5rem;
padding: 1rem 1.25rem;
border-top: 1px solid #2a3a5e;
}
.btn-secondary {
background: none;
border: 1px solid #4a5568;
color: #e0e0e0;
padding: 0.5rem 1rem;
border-radius: 6px;
cursor: pointer;
font-size: 0.85rem;
}
.btn-secondary:hover {
background: rgba(255,255,255,0.05);
}
.btn-primary {
background: #e94560;
border: none;
color: white;
padding: 0.5rem 1rem;
border-radius: 6px;
cursor: pointer;
font-size: 0.85rem;
font-weight: 500;
}
.btn-primary:hover {
background: #d63851;
}
< / style >