Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< script lang = "ts" >
2026-02-26 18:21:42 -08:00
import { invoke } from '@tauri-apps/api/core';
import { openUrl } from '@tauri-apps/plugin-opener';
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
import { settings , saveSettings , type AppSettings } from '$lib/stores/settings';
interface Props {
visible: boolean;
onClose: () => void;
}
let { visible , onClose } : Props = $props();
let localSettings = $state< AppSettings > ({ ... $settings } );
2026-02-26 18:08:51 -08:00
let activeTab = $state< 'transcription' | 'speakers' | 'ai' | 'local'>('transcription');
2026-02-26 18:21:42 -08:00
let modelStatus = $state< 'idle' | 'downloading' | 'success' | 'error'>('idle');
let modelError = $state('');
async function testAndDownloadModel() {
if (!localSettings.hf_token) {
modelStatus = 'error';
modelError = 'Please enter a HuggingFace token first.';
return;
}
modelStatus = 'downloading';
modelError = '';
try {
const result = await invoke< { ok : boolean ; error? : string } >('download_diarize_model', {
hfToken: localSettings.hf_token,
});
if (result.ok) {
modelStatus = 'success';
} else {
modelStatus = 'error';
modelError = result.error || 'Unknown error';
}
} catch (err) {
modelStatus = 'error';
modelError = String(err);
}
}
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
// Sync when settings store changes
$effect(() => {
localSettings = { ... $settings } ;
});
async function handleSave() {
await saveSettings(localSettings);
onClose();
}
function handleCancel() {
localSettings = { ... $settings } ;
onClose();
}
function handleOverlayClick(e: MouseEvent) {
if ((e.target as HTMLElement).classList.contains('modal-overlay')) {
handleCancel();
}
}
< / script >
{ #if visible }
<!-- svelte - ignore a11y_no_static_element_interactions -->
< div class = "modal-overlay" onclick = { handleOverlayClick } onkeydown= {( e ) => { if ( e . key === 'Escape' ) handleCancel (); }} >
< div class = "modal" >
< div class = "modal-header" >
< h2 > Settings< / h2 >
< button class = "close-btn" onclick = { handleCancel } > x</button >
< / div >
< div class = "tabs" >
< button class = "tab" class:active = { activeTab === 'transcription' } onclick= {() => activeTab = 'transcription' } >
Transcription
< / button >
2026-02-26 18:08:51 -08:00
< button class = "tab" class:active = { activeTab === 'speakers' } onclick= {() => activeTab = 'speakers' } >
Speakers
< / button >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< button class = "tab" class:active = { activeTab === 'ai' } onclick= {() => activeTab = 'ai' } >
AI Provider
< / button >
< button class = "tab" class:active = { activeTab === 'local' } onclick= {() => activeTab = 'local' } >
Local AI
< / button >
< / div >
< div class = "modal-body" >
{ #if activeTab === 'transcription' }
< div class = "field" >
< label for = "stt-model" > Whisper Model< / label >
< select id = "stt-model" bind:value = { localSettings . transcription_model } >
< option value = "tiny" > Tiny (fastest, least accurate)< / option >
< option value = "base" > Base (fast, good accuracy)< / option >
< option value = "small" > Small (balanced)< / option >
< option value = "medium" > Medium (slower, better accuracy)< / option >
< option value = "large-v3" > Large v3 (slowest, best accuracy)< / option >
< / select >
< / div >
< div class = "field" >
< label for = "stt-device" > Device< / label >
< select id = "stt-device" bind:value = { localSettings . transcription_device } >
< option value = "cpu" > CPU< / option >
< option value = "cuda" > CUDA (NVIDIA GPU)< / option >
< / select >
< / div >
< div class = "field" >
< label for = "stt-lang" > Language (blank = auto-detect)< / label >
< input id = "stt-lang" type = "text" bind:value = { localSettings . transcription_language } placeholder="e.g., en , es , fr " />
< / div >
2026-02-26 18:08:51 -08:00
{ :else if activeTab === 'speakers' }
< div class = "field" >
< label for = "hf-token" > HuggingFace Token< / label >
< input id = "hf-token" type = "password" bind:value = { localSettings . hf_token } placeholder="hf_..." />
< / div >
< div class = "info-box" >
2026-02-26 18:21:42 -08:00
< p class = "info-title" > Setup (one-time)< / p >
< p > Speaker detection uses < strong > pyannote.audio< / strong > models hosted on HuggingFace. You must accept the license for each model:< / p >
2026-02-26 18:08:51 -08:00
< ol >
2026-02-26 18:21:42 -08:00
< li > Create a free account at <!-- svelte-ignore a11y_no_static_element_interactions --> < a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/join' )} > huggingface.co</a ></ li >
2026-02-26 19:46:07 -08:00
< li > Accept the license on < strong > all three< / strong > of these pages:
2026-02-26 18:21:42 -08:00
< ul >
<!-- svelte - ignore a11y_no_static_element_interactions -->
< li >< a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/pyannote/speaker-diarization-3.1' )} > pyannote/speaker-diarization-3.1</a ></ li >
<!-- svelte - ignore a11y_no_static_element_interactions -->
< li >< a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/pyannote/segmentation-3.0' )} > pyannote/segmentation-3.0</a ></ li >
2026-02-26 19:46:07 -08:00
<!-- svelte - ignore a11y_no_static_element_interactions -->
< li >< a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/pyannote/speaker-diarization-community-1' )} > pyannote/speaker-diarization-community-1</a ></ li >
2026-02-26 18:21:42 -08:00
< / ul >
< / li >
<!-- svelte - ignore a11y_no_static_element_interactions -->
< li > Create a token at < a class = "ext-link" onclick = {() => openUrl ( 'https://huggingface.co/settings/tokens' )} > huggingface.co/settings/tokens</a > (read access)</ li >
< li > Paste the token above and click < strong > Test & Download< / strong > < / li >
2026-02-26 18:08:51 -08:00
< / ol >
< / div >
2026-02-26 18:21:42 -08:00
< button
class="btn-download"
onclick={ testAndDownloadModel }
disabled={ modelStatus === 'downloading' }
>
{ #if modelStatus === 'downloading' }
Downloading model...
{ : else }
Test & Download Model
{ /if }
< / button >
{ #if modelStatus === 'success' }
< p class = "status-success" > Model downloaded successfully. Speaker detection is ready.< / p >
{ /if }
{ #if modelStatus === 'error' }
< p class = "status-error" > { modelError } </ p >
{ /if }
< div class = "field checkbox" style = "margin-top: 1rem;" >
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< label >
< input type = "checkbox" bind:checked = { localSettings . skip_diarization } / >
2026-02-26 18:08:51 -08:00
Skip speaker detection (faster, no speaker labels)
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
< / label >
< / div >
{ :else if activeTab === 'ai' }
< div class = "field" >
< label for = "ai-provider" > AI Provider< / label >
< select id = "ai-provider" bind:value = { localSettings . ai_provider } >
< option value = "local" > Local (llama-server)< / option >
< option value = "openai" > OpenAI< / option >
< option value = "anthropic" > Anthropic< / option >
< option value = "litellm" > LiteLLM< / option >
< / select >
< / div >
{ #if localSettings . ai_provider === 'openai' }
< div class = "field" >
< label for = "openai-key" > OpenAI API Key< / label >
< input id = "openai-key" type = "password" bind:value = { localSettings . openai_api_key } placeholder="sk-..." />
< / div >
< div class = "field" >
< label for = "openai-model" > Model< / label >
< input id = "openai-model" type = "text" bind:value = { localSettings . openai_model } / >
< / div >
{ :else if localSettings . ai_provider === 'anthropic' }
< div class = "field" >
< label for = "anthropic-key" > Anthropic API Key< / label >
< input id = "anthropic-key" type = "password" bind:value = { localSettings . anthropic_api_key } placeholder="sk-ant-..." />
< / div >
< div class = "field" >
< label for = "anthropic-model" > Model< / label >
< input id = "anthropic-model" type = "text" bind:value = { localSettings . anthropic_model } / >
< / div >
{ :else if localSettings . ai_provider === 'litellm' }
< div class = "field" >
< label for = "litellm-model" > Model< / label >
< input id = "litellm-model" type = "text" bind:value = { localSettings . litellm_model } placeholder="provider/model-name" />
< / div >
{ /if }
{ : else }
< div class = "field" >
< label for = "llama-binary" > llama-server Binary Path< / label >
< input id = "llama-binary" type = "text" bind:value = { localSettings . local_binary_path } placeholder="llama-server" />
< / div >
< div class = "field" >
< label for = "llama-model" > GGUF Model Path< / label >
< input id = "llama-model" type = "text" bind:value = { localSettings . local_model_path } placeholder="~/.voicetonotes/models/model.gguf" />
< / div >
< p class = "hint" >
Place GGUF model files in ~/.voicetonotes/models/ for auto-detection.
The local AI server uses the OpenAI-compatible API from llama.cpp.
< / p >
{ /if }
< / div >
< div class = "modal-footer" >
< button class = "btn-secondary" onclick = { handleCancel } > Cancel</button >
< button class = "btn-primary" onclick = { handleSave } > Save</button >
< / div >
< / div >
< / div >
{ /if }
< style >
.modal-overlay {
position: fixed;
inset: 0;
background: rgba(0, 0, 0, 0.6);
display: flex;
align-items: center;
justify-content: center;
z-index: 100;
}
.modal {
background: #16213e;
border-radius: 12px;
width: 500px;
max-width: 90vw;
max-height: 80vh;
display: flex;
flex-direction: column;
color: #e0e0e0;
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.5);
}
.modal-header {
display: flex;
align-items: center;
justify-content: space-between;
padding: 1rem 1.25rem;
border-bottom: 1px solid #2a3a5e;
}
.modal-header h2 {
margin: 0;
font-size: 1.1rem;
}
.close-btn {
background: none;
border: none;
color: #999;
font-size: 1.2rem;
cursor: pointer;
padding: 0.25rem;
}
.close-btn:hover {
color: #e0e0e0;
}
.tabs {
display: flex;
border-bottom: 1px solid #2a3a5e;
padding: 0 1.25rem;
}
.tab {
background: none;
border: none;
color: #888;
padding: 0.6rem 1rem;
cursor: pointer;
font-size: 0.85rem;
border-bottom: 2px solid transparent;
}
.tab:hover {
color: #e0e0e0;
}
.tab.active {
color: #e94560;
border-bottom-color: #e94560;
}
.modal-body {
padding: 1.25rem;
overflow-y: auto;
flex: 1;
}
.field {
margin-bottom: 1rem;
}
.field label {
display: block;
font-size: 0.8rem;
color: #aaa;
margin-bottom: 0.3rem;
}
.field input,
.field select {
width: 100%;
background: #1a1a2e;
color: #e0e0e0;
border: 1px solid #4a5568;
border-radius: 4px;
padding: 0.5rem;
font-size: 0.85rem;
font-family: inherit;
box-sizing: border-box;
}
.field input:focus,
.field select:focus {
outline: none;
border-color: #e94560;
}
.field.checkbox label {
display: flex;
align-items: center;
gap: 0.5rem;
cursor: pointer;
color: #e0e0e0;
}
.field.checkbox input {
width: auto;
}
.hint {
font-size: 0.75rem;
color: #666;
line-height: 1.4;
}
2026-02-26 18:08:51 -08:00
.info-box {
background: rgba(233, 69, 96, 0.05);
border: 1px solid #2a3a5e;
border-radius: 6px;
padding: 0.75rem 1rem;
margin-bottom: 1rem;
font-size: 0.8rem;
color: #b0b0b0;
line-height: 1.5;
}
.info-box p {
margin: 0 0 0.5rem;
}
.info-box p:last-child {
margin-bottom: 0;
}
.info-box .info-title {
color: #e0e0e0;
font-weight: 600;
font-size: 0.8rem;
}
.info-box ol {
margin: 0.25rem 0 0.5rem;
padding-left: 1.25rem;
}
.info-box li {
margin-bottom: 0.25rem;
}
.info-box strong {
color: #e0e0e0;
}
2026-02-26 18:21:42 -08:00
.ext-link {
color: #e94560;
cursor: pointer;
text-decoration: underline;
}
.ext-link:hover {
color: #ff6b81;
}
.info-box ul {
margin: 0.25rem 0;
padding-left: 1.25rem;
}
.btn-download {
background: #0f3460;
border: 1px solid #4a5568;
color: #e0e0e0;
padding: 0.5rem 1rem;
border-radius: 6px;
cursor: pointer;
font-size: 0.85rem;
width: 100%;
margin-bottom: 0.5rem;
}
.btn-download:hover:not(:disabled) {
background: #1a4a7a;
border-color: #e94560;
}
.btn-download:disabled {
opacity: 0.6;
cursor: not-allowed;
}
.status-success {
color: #4ecdc4;
font-size: 0.8rem;
margin: 0.25rem 0;
}
.status-error {
color: #e94560;
font-size: 0.8rem;
margin: 0.25rem 0;
word-break: break-word;
}
Phase 6: Llama-server manager, settings UI, packaging, and polish
- Implement LlamaManager in Rust for llama-server lifecycle: spawn with
port allocation, health check, clean shutdown on Drop, model listing
- Add llama_start/stop/status/list_models Tauri commands
- Add load_settings/save_settings commands with JSON persistence
- Build SettingsModal with tabs for Transcription, AI Provider, Local AI
settings (model size, device, language, API keys, provider selection)
- Wire settings into pipeline calls (model, device, language, skip diarization)
- Configure Tauri packaging: asset protocol for local audio files,
CSP policy, bundle metadata, Linux .deb/.AppImage and Windows .msi config
- Add keyboard shortcuts: Space (play/pause), Ctrl+O (import),
Ctrl+, (settings), Escape (close menus/modals)
- Close export dropdown on outside click
- Tests: 30 Python, 6 Rust, 0 Svelte errors
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 16:38:23 -08:00
.modal-footer {
display: flex;
justify-content: flex-end;
gap: 0.5rem;
padding: 1rem 1.25rem;
border-top: 1px solid #2a3a5e;
}
.btn-secondary {
background: none;
border: 1px solid #4a5568;
color: #e0e0e0;
padding: 0.5rem 1rem;
border-radius: 6px;
cursor: pointer;
font-size: 0.85rem;
}
.btn-secondary:hover {
background: rgba(255,255,255,0.05);
}
.btn-primary {
background: #e94560;
border: none;
color: white;
padding: 0.5rem 1rem;
border-radius: 6px;
cursor: pointer;
font-size: 0.85rem;
font-weight: 500;
}
.btn-primary:hover {
background: #d63851;
}
< / style >