Compare commits

...

12 Commits

Author SHA1 Message Date
b952b8e8de Add per-project full permissions toggle for --dangerously-skip-permissions
All checks were successful
Build App / compute-version (push) Successful in 4s
Build App / build-macos (push) Successful in 2m19s
Build App / build-windows (push) Successful in 2m35s
Build App / build-linux (push) Successful in 4m43s
Build App / create-tag (push) Successful in 4s
Build App / sync-to-github (push) Successful in 11s
New projects default to standard permission mode (Claude asks before acting).
Existing projects default to full permissions ON, preserving current behavior.
UI toggle uses red/caution styling to highlight the security implications.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-13 08:58:13 -07:00
d7d7a83aec Rename LiteLLM backend to OpenAI Compatible
All checks were successful
Build App / compute-version (push) Successful in 8s
Build App / build-macos (push) Successful in 2m25s
Build App / build-windows (push) Successful in 4m0s
Build App / build-linux (push) Successful in 4m47s
Build App / create-tag (push) Successful in 3s
Build App / sync-to-github (push) Successful in 12s
Reflects that this backend works with any OpenAI API-compatible endpoint
(LiteLLM, OpenRouter, vLLM, text-generation-inference, LocalAI, etc.),
not just LiteLLM. Includes serde aliases for backward compatibility with
existing projects.json files.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-13 06:16:05 -07:00
879322bc9a Add copy/paste keyboard shortcut docs to How-To guide
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 13:31:41 -07:00
ecaa42fa77 Remove unused useShallow import to fix tsc build
All checks were successful
Build App / compute-version (push) Successful in 3s
Build App / build-macos (push) Successful in 2m27s
Build App / build-windows (push) Successful in 3m33s
Build App / build-linux (push) Successful in 4m49s
Build App / create-tag (push) Successful in 2s
Build App / sync-to-github (push) Successful in 10s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 13:20:08 -07:00
280358166a Show copy hint in status bar when terminal text is selected
Some checks failed
Build App / compute-version (push) Successful in 3s
Build App / build-macos (push) Failing after 7s
Build App / build-windows (push) Failing after 19s
Build App / build-linux (push) Failing after 1m57s
Build App / create-tag (push) Has been skipped
Build App / sync-to-github (push) Has been skipped
When the user highlights text in the terminal, a "Ctrl+Shift+C to copy"
hint appears in the status bar next to the project/terminal counts.
The hint disappears when the selection is cleared.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 13:14:08 -07:00
4732feb33e Add Ctrl+Shift+C keyboard shortcut for copying terminal text
All checks were successful
Build App / compute-version (push) Successful in 5s
Build App / build-macos (push) Successful in 2m22s
Build App / build-windows (push) Successful in 3m25s
Build App / build-linux (push) Successful in 5m33s
Build App / create-tag (push) Successful in 3s
Build App / sync-to-github (push) Successful in 10s
Ctrl+C in the terminal sends SIGINT which cancels running Claude work.
This adds a custom key handler so Ctrl+Shift+C copies selected text to
the clipboard without interrupting the container.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 13:05:10 -07:00
5977024953 Update Ollama docs and UI to mark model as required
All checks were successful
Build App / compute-version (push) Successful in 4s
Build App / build-macos (push) Successful in 2m22s
Build App / build-windows (push) Successful in 3m25s
Build App / build-linux (push) Successful in 4m48s
Build App / create-tag (push) Successful in 9s
Build App / sync-to-github (push) Successful in 14s
The model field must be set and the model must be pre-pulled in Ollama
before the container will work. Updated README, HOW-TO-USE, and the
ProjectCard UI label/tooltip to reflect this.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 12:53:24 -07:00
27007b90e3 Fetch help content from repo, add TOC and marketplace troubleshooting
All checks were successful
Build App / compute-version (push) Successful in 6s
Build App / build-macos (push) Successful in 2m21s
Build App / build-windows (push) Successful in 3m57s
Build App / build-linux (push) Successful in 5m2s
Build App / create-tag (push) Successful in 5s
Build App / sync-to-github (push) Successful in 10s
Help dialog now fetches HOW-TO-USE.md live from the gitea repo on open,
falling back to the compile-time embedded copy when offline. Content is
cached for the session. Removes the ~600-line hardcoded markdown constant
from HelpDialog.tsx in favor of a single source of truth.

Adds a Table of Contents with anchor links for quick navigation and a new
troubleshooting entry for the "Failed to install Anthropic marketplace"
error with the jq fix. Markdown renderer updated to support anchor links
and header id attributes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 11:00:59 -07:00
38e65619e9 Fix tooltips clipped by overflow containers, improve Backend tooltip text
All checks were successful
Build App / compute-version (push) Successful in 4s
Build App / build-macos (push) Successful in 2m21s
Build App / build-windows (push) Successful in 3m28s
Build App / build-linux (push) Successful in 5m44s
Build App / create-tag (push) Successful in 6s
Build App / sync-to-github (push) Successful in 12s
Rewrite Tooltip to use React portal (createPortal to document.body) so
tooltips render above all UI elements regardless of ancestor overflow:hidden.
Also increased max-width from 220px to 280px for longer descriptions.

Expanded Backend tooltip to explain each option (Anthropic, Bedrock,
Ollama, LiteLLM) with practical context for new users.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 09:56:50 -07:00
d2c1c2108a Fix update checker to use full semver comparison and correct platform filtering
Some checks failed
Build App / compute-version (push) Successful in 5s
Build App / build-macos (push) Successful in 2m20s
Build App / build-windows (push) Successful in 3m28s
Build App / build-linux (push) Successful in 5m21s
Build App / create-tag (push) Successful in 3s
Build App / sync-to-github (push) Has been cancelled
The version comparison was only comparing the patch number, ignoring major
and minor versions. This meant 0.1.75 (patch=75) appeared "newer" than
0.2.1 (patch=1), and updates within 0.2.x were missed entirely.

Also fixed platform filtering to handle -mac suffix (previously only
filtered -win, so Linux users would see macOS releases too).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 09:45:43 -07:00
cc163e6650 Add help dialog and tooltip indicators throughout the UI
All checks were successful
Build App / compute-version (push) Successful in 3s
Build App / build-macos (push) Successful in 2m21s
Build App / build-windows (push) Successful in 3m29s
Build App / build-linux (push) Successful in 5m43s
Build App / create-tag (push) Successful in 7s
Build App / sync-to-github (push) Successful in 13s
- Add circled ? help button in TopBar that opens a dialog with HOW-TO-USE.md content
- Create reusable Tooltip component with viewport-aware positioning
- Add 32 tooltip indicators across project config and settings panels
- Covers backend selection, Bedrock/Ollama/LiteLLM fields, Docker, AWS, MCP, and more

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 09:35:04 -07:00
38082059a5 Rename AuthMode to Backend, fix LiteLLM variant typo, add image update alerts, clean up Settings
All checks were successful
Build App / compute-version (push) Successful in 6s
Build App / build-macos (push) Successful in 2m21s
Build App / build-windows (push) Successful in 3m28s
Build App / build-linux (push) Successful in 5m14s
Build App / create-tag (push) Successful in 2s
Build App / sync-to-github (push) Successful in 10s
- Fix serde deserialization error: TypeScript sent "lit_llm" but Rust expected "lite_llm"
- Rename AuthMode enum to Backend across Rust and TypeScript (with serde alias for backward compat)
- Add container image update checking via registry digest comparison
- Improve Settings page: fix image address display spacing, remove per-project auth section
- Update UI labels from "Auth" to "Backend" throughout

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-12 09:26:58 -07:00
33 changed files with 1205 additions and 235 deletions

View File

@@ -72,7 +72,7 @@ docker exec stdout → tokio task → emit("terminal-output-{sessionId}") → li
- `container.rs` — Container lifecycle (create, start, stop, remove, inspect)
- `exec.rs` — PTY exec sessions with bidirectional stdin/stdout streaming
- `image.rs` — Image build/pull with progress streaming
- **`models/`** — Serde structs (`Project`, `AuthMode`, `BedrockConfig`, `OllamaConfig`, `LiteLlmConfig`, `ContainerInfo`, `AppSettings`). These define the IPC contract with the frontend.
- **`models/`** — Serde structs (`Project`, `Backend`, `BedrockConfig`, `OllamaConfig`, `OpenAiCompatibleConfig`, `ContainerInfo`, `AppSettings`). These define the IPC contract with the frontend.
- **`storage/`** — Persistence: `projects_store.rs` (JSON file with atomic writes), `secure.rs` (OS keychain via `keyring` crate), `settings_store.rs`
### Container (`container/`)
@@ -91,7 +91,7 @@ Per-project, independently configured:
- **Anthropic (OAuth)** — `claude login` in terminal, token persists in config volume
- **AWS Bedrock** — Static keys, profile, or bearer token injected as env vars
- **Ollama** — Connect to a local or remote Ollama server via `ANTHROPIC_BASE_URL` (e.g., `http://host.docker.internal:11434`)
- **LiteLLM** — Connect through a LiteLLM proxy gateway via `ANTHROPIC_BASE_URL` + `ANTHROPIC_AUTH_TOKEN` to access 100+ model providers
- **OpenAI Compatible** — Connect through any OpenAI API-compatible endpoint (LiteLLM, OpenRouter, vLLM, etc.) via `ANTHROPIC_BASE_URL` + `ANTHROPIC_AUTH_TOKEN`
## Styling

View File

@@ -4,6 +4,25 @@ Triple-C (Claude-Code-Container) is a desktop application that runs Claude Code
---
## Table of Contents
- [Prerequisites](#prerequisites)
- [First Launch](#first-launch)
- [The Interface](#the-interface)
- [Project Management](#project-management)
- [Project Configuration](#project-configuration)
- [MCP Servers (Beta)](#mcp-servers-beta)
- [AWS Bedrock Configuration](#aws-bedrock-configuration)
- [Ollama Configuration](#ollama-configuration)
- [OpenAI Compatible Configuration](#openai-compatible-configuration)
- [Settings](#settings)
- [Terminal Features](#terminal-features)
- [Scheduled Tasks (Inside the Container)](#scheduled-tasks-inside-the-container)
- [What's Inside the Container](#whats-inside-the-container)
- [Troubleshooting](#troubleshooting)
---
## Prerequisites
### Docker
@@ -34,7 +53,7 @@ You need access to Claude Code through one of:
- **Anthropic account** — Sign up at https://claude.ai and use `claude login` (OAuth) inside the terminal
- **AWS Bedrock** — An AWS account with Bedrock access and Claude models enabled
- **Ollama** — A local or remote Ollama server running an Anthropic-compatible model (best-effort support)
- **LiteLLM** — A LiteLLM proxy gateway providing access to 100+ model providers (best-effort support)
- **OpenAI Compatible** — Any OpenAI API-compatible endpoint (LiteLLM, OpenRouter, vLLM, text-generation-inference, LocalAI, etc.) (best-effort support)
---
@@ -73,7 +92,7 @@ Select your project in the sidebar and click **Start**. A progress modal appears
Click the **Terminal** button to open an interactive terminal session. A new tab appears in the top bar and an xterm.js terminal loads in the main area.
Claude Code launches automatically with `--dangerously-skip-permissions` inside the sandboxed container.
Claude Code launches automatically. By default, it runs in standard permission mode and will ask for your approval before executing commands or editing files. To enable auto-approval of all actions within the sandbox, enable **Full Permissions** in the project configuration.
### 5. Authenticate
@@ -86,22 +105,23 @@ Claude Code launches automatically with `--dangerously-skip-permissions` inside
**AWS Bedrock:**
1. Stop the container first (settings can only be changed while stopped).
2. In the project card, switch the auth mode to **Bedrock**.
2. In the project card, switch the backend to **Bedrock**.
3. Expand the **Config** panel and fill in your AWS credentials (see [AWS Bedrock Configuration](#aws-bedrock-configuration) below).
4. Start the container again.
**Ollama:**
1. Stop the container first (settings can only be changed while stopped).
2. In the project card, switch the auth mode to **Ollama**.
3. Expand the **Config** panel and set the base URL of your Ollama server (defaults to `http://host.docker.internal:11434` for a local instance). Optionally set a model ID.
4. Start the container again.
2. In the project card, switch the backend to **Ollama**.
3. Expand the **Config** panel and set the base URL of your Ollama server (defaults to `http://host.docker.internal:11434` for a local instance). Set the **Model ID** to the model you want to use (required).
4. Make sure the model has been pulled in Ollama (e.g., `ollama pull qwen3.5:27b`) or used via Ollama cloud before starting.
5. Start the container again.
**LiteLLM:**
**OpenAI Compatible:**
1. Stop the container first (settings can only be changed while stopped).
2. In the project card, switch the auth mode to **LiteLLM**.
3. Expand the **Config** panel and set the base URL of your LiteLLM proxy (defaults to `http://host.docker.internal:4000`). Optionally set an API key and model ID.
2. In the project card, switch the backend to **OpenAI Compatible**.
3. Expand the **Config** panel and set the base URL of your OpenAI-compatible endpoint (defaults to `http://host.docker.internal:4000` as an example). Optionally set an API key and model ID.
4. Start the container again.
---
@@ -216,6 +236,18 @@ Available skills include `/mission`, `/flight`, `/leg`, `/agentic-workflow`, `/f
> This setting can only be changed when the container is stopped. Toggling it triggers a container recreation on the next start.
### Full Permissions
Toggle **Full Permissions** to allow Claude Code to run with `--dangerously-skip-permissions` inside the container. This is **off by default**.
When **enabled**, Claude auto-approves all tool calls (file edits, shell commands, etc.) without prompting you. This is the fastest workflow since you won't be interrupted for approvals, and the Docker container provides isolation.
When **disabled** (default), Claude prompts you for approval before executing each action, giving you fine-grained control over what it does.
> **CAUTION:** Enabling full permissions means Claude can execute any command inside the container without asking. While the container sandbox limits the blast radius, make sure you understand the implications — especially if the container has Docker socket access or network connectivity.
> This setting can only be changed when the container is stopped. It takes effect the next time you open a terminal session.
### Environment Variables
Click **Edit** to open the environment variables modal. Add key-value pairs that will be injected into the container. Per-project variables override global variables with the same key.
@@ -361,7 +393,7 @@ MCP server configuration is tracked via SHA-256 fingerprints stored as Docker la
## AWS Bedrock Configuration
To use Claude via AWS Bedrock instead of Anthropic's API, switch the auth mode to **Bedrock** on the project card.
To use Claude via AWS Bedrock instead of Anthropic's API, switch the backend to **Bedrock** on the project card.
### Authentication Methods
@@ -390,12 +422,12 @@ Per-project settings always override these global defaults.
## Ollama Configuration
To use Claude Code with a local or remote Ollama server, switch the auth mode to **Ollama** on the project card.
To use Claude Code with a local or remote Ollama server, switch the backend to **Ollama** on the project card.
### Settings
- **Base URL** — The URL of your Ollama server. Defaults to `http://host.docker.internal:11434`, which reaches a locally running Ollama instance from inside the container. For a remote server, use its IP or hostname (e.g., `http://192.168.1.100:11434`).
- **Model ID** — Optional. Override the model to use (e.g., `qwen3.5:27b`).
- **Model ID** — **Required.** The model to use (e.g., `qwen3.5:27b`). The model must be pulled in Ollama before use — run `ollama pull <model>` or use it via Ollama cloud so it is available when the container starts.
### How It Works
@@ -403,23 +435,25 @@ Triple-C sets `ANTHROPIC_BASE_URL` to point Claude Code at your Ollama server in
> **Note:** Ollama support is best-effort. Claude Code is designed for Anthropic models, so some features (tool use, extended thinking, prompt caching, etc.) may not work as expected with non-Anthropic models.
> **Important:** The model must already be available in Ollama before starting the container. If using a local Ollama instance, pull the model first with `ollama pull <model-name>`. If using Ollama's cloud service, ensure the model has been used at least once so it is cached.
---
## LiteLLM Configuration
## OpenAI Compatible Configuration
To use Claude Code through a [LiteLLM](https://docs.litellm.ai/) proxy gateway, switch the auth mode to **LiteLLM** on the project card. LiteLLM supports 100+ model providers (OpenAI, Gemini, Anthropic, and more) through a single proxy.
To use Claude Code through any OpenAI API-compatible endpoint, switch the backend to **OpenAI Compatible** on the project card. This works with any server that exposes an OpenAI-compatible API, including LiteLLM, OpenRouter, vLLM, text-generation-inference, LocalAI, and others.
### Settings
- **Base URL** — The URL of your LiteLLM proxy. Defaults to `http://host.docker.internal:4000` for a locally running proxy.
- **API Key** — Optional. The API key for your LiteLLM proxy, if authentication is required. Stored securely in your OS keychain.
- **Base URL** — The URL of your OpenAI-compatible endpoint. Defaults to `http://host.docker.internal:4000` as an example (adjust to match your server's address and port).
- **API Key** — Optional. The API key for your endpoint, if authentication is required. Stored securely in your OS keychain.
- **Model ID** — Optional. Override the model to use.
### How It Works
Triple-C sets `ANTHROPIC_BASE_URL` to point Claude Code at your LiteLLM proxy. If an API key is provided, it is set as `ANTHROPIC_AUTH_TOKEN`.
Triple-C sets `ANTHROPIC_BASE_URL` to point Claude Code at your OpenAI-compatible endpoint. If an API key is provided, it is set as `ANTHROPIC_AUTH_TOKEN`.
> **Note:** LiteLLM support is best-effort. Claude Code is designed for Anthropic models, so some features (tool use, extended thinking, prompt caching, etc.) may not work as expected when routing to non-Anthropic models through the proxy.
> **Note:** OpenAI Compatible support is best-effort. Claude Code is designed for Anthropic models, so some features (tool use, extended thinking, prompt caching, etc.) may not work as expected when routing to non-Anthropic models through the endpoint.
---
@@ -472,6 +506,10 @@ When Claude Code prints a long URL (e.g., during `claude login`), Triple-C detec
Shorter URLs in terminal output are also clickable directly.
### Copying and Pasting
Use **Ctrl+Shift+C** (or **Cmd+C** on macOS) to copy selected text from the terminal, and **Ctrl+Shift+V** (or **Cmd+V** on macOS) to paste. This follows standard terminal emulator conventions since Ctrl+C is reserved for sending SIGINT.
### Clipboard Support (OSC 52)
Programs inside the container can copy text to your host clipboard. When a container program uses `xclip`, `xsel`, or `pbcopy`, the text is transparently forwarded to your host clipboard via OSC 52 escape sequences. No additional configuration is required — this works out of the box.
@@ -622,3 +660,13 @@ You can install additional tools at runtime with `sudo apt install`, `pip instal
- Ensure the Docker image for the MCP server exists (pull it first if needed).
- Check that Docker socket access is available (stdio + Docker MCP servers auto-enable this).
- Try resetting the project container to force a clean recreation.
### "Failed to install Anthropic marketplace" Error
If Claude Code shows **"Failed to install Anthropic marketplace - Will retry on next startup"** repeatedly, the marketplace metadata in `~/.claude.json` may be corrupted. To fix this, open a **Shell** session in the project and run:
```bash
cp ~/.claude.json ~/.claude.json.bak && jq 'with_entries(select(.key | startswith("officialMarketplace") | not))' ~/.claude.json.bak > ~/.claude.json
```
This backs up your config and removes the corrupted marketplace entries. Claude Code will re-download them cleanly on the next startup.

View File

@@ -1,6 +1,6 @@
# Triple-C (Claude-Code-Container)
Triple-C is a cross-platform desktop application that sandboxes Claude Code inside Docker containers. When running with `--dangerously-skip-permissions`, Claude only has access to the files and projects you explicitly provide to it.
Triple-C is a cross-platform desktop application that sandboxes Claude Code inside Docker containers. Each project can optionally enable full permissions mode (`--dangerously-skip-permissions`), giving Claude unrestricted access within the sandbox.
## Architecture
@@ -49,10 +49,10 @@ Each project can independently use one of:
- **Anthropic** (OAuth): User runs `claude login` inside the terminal on first use. Token persisted in the config volume across restarts and resets.
- **AWS Bedrock**: Per-project AWS credentials (static keys, profile, or bearer token). SSO sessions are validated before launching Claude for Profile auth.
- **Ollama**: Connect to a local or remote Ollama server via `ANTHROPIC_BASE_URL` (e.g., `http://host.docker.internal:11434`). Optional model override.
- **LiteLLM**: Connect through a LiteLLM proxy gateway via `ANTHROPIC_BASE_URL` + `ANTHROPIC_AUTH_TOKEN` to access 100+ model providers. API key stored securely in OS keychain.
- **Ollama**: Connect to a local or remote Ollama server via `ANTHROPIC_BASE_URL` (e.g., `http://host.docker.internal:11434`). Requires a model ID, and the model must be pulled (or used via Ollama cloud) before starting the container.
- **OpenAI Compatible**: Connect through any OpenAI API-compatible endpoint (LiteLLM, OpenRouter, vLLM, text-generation-inference, LocalAI, etc.) via `ANTHROPIC_BASE_URL` + `ANTHROPIC_AUTH_TOKEN`. API key stored securely in OS keychain.
> **Note:** Ollama and LiteLLM support is best-effort. Claude Code is designed for Anthropic models, so some features (tool use, extended thinking, prompt caching, etc.) may not work as expected with non-Anthropic models behind these backends.
> **Note:** Ollama and OpenAI Compatible support is best-effort. Claude Code is designed for Anthropic models, so some features (tool use, extended thinking, prompt caching, etc.) may not work as expected with non-Anthropic models behind these backends.
### Container Spawning (Sibling Containers)
@@ -102,7 +102,7 @@ Users can override this in Settings via the global `docker_socket_path` option.
| `app/src/components/layout/TopBar.tsx` | Terminal tabs + Docker/Image status indicators |
| `app/src/components/layout/Sidebar.tsx` | Responsive sidebar (25% width, min 224px, max 320px) |
| `app/src/components/layout/StatusBar.tsx` | Running project/terminal counts |
| `app/src/components/projects/ProjectCard.tsx` | Project config, auth mode, action buttons |
| `app/src/components/projects/ProjectCard.tsx` | Project config, backend selector, action buttons |
| `app/src/components/projects/ProjectList.tsx` | Project list in sidebar |
| `app/src/components/projects/FileManagerModal.tsx` | File browser modal (browse, download, upload) |
| `app/src/components/projects/ContainerProgressModal.tsx` | Real-time container operation progress |
@@ -122,7 +122,7 @@ Users can override this in Settings via the global `docker_socket_path` option.
| `app/src-tauri/src/commands/project_commands.rs` | Start/stop/rebuild Tauri command handlers |
| `app/src-tauri/src/commands/file_commands.rs` | File manager Tauri commands (list, download, upload) |
| `app/src-tauri/src/commands/mcp_commands.rs` | MCP server CRUD Tauri commands |
| `app/src-tauri/src/models/project.rs` | Project struct (auth mode, Docker access, MCP servers, Mission Control) |
| `app/src-tauri/src/models/project.rs` | Project struct (backend, Docker access, MCP servers, Mission Control) |
| `app/src-tauri/src/models/mcp_server.rs` | MCP server struct (transport, Docker image, env vars) |
| `app/src-tauri/src/models/app_settings.rs` | Global settings (image source, Docker socket, AWS, microphone) |
| `app/src-tauri/src/storage/mcp_store.rs` | MCP server persistence (JSON with atomic writes) |

View File

@@ -290,7 +290,7 @@ triple-c/
│ ├── image.rs # Build from Dockerfile, pull from registry
│ └── network.rs # Per-project bridge networks for MCP
├── models/ # Data structures
│ ├── project.rs # Project, AuthMode, BedrockConfig
│ ├── project.rs # Project, Backend, BedrockConfig
│ ├── mcp_server.rs # MCP server configuration
│ ├── app_settings.rs # Global settings (image source, AWS, etc.)
│ ├── container_config.rs # Image name resolution

View File

@@ -0,0 +1,60 @@
use std::sync::OnceLock;
use tokio::sync::Mutex;
const HELP_URL: &str =
"https://repo.anhonesthost.net/cybercovellc/triple-c/raw/branch/main/HOW-TO-USE.md";
const EMBEDDED_HELP: &str = include_str!("../../../../HOW-TO-USE.md");
/// Cached help content fetched from the remote repo (or `None` if not yet fetched).
static CACHED_HELP: OnceLock<Mutex<Option<String>>> = OnceLock::new();
/// Return the help markdown content.
///
/// On the first call, tries to fetch the latest version from the gitea repo.
/// If that fails (network error, timeout, etc.), falls back to the version
/// embedded at compile time. The result is cached for the rest of the session.
#[tauri::command]
pub async fn get_help_content() -> Result<String, String> {
let mutex = CACHED_HELP.get_or_init(|| Mutex::new(None));
let mut guard = mutex.lock().await;
if let Some(ref cached) = *guard {
return Ok(cached.clone());
}
let content = match fetch_remote_help().await {
Ok(md) => {
log::info!("Loaded help content from remote repo");
md
}
Err(e) => {
log::info!("Using embedded help content (remote fetch failed: {})", e);
EMBEDDED_HELP.to_string()
}
};
*guard = Some(content.clone());
Ok(content)
}
async fn fetch_remote_help() -> Result<String, String> {
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(10))
.build()
.map_err(|e| format!("Failed to create HTTP client: {}", e))?;
let resp = client
.get(HELP_URL)
.send()
.await
.map_err(|e| format!("Failed to fetch help content: {}", e))?;
if !resp.status().is_success() {
return Err(format!("Remote returned status {}", resp.status()));
}
resp.text()
.await
.map_err(|e| format!("Failed to read response body: {}", e))
}

View File

@@ -1,6 +1,7 @@
pub mod aws_commands;
pub mod docker_commands;
pub mod file_commands;
pub mod help_commands;
pub mod mcp_commands;
pub mod project_commands;
pub mod settings_commands;

View File

@@ -1,7 +1,7 @@
use tauri::{Emitter, State};
use crate::docker;
use crate::models::{container_config, AuthMode, McpServer, Project, ProjectPath, ProjectStatus};
use crate::models::{container_config, Backend, McpServer, Project, ProjectPath, ProjectStatus};
use crate::storage::secure;
use crate::AppState;
@@ -34,9 +34,9 @@ fn store_secrets_for_project(project: &Project) -> Result<(), String> {
secure::store_project_secret(&project.id, "aws-bearer-token", v)?;
}
}
if let Some(ref litellm) = project.litellm_config {
if let Some(ref v) = litellm.api_key {
secure::store_project_secret(&project.id, "litellm-api-key", v)?;
if let Some(ref oai_config) = project.openai_compatible_config {
if let Some(ref v) = oai_config.api_key {
secure::store_project_secret(&project.id, "openai-compatible-api-key", v)?;
}
}
Ok(())
@@ -56,8 +56,8 @@ fn load_secrets_for_project(project: &mut Project) {
bedrock.aws_bearer_token = secure::get_project_secret(&project.id, "aws-bearer-token")
.unwrap_or(None);
}
if let Some(ref mut litellm) = project.litellm_config {
litellm.api_key = secure::get_project_secret(&project.id, "litellm-api-key")
if let Some(ref mut oai_config) = project.openai_compatible_config {
oai_config.api_key = secure::get_project_secret(&project.id, "openai-compatible-api-key")
.unwrap_or(None);
}
}
@@ -179,29 +179,29 @@ pub async fn start_project_container(
// Resolve enabled MCP servers for this project
let (enabled_mcp, docker_mcp) = resolve_mcp_servers(&project, &state);
// Validate auth mode requirements
if project.auth_mode == AuthMode::Bedrock {
// Validate backend requirements
if project.backend == Backend::Bedrock {
let bedrock = project.bedrock_config.as_ref()
.ok_or_else(|| "Bedrock auth mode selected but no Bedrock configuration found.".to_string())?;
.ok_or_else(|| "Bedrock backend selected but no Bedrock configuration found.".to_string())?;
// Region can come from per-project or global
if bedrock.aws_region.is_empty() && settings.global_aws.aws_region.is_none() {
return Err("AWS region is required for Bedrock auth mode. Set it per-project or in global AWS settings.".to_string());
return Err("AWS region is required for Bedrock backend. Set it per-project or in global AWS settings.".to_string());
}
}
if project.auth_mode == AuthMode::Ollama {
if project.backend == Backend::Ollama {
let ollama = project.ollama_config.as_ref()
.ok_or_else(|| "Ollama auth mode selected but no Ollama configuration found.".to_string())?;
.ok_or_else(|| "Ollama backend selected but no Ollama configuration found.".to_string())?;
if ollama.base_url.is_empty() {
return Err("Ollama base URL is required.".to_string());
}
}
if project.auth_mode == AuthMode::LiteLlm {
let litellm = project.litellm_config.as_ref()
.ok_or_else(|| "LiteLLM auth mode selected but no LiteLLM configuration found.".to_string())?;
if litellm.base_url.is_empty() {
return Err("LiteLLM base URL is required.".to_string());
if project.backend == Backend::OpenAiCompatible {
let oai_config = project.openai_compatible_config.as_ref()
.ok_or_else(|| "OpenAI Compatible backend selected but no configuration found.".to_string())?;
if oai_config.base_url.is_empty() {
return Err("OpenAI Compatible base URL is required.".to_string());
}
}

View File

@@ -1,6 +1,6 @@
use tauri::{AppHandle, Emitter, State};
use crate::models::{AuthMode, BedrockAuthMethod, Project};
use crate::models::{Backend, BedrockAuthMethod, Project};
use crate::AppState;
/// Build the command to run in the container terminal.
@@ -9,7 +9,7 @@ use crate::AppState;
/// the AWS session first. If the SSO session is expired, runs `aws sso login`
/// so the user can re-authenticate (the URL is clickable via xterm.js WebLinksAddon).
fn build_terminal_cmd(project: &Project, state: &AppState) -> Vec<String> {
let is_bedrock_profile = project.auth_mode == AuthMode::Bedrock
let is_bedrock_profile = project.backend == Backend::Bedrock
&& project
.bedrock_config
.as_ref()
@@ -17,10 +17,11 @@ fn build_terminal_cmd(project: &Project, state: &AppState) -> Vec<String> {
.unwrap_or(false);
if !is_bedrock_profile {
return vec![
"claude".to_string(),
"--dangerously-skip-permissions".to_string(),
];
let mut cmd = vec!["claude".to_string()];
if project.full_permissions {
cmd.push("--dangerously-skip-permissions".to_string());
}
return cmd;
}
// Resolve AWS profile: project-level → global settings → "default"
@@ -33,6 +34,12 @@ fn build_terminal_cmd(project: &Project, state: &AppState) -> Vec<String> {
// Build a bash wrapper that validates credentials, re-auths if needed,
// then exec's into claude.
let claude_cmd = if project.full_permissions {
"exec claude --dangerously-skip-permissions"
} else {
"exec claude"
};
let script = format!(
r#"
echo "Validating AWS session for profile '{profile}'..."
@@ -58,9 +65,10 @@ else
echo ""
fi
fi
exec claude --dangerously-skip-permissions
{claude_cmd}
"#,
profile = profile
profile = profile,
claude_cmd = claude_cmd
);
vec![

View File

@@ -1,8 +1,16 @@
use crate::models::{GiteaRelease, ReleaseAsset, UpdateInfo};
use tauri::State;
use crate::docker;
use crate::models::{container_config, GiteaRelease, ImageUpdateInfo, ReleaseAsset, UpdateInfo};
use crate::AppState;
const RELEASES_URL: &str =
"https://repo.anhonesthost.net/api/v1/repos/cybercovellc/triple-c/releases";
/// Gitea container-registry tag object (v2 manifest).
const REGISTRY_API_BASE: &str =
"https://repo.anhonesthost.net/v2/cybercovellc/triple-c/triple-c-sandbox";
#[tauri::command]
pub fn get_app_version() -> String {
env!("CARGO_PKG_VERSION").to_string()
@@ -26,30 +34,37 @@ pub async fn check_for_updates() -> Result<Option<UpdateInfo>, String> {
.map_err(|e| format!("Failed to parse releases: {}", e))?;
let current_version = env!("CARGO_PKG_VERSION");
let is_windows = cfg!(target_os = "windows");
let current_semver = parse_semver(current_version).unwrap_or((0, 0, 0));
// Determine platform suffix for tag filtering
let platform_suffix: &str = if cfg!(target_os = "windows") {
"-win"
} else if cfg!(target_os = "macos") {
"-mac"
} else {
"" // Linux uses bare tags (no suffix)
};
// Filter releases by platform tag suffix
let platform_releases: Vec<&GiteaRelease> = releases
.iter()
.filter(|r| {
if is_windows {
r.tag_name.ends_with("-win")
if platform_suffix.is_empty() {
// Linux: bare tag only (no -win, no -mac)
!r.tag_name.ends_with("-win") && !r.tag_name.ends_with("-mac")
} else {
!r.tag_name.ends_with("-win")
r.tag_name.ends_with(platform_suffix)
}
})
.collect();
// Find the latest release with a higher patch version
// Version format: 0.1.X or v0.1.X (tag may have prefix/suffix)
let current_patch = parse_patch_version(current_version).unwrap_or(0);
let mut best: Option<(&GiteaRelease, u32)> = None;
// Find the latest release with a higher semver version
let mut best: Option<(&GiteaRelease, (u32, u32, u32))> = None;
for release in &platform_releases {
if let Some(patch) = parse_patch_from_tag(&release.tag_name) {
if patch > current_patch {
if best.is_none() || patch > best.unwrap().1 {
best = Some((release, patch));
if let Some(ver) = parse_semver_from_tag(&release.tag_name) {
if ver > current_semver {
if best.is_none() || ver > best.unwrap().1 {
best = Some((release, ver));
}
}
}
@@ -84,34 +99,125 @@ pub async fn check_for_updates() -> Result<Option<UpdateInfo>, String> {
}
}
/// Parse patch version from a semver string like "0.1.5" -> 5
fn parse_patch_version(version: &str) -> Option<u32> {
/// Parse a semver string like "0.2.5" -> (0, 2, 5)
fn parse_semver(version: &str) -> Option<(u32, u32, u32)> {
let clean = version.trim_start_matches('v');
let parts: Vec<&str> = clean.split('.').collect();
if parts.len() >= 3 {
parts[2].parse().ok()
let major = parts[0].parse().ok()?;
let minor = parts[1].parse().ok()?;
let patch = parts[2].parse().ok()?;
Some((major, minor, patch))
} else {
None
}
}
/// Parse patch version from a tag like "v0.1.5", "v0.1.5-win", "0.1.5" -> 5
fn parse_patch_from_tag(tag: &str) -> Option<u32> {
/// Parse semver from a tag like "v0.2.5", "v0.2.5-win", "v0.2.5-mac" -> (0, 2, 5)
fn parse_semver_from_tag(tag: &str) -> Option<(u32, u32, u32)> {
let clean = tag.trim_start_matches('v');
// Remove platform suffix
let clean = clean.strip_suffix("-win").unwrap_or(clean);
parse_patch_version(clean)
let clean = clean.strip_suffix("-win")
.or_else(|| clean.strip_suffix("-mac"))
.unwrap_or(clean);
parse_semver(clean)
}
/// Extract a clean version string from a tag like "v0.1.5-win" -> "0.1.5"
/// Extract a clean version string from a tag like "v0.2.5-win" -> "0.2.5"
fn extract_version_from_tag(tag: &str) -> Option<String> {
let clean = tag.trim_start_matches('v');
let clean = clean.strip_suffix("-win").unwrap_or(clean);
// Validate it looks like a version
let parts: Vec<&str> = clean.split('.').collect();
if parts.len() >= 3 && parts.iter().all(|p| p.parse::<u32>().is_ok()) {
Some(clean.to_string())
} else {
None
let (major, minor, patch) = parse_semver_from_tag(tag)?;
Some(format!("{}.{}.{}", major, minor, patch))
}
/// Check whether a newer container image is available in the registry.
///
/// Compares the local image digest with the remote registry digest using the
/// Docker Registry HTTP API v2. Only applies when the image source is
/// "registry" (the default); for local builds or custom images we cannot
/// meaningfully check for remote updates.
#[tauri::command]
pub async fn check_image_update(
state: State<'_, AppState>,
) -> Result<Option<ImageUpdateInfo>, String> {
let settings = state.settings_store.get();
// Only check for registry images
if settings.image_source != crate::models::app_settings::ImageSource::Registry {
return Ok(None);
}
let image_name =
container_config::resolve_image_name(&settings.image_source, &settings.custom_image_name);
// 1. Get local image digest via Docker
let local_digest = docker::get_local_image_digest(&image_name).await.ok().flatten();
// 2. Get remote digest from the Gitea container registry (OCI distribution spec)
let remote_digest = fetch_remote_digest("latest").await?;
// No remote digest available — nothing to compare
let remote_digest = match remote_digest {
Some(d) => d,
None => return Ok(None),
};
// If local digest matches remote, no update
if let Some(ref local) = local_digest {
if *local == remote_digest {
return Ok(None);
}
}
// There's a difference (or no local image at all)
Ok(Some(ImageUpdateInfo {
remote_digest,
local_digest,
remote_updated_at: None,
}))
}
/// Fetch the digest of a tag from the Gitea container registry using the
/// OCI / Docker Registry HTTP API v2.
///
/// We issue a HEAD request to /v2/<repo>/manifests/<tag> and read the
/// `Docker-Content-Digest` header that the registry returns.
async fn fetch_remote_digest(tag: &str) -> Result<Option<String>, String> {
let url = format!("{}/manifests/{}", REGISTRY_API_BASE, tag);
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(15))
.build()
.map_err(|e| format!("Failed to create HTTP client: {}", e))?;
let response = client
.head(&url)
.header(
"Accept",
"application/vnd.docker.distribution.manifest.v2+json, application/vnd.oci.image.index.v1+json",
)
.send()
.await;
match response {
Ok(resp) => {
if !resp.status().is_success() {
log::warn!(
"Registry returned status {} when checking image digest",
resp.status()
);
return Ok(None);
}
// The digest is returned in the Docker-Content-Digest header
if let Some(digest) = resp.headers().get("docker-content-digest") {
if let Ok(val) = digest.to_str() {
return Ok(Some(val.to_string()));
}
}
Ok(None)
}
Err(e) => {
log::warn!("Failed to check registry for image update: {}", e);
Ok(None)
}
}
}

View File

@@ -8,7 +8,7 @@ use std::collections::HashMap;
use sha2::{Sha256, Digest};
use super::client::get_docker;
use crate::models::{AuthMode, BedrockAuthMethod, ContainerInfo, EnvVar, GlobalAwsSettings, McpServer, McpTransportType, PortMapping, Project, ProjectPath};
use crate::models::{Backend, BedrockAuthMethod, ContainerInfo, EnvVar, GlobalAwsSettings, McpServer, McpTransportType, PortMapping, Project, ProjectPath};
const SCHEDULER_INSTRUCTIONS: &str = r#"## Scheduled Tasks
@@ -244,13 +244,13 @@ fn compute_ollama_fingerprint(project: &Project) -> String {
}
}
/// Compute a fingerprint for the LiteLLM configuration so we can detect changes.
fn compute_litellm_fingerprint(project: &Project) -> String {
if let Some(ref litellm) = project.litellm_config {
/// Compute a fingerprint for the OpenAI Compatible configuration so we can detect changes.
fn compute_openai_compatible_fingerprint(project: &Project) -> String {
if let Some(ref config) = project.openai_compatible_config {
let parts = vec![
litellm.base_url.clone(),
litellm.api_key.as_deref().unwrap_or("").to_string(),
litellm.model_id.as_deref().unwrap_or("").to_string(),
config.base_url.clone(),
config.api_key.as_deref().unwrap_or("").to_string(),
config.model_id.as_deref().unwrap_or("").to_string(),
];
sha256_hex(&parts.join("|"))
} else {
@@ -453,7 +453,7 @@ pub async fn create_container(
}
// Bedrock configuration
if project.auth_mode == AuthMode::Bedrock {
if project.backend == Backend::Bedrock {
if let Some(ref bedrock) = project.bedrock_config {
env_vars.push("CLAUDE_CODE_USE_BEDROCK=1".to_string());
@@ -506,7 +506,7 @@ pub async fn create_container(
}
// Ollama configuration
if project.auth_mode == AuthMode::Ollama {
if project.backend == Backend::Ollama {
if let Some(ref ollama) = project.ollama_config {
env_vars.push(format!("ANTHROPIC_BASE_URL={}", ollama.base_url));
env_vars.push("ANTHROPIC_AUTH_TOKEN=ollama".to_string());
@@ -516,14 +516,14 @@ pub async fn create_container(
}
}
// LiteLLM configuration
if project.auth_mode == AuthMode::LiteLlm {
if let Some(ref litellm) = project.litellm_config {
env_vars.push(format!("ANTHROPIC_BASE_URL={}", litellm.base_url));
if let Some(ref key) = litellm.api_key {
// OpenAI Compatible configuration
if project.backend == Backend::OpenAiCompatible {
if let Some(ref config) = project.openai_compatible_config {
env_vars.push(format!("ANTHROPIC_BASE_URL={}", config.base_url));
if let Some(ref key) = config.api_key {
env_vars.push(format!("ANTHROPIC_AUTH_TOKEN={}", key));
}
if let Some(ref model) = litellm.model_id {
if let Some(ref model) = config.model_id {
env_vars.push(format!("ANTHROPIC_MODEL={}", model));
}
}
@@ -624,7 +624,7 @@ pub async fn create_container(
// AWS config mount (read-only)
// Mount if: Bedrock profile auth needs it, OR a global aws_config_path is set
let should_mount_aws = if project.auth_mode == AuthMode::Bedrock {
let should_mount_aws = if project.backend == Backend::Bedrock {
if let Some(ref bedrock) = project.bedrock_config {
bedrock.auth_method == BedrockAuthMethod::Profile
} else {
@@ -694,11 +694,11 @@ pub async fn create_container(
labels.insert("triple-c.managed".to_string(), "true".to_string());
labels.insert("triple-c.project-id".to_string(), project.id.clone());
labels.insert("triple-c.project-name".to_string(), project.name.clone());
labels.insert("triple-c.auth-mode".to_string(), format!("{:?}", project.auth_mode));
labels.insert("triple-c.backend".to_string(), format!("{:?}", project.backend));
labels.insert("triple-c.paths-fingerprint".to_string(), compute_paths_fingerprint(&project.paths));
labels.insert("triple-c.bedrock-fingerprint".to_string(), compute_bedrock_fingerprint(project));
labels.insert("triple-c.ollama-fingerprint".to_string(), compute_ollama_fingerprint(project));
labels.insert("triple-c.litellm-fingerprint".to_string(), compute_litellm_fingerprint(project));
labels.insert("triple-c.openai-compatible-fingerprint".to_string(), compute_openai_compatible_fingerprint(project));
labels.insert("triple-c.ports-fingerprint".to_string(), compute_ports_fingerprint(&project.port_mappings));
labels.insert("triple-c.image".to_string(), image_name.to_string());
labels.insert("triple-c.timezone".to_string(), timezone.unwrap_or("").to_string());
@@ -897,11 +897,13 @@ pub async fn container_needs_recreation(
// Code settings stored in the named volume). The change takes effect
// on the next explicit rebuild instead.
// ── Auth mode ────────────────────────────────────────────────────────
let current_auth_mode = format!("{:?}", project.auth_mode);
if let Some(container_auth_mode) = get_label("triple-c.auth-mode") {
if container_auth_mode != current_auth_mode {
log::info!("Auth mode mismatch (container={:?}, project={:?})", container_auth_mode, current_auth_mode);
// ── Backend ──────────────────────────────────────────────────────────
let current_backend = format!("{:?}", project.backend);
// Check new label name, falling back to old "triple-c.auth-mode" for pre-rename containers
let container_backend = get_label("triple-c.backend").or_else(|| get_label("triple-c.auth-mode"));
if let Some(container_backend) = container_backend {
if container_backend != current_backend {
log::info!("Backend mismatch (container={:?}, project={:?})", container_backend, current_backend);
return Ok(true);
}
}
@@ -946,11 +948,11 @@ pub async fn container_needs_recreation(
return Ok(true);
}
// ── LiteLLM config fingerprint ───────────────────────────────────────
let expected_litellm_fp = compute_litellm_fingerprint(project);
let container_litellm_fp = get_label("triple-c.litellm-fingerprint").unwrap_or_default();
if container_litellm_fp != expected_litellm_fp {
log::info!("LiteLLM config mismatch");
// ── OpenAI Compatible config fingerprint ────────────────────────────
let expected_oai_fp = compute_openai_compatible_fingerprint(project);
let container_oai_fp = get_label("triple-c.openai-compatible-fingerprint").unwrap_or_default();
if container_oai_fp != expected_oai_fp {
log::info!("OpenAI Compatible config mismatch");
return Ok(true);
}

View File

@@ -31,6 +31,38 @@ pub async fn image_exists(image_name: &str) -> Result<bool, String> {
Ok(!images.is_empty())
}
/// Returns the first repo digest (e.g. "sha256:abc...") for the given image,
/// or None if the image doesn't exist locally or has no repo digests.
pub async fn get_local_image_digest(image_name: &str) -> Result<Option<String>, String> {
let docker = get_docker()?;
let filters: HashMap<String, Vec<String>> = HashMap::from([(
"reference".to_string(),
vec![image_name.to_string()],
)]);
let images: Vec<ImageSummary> = docker
.list_images(Some(ListImagesOptions {
filters,
..Default::default()
}))
.await
.map_err(|e| format!("Failed to list images: {}", e))?;
if let Some(img) = images.first() {
// RepoDigests contains entries like "registry/repo@sha256:abc..."
if let Some(digest_str) = img.repo_digests.first() {
// Extract the sha256:... part after '@'
if let Some(pos) = digest_str.find('@') {
return Ok(Some(digest_str[pos + 1..].to_string()));
}
return Ok(Some(digest_str.clone()));
}
}
Ok(None)
}
pub async fn pull_image<F>(image_name: &str, on_progress: F) -> Result<(), String>
where
F: Fn(String) + Send + 'static,

View File

@@ -119,6 +119,9 @@ pub fn run() {
// Updates
commands::update_commands::get_app_version,
commands::update_commands::check_for_updates,
commands::update_commands::check_image_update,
// Help
commands::help_commands::get_help_content,
])
.run(tauri::generate_context!())
.expect("error while running tauri application");

View File

@@ -72,6 +72,8 @@ pub struct AppSettings {
pub timezone: Option<String>,
#[serde(default)]
pub default_microphone: Option<String>,
#[serde(default)]
pub dismissed_image_digest: Option<String>,
}
impl Default for AppSettings {
@@ -90,6 +92,7 @@ impl Default for AppSettings {
dismissed_update_version: None,
timezone: None,
default_microphone: None,
dismissed_image_digest: None,
}
}
}

View File

@@ -24,6 +24,10 @@ fn default_protocol() -> String {
"tcp".to_string()
}
fn default_full_permissions() -> bool {
true
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Project {
pub id: String,
@@ -31,13 +35,17 @@ pub struct Project {
pub paths: Vec<ProjectPath>,
pub container_id: Option<String>,
pub status: ProjectStatus,
pub auth_mode: AuthMode,
#[serde(alias = "auth_mode")]
pub backend: Backend,
pub bedrock_config: Option<BedrockConfig>,
pub ollama_config: Option<OllamaConfig>,
pub litellm_config: Option<LiteLlmConfig>,
#[serde(alias = "litellm_config")]
pub openai_compatible_config: Option<OpenAiCompatibleConfig>,
pub allow_docker_access: bool,
#[serde(default)]
pub mission_control_enabled: bool,
#[serde(default = "default_full_permissions")]
pub full_permissions: bool,
pub ssh_key_path: Option<String>,
#[serde(skip_serializing, default)]
pub git_token: Option<String>,
@@ -65,23 +73,24 @@ pub enum ProjectStatus {
Error,
}
/// How the project authenticates with Claude.
/// - `Anthropic`: User runs `claude login` inside the container (OAuth via Anthropic Console,
/// persisted in the config volume)
/// - `Bedrock`: Uses AWS Bedrock with per-project AWS credentials
/// Which AI model backend/provider the project uses.
/// - `Anthropic`: Direct Anthropic API (user runs `claude login` inside the container)
/// - `Bedrock`: AWS Bedrock with per-project AWS credentials
/// - `Ollama`: Local or remote Ollama server
/// - `OpenAiCompatible`: Any OpenAI API-compatible endpoint (e.g., LiteLLM, vLLM, etc.)
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
#[serde(rename_all = "snake_case")]
pub enum AuthMode {
pub enum Backend {
/// Backward compat: old projects stored as "login" or "api_key" map to Anthropic.
#[serde(alias = "login", alias = "api_key")]
Anthropic,
Bedrock,
Ollama,
#[serde(alias = "litellm")]
LiteLlm,
#[serde(alias = "lite_llm", alias = "litellm")]
OpenAiCompatible,
}
impl Default for AuthMode {
impl Default for Backend {
fn default() -> Self {
Self::Anthropic
}
@@ -130,13 +139,14 @@ pub struct OllamaConfig {
pub model_id: Option<String>,
}
/// LiteLLM gateway configuration for a project.
/// LiteLLM translates Anthropic API calls to 100+ model providers.
/// OpenAI Compatible endpoint configuration for a project.
/// Routes Anthropic API calls through any OpenAI API-compatible endpoint
/// (e.g., LiteLLM, vLLM, or other compatible gateways).
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct LiteLlmConfig {
/// The base URL of the LiteLLM proxy (e.g., "http://host.docker.internal:4000" or "https://litellm.example.com")
pub struct OpenAiCompatibleConfig {
/// The base URL of the OpenAI-compatible endpoint (e.g., "http://host.docker.internal:4000" or "https://api.example.com")
pub base_url: String,
/// API key for the LiteLLM proxy
/// API key for the OpenAI-compatible endpoint
#[serde(skip_serializing, default)]
pub api_key: Option<String>,
/// Optional model override
@@ -152,12 +162,13 @@ impl Project {
paths,
container_id: None,
status: ProjectStatus::Stopped,
auth_mode: AuthMode::default(),
backend: Backend::default(),
bedrock_config: None,
ollama_config: None,
litellm_config: None,
openai_compatible_config: None,
allow_docker_access: false,
mission_control_enabled: false,
full_permissions: false,
ssh_key_path: None,
git_token: None,
git_user_name: None,

View File

@@ -35,3 +35,14 @@ pub struct GiteaAsset {
pub browser_download_url: String,
pub size: u64,
}
/// Info returned to the frontend about an available container image update.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ImageUpdateInfo {
/// The remote digest (e.g. sha256:abc...)
pub remote_digest: String,
/// The local digest, if available
pub local_digest: Option<String>,
/// When the remote image was last updated (if known)
pub remote_updated_at: Option<String>,
}

View File

@@ -17,7 +17,7 @@ export default function App() {
const { loadSettings } = useSettings();
const { refresh } = useProjects();
const { refresh: refreshMcp } = useMcpServers();
const { loadVersion, checkForUpdates, startPeriodicCheck } = useUpdates();
const { loadVersion, checkForUpdates, checkImageUpdate, startPeriodicCheck } = useUpdates();
const { sessions, activeSessionId, setProjects } = useAppState(
useShallow(s => ({ sessions: s.sessions, activeSessionId: s.activeSessionId, setProjects: s.setProjects }))
);
@@ -46,7 +46,10 @@ export default function App() {
// Update detection
loadVersion();
const updateTimer = setTimeout(() => checkForUpdates(), 3000);
const updateTimer = setTimeout(() => {
checkForUpdates();
checkImageUpdate();
}, 3000);
const cleanup = startPeriodicCheck();
return () => {
clearTimeout(updateTimer);

View File

@@ -0,0 +1,218 @@
import { useEffect, useRef, useCallback, useState } from "react";
import { getHelpContent } from "../../lib/tauri-commands";
interface Props {
onClose: () => void;
}
/** Convert header text to a URL-friendly slug for anchor links. */
function slugify(text: string): string {
return text
.toLowerCase()
.replace(/<[^>]+>/g, "") // strip HTML tags (e.g. from inline code)
.replace(/[^\w\s-]/g, "") // remove non-word chars except spaces/dashes
.replace(/\s+/g, "-") // spaces to dashes
.replace(/-+/g, "-") // collapse consecutive dashes
.replace(/^-|-$/g, ""); // trim leading/trailing dashes
}
/** Simple markdown-to-HTML converter for the help content. */
function renderMarkdown(md: string): string {
let html = md;
// Normalize line endings
html = html.replace(/\r\n/g, "\n");
// Escape HTML entities (but we'll re-introduce tags below)
html = html.replace(/&/g, "&amp;").replace(/</g, "&lt;").replace(/>/g, "&gt;");
// Fenced code blocks (```...```)
html = html.replace(/```(\w*)\n([\s\S]*?)```/g, (_m, _lang, code) => {
return `<pre class="help-code-block"><code>${code.trimEnd()}</code></pre>`;
});
// Inline code (`...`)
html = html.replace(/`([^`]+)`/g, '<code class="help-inline-code">$1</code>');
// Tables
html = html.replace(
/(?:^|\n)(\|.+\|)\n(\|[\s:|-]+\|)\n((?:\|.+\|\n?)+)/g,
(_m, headerRow: string, _sep: string, bodyRows: string) => {
const headers = headerRow
.split("|")
.slice(1, -1)
.map((c: string) => `<th>${c.trim()}</th>`)
.join("");
const rows = bodyRows
.trim()
.split("\n")
.map((row: string) => {
const cells = row
.split("|")
.slice(1, -1)
.map((c: string) => `<td>${c.trim()}</td>`)
.join("");
return `<tr>${cells}</tr>`;
})
.join("");
return `<table class="help-table"><thead><tr>${headers}</tr></thead><tbody>${rows}</tbody></table>`;
},
);
// Blockquotes (> ...)
html = html.replace(/(?:^|\n)&gt; (.+)/g, '<blockquote class="help-blockquote">$1</blockquote>');
// Merge adjacent blockquotes
html = html.replace(/<\/blockquote>\s*<blockquote class="help-blockquote">/g, "<br/>");
// Horizontal rules
html = html.replace(/\n---\n/g, '<hr class="help-hr"/>');
// Headers with id attributes for anchor navigation (process from h4 down to h1)
html = html.replace(/^#### (.+)$/gm, (_m, title) => `<h4 class="help-h4" id="${slugify(title)}">${title}</h4>`);
html = html.replace(/^### (.+)$/gm, (_m, title) => `<h3 class="help-h3" id="${slugify(title)}">${title}</h3>`);
html = html.replace(/^## (.+)$/gm, (_m, title) => `<h2 class="help-h2" id="${slugify(title)}">${title}</h2>`);
html = html.replace(/^# (.+)$/gm, (_m, title) => `<h1 class="help-h1" id="${slugify(title)}">${title}</h1>`);
// Bold (**...**)
html = html.replace(/\*\*([^*]+)\*\*/g, "<strong>$1</strong>");
// Italic (*...*)
html = html.replace(/\*([^*]+)\*/g, "<em>$1</em>");
// Markdown-style anchor links [text](#anchor)
html = html.replace(
/\[([^\]]+)\]\(#([^)]+)\)/g,
'<a class="help-link" href="#$2">$1</a>',
);
// Markdown-style external links [text](url)
html = html.replace(
/\[([^\]]+)\]\((https?:\/\/[^)]+)\)/g,
'<a class="help-link" href="$2" target="_blank" rel="noopener noreferrer">$1</a>',
);
// Unordered list items (- ...)
// Group consecutive list items
html = html.replace(/((?:^|\n)- .+(?:\n- .+)*)/g, (block) => {
const items = block
.trim()
.split("\n")
.map((line) => `<li>${line.replace(/^- /, "")}</li>`)
.join("");
return `<ul class="help-ul">${items}</ul>`;
});
// Ordered list items (1. ...)
html = html.replace(/((?:^|\n)\d+\. .+(?:\n\d+\. .+)*)/g, (block) => {
const items = block
.trim()
.split("\n")
.map((line) => `<li>${line.replace(/^\d+\. /, "")}</li>`)
.join("");
return `<ol class="help-ol">${items}</ol>`;
});
// Links - convert bare URLs to clickable links (skip already-wrapped URLs)
html = html.replace(
/(?<!="|'>)(https?:\/\/[^\s<)]+)/g,
'<a class="help-link" href="$1" target="_blank" rel="noopener noreferrer">$1</a>',
);
// Wrap remaining loose text lines in paragraphs
// Split by double newlines for paragraph breaks
const blocks = html.split(/\n\n+/);
html = blocks
.map((block) => {
const trimmed = block.trim();
if (!trimmed) return "";
// Don't wrap blocks that are already HTML elements
if (
/^<(h[1-4]|ul|ol|pre|table|blockquote|hr)/.test(trimmed)
) {
return trimmed;
}
// Wrap in paragraph, replacing single newlines with <br/>
return `<p class="help-p">${trimmed.replace(/\n/g, "<br/>")}</p>`;
})
.join("\n");
return html;
}
export default function HelpDialog({ onClose }: Props) {
const overlayRef = useRef<HTMLDivElement>(null);
const contentRef = useRef<HTMLDivElement>(null);
const [markdown, setMarkdown] = useState<string | null>(null);
const [error, setError] = useState<string | null>(null);
useEffect(() => {
const handleKeyDown = (e: KeyboardEvent) => {
if (e.key === "Escape") onClose();
};
document.addEventListener("keydown", handleKeyDown);
return () => document.removeEventListener("keydown", handleKeyDown);
}, [onClose]);
useEffect(() => {
getHelpContent()
.then(setMarkdown)
.catch((e) => setError(String(e)));
}, []);
const handleOverlayClick = useCallback(
(e: React.MouseEvent<HTMLDivElement>) => {
if (e.target === overlayRef.current) onClose();
},
[onClose],
);
// Handle anchor link clicks to scroll within the dialog
const handleContentClick = useCallback((e: React.MouseEvent<HTMLDivElement>) => {
const target = e.target as HTMLElement;
const anchor = target.closest("a");
if (!anchor) return;
const href = anchor.getAttribute("href");
if (!href || !href.startsWith("#")) return;
e.preventDefault();
const el = contentRef.current?.querySelector(href);
if (el) el.scrollIntoView({ behavior: "smooth" });
}, []);
return (
<div
ref={overlayRef}
onClick={handleOverlayClick}
className="fixed inset-0 bg-black/50 flex items-center justify-center z-50"
>
<div className="bg-[var(--bg-secondary)] border border-[var(--border-color)] rounded-lg shadow-xl w-[48rem] max-w-[90vw] max-h-[85vh] flex flex-col">
{/* Header */}
<div className="flex items-center justify-between px-6 py-4 border-b border-[var(--border-color)] flex-shrink-0">
<h2 className="text-lg font-semibold">How to Use Triple-C</h2>
<button
onClick={onClose}
className="px-3 py-1.5 text-xs bg-[var(--bg-tertiary)] border border-[var(--border-color)] rounded hover:bg-[var(--border-color)] transition-colors"
>
Close
</button>
</div>
{/* Scrollable content */}
<div
ref={contentRef}
onClick={handleContentClick}
className="flex-1 overflow-y-auto px-6 py-4 help-content"
>
{error && (
<p className="text-[var(--error)] text-sm">Failed to load help content: {error}</p>
)}
{!markdown && !error && (
<p className="text-[var(--text-secondary)] text-sm">Loading...</p>
)}
{markdown && (
<div dangerouslySetInnerHTML={{ __html: renderMarkdown(markdown) }} />
)}
</div>
</div>
</div>
);
}

View File

@@ -2,8 +2,8 @@ import { useShallow } from "zustand/react/shallow";
import { useAppState } from "../../store/appState";
export default function StatusBar() {
const { projects, sessions } = useAppState(
useShallow(s => ({ projects: s.projects, sessions: s.sessions }))
const { projects, sessions, terminalHasSelection } = useAppState(
useShallow(s => ({ projects: s.projects, sessions: s.sessions, terminalHasSelection: s.terminalHasSelection }))
);
const running = projects.filter((p) => p.status === "running").length;
@@ -20,6 +20,12 @@ export default function StatusBar() {
<span>
{sessions.length} terminal{sessions.length !== 1 ? "s" : ""}
</span>
{terminalHasSelection && (
<>
<span className="mx-2">|</span>
<span className="text-[var(--accent)]">Ctrl+Shift+C to copy</span>
</>
)}
</div>
);
}

View File

@@ -4,19 +4,25 @@ import TerminalTabs from "../terminal/TerminalTabs";
import { useAppState } from "../../store/appState";
import { useSettings } from "../../hooks/useSettings";
import UpdateDialog from "../settings/UpdateDialog";
import ImageUpdateDialog from "../settings/ImageUpdateDialog";
import HelpDialog from "./HelpDialog";
export default function TopBar() {
const { dockerAvailable, imageExists, updateInfo, appVersion, setUpdateInfo } = useAppState(
const { dockerAvailable, imageExists, updateInfo, imageUpdateInfo, appVersion, setUpdateInfo, setImageUpdateInfo } = useAppState(
useShallow(s => ({
dockerAvailable: s.dockerAvailable,
imageExists: s.imageExists,
updateInfo: s.updateInfo,
imageUpdateInfo: s.imageUpdateInfo,
appVersion: s.appVersion,
setUpdateInfo: s.setUpdateInfo,
setImageUpdateInfo: s.setImageUpdateInfo,
}))
);
const { appSettings, saveSettings } = useSettings();
const [showUpdateDialog, setShowUpdateDialog] = useState(false);
const [showImageUpdateDialog, setShowImageUpdateDialog] = useState(false);
const [showHelpDialog, setShowHelpDialog] = useState(false);
const handleDismiss = async () => {
if (appSettings && updateInfo) {
@@ -29,6 +35,17 @@ export default function TopBar() {
setShowUpdateDialog(false);
};
const handleImageUpdateDismiss = async () => {
if (appSettings && imageUpdateInfo) {
await saveSettings({
...appSettings,
dismissed_image_digest: imageUpdateInfo.remote_digest,
});
}
setImageUpdateInfo(null);
setShowImageUpdateDialog(false);
};
return (
<>
<div className="flex items-center h-10 bg-[var(--bg-secondary)] border border-[var(--border-color)] rounded-lg overflow-hidden">
@@ -44,8 +61,24 @@ export default function TopBar() {
Update
</button>
)}
{imageUpdateInfo && (
<button
onClick={() => setShowImageUpdateDialog(true)}
className="px-2 py-0.5 rounded text-xs font-medium bg-[var(--warning,#f59e0b)] text-white hover:opacity-80 transition-colors"
title="A newer container image is available"
>
Image Update
</button>
)}
<StatusDot ok={dockerAvailable === true} label="Docker" />
<StatusDot ok={imageExists === true} label="Image" />
<button
onClick={() => setShowHelpDialog(true)}
title="Help"
className="ml-1 w-5 h-5 flex items-center justify-center rounded-full border border-[var(--border-color)] text-[var(--text-secondary)] hover:text-[var(--text-primary)] hover:border-[var(--text-secondary)] transition-colors text-xs font-semibold leading-none"
>
?
</button>
</div>
</div>
{showUpdateDialog && updateInfo && (
@@ -56,6 +89,16 @@ export default function TopBar() {
onClose={() => setShowUpdateDialog(false)}
/>
)}
{showImageUpdateDialog && imageUpdateInfo && (
<ImageUpdateDialog
imageUpdateInfo={imageUpdateInfo}
onDismiss={handleImageUpdateDismiss}
onClose={() => setShowImageUpdateDialog(false)}
/>
)}
{showHelpDialog && (
<HelpDialog onClose={() => setShowHelpDialog(false)} />
)}
</>
);
}

View File

@@ -57,7 +57,7 @@ const mockProject: Project = {
paths: [{ host_path: "/home/user/project", mount_name: "project" }],
container_id: null,
status: "stopped",
auth_mode: "anthropic",
backend: "anthropic",
bedrock_config: null,
allow_docker_access: false,
ssh_key_path: null,

View File

@@ -1,7 +1,7 @@
import { useState, useEffect } from "react";
import { open } from "@tauri-apps/plugin-dialog";
import { listen } from "@tauri-apps/api/event";
import type { Project, ProjectPath, AuthMode, BedrockConfig, BedrockAuthMethod, OllamaConfig, LiteLlmConfig } from "../../lib/types";
import type { Project, ProjectPath, Backend, BedrockConfig, BedrockAuthMethod, OllamaConfig, OpenAiCompatibleConfig } from "../../lib/types";
import { useProjects } from "../../hooks/useProjects";
import { useMcpServers } from "../../hooks/useMcpServers";
import { useTerminal } from "../../hooks/useTerminal";
@@ -12,6 +12,7 @@ import ClaudeInstructionsModal from "./ClaudeInstructionsModal";
import ContainerProgressModal from "./ContainerProgressModal";
import FileManagerModal from "./FileManagerModal";
import ConfirmRemoveModal from "./ConfirmRemoveModal";
import Tooltip from "../ui/Tooltip";
interface Props {
project: Project;
@@ -62,10 +63,10 @@ export default function ProjectCard({ project }: Props) {
const [ollamaBaseUrl, setOllamaBaseUrl] = useState(project.ollama_config?.base_url ?? "http://host.docker.internal:11434");
const [ollamaModelId, setOllamaModelId] = useState(project.ollama_config?.model_id ?? "");
// LiteLLM local state
const [litellmBaseUrl, setLitellmBaseUrl] = useState(project.litellm_config?.base_url ?? "http://host.docker.internal:4000");
const [litellmApiKey, setLitellmApiKey] = useState(project.litellm_config?.api_key ?? "");
const [litellmModelId, setLitellmModelId] = useState(project.litellm_config?.model_id ?? "");
// OpenAI Compatible local state
const [openaiCompatibleBaseUrl, setOpenaiCompatibleBaseUrl] = useState(project.openai_compatible_config?.base_url ?? "http://host.docker.internal:4000");
const [openaiCompatibleApiKey, setOpenaiCompatibleApiKey] = useState(project.openai_compatible_config?.api_key ?? "");
const [openaiCompatibleModelId, setOpenaiCompatibleModelId] = useState(project.openai_compatible_config?.model_id ?? "");
// Sync local state when project prop changes (e.g., after save or external update)
useEffect(() => {
@@ -87,9 +88,9 @@ export default function ProjectCard({ project }: Props) {
setBedrockModelId(project.bedrock_config?.model_id ?? "");
setOllamaBaseUrl(project.ollama_config?.base_url ?? "http://host.docker.internal:11434");
setOllamaModelId(project.ollama_config?.model_id ?? "");
setLitellmBaseUrl(project.litellm_config?.base_url ?? "http://host.docker.internal:4000");
setLitellmApiKey(project.litellm_config?.api_key ?? "");
setLitellmModelId(project.litellm_config?.model_id ?? "");
setOpenaiCompatibleBaseUrl(project.openai_compatible_config?.base_url ?? "http://host.docker.internal:4000");
setOpenaiCompatibleApiKey(project.openai_compatible_config?.api_key ?? "");
setOpenaiCompatibleModelId(project.openai_compatible_config?.model_id ?? "");
}, [project]);
// Listen for container progress events
@@ -196,23 +197,23 @@ export default function ProjectCard({ project }: Props) {
model_id: null,
};
const defaultLiteLlmConfig: LiteLlmConfig = {
const defaultOpenAiCompatibleConfig: OpenAiCompatibleConfig = {
base_url: "http://host.docker.internal:4000",
api_key: null,
model_id: null,
};
const handleAuthModeChange = async (mode: AuthMode) => {
const handleBackendChange = async (mode: Backend) => {
try {
const updates: Partial<Project> = { auth_mode: mode };
const updates: Partial<Project> = { backend: mode };
if (mode === "bedrock" && !project.bedrock_config) {
updates.bedrock_config = defaultBedrockConfig;
}
if (mode === "ollama" && !project.ollama_config) {
updates.ollama_config = defaultOllamaConfig;
}
if (mode === "lit_llm" && !project.litellm_config) {
updates.litellm_config = defaultLiteLlmConfig;
if (mode === "open_ai_compatible" && !project.openai_compatible_config) {
updates.openai_compatible_config = defaultOpenAiCompatibleConfig;
}
await update({ ...project, ...updates });
} catch (e) {
@@ -354,30 +355,30 @@ export default function ProjectCard({ project }: Props) {
}
};
const handleLitellmBaseUrlBlur = async () => {
const handleOpenaiCompatibleBaseUrlBlur = async () => {
try {
const current = project.litellm_config ?? defaultLiteLlmConfig;
await update({ ...project, litellm_config: { ...current, base_url: litellmBaseUrl } });
const current = project.openai_compatible_config ?? defaultOpenAiCompatibleConfig;
await update({ ...project, openai_compatible_config: { ...current, base_url: openaiCompatibleBaseUrl } });
} catch (err) {
console.error("Failed to update LiteLLM base URL:", err);
console.error("Failed to update OpenAI Compatible base URL:", err);
}
};
const handleLitellmApiKeyBlur = async () => {
const handleOpenaiCompatibleApiKeyBlur = async () => {
try {
const current = project.litellm_config ?? defaultLiteLlmConfig;
await update({ ...project, litellm_config: { ...current, api_key: litellmApiKey || null } });
const current = project.openai_compatible_config ?? defaultOpenAiCompatibleConfig;
await update({ ...project, openai_compatible_config: { ...current, api_key: openaiCompatibleApiKey || null } });
} catch (err) {
console.error("Failed to update LiteLLM API key:", err);
console.error("Failed to update OpenAI Compatible API key:", err);
}
};
const handleLitellmModelIdBlur = async () => {
const handleOpenaiCompatibleModelIdBlur = async () => {
try {
const current = project.litellm_config ?? defaultLiteLlmConfig;
await update({ ...project, litellm_config: { ...current, model_id: litellmModelId || null } });
const current = project.openai_compatible_config ?? defaultOpenAiCompatibleConfig;
await update({ ...project, openai_compatible_config: { ...current, model_id: openaiCompatibleModelId || null } });
} catch (err) {
console.error("Failed to update LiteLLM model ID:", err);
console.error("Failed to update OpenAI Compatible model ID:", err);
}
};
@@ -446,12 +447,12 @@ export default function ProjectCard({ project }: Props) {
{isSelected && (
<div className="mt-2 ml-4 space-y-2 min-w-0 overflow-hidden">
{/* Auth mode selector */}
{/* Backend selector */}
<div className="flex items-center gap-1 text-xs">
<span className="text-[var(--text-secondary)] mr-1">Auth:</span>
<span className="text-[var(--text-secondary)] mr-1">Backend:<Tooltip text="Choose the AI model provider for this project. Anthropic: Connect directly to Claude via OAuth login (run 'claude login' in terminal). Bedrock: Route through AWS Bedrock using your AWS credentials. Ollama: Use locally-hosted open-source models (Llama, Mistral, etc.) via an Ollama server. OpenAI Compatible: Connect through any OpenAI API-compatible endpoint (LiteLLM, OpenRouter, vLLM, etc.) to access 100+ model providers." /></span>
<select
value={project.auth_mode}
onChange={(e) => { e.stopPropagation(); handleAuthModeChange(e.target.value as AuthMode); }}
value={project.backend}
onChange={(e) => { e.stopPropagation(); handleBackendChange(e.target.value as Backend); }}
onClick={(e) => e.stopPropagation()}
disabled={!isStopped}
className="px-2 py-0.5 rounded bg-[var(--bg-primary)] border border-[var(--border-color)] text-xs text-[var(--text-primary)] focus:outline-none focus:border-[var(--accent)] disabled:opacity-50"
@@ -459,7 +460,7 @@ export default function ProjectCard({ project }: Props) {
<option value="anthropic">Anthropic</option>
<option value="bedrock">Bedrock</option>
<option value="ollama">Ollama</option>
<option value="lit_llm">LiteLLM</option>
<option value="open_ai_compatible">OpenAI Compatible</option>
</select>
</div>
@@ -609,7 +610,7 @@ export default function ProjectCard({ project }: Props) {
{/* SSH Key */}
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">SSH Key Directory</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">SSH Key Directory<Tooltip text="Path to your .ssh directory. Mounted into the container so Claude can authenticate with Git remotes over SSH." /></label>
<div className="flex gap-1">
<input
value={sshKeyPath}
@@ -631,7 +632,7 @@ export default function ProjectCard({ project }: Props) {
{/* Git Name */}
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Git Name</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Git Name<Tooltip text="Sets git user.name inside the container for commit authorship." /></label>
<input
value={gitName}
onChange={(e) => setGitName(e.target.value)}
@@ -644,7 +645,7 @@ export default function ProjectCard({ project }: Props) {
{/* Git Email */}
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Git Email</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Git Email<Tooltip text="Sets git user.email inside the container for commit authorship." /></label>
<input
value={gitEmail}
onChange={(e) => setGitEmail(e.target.value)}
@@ -657,7 +658,7 @@ export default function ProjectCard({ project }: Props) {
{/* Git Token (HTTPS) */}
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Git HTTPS Token</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Git HTTPS Token<Tooltip text="A personal access token (e.g. GitHub PAT) for HTTPS git operations inside the container." /></label>
<input
type="password"
value={gitToken}
@@ -671,7 +672,7 @@ export default function ProjectCard({ project }: Props) {
{/* Docker access toggle */}
<div className="flex items-center gap-2">
<label className="text-xs text-[var(--text-secondary)]">Allow container spawning</label>
<label className="text-xs text-[var(--text-secondary)]">Allow container spawning<Tooltip text="Mounts the Docker socket so Claude can build and run Docker containers from inside the sandbox." /></label>
<button
onClick={async () => {
try { await update({ ...project, allow_docker_access: !project.allow_docker_access }); } catch (err) {
@@ -691,7 +692,7 @@ export default function ProjectCard({ project }: Props) {
{/* Mission Control toggle */}
<div className="flex items-center gap-2">
<label className="text-xs text-[var(--text-secondary)]">Mission Control</label>
<label className="text-xs text-[var(--text-secondary)]">Mission Control<Tooltip text="Enables a web dashboard for monitoring and managing Claude sessions remotely." /></label>
<button
onClick={async () => {
try {
@@ -711,10 +712,36 @@ export default function ProjectCard({ project }: Props) {
</button>
</div>
{/* Full Permissions toggle */}
<div className="flex items-center gap-2">
<label className="text-xs text-[var(--text-secondary)]">
Full Permissions
<span className="text-[var(--error)] font-semibold ml-1">(CAUTION)</span>
<Tooltip text="When enabled, Claude runs with --dangerously-skip-permissions and auto-approves all tool calls without prompting. Only enable this if you trust the sandboxed environment to contain all actions. When disabled, Claude will ask for your approval before running commands, editing files, etc." />
</label>
<button
onClick={async () => {
try {
await update({ ...project, full_permissions: !project.full_permissions });
} catch (err) {
console.error("Failed to update full permissions setting:", err);
}
}}
disabled={!isStopped}
className={`px-2 py-0.5 text-xs rounded transition-colors disabled:opacity-50 ${
project.full_permissions
? "bg-[var(--error)] text-white"
: "bg-[var(--bg-primary)] border border-[var(--border-color)] text-[var(--text-secondary)]"
}`}
>
{project.full_permissions ? "ON" : "OFF"}
</button>
</div>
{/* Environment Variables */}
<div className="flex items-center justify-between">
<label className="text-xs text-[var(--text-secondary)]">
Environment Variables{envVars.length > 0 && ` (${envVars.length})`}
Environment Variables{envVars.length > 0 && ` (${envVars.length})`}<Tooltip text="Custom env vars injected into this project's container. Useful for API keys or tool configuration." />
</label>
<button
onClick={() => setShowEnvVarsModal(true)}
@@ -727,7 +754,7 @@ export default function ProjectCard({ project }: Props) {
{/* Port Mappings */}
<div className="flex items-center justify-between">
<label className="text-xs text-[var(--text-secondary)]">
Port Mappings{portMappings.length > 0 && ` (${portMappings.length})`}
Port Mappings{portMappings.length > 0 && ` (${portMappings.length})`}<Tooltip text="Map container ports to host ports so you can access dev servers running inside the container." />
</label>
<button
onClick={() => setShowPortMappingsModal(true)}
@@ -740,7 +767,7 @@ export default function ProjectCard({ project }: Props) {
{/* Claude Instructions */}
<div className="flex items-center justify-between">
<label className="text-xs text-[var(--text-secondary)]">
Claude Instructions{claudeInstructions ? " (set)" : ""}
Claude Instructions{claudeInstructions ? " (set)" : ""}<Tooltip text="Project-specific instructions written to CLAUDE.md. Guides Claude's behavior for this project." />
</label>
<button
onClick={() => setShowClaudeInstructionsModal(true)}
@@ -753,7 +780,7 @@ export default function ProjectCard({ project }: Props) {
{/* MCP Servers */}
{mcpServers.length > 0 && (
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-1">MCP Servers</label>
<label className="block text-xs text-[var(--text-secondary)] mb-1">MCP Servers<Tooltip text="Model Context Protocol servers give Claude access to external tools and data sources." /></label>
<div className="space-y-1">
{mcpServers.map((server) => {
const enabled = project.enabled_mcp_servers.includes(server.id);
@@ -794,7 +821,7 @@ export default function ProjectCard({ project }: Props) {
)}
{/* Bedrock config */}
{project.auth_mode === "bedrock" && (() => {
{project.backend === "bedrock" && (() => {
const bc = project.bedrock_config ?? defaultBedrockConfig;
const inputCls = "w-full px-2 py-1 bg-[var(--bg-primary)] border border-[var(--border-color)] rounded text-xs text-[var(--text-primary)] focus:outline-none focus:border-[var(--accent)] disabled:opacity-50";
return (
@@ -819,7 +846,7 @@ export default function ProjectCard({ project }: Props) {
{/* AWS Region (always shown) */}
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">AWS Region</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">AWS Region<Tooltip text="The AWS region where your Bedrock endpoint is available (e.g. us-east-1)." /></label>
<input
value={bedrockRegion}
onChange={(e) => setBedrockRegion(e.target.value)}
@@ -834,7 +861,7 @@ export default function ProjectCard({ project }: Props) {
{bc.auth_method === "static_credentials" && (
<>
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Access Key ID</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Access Key ID<Tooltip text="Your AWS IAM access key ID for Bedrock API authentication." /></label>
<input
value={bedrockAccessKeyId}
onChange={(e) => setBedrockAccessKeyId(e.target.value)}
@@ -845,7 +872,7 @@ export default function ProjectCard({ project }: Props) {
/>
</div>
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Secret Access Key</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Secret Access Key<Tooltip text="Your AWS IAM secret key. Stored locally and injected as an env var into the container." /></label>
<input
type="password"
value={bedrockSecretKey}
@@ -856,7 +883,7 @@ export default function ProjectCard({ project }: Props) {
/>
</div>
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Session Token (optional)</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Session Token (optional)<Tooltip text="Temporary session token for assumed-role or MFA-based AWS credentials." /></label>
<input
type="password"
value={bedrockSessionToken}
@@ -872,7 +899,7 @@ export default function ProjectCard({ project }: Props) {
{/* Profile field */}
{bc.auth_method === "profile" && (
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">AWS Profile</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">AWS Profile<Tooltip text="Named profile from your AWS config/credentials files (e.g. 'default' or 'prod')." /></label>
<input
value={bedrockProfile}
onChange={(e) => setBedrockProfile(e.target.value)}
@@ -887,7 +914,7 @@ export default function ProjectCard({ project }: Props) {
{/* Bearer token field */}
{bc.auth_method === "bearer_token" && (
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Bearer Token</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Bearer Token<Tooltip text="An SSO or identity-center bearer token for Bedrock authentication." /></label>
<input
type="password"
value={bedrockBearerToken}
@@ -901,7 +928,7 @@ export default function ProjectCard({ project }: Props) {
{/* Model override */}
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Model ID (optional)</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Model ID (optional)<Tooltip text="Override the default Bedrock model. Leave blank to use Claude's default." /></label>
<input
value={bedrockModelId}
onChange={(e) => setBedrockModelId(e.target.value)}
@@ -916,7 +943,7 @@ export default function ProjectCard({ project }: Props) {
})()}
{/* Ollama config */}
{project.auth_mode === "ollama" && (() => {
{project.backend === "ollama" && (() => {
const inputCls = "w-full px-2 py-1 bg-[var(--bg-primary)] border border-[var(--border-color)] rounded text-xs text-[var(--text-primary)] focus:outline-none focus:border-[var(--accent)] disabled:opacity-50";
return (
<div className="space-y-2 pt-1 border-t border-[var(--border-color)]">
@@ -926,7 +953,7 @@ export default function ProjectCard({ project }: Props) {
</p>
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Base URL</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Base URL<Tooltip text="URL of your Ollama server. Use host.docker.internal to reach the host machine from inside the container." /></label>
<input
value={ollamaBaseUrl}
onChange={(e) => setOllamaBaseUrl(e.target.value)}
@@ -941,7 +968,7 @@ export default function ProjectCard({ project }: Props) {
</div>
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Model (optional)</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Model (required)<Tooltip text="Ollama model name to use (e.g. qwen3.5:27b). The model must be pulled in Ollama before starting the container." /></label>
<input
value={ollamaModelId}
onChange={(e) => setOllamaModelId(e.target.value)}
@@ -955,38 +982,38 @@ export default function ProjectCard({ project }: Props) {
);
})()}
{/* LiteLLM config */}
{project.auth_mode === "lit_llm" && (() => {
{/* OpenAI Compatible config */}
{project.backend === "open_ai_compatible" && (() => {
const inputCls = "w-full px-2 py-1 bg-[var(--bg-primary)] border border-[var(--border-color)] rounded text-xs text-[var(--text-primary)] focus:outline-none focus:border-[var(--accent)] disabled:opacity-50";
return (
<div className="space-y-2 pt-1 border-t border-[var(--border-color)]">
<label className="block text-xs font-medium text-[var(--text-primary)]">LiteLLM Gateway</label>
<label className="block text-xs font-medium text-[var(--text-primary)]">OpenAI Compatible Endpoint</label>
<p className="text-xs text-[var(--text-secondary)]">
Connect through a LiteLLM proxy to use 100+ model providers.
Connect through any OpenAI API-compatible endpoint (LiteLLM, OpenRouter, vLLM, etc.).
</p>
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Base URL</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Base URL<Tooltip text="URL of your OpenAI API-compatible server. Use host.docker.internal for a locally running service." /></label>
<input
value={litellmBaseUrl}
onChange={(e) => setLitellmBaseUrl(e.target.value)}
onBlur={handleLitellmBaseUrlBlur}
value={openaiCompatibleBaseUrl}
onChange={(e) => setOpenaiCompatibleBaseUrl(e.target.value)}
onBlur={handleOpenaiCompatibleBaseUrlBlur}
placeholder="http://host.docker.internal:4000"
disabled={!isStopped}
className={inputCls}
/>
<p className="text-xs text-[var(--text-secondary)] mt-0.5 opacity-70">
Use host.docker.internal for local, or a URL for remote/containerized LiteLLM.
Use host.docker.internal for local, or a URL for a remote OpenAI-compatible service.
</p>
</div>
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">API Key</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">API Key<Tooltip text="Authentication key for your OpenAI-compatible endpoint, if required." /></label>
<input
type="password"
value={litellmApiKey}
onChange={(e) => setLitellmApiKey(e.target.value)}
onBlur={handleLitellmApiKeyBlur}
value={openaiCompatibleApiKey}
onChange={(e) => setOpenaiCompatibleApiKey(e.target.value)}
onBlur={handleOpenaiCompatibleApiKeyBlur}
placeholder="sk-..."
disabled={!isStopped}
className={inputCls}
@@ -994,11 +1021,11 @@ export default function ProjectCard({ project }: Props) {
</div>
<div>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Model (optional)</label>
<label className="block text-xs text-[var(--text-secondary)] mb-0.5">Model (optional)<Tooltip text="Model identifier as configured in your provider (e.g. gpt-4o, gemini-pro)." /></label>
<input
value={litellmModelId}
onChange={(e) => setLitellmModelId(e.target.value)}
onBlur={handleLitellmModelIdBlur}
value={openaiCompatibleModelId}
onChange={(e) => setOpenaiCompatibleModelId(e.target.value)}
onBlur={handleOpenaiCompatibleModelIdBlur}
placeholder="gpt-4o / gemini-pro / etc."
disabled={!isStopped}
className={inputCls}

View File

@@ -1,9 +1,9 @@
export default function ApiKeyInput() {
return (
<div>
<label className="block text-sm font-medium mb-1">Authentication</label>
<label className="block text-sm font-medium mb-1">Backend</label>
<p className="text-xs text-[var(--text-secondary)] mb-3">
Each project can use <strong>claude login</strong> (OAuth, run inside the terminal) or <strong>AWS Bedrock</strong>. Set auth mode per-project.
Each project can use <strong>claude login</strong> (OAuth, run inside the terminal) or <strong>AWS Bedrock</strong>. Set backend per-project.
</p>
</div>
);

View File

@@ -1,6 +1,7 @@
import { useState, useEffect } from "react";
import { useSettings } from "../../hooks/useSettings";
import * as commands from "../../lib/tauri-commands";
import Tooltip from "../ui/Tooltip";
export default function AwsSettings() {
const { appSettings, saveSettings } = useSettings();
@@ -56,7 +57,7 @@ export default function AwsSettings() {
{/* AWS Config Path */}
<div>
<span className="text-[var(--text-secondary)] text-xs block mb-1">AWS Config Path</span>
<span className="text-[var(--text-secondary)] text-xs block mb-1">AWS Config Path<Tooltip text="Path to your AWS config/credentials directory. Mounted into containers for Bedrock auth." /></span>
<div className="flex gap-2">
<input
type="text"
@@ -80,7 +81,7 @@ export default function AwsSettings() {
{/* AWS Profile */}
<div>
<span className="text-[var(--text-secondary)] text-xs block mb-1">Default Profile</span>
<span className="text-[var(--text-secondary)] text-xs block mb-1">Default Profile<Tooltip text="AWS named profile to use by default. Per-project settings can override this." /></span>
<select
value={globalAws.aws_profile ?? ""}
onChange={(e) => handleChange("aws_profile", e.target.value)}
@@ -95,7 +96,7 @@ export default function AwsSettings() {
{/* AWS Region */}
<div>
<span className="text-[var(--text-secondary)] text-xs block mb-1">Default Region</span>
<span className="text-[var(--text-secondary)] text-xs block mb-1">Default Region<Tooltip text="Default AWS region for Bedrock API calls (e.g. us-east-1). Can be overridden per project." /></span>
<input
type="text"
value={globalAws.aws_region ?? ""}

View File

@@ -2,6 +2,7 @@ import { useState } from "react";
import { useDocker } from "../../hooks/useDocker";
import { useSettings } from "../../hooks/useSettings";
import type { ImageSource } from "../../lib/types";
import Tooltip from "../ui/Tooltip";
const REGISTRY_IMAGE = "repo.anhonesthost.net/cybercovellc/triple-c/triple-c-sandbox:latest";
@@ -87,7 +88,7 @@ export default function DockerSettings() {
{/* Image Source Selector */}
<div>
<span className="text-[var(--text-secondary)] text-xs block mb-1.5">Image Source</span>
<span className="text-[var(--text-secondary)] text-xs block mb-1.5">Image Source<Tooltip text="Registry pulls the pre-built image. Local Build compiles from the bundled Dockerfile. Custom lets you specify any image." /></span>
<div className="flex gap-1">
{IMAGE_SOURCE_OPTIONS.map((opt) => (
<button
@@ -109,7 +110,7 @@ export default function DockerSettings() {
{/* Custom image input */}
{imageSource === "custom" && (
<div>
<span className="text-[var(--text-secondary)] text-xs block mb-1">Custom Image</span>
<span className="text-[var(--text-secondary)] text-xs block mb-1">Custom Image<Tooltip text="Full image name including registry and tag (e.g. myregistry.com/image:tag)." /></span>
<input
type="text"
value={customInput}
@@ -121,9 +122,9 @@ export default function DockerSettings() {
)}
{/* Resolved image display */}
<div className="flex items-center justify-between">
<div>
<span className="text-[var(--text-secondary)]">Image</span>
<span className="text-xs text-[var(--text-secondary)] truncate max-w-[200px]" title={resolvedImageName}>
<span className="block text-xs text-[var(--text-secondary)] font-mono mt-0.5 truncate" title={resolvedImageName}>
{resolvedImageName}
</span>
</div>

View File

@@ -0,0 +1,91 @@
import { useEffect, useRef, useCallback } from "react";
import type { ImageUpdateInfo } from "../../lib/types";
interface Props {
imageUpdateInfo: ImageUpdateInfo;
onDismiss: () => void;
onClose: () => void;
}
export default function ImageUpdateDialog({
imageUpdateInfo,
onDismiss,
onClose,
}: Props) {
const overlayRef = useRef<HTMLDivElement>(null);
useEffect(() => {
const handleKeyDown = (e: KeyboardEvent) => {
if (e.key === "Escape") onClose();
};
document.addEventListener("keydown", handleKeyDown);
return () => document.removeEventListener("keydown", handleKeyDown);
}, [onClose]);
const handleOverlayClick = useCallback(
(e: React.MouseEvent<HTMLDivElement>) => {
if (e.target === overlayRef.current) onClose();
},
[onClose],
);
const shortDigest = (digest: string) => {
// Show first 16 chars of the hash part (after "sha256:")
const hash = digest.startsWith("sha256:") ? digest.slice(7) : digest;
return hash.slice(0, 16);
};
return (
<div
ref={overlayRef}
onClick={handleOverlayClick}
className="fixed inset-0 bg-black/50 flex items-center justify-center z-50"
>
<div className="bg-[var(--bg-secondary)] border border-[var(--border-color)] rounded-lg p-6 w-[28rem] max-h-[80vh] overflow-y-auto shadow-xl">
<h2 className="text-lg font-semibold mb-3">Container Image Update</h2>
<p className="text-sm text-[var(--text-secondary)] mb-4">
A newer version of the container image is available in the registry.
Re-pull the image in Docker settings to get the latest tools and fixes.
</p>
<div className="space-y-2 mb-4 text-xs bg-[var(--bg-primary)] rounded p-3 border border-[var(--border-color)]">
{imageUpdateInfo.local_digest && (
<div className="flex justify-between">
<span className="text-[var(--text-secondary)]">Local digest</span>
<span className="font-mono text-[var(--text-primary)]">
{shortDigest(imageUpdateInfo.local_digest)}...
</span>
</div>
)}
<div className="flex justify-between">
<span className="text-[var(--text-secondary)]">Remote digest</span>
<span className="font-mono text-[var(--accent)]">
{shortDigest(imageUpdateInfo.remote_digest)}...
</span>
</div>
</div>
<p className="text-xs text-[var(--text-secondary)] mb-4">
Go to Settings &gt; Docker and click &quot;Re-pull Image&quot; to update.
Running containers will not be affected until restarted.
</p>
<div className="flex items-center justify-end gap-2">
<button
onClick={onDismiss}
className="px-3 py-1.5 text-xs text-[var(--text-secondary)] hover:text-[var(--text-primary)] transition-colors"
>
Dismiss
</button>
<button
onClick={onClose}
className="px-3 py-1.5 text-xs bg-[var(--bg-tertiary)] border border-[var(--border-color)] rounded hover:bg-[var(--border-color)] transition-colors"
>
Close
</button>
</div>
</div>
</div>
);
}

View File

@@ -1,5 +1,4 @@
import { useState, useEffect } from "react";
import ApiKeyInput from "./ApiKeyInput";
import DockerSettings from "./DockerSettings";
import AwsSettings from "./AwsSettings";
import { useSettings } from "../../hooks/useSettings";
@@ -8,10 +7,11 @@ import ClaudeInstructionsModal from "../projects/ClaudeInstructionsModal";
import EnvVarsModal from "../projects/EnvVarsModal";
import { detectHostTimezone } from "../../lib/tauri-commands";
import type { EnvVar } from "../../lib/types";
import Tooltip from "../ui/Tooltip";
export default function SettingsPanel() {
const { appSettings, saveSettings } = useSettings();
const { appVersion, checkForUpdates } = useUpdates();
const { appVersion, imageUpdateInfo, checkForUpdates, checkImageUpdate } = useUpdates();
const [globalInstructions, setGlobalInstructions] = useState(appSettings?.global_claude_instructions ?? "");
const [globalEnvVars, setGlobalEnvVars] = useState<EnvVar[]>(appSettings?.global_custom_env_vars ?? []);
const [checkingUpdates, setCheckingUpdates] = useState(false);
@@ -39,7 +39,7 @@ export default function SettingsPanel() {
const handleCheckNow = async () => {
setCheckingUpdates(true);
try {
await checkForUpdates();
await Promise.all([checkForUpdates(), checkImageUpdate()]);
} finally {
setCheckingUpdates(false);
}
@@ -55,13 +55,12 @@ export default function SettingsPanel() {
<h2 className="text-xs font-semibold uppercase text-[var(--text-secondary)]">
Settings
</h2>
<ApiKeyInput />
<DockerSettings />
<AwsSettings />
{/* Container Timezone */}
<div>
<label className="block text-sm font-medium mb-1">Container Timezone</label>
<label className="block text-sm font-medium mb-1">Container Timezone<Tooltip text="Sets the timezone inside containers. Affects scheduled task timing and log timestamps." /></label>
<p className="text-xs text-[var(--text-secondary)] mb-1.5">
Timezone for containers affects scheduled task timing (IANA format, e.g. America/New_York)
</p>
@@ -81,7 +80,7 @@ export default function SettingsPanel() {
{/* Global Claude Instructions */}
<div>
<label className="block text-sm font-medium mb-1">Claude Instructions</label>
<label className="block text-sm font-medium mb-1">Claude Instructions<Tooltip text="Global instructions applied to all projects. Written to ~/.claude/CLAUDE.md in every container." /></label>
<p className="text-xs text-[var(--text-secondary)] mb-1.5">
Global instructions applied to all projects (written to ~/.claude/CLAUDE.md in containers)
</p>
@@ -100,7 +99,7 @@ export default function SettingsPanel() {
{/* Global Environment Variables */}
<div>
<label className="block text-sm font-medium mb-1">Global Environment Variables</label>
<label className="block text-sm font-medium mb-1">Global Environment Variables<Tooltip text="Env vars injected into all containers. Per-project vars with the same key take precedence." /></label>
<p className="text-xs text-[var(--text-secondary)] mb-1.5">
Applied to all project containers. Per-project variables override global ones with the same key.
</p>
@@ -119,7 +118,7 @@ export default function SettingsPanel() {
{/* Updates section */}
<div>
<label className="block text-sm font-medium mb-2">Updates</label>
<label className="block text-sm font-medium mb-2">Updates<Tooltip text="Check for new versions of the Triple-C app and container image." /></label>
<div className="space-y-2">
{appVersion && (
<p className="text-xs text-[var(--text-secondary)]">
@@ -146,6 +145,12 @@ export default function SettingsPanel() {
>
{checkingUpdates ? "Checking..." : "Check now"}
</button>
{imageUpdateInfo && (
<div className="flex items-center gap-2 px-3 py-2 text-xs bg-[var(--bg-primary)] border border-[var(--warning,#f59e0b)] rounded">
<span className="inline-block w-2 h-2 rounded-full bg-[var(--warning,#f59e0b)]" />
<span>A newer container image is available. Re-pull the image in Docker settings above to update.</span>
</div>
)}
</div>
</div>

View File

@@ -24,6 +24,7 @@ export default function TerminalView({ sessionId, active }: Props) {
const webglRef = useRef<WebglAddon | null>(null);
const detectorRef = useRef<UrlDetector | null>(null);
const { sendInput, pasteImage, resize, onOutput, onExit } = useTerminal();
const setTerminalHasSelection = useAppState(s => s.setTerminalHasSelection);
const ssoBufferRef = useRef("");
const ssoTriggeredRef = useRef(false);
@@ -80,6 +81,22 @@ export default function TerminalView({ sessionId, active }: Props) {
term.open(containerRef.current);
// Ctrl+Shift+C copies selected terminal text to clipboard.
// This prevents the keystroke from reaching the container (where
// Ctrl+C would send SIGINT and cancel running work).
term.attachCustomKeyEventHandler((event) => {
if (event.type === "keydown" && event.ctrlKey && event.shiftKey && event.key === "C") {
const sel = term.getSelection();
if (sel) {
navigator.clipboard.writeText(sel).catch((e) =>
console.error("Ctrl+Shift+C clipboard write failed:", e),
);
}
return false; // prevent xterm from processing this key
}
return true;
});
// WebGL addon is loaded/disposed dynamically in the active effect
// to avoid exhausting the browser's limited WebGL context pool.
@@ -120,6 +137,11 @@ export default function TerminalView({ sessionId, active }: Props) {
setIsAtBottom(buf.viewportY >= buf.baseY);
});
// Track text selection to show copy hint in status bar
const selectionDisposable = term.onSelectionChange(() => {
setTerminalHasSelection(term.hasSelection());
});
// Handle image paste: intercept paste events with image data,
// upload to the container, and inject the file path into terminal input.
const handlePaste = (e: ClipboardEvent) => {
@@ -222,6 +244,8 @@ export default function TerminalView({ sessionId, active }: Props) {
osc52Disposable.dispose();
inputDisposable.dispose();
scrollDisposable.dispose();
selectionDisposable.dispose();
setTerminalHasSelection(false);
containerRef.current?.removeEventListener("paste", handlePaste, { capture: true });
outputPromise.then((fn) => fn?.());
exitPromise.then((fn) => fn?.());

View File

@@ -0,0 +1,78 @@
import { useState, useRef, useLayoutEffect, type ReactNode } from "react";
import { createPortal } from "react-dom";
interface TooltipProps {
text: string;
children?: ReactNode;
}
/**
* A small circled question-mark icon that shows a tooltip on hover.
* Uses a portal to render at `document.body` so the tooltip is never
* clipped by ancestor `overflow: hidden` containers.
*/
export default function Tooltip({ text, children }: TooltipProps) {
const [visible, setVisible] = useState(false);
const [coords, setCoords] = useState({ top: 0, left: 0 });
const [, setPlacement] = useState<"top" | "bottom">("top");
const triggerRef = useRef<HTMLSpanElement>(null);
const tooltipRef = useRef<HTMLDivElement>(null);
useLayoutEffect(() => {
if (!visible || !triggerRef.current || !tooltipRef.current) return;
const trigger = triggerRef.current.getBoundingClientRect();
const tooltip = tooltipRef.current.getBoundingClientRect();
const gap = 6;
// Vertical: prefer above, fall back to below
const above = trigger.top - tooltip.height - gap >= 4;
const pos = above ? "top" : "bottom";
setPlacement(pos);
const top =
pos === "top"
? trigger.top - tooltip.height - gap
: trigger.bottom + gap;
// Horizontal: center on trigger, clamp to viewport
let left = trigger.left + trigger.width / 2 - tooltip.width / 2;
left = Math.max(4, Math.min(left, window.innerWidth - tooltip.width - 4));
setCoords({ top, left });
}, [visible]);
return (
<span
ref={triggerRef}
className="inline-flex items-center ml-1"
onMouseEnter={() => setVisible(true)}
onMouseLeave={() => setVisible(false)}
>
{children ?? (
<span
className="inline-flex items-center justify-center w-3.5 h-3.5 rounded-full border border-[var(--text-secondary)] text-[var(--text-secondary)] text-[9px] leading-none cursor-help select-none hover:border-[var(--accent)] hover:text-[var(--accent)] transition-colors"
aria-label="Help"
>
?
</span>
)}
{visible &&
createPortal(
<div
ref={tooltipRef}
style={{
position: "fixed",
top: coords.top,
left: coords.left,
zIndex: 9999,
}}
className={`px-2.5 py-1.5 text-[11px] leading-snug text-[var(--text-primary)] bg-[var(--bg-tertiary)] border border-[var(--border-color)] rounded shadow-lg whitespace-normal max-w-[280px] w-max pointer-events-none`}
>
{text}
</div>,
document.body
)}
</span>
);
}

View File

@@ -6,16 +6,25 @@ import * as commands from "../lib/tauri-commands";
const CHECK_INTERVAL_MS = 24 * 60 * 60 * 1000; // 24 hours
export function useUpdates() {
const { updateInfo, setUpdateInfo, appVersion, setAppVersion, appSettings } =
useAppState(
useShallow((s) => ({
updateInfo: s.updateInfo,
setUpdateInfo: s.setUpdateInfo,
appVersion: s.appVersion,
setAppVersion: s.setAppVersion,
appSettings: s.appSettings,
})),
);
const {
updateInfo,
setUpdateInfo,
imageUpdateInfo,
setImageUpdateInfo,
appVersion,
setAppVersion,
appSettings,
} = useAppState(
useShallow((s) => ({
updateInfo: s.updateInfo,
setUpdateInfo: s.setUpdateInfo,
imageUpdateInfo: s.imageUpdateInfo,
setImageUpdateInfo: s.setImageUpdateInfo,
appVersion: s.appVersion,
setAppVersion: s.setAppVersion,
appSettings: s.appSettings,
})),
);
const intervalRef = useRef<ReturnType<typeof setInterval> | null>(null);
@@ -47,11 +56,31 @@ export function useUpdates() {
}
}, [setUpdateInfo, appSettings?.dismissed_update_version]);
const checkImageUpdate = useCallback(async () => {
try {
const info = await commands.checkImageUpdate();
if (info) {
// Respect dismissed image digest
const dismissed = appSettings?.dismissed_image_digest;
if (dismissed && dismissed === info.remote_digest) {
setImageUpdateInfo(null);
return null;
}
}
setImageUpdateInfo(info);
return info;
} catch (e) {
console.error("Failed to check for image updates:", e);
return null;
}
}, [setImageUpdateInfo, appSettings?.dismissed_image_digest]);
const startPeriodicCheck = useCallback(() => {
if (intervalRef.current) return;
intervalRef.current = setInterval(() => {
if (appSettings?.auto_check_updates !== false) {
checkForUpdates();
checkImageUpdate();
}
}, CHECK_INTERVAL_MS);
return () => {
@@ -60,13 +89,15 @@ export function useUpdates() {
intervalRef.current = null;
}
};
}, [checkForUpdates, appSettings?.auto_check_updates]);
}, [checkForUpdates, checkImageUpdate, appSettings?.auto_check_updates]);
return {
updateInfo,
imageUpdateInfo,
appVersion,
loadVersion,
checkForUpdates,
checkImageUpdate,
startPeriodicCheck,
};
}

View File

@@ -53,3 +53,135 @@ body {
to { opacity: 1; transform: translate(-50%, 0); }
}
.animate-slide-down { animation: slide-down 0.2s ease-out; }
/* Help dialog content styles */
.help-content {
font-size: 0.8125rem;
line-height: 1.6;
color: var(--text-primary);
}
.help-content .help-h1 {
font-size: 1.5rem;
font-weight: 700;
margin: 0 0 1rem 0;
color: var(--text-primary);
}
.help-content .help-h2 {
font-size: 1.15rem;
font-weight: 600;
margin: 1.5rem 0 0.75rem 0;
padding-bottom: 0.375rem;
border-bottom: 1px solid var(--border-color);
color: var(--text-primary);
}
.help-content .help-h3 {
font-size: 0.95rem;
font-weight: 600;
margin: 1.25rem 0 0.5rem 0;
color: var(--text-primary);
}
.help-content .help-h4 {
font-size: 0.875rem;
font-weight: 600;
margin: 1rem 0 0.375rem 0;
color: var(--text-secondary);
}
.help-content .help-p {
margin: 0.5rem 0;
}
.help-content .help-ul,
.help-content .help-ol {
margin: 0.5rem 0;
padding-left: 1.5rem;
}
.help-content .help-ul {
list-style-type: disc;
}
.help-content .help-ol {
list-style-type: decimal;
}
.help-content .help-ul li,
.help-content .help-ol li {
margin: 0.25rem 0;
}
.help-content .help-code-block {
display: block;
background: var(--bg-primary);
border: 1px solid var(--border-color);
border-radius: 6px;
padding: 0.75rem 1rem;
margin: 0.5rem 0;
overflow-x: auto;
font-family: "SFMono-Regular", Consolas, "Liberation Mono", Menlo, monospace;
font-size: 0.75rem;
line-height: 1.5;
white-space: pre;
}
.help-content .help-inline-code {
background: var(--bg-tertiary);
border: 1px solid var(--border-color);
border-radius: 3px;
padding: 0.125rem 0.375rem;
font-family: "SFMono-Regular", Consolas, "Liberation Mono", Menlo, monospace;
font-size: 0.75rem;
}
.help-content .help-table {
width: 100%;
border-collapse: collapse;
margin: 0.5rem 0;
font-size: 0.75rem;
}
.help-content .help-table th,
.help-content .help-table td {
border: 1px solid var(--border-color);
padding: 0.375rem 0.625rem;
text-align: left;
}
.help-content .help-table th {
background: var(--bg-tertiary);
font-weight: 600;
}
.help-content .help-table td {
background: var(--bg-primary);
}
.help-content .help-blockquote {
border-left: 3px solid var(--accent);
background: var(--bg-primary);
margin: 0.5rem 0;
padding: 0.5rem 0.75rem;
border-radius: 0 4px 4px 0;
color: var(--text-secondary);
font-size: 0.75rem;
}
.help-content .help-hr {
border: none;
border-top: 1px solid var(--border-color);
margin: 1.5rem 0;
}
.help-content .help-link {
color: var(--accent);
text-decoration: none;
}
.help-content .help-link:hover {
color: var(--accent-hover);
text-decoration: underline;
}

View File

@@ -1,5 +1,5 @@
import { invoke } from "@tauri-apps/api/core";
import type { Project, ProjectPath, ContainerInfo, SiblingContainer, AppSettings, UpdateInfo, McpServer, FileEntry } from "./types";
import type { Project, ProjectPath, ContainerInfo, SiblingContainer, AppSettings, UpdateInfo, ImageUpdateInfo, McpServer, FileEntry } from "./types";
// Docker
export const checkDocker = () => invoke<boolean>("check_docker");
@@ -83,3 +83,8 @@ export const uploadFileToContainer = (projectId: string, hostPath: string, conta
export const getAppVersion = () => invoke<string>("get_app_version");
export const checkForUpdates = () =>
invoke<UpdateInfo | null>("check_for_updates");
export const checkImageUpdate = () =>
invoke<ImageUpdateInfo | null>("check_image_update");
// Help
export const getHelpContent = () => invoke<string>("get_help_content");

View File

@@ -20,12 +20,13 @@ export interface Project {
paths: ProjectPath[];
container_id: string | null;
status: ProjectStatus;
auth_mode: AuthMode;
backend: Backend;
bedrock_config: BedrockConfig | null;
ollama_config: OllamaConfig | null;
litellm_config: LiteLlmConfig | null;
openai_compatible_config: OpenAiCompatibleConfig | null;
allow_docker_access: boolean;
mission_control_enabled: boolean;
full_permissions: boolean;
ssh_key_path: string | null;
git_token: string | null;
git_user_name: string | null;
@@ -45,7 +46,7 @@ export type ProjectStatus =
| "stopping"
| "error";
export type AuthMode = "anthropic" | "bedrock" | "ollama" | "lit_llm";
export type Backend = "anthropic" | "bedrock" | "ollama" | "open_ai_compatible";
export type BedrockAuthMethod = "static_credentials" | "profile" | "bearer_token";
@@ -66,7 +67,7 @@ export interface OllamaConfig {
model_id: string | null;
}
export interface LiteLlmConfig {
export interface OpenAiCompatibleConfig {
base_url: string;
api_key: string | null;
model_id: string | null;
@@ -116,6 +117,7 @@ export interface AppSettings {
dismissed_update_version: string | null;
timezone: string | null;
default_microphone: string | null;
dismissed_image_digest: string | null;
}
export interface UpdateInfo {
@@ -133,6 +135,12 @@ export interface ReleaseAsset {
size: number;
}
export interface ImageUpdateInfo {
remote_digest: string;
local_digest: string | null;
remote_updated_at: string | null;
}
export type McpTransportType = "stdio" | "http";
export interface McpServer {

View File

@@ -1,5 +1,5 @@
import { create } from "zustand";
import type { Project, TerminalSession, AppSettings, UpdateInfo, McpServer } from "../lib/types";
import type { Project, TerminalSession, AppSettings, UpdateInfo, ImageUpdateInfo, McpServer } from "../lib/types";
interface AppState {
// Projects
@@ -24,6 +24,8 @@ interface AppState {
removeMcpServerFromList: (id: string) => void;
// UI state
terminalHasSelection: boolean;
setTerminalHasSelection: (has: boolean) => void;
sidebarView: "projects" | "mcp" | "settings";
setSidebarView: (view: "projects" | "mcp" | "settings") => void;
dockerAvailable: boolean | null;
@@ -39,6 +41,10 @@ interface AppState {
setUpdateInfo: (info: UpdateInfo | null) => void;
appVersion: string;
setAppVersion: (version: string) => void;
// Image update info
imageUpdateInfo: ImageUpdateInfo | null;
setImageUpdateInfo: (info: ImageUpdateInfo | null) => void;
}
export const useAppState = create<AppState>((set) => ({
@@ -96,6 +102,8 @@ export const useAppState = create<AppState>((set) => ({
})),
// UI state
terminalHasSelection: false,
setTerminalHasSelection: (has) => set({ terminalHasSelection: has }),
sidebarView: "projects",
setSidebarView: (view) => set({ sidebarView: view }),
dockerAvailable: null,
@@ -111,4 +119,8 @@ export const useAppState = create<AppState>((set) => ({
setUpdateInfo: (info) => set({ updateInfo: info }),
appVersion: "",
setAppVersion: (version) => set({ appVersion: version }),
// Image update info
imageUpdateInfo: null,
setImageUpdateInfo: (info) => set({ imageUpdateInfo: info }),
}));