Compare commits
1 Commits
v0.1.97-wi
...
v0.1.98-ma
| Author | SHA1 | Date | |
|---|---|---|---|
| b6fd8a557e |
@@ -33,6 +33,8 @@ You need access to Claude Code through one of:
|
|||||||
|
|
||||||
- **Anthropic account** — Sign up at https://claude.ai and use `claude login` (OAuth) inside the terminal
|
- **Anthropic account** — Sign up at https://claude.ai and use `claude login` (OAuth) inside the terminal
|
||||||
- **AWS Bedrock** — An AWS account with Bedrock access and Claude models enabled
|
- **AWS Bedrock** — An AWS account with Bedrock access and Claude models enabled
|
||||||
|
- **Ollama** — A local or remote Ollama server running an Anthropic-compatible model (best-effort support)
|
||||||
|
- **LiteLLM** — A LiteLLM proxy gateway providing access to 100+ model providers (best-effort support)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -88,6 +90,20 @@ Claude Code launches automatically with `--dangerously-skip-permissions` inside
|
|||||||
3. Expand the **Config** panel and fill in your AWS credentials (see [AWS Bedrock Configuration](#aws-bedrock-configuration) below).
|
3. Expand the **Config** panel and fill in your AWS credentials (see [AWS Bedrock Configuration](#aws-bedrock-configuration) below).
|
||||||
4. Start the container again.
|
4. Start the container again.
|
||||||
|
|
||||||
|
**Ollama:**
|
||||||
|
|
||||||
|
1. Stop the container first (settings can only be changed while stopped).
|
||||||
|
2. In the project card, switch the auth mode to **Ollama**.
|
||||||
|
3. Expand the **Config** panel and set the base URL of your Ollama server (defaults to `http://host.docker.internal:11434` for a local instance). Optionally set a model ID.
|
||||||
|
4. Start the container again.
|
||||||
|
|
||||||
|
**LiteLLM:**
|
||||||
|
|
||||||
|
1. Stop the container first (settings can only be changed while stopped).
|
||||||
|
2. In the project card, switch the auth mode to **LiteLLM**.
|
||||||
|
3. Expand the **Config** panel and set the base URL of your LiteLLM proxy (defaults to `http://host.docker.internal:4000`). Optionally set an API key and model ID.
|
||||||
|
4. Start the container again.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## The Interface
|
## The Interface
|
||||||
@@ -372,6 +388,41 @@ Per-project settings always override these global defaults.
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
## Ollama Configuration
|
||||||
|
|
||||||
|
To use Claude Code with a local or remote Ollama server, switch the auth mode to **Ollama** on the project card.
|
||||||
|
|
||||||
|
### Settings
|
||||||
|
|
||||||
|
- **Base URL** — The URL of your Ollama server. Defaults to `http://host.docker.internal:11434`, which reaches a locally running Ollama instance from inside the container. For a remote server, use its IP or hostname (e.g., `http://192.168.1.100:11434`).
|
||||||
|
- **Model ID** — Optional. Override the model to use (e.g., `qwen3.5:27b`).
|
||||||
|
|
||||||
|
### How It Works
|
||||||
|
|
||||||
|
Triple-C sets `ANTHROPIC_BASE_URL` to point Claude Code at your Ollama server instead of Anthropic's API. The `ANTHROPIC_AUTH_TOKEN` is set to `ollama` (required by Claude Code but not used for actual authentication).
|
||||||
|
|
||||||
|
> **Note:** Ollama support is best-effort. Claude Code is designed for Anthropic models, so some features (tool use, extended thinking, prompt caching, etc.) may not work as expected with non-Anthropic models.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## LiteLLM Configuration
|
||||||
|
|
||||||
|
To use Claude Code through a [LiteLLM](https://docs.litellm.ai/) proxy gateway, switch the auth mode to **LiteLLM** on the project card. LiteLLM supports 100+ model providers (OpenAI, Gemini, Anthropic, and more) through a single proxy.
|
||||||
|
|
||||||
|
### Settings
|
||||||
|
|
||||||
|
- **Base URL** — The URL of your LiteLLM proxy. Defaults to `http://host.docker.internal:4000` for a locally running proxy.
|
||||||
|
- **API Key** — Optional. The API key for your LiteLLM proxy, if authentication is required. Stored securely in your OS keychain.
|
||||||
|
- **Model ID** — Optional. Override the model to use.
|
||||||
|
|
||||||
|
### How It Works
|
||||||
|
|
||||||
|
Triple-C sets `ANTHROPIC_BASE_URL` to point Claude Code at your LiteLLM proxy. If an API key is provided, it is set as `ANTHROPIC_AUTH_TOKEN`.
|
||||||
|
|
||||||
|
> **Note:** LiteLLM support is best-effort. Claude Code is designed for Anthropic models, so some features (tool use, extended thinking, prompt caching, etc.) may not work as expected when routing to non-Anthropic models through the proxy.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## Settings
|
## Settings
|
||||||
|
|
||||||
Access global settings via the **Settings** tab in the sidebar.
|
Access global settings via the **Settings** tab in the sidebar.
|
||||||
|
|||||||
@@ -49,6 +49,10 @@ Each project can independently use one of:
|
|||||||
|
|
||||||
- **Anthropic** (OAuth): User runs `claude login` inside the terminal on first use. Token persisted in the config volume across restarts and resets.
|
- **Anthropic** (OAuth): User runs `claude login` inside the terminal on first use. Token persisted in the config volume across restarts and resets.
|
||||||
- **AWS Bedrock**: Per-project AWS credentials (static keys, profile, or bearer token). SSO sessions are validated before launching Claude for Profile auth.
|
- **AWS Bedrock**: Per-project AWS credentials (static keys, profile, or bearer token). SSO sessions are validated before launching Claude for Profile auth.
|
||||||
|
- **Ollama**: Connect to a local or remote Ollama server via `ANTHROPIC_BASE_URL` (e.g., `http://host.docker.internal:11434`). Optional model override.
|
||||||
|
- **LiteLLM**: Connect through a LiteLLM proxy gateway via `ANTHROPIC_BASE_URL` + `ANTHROPIC_AUTH_TOKEN` to access 100+ model providers. API key stored securely in OS keychain.
|
||||||
|
|
||||||
|
> **Note:** Ollama and LiteLLM support is best-effort. Claude Code is designed for Anthropic models, so some features (tool use, extended thinking, prompt caching, etc.) may not work as expected with non-Anthropic models behind these backends.
|
||||||
|
|
||||||
### Container Spawning (Sibling Containers)
|
### Container Spawning (Sibling Containers)
|
||||||
|
|
||||||
|
|||||||
@@ -1143,11 +1143,6 @@ pub fn any_stdio_docker_mcp(servers: &[McpServer]) -> bool {
|
|||||||
servers.iter().any(|s| s.is_docker() && s.transport_type == McpTransportType::Stdio)
|
servers.iter().any(|s| s.is_docker() && s.transport_type == McpTransportType::Stdio)
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Returns true if any MCP server uses Docker.
|
|
||||||
pub fn any_docker_mcp(servers: &[McpServer]) -> bool {
|
|
||||||
servers.iter().any(|s| s.is_docker())
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Find an existing MCP container by its expected name.
|
/// Find an existing MCP container by its expected name.
|
||||||
pub async fn find_mcp_container(server: &McpServer) -> Result<Option<String>, String> {
|
pub async fn find_mcp_container(server: &McpServer) -> Result<Option<String>, String> {
|
||||||
let docker = get_docker()?;
|
let docker = get_docker()?;
|
||||||
|
|||||||
@@ -22,6 +22,7 @@ impl ExecSession {
|
|||||||
.map_err(|e| format!("Failed to send input: {}", e))
|
.map_err(|e| format!("Failed to send input: {}", e))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[allow(dead_code)]
|
||||||
pub async fn resize(&self, cols: u16, rows: u16) -> Result<(), String> {
|
pub async fn resize(&self, cols: u16, rows: u16) -> Result<(), String> {
|
||||||
let docker = get_docker()?;
|
let docker = get_docker()?;
|
||||||
docker
|
docker
|
||||||
|
|||||||
@@ -4,8 +4,13 @@ pub mod image;
|
|||||||
pub mod exec;
|
pub mod exec;
|
||||||
pub mod network;
|
pub mod network;
|
||||||
|
|
||||||
|
#[allow(unused_imports)]
|
||||||
pub use client::*;
|
pub use client::*;
|
||||||
|
#[allow(unused_imports)]
|
||||||
pub use container::*;
|
pub use container::*;
|
||||||
|
#[allow(unused_imports)]
|
||||||
pub use image::*;
|
pub use image::*;
|
||||||
|
#[allow(unused_imports)]
|
||||||
pub use exec::*;
|
pub use exec::*;
|
||||||
|
#[allow(unused_imports)]
|
||||||
pub use network::*;
|
pub use network::*;
|
||||||
|
|||||||
@@ -48,6 +48,7 @@ pub async fn ensure_project_network(project_id: &str) -> Result<String, String>
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// Connect a container to the project network.
|
/// Connect a container to the project network.
|
||||||
|
#[allow(dead_code)]
|
||||||
pub async fn connect_container_to_network(
|
pub async fn connect_container_to_network(
|
||||||
container_id: &str,
|
container_id: &str,
|
||||||
network_name: &str,
|
network_name: &str,
|
||||||
|
|||||||
@@ -3,7 +3,11 @@ pub mod secure;
|
|||||||
pub mod settings_store;
|
pub mod settings_store;
|
||||||
pub mod mcp_store;
|
pub mod mcp_store;
|
||||||
|
|
||||||
|
#[allow(unused_imports)]
|
||||||
pub use projects_store::*;
|
pub use projects_store::*;
|
||||||
|
#[allow(unused_imports)]
|
||||||
pub use secure::*;
|
pub use secure::*;
|
||||||
|
#[allow(unused_imports)]
|
||||||
pub use settings_store::*;
|
pub use settings_store::*;
|
||||||
|
#[allow(unused_imports)]
|
||||||
pub use mcp_store::*;
|
pub use mcp_store::*;
|
||||||
|
|||||||
Reference in New Issue
Block a user