Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions openhands/usage/cli/installation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,8 @@ description: Install the OpenHands CLI on your system
```bash
docker run -it \
--pull=always \
-e AGENT_SERVER_IMAGE_REPOSITORY=docker.openhands.dev/openhands/runtime \
-e AGENT_SERVER_IMAGE_TAG=1.2-nikolaik \
-e AGENT_SERVER_IMAGE_REPOSITORY=ghcr.io/openhands/agent-server \
-e AGENT_SERVER_IMAGE_TAG=1.10.0-python \
-e SANDBOX_USER_ID=$(id -u) \
-e SANDBOX_VOLUMES=$SANDBOX_VOLUMES \
-v /var/run/docker.sock:/var/run/docker.sock \
Expand Down
4 changes: 2 additions & 2 deletions openhands/usage/environment-variables.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
| `RUNTIME` | string | `"docker"` | Runtime environment (`docker`, `local`, `cli`, etc.) |
| `DEFAULT_AGENT` | string | `"CodeActAgent"` | Default agent class to use |
| `JWT_SECRET` | string | auto-generated | JWT secret for authentication |
| `RUN_AS_OPENHANDS` | boolean | `true` | Whether to run as the openhands user |

Check warning on line 39 in openhands/usage/environment-variables.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/environment-variables.mdx#L39

Did you really mean 'openhands'?
| `VOLUMES` | string | `""` | Volume mounts in format `host:container[:mode]` |

## LLM Configuration Variables
Expand All @@ -58,12 +58,12 @@
| `LLM_NUM_RETRIES` | integer | `8` | Number of retry attempts |
| `LLM_RETRY_MIN_WAIT` | integer | `15` | Minimum wait time between retries (seconds) |
| `LLM_RETRY_MAX_WAIT` | integer | `120` | Maximum wait time between retries (seconds) |
| `LLM_RETRY_MULTIPLIER` | float | `2.0` | Exponential backoff multiplier |

Check warning on line 61 in openhands/usage/environment-variables.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/environment-variables.mdx#L61

Did you really mean 'backoff'?
| `LLM_DROP_PARAMS` | boolean | `false` | Drop unsupported parameters without error |
| `LLM_CACHING_PROMPT` | boolean | `true` | Enable prompt caching if supported |
| `LLM_DISABLE_VISION` | boolean | `false` | Disable vision capabilities for cost reduction |
| `LLM_CUSTOM_LLM_PROVIDER` | string | `""` | Custom LLM provider name |
| `LLM_OLLAMA_BASE_URL` | string | `""` | Base URL for Ollama API |

Check warning on line 66 in openhands/usage/environment-variables.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/environment-variables.mdx#L66

Did you really mean 'Ollama'?
| `LLM_INPUT_COST_PER_TOKEN` | float | `0.0` | Cost per input token |
| `LLM_OUTPUT_COST_PER_TOKEN` | float | `0.0` | Cost per output token |
| `LLM_REASONING_EFFORT` | string | `""` | Reasoning effort for o-series models (`low`, `medium`, `high`) |
Expand All @@ -87,7 +87,7 @@
| `AGENT_ENABLE_LLM_EDITOR` | boolean | `false` | Enable LLM-based editor |
| `AGENT_ENABLE_JUPYTER` | boolean | `false` | Enable Jupyter integration |
| `AGENT_ENABLE_HISTORY_TRUNCATION` | boolean | `true` | Enable history truncation |
| `AGENT_ENABLE_PROMPT_EXTENSIONS` | boolean | `true` | Enable skills (formerly known as microagents) (prompt extensions) |

Check warning on line 90 in openhands/usage/environment-variables.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/environment-variables.mdx#L90

Did you really mean 'microagents'?
| `AGENT_DISABLED_MICROAGENTS` | list | `[]` | List of skills to disable |

## Sandbox Configuration Variables
Expand All @@ -107,11 +107,11 @@
| `SANDBOX_RUNTIME_STARTUP_ENV_VARS` | dict | `{}` | Environment variables for runtime |
| `SANDBOX_BROWSERGYM_EVAL_ENV` | string | `""` | BrowserGym evaluation environment |
| `SANDBOX_VOLUMES` | string | `""` | Volume mounts (replaces deprecated workspace settings) |
| `AGENT_SERVER_IMAGE_REPOSITORY` | string | `""` | Runtime container image repository (e.g., `docker.openhands.dev/openhands/runtime`) |
| `AGENT_SERVER_IMAGE_TAG` | string | `""` | Runtime container image tag (e.g., `1.2-nikolaik`) |
| `AGENT_SERVER_IMAGE_REPOSITORY` | string | `""` | Runtime container image repository (e.g., `ghcr.io/openhands/agent-server`) |
| `AGENT_SERVER_IMAGE_TAG` | string | `""` | Runtime container image tag (e.g., `1.10.0-python`) |
| `SANDBOX_KEEP_RUNTIME_ALIVE` | boolean | `false` | Keep runtime alive after session ends |
| `SANDBOX_PAUSE_CLOSED_RUNTIMES` | boolean | `false` | Pause instead of stopping closed runtimes |

Check warning on line 113 in openhands/usage/environment-variables.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/environment-variables.mdx#L113

Did you really mean 'runtimes'?
| `SANDBOX_CLOSE_DELAY` | integer | `300` | Delay before closing idle runtimes (seconds) |

Check warning on line 114 in openhands/usage/environment-variables.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/environment-variables.mdx#L114

Did you really mean 'runtimes'?
| `SANDBOX_RM_ALL_CONTAINERS` | boolean | `false` | Remove all containers when stopping |
| `SANDBOX_ENABLE_GPU` | boolean | `false` | Enable GPU support |
| `SANDBOX_CUDA_VISIBLE_DEVICES` | string | `""` | Specify GPU devices by ID |
Expand Down Expand Up @@ -178,7 +178,7 @@
| `ANTHROPIC_API_KEY` | string | `""` | Anthropic API key |
| `GOOGLE_API_KEY` | string | `""` | Google API key |
| `AZURE_API_KEY` | string | `""` | Azure API key |
| `TAVILY_API_KEY` | string | `""` | Tavily search API key |

Check warning on line 181 in openhands/usage/environment-variables.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/environment-variables.mdx#L181

Did you really mean 'Tavily'?

## Server Configuration Variables

Expand Down
10 changes: 4 additions & 6 deletions openhands/usage/llms/local-llms.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: Local LLMs

Check warning on line 2 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L2

Did you really mean 'LLMs'?
description: When using a Local LLM, OpenHands may have limited functionality. It is highly recommended that you use GPUs to serve local models for optimal experience.

Check warning on line 3 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L3

Did you really mean 'GPUs'?
---

## News
Expand Down Expand Up @@ -32,7 +32,7 @@

![image](./screenshots/01_lm_studio_open_model_hub.png)

3. Search for **"Qwen3-Coder-30B-A3B-Instruct"**, confirm you're downloading from the official Qwen publisher, then proceed to download.

Check warning on line 35 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L35

Did you really mean 'Qwen'?

![image](./screenshots/02_lm_studio_download_devstral.png)

Expand All @@ -50,7 +50,7 @@

![image](./screenshots/04_lm_studio_setup_devstral_part_1.png)

5. Enable the "Show advanced settings" switch at the bottom of the Model settings flyout to show all the available settings.

Check warning on line 53 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L53

Did you really mean 'flyout'?
6. Set "Context Length" to at least 22000 (for lower VRAM systems) or 32768 (recommended for better performance) and enable Flash Attention.
7. Click "Load Model" to start loading the model.

Expand All @@ -68,24 +68,22 @@
1. Check [the installation guide](/openhands/usage/run-openhands/local-setup) and ensure all prerequisites are met before running OpenHands, then run:

```bash
docker pull docker.openhands.dev/openhands/runtime:1.2-nikolaik

docker run -it --rm --pull=always \
-e AGENT_SERVER_IMAGE_REPOSITORY=docker.openhands.dev/openhands/runtime \
-e AGENT_SERVER_IMAGE_TAG=1.2-nikolaik \
-e AGENT_SERVER_IMAGE_REPOSITORY=ghcr.io/openhands/agent-server \
-e AGENT_SERVER_IMAGE_TAG=1.10.0-python \
-e LOG_ALL_EVENTS=true \
-v /var/run/docker.sock:/var/run/docker.sock \
-v ~/.openhands:/.openhands \
-p 3000:3000 \
--add-host host.docker.internal:host-gateway \
--name openhands-app \
docker.openhands.dev/openhands/openhands:1.2
docker.openhands.dev/openhands/openhands:1.3
```

2. Wait until the server is running (see log below):
```
Digest: sha256:e72f9baecb458aedb9afc2cd5bc935118d1868719e55d50da73190d3a85c674f
Status: Image is up to date for docker.openhands.dev/openhands/openhands:1.2
Status: Image is up to date for docker.openhands.dev/openhands/openhands:1.3
Starting OpenHands...
Running OpenHands as root
14:22:13 - openhands:INFO: server_config.py:50 - Using config class None
Expand Down Expand Up @@ -124,11 +122,11 @@

## Advanced: Alternative LLM Backends

This section describes how to run local LLMs with OpenHands using alternative backends like Ollama, SGLang, or vLLM — without relying on LM Studio.

Check warning on line 125 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L125

Did you really mean 'LLMs'?

Check warning on line 125 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L125

Did you really mean 'Ollama'?

Check warning on line 125 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L125

Did you really mean 'SGLang'?

### Create an OpenAI-Compatible Endpoint with Ollama

Check warning on line 127 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L127

Did you really mean 'Ollama'?

- Install Ollama following [the official documentation](https://ollama.com/download).

Check warning on line 129 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L129

Did you really mean 'Ollama'?
- Example launch command for Qwen3-Coder-30B-A3B-Instruct:

```bash
Expand All @@ -139,7 +137,7 @@
ollama pull qwen3-coder:30b
```

### Create an OpenAI-Compatible Endpoint with vLLM or SGLang

Check warning on line 140 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L140

Did you really mean 'SGLang'?

First, download the model checkpoint:

Expand All @@ -147,10 +145,10 @@
huggingface-cli download Qwen/Qwen3-Coder-30B-A3B-Instruct --local-dir Qwen/Qwen3-Coder-30B-A3B-Instruct
```

#### Serving the model using SGLang

Check warning on line 148 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L148

Did you really mean 'SGLang'?

- Install SGLang following [the official documentation](https://docs.sglang.ai/start/install.html).

Check warning on line 150 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L150

Did you really mean 'SGLang'?
- Example launch command (with at least 2 GPUs):

Check warning on line 151 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L151

Did you really mean 'GPUs'?

```bash
SGLANG_ALLOW_OVERWRITE_LONGER_CONTEXT_LEN=1 python3 -m sglang.launch_server \
Expand All @@ -165,7 +163,7 @@
#### Serving the model using vLLM

- Install vLLM following [the official documentation](https://docs.vllm.ai/en/latest/getting_started/installation.html).
- Example launch command (with at least 2 GPUs):

Check warning on line 166 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L166

Did you really mean 'GPUs'?

```bash
vllm serve Qwen/Qwen3-Coder-30B-A3B-Instruct \
Expand Down Expand Up @@ -217,10 +215,10 @@
2. Enable the **Advanced** toggle at the top of the page.
3. Set the following parameters, if you followed the examples above:
- **Custom Model**: `openai/<served-model-name>`
- For **Ollama**: `openai/qwen3-coder:30b`

Check warning on line 218 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L218

Did you really mean 'Ollama'?
- For **SGLang/vLLM**: `openai/Qwen3-Coder-30B-A3B-Instruct`
- **Base URL**: `http://host.docker.internal:<port>/v1`
Use port `11434` for Ollama, or `8000` for SGLang and vLLM.

Check warning on line 221 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L221

Did you really mean 'Ollama'?

Check warning on line 221 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L221

Did you really mean 'SGLang'?
- **API Key**:
- For **Ollama**: any placeholder value (e.g. `dummy`, `local-llm`)

Check warning on line 223 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L223

Did you really mean 'Ollama'?
- For **SGLang** or **vLLM**: use the same key provided when starting the server (e.g. `mykey`)

Check warning on line 224 in openhands/usage/llms/local-llms.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/llms/local-llms.mdx#L224

Did you really mean 'SGLang'?
2 changes: 1 addition & 1 deletion openhands/usage/run-openhands/local-setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
description: Getting started with running OpenHands on your own.
---

## Recommended Methods for Running Openhands on Your Local System

Check warning on line 6 in openhands/usage/run-openhands/local-setup.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/run-openhands/local-setup.mdx#L6

Did you really mean 'Openhands'?

### System Requirements

Expand Down Expand Up @@ -68,11 +68,11 @@

### Start the App

#### Option 1: Using the CLI Launcher with uv (Recommended)

Check warning on line 71 in openhands/usage/run-openhands/local-setup.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/run-openhands/local-setup.mdx#L71

Did you really mean 'uv'?

We recommend using [uv](https://docs.astral.sh/uv/) for the best OpenHands experience. uv provides better isolation from your current project's virtual environment and is required for OpenHands' default MCP servers (like the [fetch MCP server](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch)).

Check warning on line 73 in openhands/usage/run-openhands/local-setup.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/run-openhands/local-setup.mdx#L73

Did you really mean 'uv'?

**Install uv** (if you haven't already):

Check warning on line 75 in openhands/usage/run-openhands/local-setup.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/run-openhands/local-setup.mdx#L75

Did you really mean 'uv'?

See the [uv installation guide](https://docs.astral.sh/uv/getting-started/installation/) for the latest installation instructions for your platform.

Expand Down Expand Up @@ -130,7 +130,7 @@
-p 3000:3000 \
--add-host host.docker.internal:host-gateway \
--name openhands-app \
docker.openhands.dev/openhands/openhands:1.2
docker.openhands.dev/openhands/openhands:1.3
```

</Accordion>
Expand Down Expand Up @@ -213,8 +213,8 @@

To enable search functionality in OpenHands:

1. Get a Tavily API key from [tavily.com](https://tavily.com/).

Check warning on line 216 in openhands/usage/run-openhands/local-setup.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/run-openhands/local-setup.mdx#L216

Did you really mean 'Tavily'?
2. Enter the Tavily API key in the Settings page under `LLM` tab > `Search API Key (Tavily)`

Check warning on line 217 in openhands/usage/run-openhands/local-setup.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/run-openhands/local-setup.mdx#L217

Did you really mean 'Tavily'?

For more details, see the [Search Engine Setup](/openhands/usage/advanced/search-engine-setup) guide.

Expand Down
4 changes: 2 additions & 2 deletions openhands/usage/troubleshooting/troubleshooting.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -86,8 +86,8 @@
```bash
docker run -it --rm \
-e SANDBOX_VSCODE_PORT=41234 \
-e AGENT_SERVER_IMAGE_REPOSITORY=docker.openhands.dev/openhands/runtime \
-e AGENT_SERVER_IMAGE_TAG=latest \
-e AGENT_SERVER_IMAGE_REPOSITORY=ghcr.io/openhands/agent-server \
-e AGENT_SERVER_IMAGE_TAG=1.10.0-python \
-v /var/run/docker.sock:/var/run/docker.sock \
-v ~/.openhands:/.openhands \
-p 3000:3000 \
Expand Down Expand Up @@ -126,7 +126,7 @@
git remote set-url origin https://github.com/OpenHands/OpenHands.git
```
* Update Docker image references from `ghcr.io/all-hands-ai/` to `ghcr.io/openhands/`
* Find and update any hardcoded references:

Check warning on line 129 in openhands/usage/troubleshooting/troubleshooting.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

openhands/usage/troubleshooting/troubleshooting.mdx#L129

Did you really mean 'hardcoded'?
```bash
git grep -i "all-hands-ai"
git grep -i "ghcr.io/all-hands-ai"
Expand Down