Skip to content
1 change: 1 addition & 0 deletions apps/app/src/app/utils/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ const FRIENDLY_PROVIDER_LABELS: Record<string, string> = {
anthropic: "Anthropic",
google: "Google",
openrouter: "OpenRouter",
lilac: "Lilac",
};

const humanizeModelLabel = (value: string) => {
Expand Down
1 change: 1 addition & 0 deletions packages/docs/docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@
"pages": [
"how-to-get-an-anthropic-api-key",
"how-to-connect-a-custom-provider",
"how-to-connect-lilac",
"importing-a-skill"
]
},
Expand Down
174 changes: 174 additions & 0 deletions packages/docs/how-to-connect-lilac.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,174 @@
---
title: "How to connect Lilac to OpenWork"
description: "Connect Lilac's OpenAI-compatible API and use models like GLM 5.1 and Kimi K2.5"
---

[Lilac](https://getlilac.com) provides affordable GPU inference through an OpenAI-compatible API at `api.getlilac.com/v1`. It supports chat completions, streaming, tool/function calling, reasoning, and structured output; some models (like Kimi K2.5) also support vision/image inputs. See [Lilac's OpenAI compatibility docs](https://docs.getlilac.com/inference/openai-compatibility) for the full feature matrix.

You can get an API key from the [Lilac console](https://console.getlilac.com). See [Lilac's API key guide](https://docs.getlilac.com/inference/api-keys) for details.

## Quick setup (built-in provider)

Lilac is available as a built-in provider in OpenWork. To connect:

1. Open **Connect providers** from the sidebar or settings.
2. Find **Lilac** in the provider list and click it.
3. Paste your `LILAC_API_KEY` into the API key field. Keys are stored locally by OpenCode.
4. Open the model picker in a session and select a Lilac model (e.g. `Lilac · GLM 5.1` or `Lilac · Kimi K2.5`).

That's it — no manual configuration needed. If you don't see Lilac in the provider list yet, see the manual setup below.

<Note>You can also set the API key as an environment variable (`export LILAC_API_KEY="your-api-key"`) if you prefer the CLI path.</Note>

## Manual setup (desktop)

If Lilac doesn't appear as a built-in provider yet, add it manually to your workspace `.opencode.json` (or `~/.config/opencode/opencode.json` for a global setup):

```json
{
"provider": {
"lilac": {
"npm": "@ai-sdk/openai-compatible",
"name": "Lilac",
"options": {
"baseURL": "https://api.getlilac.com/v1",
"apiKey": "your-api-key"
},
"models": {
"z-ai/glm-5.1": {
"name": "GLM 5.1"
},
"moonshotai/kimi-k2.5": {
"name": "Kimi K2.5"
}
}
}
}
}
```

<Note>You can also set the API key via the `LILAC_API_KEY` environment variable instead of `options.apiKey`.</Note>

Reload the workspace when OpenWork asks, then select a Lilac model from the model picker.

## OpenWork Cloud setup (for teams)

Use this when you want your OpenWork org to manage the Lilac credential and model list centrally, so team members don't each need to configure it locally.

1. Open `LLM Providers` in OpenWork Cloud.
2. Click `Add Provider`.
3. Switch to `Custom provider`.
4. Paste the following JSON:

```json
{
"id": "lilac",
"name": "Lilac",
"npm": "@ai-sdk/openai-compatible",
"env": [
"LILAC_API_KEY"
],
"doc": "https://docs.getlilac.com/inference/openai-compatibility",
"api": "https://api.getlilac.com/v1",
"models": [
{
"id": "z-ai/glm-5.1",
"name": "GLM 5.1",
"attachment": false,
"reasoning": true,
"tool_call": true,
"structured_output": true,
"temperature": true,
"release_date": "2026-04-07",
"last_updated": "2026-04-07",
"open_weights": true,
"limit": {
"context": 202800,
"input": 202800,
"output": 131072
},
"modalities": {
"input": [
"text"
],
"output": [
"text"
]
}
},
{
"id": "moonshotai/kimi-k2.5",
"name": "Kimi K2.5",
"attachment": true,
"reasoning": true,
"tool_call": true,
"structured_output": true,
"temperature": true,
"release_date": "2026-01-27",
"last_updated": "2026-01-27",
"open_weights": true,
"limit": {
"context": 262144,
"input": 262144,
"output": 32768
},
"modalities": {
"input": [
"text",
"image"
],
"output": [
"text"
]
}
}
]
}
```

5. Paste your shared `LILAC_API_KEY` credential.
6. Choose `People access` and/or `Team access`.
7. Click `Create Provider`.

Then import it into the desktop app:

1. Open `Settings -> Cloud`.
2. Choose the correct `Active org`.
3. Under `Cloud providers`, click `Import`.
4. Reload the workspace when OpenWork asks.

For more detail on the Cloud provider flow, see [Adding a custom LLM provider](/cloud-custom-llm-providers).

## Verifying the connection

After setup, open the model picker in a session. You should see `Lilac · GLM 5.1` and `Lilac · Kimi K2.5` (or whichever models you configured). Select one and send a message to confirm the connection is working.

You can also verify the API directly:

```bash
curl https://api.getlilac.com/v1/chat/completions \
-H "Authorization: Bearer $LILAC_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "z-ai/glm-5.1",
"messages": [
{"role": "user", "content": "Hello!"}
]
}'
```

## Model details

| Model | Model ID | Context | Vision | Reasoning | Tool calling | Input price | Output price |
| --- | --- | --- | --- | --- | --- | --- | --- |
| GLM 5.1 | `z-ai/glm-5.1` | 202,800 | No | Yes (on by default) | Yes | $0.90/M tokens | $3.00/M tokens |
| Kimi K2.5 | `moonshotai/kimi-k2.5` | 262,144 | Yes | Yes (on by default) | Yes | $0.40/M tokens | $2.00/M tokens |

Pricing is from [Lilac's models page](https://docs.getlilac.com/inference/models) and may change.

## More Lilac resources

- [Available models](https://docs.getlilac.com/inference/models) — full model catalog with pricing
- [OpenAI compatibility](https://docs.getlilac.com/inference/openai-compatibility) — supported endpoints and features
- [Connect to local coding tools](https://docs.getlilac.com/inference/local-tools) — Lilac's own guide for connecting to local dev tools
- [Pricing](https://docs.getlilac.com/inference/pricing) — per-token pricing details