notice
Browse files
README.md
CHANGED
|
@@ -99,7 +99,7 @@ vllm serve --model convergence-ai/proxy-lite \
|
|
| 99 |
|
| 100 |
The tool arguments are **very important** for parsing the tool calls from the model appropriately.
|
| 101 |
|
| 102 |
-
> **Important:** To
|
| 103 |
|
| 104 |
You can set the `api_base` to point to your local endpoint when calling Proxy Lite:
|
| 105 |
|
|
|
|
| 99 |
|
| 100 |
The tool arguments are **very important** for parsing the tool calls from the model appropriately.
|
| 101 |
|
| 102 |
+
> **Important:** To serve the model locally, install vLLM and transformers with `uv sync --all-extras`. Qwen-2.5-VL support is not yet available in the latest release of `transformers` so installation from source is required.
|
| 103 |
|
| 104 |
You can set the `api_base` to point to your local endpoint when calling Proxy Lite:
|
| 105 |
|