fix: update llm.py file path reference

Update GitHub link path for llm.py file in custom model dialog
from backend/services/llm.py to correct backend/core/services/llm.py
This commit is contained in:
Frank An 2025-09-23 14:03:55 +08:00
parent f19c1a6f55
commit 276addb10b
No known key found for this signature in database
1 changed files with 1 additions and 1 deletions

View File

@ -77,7 +77,7 @@ export const CustomModelDialog: React.FC<CustomModelDialogProps> = ({
<DialogHeader>
<DialogTitle>{mode === 'add' ? 'Add Custom Model' : 'Edit Custom Model'}</DialogTitle>
<DialogDescription>
Kortix Suna uses <b>LiteLLM</b> under the hood, which makes it compatible with over 100 models. You can easily choose any <a href="https://openrouter.ai/models" target="_blank" rel="noopener noreferrer" className="text-blue-500 hover:text-blue-600 underline">OpenRouter model</a> by prefixing it with <code className="bg-muted px-1.5 py-0.5 rounded text-sm font-mono">openrouter/</code> and it should work out of the box. If you want to use other models besides OpenRouter, you might have to modify the <a href="https://github.com/kortix-ai/suna/blob/main/backend/services/llm.py" target="_blank" rel="noopener noreferrer" className="text-blue-500 hover:text-blue-600 underline">llm.py</a>, set the correct environment variables, and rebuild your self-hosted Docker container.
Kortix Suna uses <b>LiteLLM</b> under the hood, which makes it compatible with over 100 models. You can easily choose any <a href="https://openrouter.ai/models" target="_blank" rel="noopener noreferrer" className="text-blue-500 hover:text-blue-600 underline">OpenRouter model</a> by prefixing it with <code className="bg-muted px-1.5 py-0.5 rounded text-sm font-mono">openrouter/</code> and it should work out of the box. If you want to use other models besides OpenRouter, you might have to modify the <a href="https://github.com/kortix-ai/suna/blob/main/backend/core/services/llm.py" target="_blank" rel="noopener noreferrer" className="text-blue-500 hover:text-blue-600 underline">llm.py</a>, set the correct environment variables, and rebuild your self-hosted Docker container.
</DialogDescription>