# Custom LLM

#### Overview

The **Custom LLM Provider** feature allows users to connect their own Large Language Model (LLM) providers to an AI agent. This is useful when a client already has their own LLM subscription and wants AI agents to run using their own API keys and models instead of the company’s default system provider.

All AI agents must be connected to an LLM. By default, agents use system-provided models. This feature enables users to add and manage their own LLM providers.

This enables:

* Using the client’s own API keys and billing
* Selecting specific models per AI agent
* Managing provider credentials and capabilities centrally

### LLM Provider Configuration

This guide explains how to add, and manage a **Custom LLM Provider** when creating or configuring an AI agent.\
Custom LLM Provider – Step-by-Step Guide

#### Step 1: Open LLM Provider Settings

While creating or editing an AI agent, navigate to the **Basic Information** tab.

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2FuuBxwpEHwOpEdk1kIkzT%2FScreenshot%202026-01-04%20221908.png?alt=media&#x26;token=d6fd3a42-2d61-4d95-8169-44726fe11ce9" alt=""><figcaption></figcaption></figure>

In the **LLM Provider** section, the **Connected Provider Type** and **LLM Model** fields are pre-populated with models provided by the system by default. These system models are immediately available and require no additional configuration.

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2Fv8foGedmeQ79wPH2ObjJ%2FScreenshot%202026-01-04%20221958.png?alt=media&#x26;token=55ae1d21-3cb2-4b98-9463-5daba8e1a83f" alt=""><figcaption></figcaption></figure>

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2FFcNAJgANGQOM9707WNGh%2FScreenshot%202026-01-04%20222010.png?alt=media&#x26;token=af48d809-9bbb-4269-ace7-73e0c123c062" alt=""><figcaption></figcaption></figure>

If you want to continue using the built-in models:

* Select **System Provider** from the Connected Provider Type dropdown.
* Choose an LLM model from the existing list provided by the system.

#### Step 2: Add a New LLM Provider

If you want to use your own LLM account instead of the system provider, click the **Add / Manage LLM Provider** button.

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2FdDy8DbhZVogtTis9MjIK%2FScreenshot%202026-01-04%20222030.png?alt=media&#x26;token=d48477d7-935f-4df0-96be-25352ea4b5a1" alt=""><figcaption></figcaption></figure>

A dialog box will open. Enter the details of the custom LLM here.

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2FpDDE8PGN03gaU7p65AFQ%2FScreenshot%202026-01-04%20222122.png?alt=media&#x26;token=75717506-afe1-4383-88e1-897c543d5c28" alt=""><figcaption></figcaption></figure>

In the provider configuration dialog:

1. Select the **Provider Type** from the dropdown. Available options include:
   * OpenAI
   * Google AI (Gemini)
   * Amazon Bedrock
   * Ollama
   * VLLM
   * Custom Provider (OpenAI compatible)

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2FWsuveQQa6KCPvOujoUs4%2FScreenshot%202026-01-04%20222130.png?alt=media&#x26;token=efbfd7bd-23ad-4fb1-92ed-83b4c7262f3f" alt=""><figcaption></figcaption></figure>

2. After selecting the provider type, enter a **name** for the provider. This name helps you identify the provider later when selecting it for an AI agent.

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2FXwUQj7Dyx18RIEtlyLQq%2FScreenshot%202026-01-04%20222151.png?alt=media&#x26;token=dcb278e3-4f45-4999-b673-5aea26e3ee8a" alt=""><figcaption></figcaption></figure>

#### Step 3: Enter Provider Credentials

Next, enter the credentials required to connect your LLM account.

1. Paste your **API Key** into the API Key field.
2. If required by the selected provider, enter the **Access URL** or **Base URL**.
3. Optionally, configure token-related limits such as:
   * Maximum completion tokens
   * Maximum output tokens
   * Maximum total tokens
4. Select the **models supported by your provider** that you want to make available for use with AI agents.

Once all required information is entered, click **Connect Provider** to save the provider configuration.

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2FjtC0zSzF24priq7YSvxE%2FScreenshot%202026-01-04%20222402.png?alt=media&#x26;token=e1adbff7-aa38-45ac-ab4a-fcd278bb5487" alt=""><figcaption></figcaption></figure>

#### Step 4: Select the Custom Provider for Your AI Agent

After the provider is successfully connected, return to the AI agent’s **Basic Information** screen.

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2F3e86yZFR7L5d7R8xCxwf%2FScreenshot%202026-01-04%20225129.png?alt=media&#x26;token=c35d2c65-e764-4f7f-b11f-a2cc58c41175" alt=""><figcaption></figcaption></figure>

1. From the **Connected Provider Type** dropdown, select the custom provider you just added.
2. Open the **LLM Model** dropdown and choose one of the models associated with that provider.

A single provider can have multiple models available, allowing you to select the model that best fits the purpose of your AI agent.

#### Step 5: Complete and Create the AI Agent

Finish configuring the remaining AI agent settings as needed.

Once complete, click **Create** to finalize the AI agent.\
The AI agent will now run using your custom LLM provider and the selected model.

### Managing an Existing LLM Provider

If a custom LLM provider has already been added, it can be updated at any time.

1. Click **Add / Manage LLM Provider**.
2. Open the **Manage Current Provider** tab.
3. Select the provider you want to modify and click **Manage**.

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2FHLOba84F3WtiTJ8fFCBv%2FScreenshot%202026-01-04%20222558.png?alt=media&#x26;token=95db41ab-3ba5-43c3-9fe8-3847bfa7fd19" alt=""><figcaption></figcaption></figure>

From here, you can:

* Update the provider name
* Replace or rotate the API key
* Change the Access URL or Base URL
* Adjust token limits
* Modify supported capabilities

<figure><img src="https://3400071099-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FypCKq44Psaly6khgrY7x%2Fuploads%2FL6AGj2Qni1cayYx86QQk%2FScreenshot%202026-01-04%20222611.png?alt=media&#x26;token=0100d367-c68a-46e2-8c3b-502994b85022" alt=""><figcaption></figcaption></figure>

After making changes, click **Update Provider** to apply them. All updates will immediately reflect across AI agents using that provider.
