Custom LLM

Overview

The Custom LLM Provider feature allows users to connect their own Large Language Model (LLM) providers to an AI agent. This is useful when a client already has their own LLM subscription and wants AI agents to run using their own API keys and models instead of the company’s default system provider.

All AI agents must be connected to an LLM. By default, agents use system-provided models. This feature enables users to add and manage their own LLM providers.

This enables:

  • Using the client’s own API keys and billing

  • Selecting specific models per AI agent

  • Managing provider credentials and capabilities centrally

LLM Provider Configuration

This guide explains how to add, and manage a Custom LLM Provider when creating or configuring an AI agent. Custom LLM Provider – Step-by-Step Guide

Step 1: Open LLM Provider Settings

While creating or editing an AI agent, navigate to the Basic Information tab.

In the LLM Provider section, the Connected Provider Type and LLM Model fields are pre-populated with models provided by the system by default. These system models are immediately available and require no additional configuration.

If you want to continue using the built-in models:

  • Select System Provider from the Connected Provider Type dropdown.

  • Choose an LLM model from the existing list provided by the system.

Step 2: Add a New LLM Provider

If you want to use your own LLM account instead of the system provider, click the Add / Manage LLM Provider button.

A dialog box will open. Enter the details of the custom LLM here.

In the provider configuration dialog:

  1. Select the Provider Type from the dropdown. Available options include:

    • OpenAI

    • Google AI (Gemini)

    • Amazon Bedrock

    • Ollama

    • VLLM

    • Custom Provider (OpenAI compatible)

  1. After selecting the provider type, enter a name for the provider. This name helps you identify the provider later when selecting it for an AI agent.

Step 3: Enter Provider Credentials

Next, enter the credentials required to connect your LLM account.

  1. Paste your API Key into the API Key field.

  2. If required by the selected provider, enter the Access URL or Base URL.

  3. Optionally, configure token-related limits such as:

    • Maximum completion tokens

    • Maximum output tokens

    • Maximum total tokens

  4. Select the models supported by your provider that you want to make available for use with AI agents.

Once all required information is entered, click Connect Provider to save the provider configuration.

Step 4: Select the Custom Provider for Your AI Agent

After the provider is successfully connected, return to the AI agent’s Basic Information screen.

  1. From the Connected Provider Type dropdown, select the custom provider you just added.

  2. Open the LLM Model dropdown and choose one of the models associated with that provider.

A single provider can have multiple models available, allowing you to select the model that best fits the purpose of your AI agent.

Step 5: Complete and Create the AI Agent

Finish configuring the remaining AI agent settings as needed.

Once complete, click Create to finalize the AI agent. The AI agent will now run using your custom LLM provider and the selected model.

Managing an Existing LLM Provider

If a custom LLM provider has already been added, it can be updated at any time.

  1. Click Add / Manage LLM Provider.

  2. Open the Manage Current Provider tab.

  3. Select the provider you want to modify and click Manage.

From here, you can:

  • Update the provider name

  • Replace or rotate the API key

  • Change the Access URL or Base URL

  • Adjust token limits

  • Modify supported capabilities

After making changes, click Update Provider to apply them. All updates will immediately reflect across AI agents using that provider.

Last updated