Set Up LLM Connections
Add LLM connections to use AI models through CheckMate during live Sessions. Connections can be personal or Organization-level. Personal connections use your own AI provider account, while Organization connections are configured by an administrator and available to all users in the Organization.
Query your connections directly from the Notes panel during a live Session. The Prevail Native Solution is automatically available to all Prevail Members, with no setup needed.
Set Up Personal Account Connections
Set up a CheckMate connection using your personal AI provider account. You must provide your own API key and any additional credentials required by your provider. Refer to your provider’s documentation for instructions on generating an API key and locating any required account identifiers.
To access connection setup options from your User profile, please contact Customer Success at CustomerSuccess@prevail.ai.
Select a Template
Add a connection to a supported provider using a template, which automatically populates most fields. To configure the settings, select a model, enter your API key, and name the connection. The connection name is what you type after @ in the Notes panel to query the AI model.
The following settings are selected by default:
- Active — Enables the connection for use in the Notes panel.
- Supports Native Threads — Allows you to continue the conversation when you expand the Notes panel in a new tab.
- On the Navigation menu, click the User icon.
- Click Profile.
- Click the Settings tab.
- In the LLM Connections section, click Add Connection.
- In the Select a Provider group, select a provider template.
- In the Basic Information section, in the Connection Name box, enter a name to identify the connection.
The connection name is the name you type after @ in the Notes panel to query this AI model.
- Optional: In the Endpoint Configuration and Default Parameters groups, update any values that differ from your provider account settings.
Provider templates automatically populate these fields.
- Optional: If your provider requires a separate endpoint for document requests, in the Endpoint Configuration group, select Enable Document Endpoint, then enter the path provided by your AI provider.
- In the Default Parameters group, in the Model drop-down, select a model.
The Temperature and Max Tokens fields are automatically populated by the provider template.
- In the Authentication section, in the API Key field, provide your API key.
- If your provider requires additional configuration, complete the fields in the Provider-Specific Settings section.
- Optional: In the Features group, select Supports Prompt Caching to allow the provider to cache repeated or similar prompts, which reduces API costs.
Active is pre-selected and makes the connection available in the Notes panel.
Supports Native Threads is pre-selected and allows you to continue the conversation when you expand the Notes panel in a new tab.
- Optional: To validate the connection before saving, click Testing.
A successful test displays a response from the provider.
- Click Create Connection.
The connection is available from the Notes panel.
Select a Custom Configuration
Manually add a connection to any supported provider without using a template. You must provide all endpoint, parameter, and authentication settings. Refer to your provider’s documentation for the required values.
- On the Navigation menu, click the User icon.
- Click Profile.
- Click the Settings tab.
- In the LLM Connections section, click Add Connection.
- In the Select a Provider group, click Custom Configuration.
- In the Connection Name box, enter a name to identify the connection.
The connection name is the name you type after @ in the Notes panel to query this AI model.
- In the API Format drop-down, select the message format your provider uses.
- In the Authentication Type drop-down, select the authentication method your provider requires.
- In the Endpoint Configuration group, in the Base URL box, enter the base URL for your provider's API endpoint.
- In the Chat Endpoint box, enter the path for chat and completion requests.
- Optional: If your provider requires a separate endpoint for document requests, select Enable Document Endpoint, then enter the path provided by your AI provider.
- In the Default Parameters group, in the Model box, enter the model identifier for the AI model you want to use.
- In the Temperature box, enter a value. Lower values produce more consistent responses.
- In the Max Tokens box, enter the maximum number of tokens allowed in a response.
- In the Authentication group, enter your API key and any additional credentials required by your provider.
- Optional: In the Features group, select Supports Prompt Caching to allow the provider to cache repeated or similar prompts, which reduces API costs.
- Optional: Select Supports Native Threads to continue the conversation when you expand the Notes panel in a new tab.
- Verify that Active is selected.
Active makes the connection available in the Notes panel.
- Optional: To validate the connection before saving, click Test Connection.
A successful test displays a response from the provider.
- Click Create Connection.
Manage Your Connections
View your connections, set a default connection, start a chat, test, edit, or delete connections from the LLM Connections group.
Set a Default Connection
Set a default connection to control which connection appears first in the @ autocomplete menu.
- In the Default Connection drop-down, select a connection name.
- Click Set Default.
My Connections
Connections you created using your personal AI provider account appear in the My Connections group. You can start a chat, test, edit, or delete these connections.
Organization Connections
Your Organization connections appear in the Organization Connections group. If you belong to multiple Organizations, connections from each Organization are listed. Click Start Chat to use a connection. Organization administrators can edit or delete these connections from the Organization page.
Set Up Organization Connections
Organization administrators can add CheckMate connections that are available to all members of the Organization. You must provide an API key and any additional credentials required by your provider. Refer to your provider’s documentation for instructions on generating an API key and locating any required account identifiers.
To access connection setup options from your User profile, please contact Customer Success at CustomerSuccess@prevail.ai.
Select a Template
Add a connection to a supported provider using a template, which automatically populates most fields. To configure the settings, select a model, enter your API key, and name the connection. The connection name is what you type after @ in the Notes panel to query the AI model.
The following settings are selected by default:
- Active — Enables the connection for use in the Notes panel.
- Supports Native Threads — Allows you to continue the conversation when you expand the Notes panel in a new tab.
- On the Navigation menu, click Organizations.
- Click the Organization name.
- In the LLM Connections section, click Add Connection.
- In the Select a Provider group, select a provider template.
- In the Basic Information section, in the Connection Name box, enter a name to identify the connection.
The connection name is the name you type after @ in the Notes panel to query this AI model.
- Optional: In the Endpoint Configuration and Default Parameters groups, update any values that differ from your provider account settings.
Provider templates automatically populate these fields.
- Optional: If your provider requires a separate endpoint for document requests, in the Endpoint Configuration group, select Enable Document Endpoint, then enter the path provided by your AI provider.
- In the Default Parameters group, in the Model drop-down, select a model.
The Temperature and Max Tokens fields are automatically populated by the provider template.
- In the Authentication section, in the API Key field, provide your API key.
- If your provider requires additional configuration, complete the fields in the Provider-Specific Settings section.
- Optional: In the Features group, select Supports Prompt Caching to allow the provider to cache repeated or similar prompts, which reduces API costs.
Active is pre-selected and makes the connection available in the Notes panel.
Supports Native Threads is pre-selected and allows you to continue the conversation when you expand the Notes panel in a new tab.
- Optional: To validate the connection before saving, click Testing.
A successful test displays a response from the provider.
- Click Create Connection.
Select a Custom Configuration
Manually add a connection to any supported provider without using a template. You must provide all endpoint, parameter, and authentication settings. Refer to your provider’s documentation for the required values.
- On the Navigation menu, click Organizations.
- Click the Organization name.
- In the LLM Connections section, click Add Connection.
- In the Select a Provider group, click Custom Configuration.
- In the Connection Name box, enter a name to identify the connection.
The connection name is the name you type after @ in the Notes panel to query this AI model.
- In the API Format drop-down, select the message format your provider uses.
- In the Authentication Type drop-down, select the authentication method your provider requires.
- In the Endpoint Configuration group, in the Base URL box, enter the base URL for your provider's API endpoint.
- In the Chat Endpoint box, enter the path for chat and completion requests.
- Optional: If your provider requires a separate endpoint for document requests, select Enable Document Endpoint, then enter the path provided by your AI provider.
- In the Default Parameters group, in the Model box, enter the model identifier for the AI model you want to use.
- In the Temperature box, enter a value. Lower values produce more consistent responses.
- In the Max Tokens box, enter the maximum number of tokens allowed in a response.
- In the Authentication group, enter your API key and any additional credentials required by your provider.
- Optional: In the Features group, select Supports Prompt Caching to allow the provider to cache repeated or similar prompts, which reduces API costs.
- Optional: Select Supports Native Threads to continue the conversation when you expand the Notes panel in a new tab.
- Verify that Active is selected.
Active makes the connection available in the Notes panel.
- Optional: To validate the connection before saving, click Test Connection.
A successful test displays a response from the provider.
- Click Create Connection.
Manage Organization Connections
The LLM Connections table displays all connections configured for your Organization, including the connection name, provider, format, and status. Use the table options to start a chat, test a connection, edit settings, or delete a connection.