API
Integrations

Advanced Conversational Settings (LLM Provider, external tools)

7min

This section details the configuration options for managing core aspects of the digital human's conversational behavior, including the selection of the underlying Language Model (LLM) provider and model and external tool integrations.

The platform currently offers integration with the following LLM providers: OpenAI and Groq. We are actively developing support for further LLM providers, which will be available in upcoming updates.

To configure the conversational parameters, a deployed digital human with the operation_mode parameter set to "oc" is required.

operation_mode= "oc"

Configure the LLM settings

Use the PUT /head/{id}/conversation-settings endpoint to configure the LLM settings for the Digital Human. The following parameters are required:

provider:

Specify the exact LLM model to be used.

  • Expected value: string
  • Valid values:
    • "openai": Use an OpenAI model.
    • "groq": Use a Groq model.
  • Default value: "openai"

llm_name:

Specify the exact LLM model.

  • Expected value: string
  • Possible values: Refer to your chosen LLM provider's documentation (Groq, OpenAI) for a list of available models (e.g., "gpt-3.5-turbo", "gpt-4" for OpenAI).
  • Default value: "gpt-4o-mini"

max_llm_tokens:

Specify the max. tokens per response.

  • Expected value: integer
  • Possible values: Refer to your chosen LLM provider's documentation for the maximum tokens per response.
  • Default value: "8000"

api_key:

Set the api_key of your llm provider service

  • Expected value: string

Example request:

Request body


Use the GET /head/{id}/conversation-settings to retreive the LLM settings for the Digital Human. Note that if you want to update any parameter you have to provide the whole "conversationSettings" field including the previous parameters that you don't want to change, but you need different thanm the default in the PUT method.

Configure the external tools integration

Configure external tools that can be invoked by the digital human via HTTP POST requests. This enables the digital human to interact with external services and APIs during a conversation.

Use the PUT /head/{id}/conversation-settings to configure the external tool integration with the following parameters:



This section outlines the process of configuring external tools that the digital human can interact with. Each tool must be defined with the following parameters:

Tools: array An array of tool definitions. Each element in the array represents a single external tool.

  • name: string The name of the external tool or the function it performs. This name should be descriptive and concise as it will be used to identify the tool when invoked during a conversation.
    • Example: "get_weather"
  • desciption: A brief description of the tool's functionality. This description should clearly explain what the tool does and how it can be used by the LLM.
    • Example: "Retrieves the current weather for a specified location."
  • parameters array An array of parameter definitions that the LLM will use to interact with the tool. Each element defines a single parameter.
    • name: string The name of the parameter. Use clear and descriptive names.
      • Example: "location"
    • description: string A description of the parameter, explaining its purpose and what kind of value the LLM should provide.
      • Example: ·"The city for which to retrieve the weather, e.g., San Francisco"
    • type: string The data type of the parameter. This helps the LLM understand the expected format of the value. Common types include:
      • "string": For text values.
      • "integer": For whole numbers.
  • url: string The webhook URL of the external tool's endpoint that will receive the POST request. This is the address where the LLM will send the data to invoke the tool.
  • api_key: The API key, if required, for authenticating with the external tool's API. If the external service is authenticated via headers of the POST request.



Example request:

Request body