API
Integrations
MCP Server Integration
11 min
overview unith digital humans support two powerful integration methods to extend conversational capabilities https //docs unith ai/tools usage guide and mcp (model context protocol) servers these integrations enable your digital human to interact with external services, trigger workflows, and perform complex operations beyond standard conversation the integration method you choose depends on your digital human's operation mode and your specific use case requirements understanding custom tools vs mcp servers custom tools workflow automation endpoints that connect your digital human to external services and business processes primary use cases zapier workflow integration n8n automation sequences custom webhook endpoints examples trigger a zapier workflow to create support tickets in jira execute an n8n workflow to update customer data across multiple systems automate email sequences through marketing platforms compatibility open conversation mode ( oc ) only to learn more about operation modes please check https //docs unith ai/create a digital human mcp servers powerful external services that handle complex, multi step operations using the model context protocol primary use cases advanced calculators with multiple functions database query systems complex workflow automation third party service integrations with stateful operations examples perform multi step mathematical calculations query and analyze database records execute sophisticated business logic workflows integrate with external apis requiring context preservation compatibility open conversation mode only to learn more about operation modes please check https //docs unith ai/create a digital human important limitations azure openai mcp servers are currently not available when using azure openai models provider dependency mcp availability depends on your llm provider currently supported openai (gpt 4o mini recommended) mutual exclusivity in open conversation mode, when mcp servers are configured, custom tools are not loaded the system uses either mcp servers or custom tools, not both simultaneously configuration guide basic configuration structure all tool and mcp server configurations are defined within the conversationsettings object { "conversationsettings" { "chat model settings" { "provider" "openai", "llm name" "gpt 4o mini", "max llm tokens" 8096, "api key" "your api key" }, "tools" \[ // your custom tools go here ], "mcp servers" \[ // your mcp servers go here ] } } mcp server configuration adding an mcp server mcp servers are defined as json objects within the mcp servers array example configuration { "mcp servers" \[ { "server label" "calculator", "server url" "https //your mcp server com/sse", "require approval" "never", "allowed tools" \["add", "subtract", "multiply", "divide"], "headers" { "authorization" "bearer your token", "content type" "application/json" } } ] } mcp server parameters parameter required description example server label ✅ identifier for this mcp server "calculator" server url ✅ mcp server sse endpoint "https //server com/sse" require approval ❌ when to request user permission "never", "always", "sometimes" allowed tools ❌ specific tools to enable from this server \["add", "subtract"] headers ❌ custom http headers for authentication {"authorization" "bearer token"} prompt based tool selection critical concept https //docs unith ai/tools usage guide and mcp server selection is entirely prompt based the ai agent decides which tools to use based on your descriptions and the user's request high quality descriptions are essential for effective tool usage for mcp servers the system automatically retrieves tool information from the mcp server, but the agent still relies on the tool descriptions provided by the server for selection decisions ensure your mcp server provides clear, descriptive tool schemas test tool selection with various user queries monitor which tools are being called and refine descriptions as needed complete configuration examples open conversation with mcp server { "conversationsettings" { "chat model settings" { "provider" "openai", "llm name" "gpt 4o mini", "max llm tokens" 8096 }, "mcp servers" \[ { "server label" "business calculator", "server url" "https //mcp example com/calculator/sse", "require approval" "never", "allowed tools" \["calculate roi", "calculate margin", "forecast revenue"], "headers" { "authorization" "bearer mcp token abc123", "content type" "application/json" } } ] } } important notes tool selection is prompt based the ai agent decides which tools to use based solely on your descriptions and the user's query invest time in crafting clear, specific tool descriptions for optimal performance mcp server limitations mcp servers are only available in open conversation mode with openai models mutual exclusivity in open conversation mode, when mcp servers are configured, custom tools are not loaded choose either mcp servers or custom tools, not both provider requirements mcp implementation uses openai's protocol and is optimized for gpt 4o mini other models may have varying levels of support authentication secure your tool endpoints with api keys pass authentication credentials via the api key parameter or custom headers performance tool execution adds latency to responses optimize your tool endpoints for fast response times