AI Agent Settings

Configure your AI agent for optimal performance and user experience.

Agent Settings

Configure your AI agent's basic settings.

Public

急救 AI Agent 一号

XYBKGaAMfk

Test AI agent

Choose the default language the agent will communicate in.

Chinese (Mandarin)

Specify additional languages which callers can choose from.

The first message the agent will say. If empty, the agent will wait for the user to start the conversation.

The system prompt is used to determine the persona of the agent and the context of the conversation. Learn more

Variables like {{ user_name }} in your prompts and first message will be replaced with actual values when the conversation starts. Learn more

Select which provider and model to use for the LLM.

If your chosen LLM is not available at the moment or something goes wrong, we will redirect the conversation to another LLM.

Currently, the LLM cost is covered by us. In the future, this cost will be passed onto you.

DeepSeek V3

Temperature is a parameter that controls the creativity or randomness of the responses generated by the LLM.

PreciseBalancedCreative

Configure the maximum number of tokens that the LLM can predict. A limit will be applied if the value is greater than 0.

Provide the LLM with domain-specific information to help it answer questions more accurately.

Use RAG

Retrieval-Augmented Generation (RAG) increases the agent's maximum Knowledge Base size. The agent will have access to relevant pieces of attached Knowledge Base during answer generation.

Provide the agent with tools it can use to help users.

end_call

System

Gives agent the ability to end the call with the user.

Create and manage secure secrets that can be accessed across your workspace.