example:
ai_agent_extractThe type of AI agent to be used for metadata extraction.Value is always ai_agent_extractexample:
enabledThe state of the AI Agent capability. Possible values are: enabled and disabled.AI agent processor used to handle basic text.
Show child attributes
Show child attributes
example:
falseTrue if system message contains custom instructions placeholder, false otherwise.llm_endpoint_params
AI LLM endpoint params AWS/AI LLM endpoint params Google/AI LLM endpoint params IBM/AI LLM endpoint params OpenAIobject
The parameters for the LLM endpoint specific to a model.
example:
azure__openai__gpt_4o_miniThe model used for the AI agent for basic text. For specific model values, see the available models list.example:
8400The number of tokens for completion.example:
(\{user_question\}[\s\S]*?\{content\}|\{content\}[\s\S]*?\{user_question\})The prompt template contains contextual information of the request and the user prompt.
When passing prompt_template parameters, you must include inputs for {user_question} and {content}.
{current_date} is optional, depending on the use.example:
You are a helpful travel assistant specialized in budget travelSystem messages try to help the LLM “understand” its role and what it is supposed to do.example:
["MODEL_INACTIVE"]Warnings concerning tool.AI agent processor used to handle basic text.
Show child attributes
Show child attributes
example:
falseTrue if system message contains custom instructions placeholder, false otherwise.llm_endpoint_params
AI LLM endpoint params AWS/AI LLM endpoint params Google/AI LLM endpoint params IBM/AI LLM endpoint params OpenAIobject
The parameters for the LLM endpoint specific to a model.
example:
azure__openai__gpt_4o_miniThe model used for the AI agent for basic text. For specific model values, see the available models list.example:
8400The number of tokens for completion.example:
(\{user_question\}[\s\S]*?\{content\}|\{content\}[\s\S]*?\{user_question\})The prompt template contains contextual information of the request and the user prompt.
When passing prompt_template parameters, you must include inputs for {user_question} and {content}.
{current_date} is optional, depending on the use.example:
You are a helpful travel assistant specialized in budget travelSystem messages try to help the LLM “understand” its role and what it is supposed to do.example:
["MODEL_INACTIVE"]Warnings concerning tool.example:
This is a custom instructionCustom instructions for the AI agent.example:
This is ASK AgentThe description of the AI agent.AI agent processor used to to handle longer text.
Show child attributes
Show child attributes
Show child attributes
Show child attributes
example:
azure__openai__text_embedding_ada_002The model used for the AI agent for calculating embeddings.example:
falseTrue if system message contains custom instructions placeholder, false otherwise.llm_endpoint_params
AI LLM endpoint params AWS/AI LLM endpoint params Google/AI LLM endpoint params IBM/AI LLM endpoint params OpenAIobject
The parameters for the LLM endpoint specific to a model.
example:
azure__openai__gpt_4o_miniThe model used for the AI agent for basic text. For specific model values, see the available models list.example:
8400The number of tokens for completion.example:
(\{user_question\}[\s\S]*?\{content\}|\{content\}[\s\S]*?\{user_question\})The prompt template contains contextual information of the request and the user prompt.
When passing prompt_template parameters, you must include inputs for {user_question} and {content}.
{current_date} is optional, depending on the use.example:
You are a helpful travel assistant specialized in budget travelSystem messages try to help the LLM “understand” its role and what it is supposed to do.example:
["MODEL_INACTIVE"]Warnings concerning tool.Copy
Ask AI
{
"type": "ai_agent_extract",
"access_state": "enabled",
"basic_image": {
"is_custom_instructions_included": false,
"llm_endpoint_params": {
"type": "openai_params",
"frequency_penalty": 1.5,
"presence_penalty": 1.5,
"stop": "<|im_end|>",
"temperature": 0,
"top_p": 1
},
"model": "azure__openai__gpt_4o_mini",
"num_tokens_for_completion": 8400,
"prompt_template": "It is `{current_date}`, consider these travel options `{content}` and answer the `{user_question}`.",
"system_message": "You are a helpful travel assistant specialized in budget travel",
"warnings": [
"MODEL_INACTIVE"
]
},
"basic_text": {
"is_custom_instructions_included": false,
"llm_endpoint_params": {
"type": "openai_params",
"frequency_penalty": 1.5,
"presence_penalty": 1.5,
"stop": "<|im_end|>",
"temperature": 0,
"top_p": 1
},
"model": "azure__openai__gpt_4o_mini",
"num_tokens_for_completion": 8400,
"prompt_template": "It is `{current_date}`, consider these travel options `{content}` and answer the `{user_question}`.",
"system_message": "You are a helpful travel assistant specialized in budget travel",
"warnings": [
"MODEL_INACTIVE"
]
},
"custom_instructions": "This is a custom instruction",
"description": "This is ASK Agent",
"long_text": {
"embeddings": {
"model": "azure__openai__text_embedding_ada_002",
"strategy": {
"id": "basic",
"num_tokens_per_chunk": 64
}
},
"is_custom_instructions_included": false,
"llm_endpoint_params": {
"type": "openai_params",
"frequency_penalty": 1.5,
"presence_penalty": 1.5,
"stop": "<|im_end|>",
"temperature": 0,
"top_p": 1
},
"model": "azure__openai__gpt_4o_mini",
"num_tokens_for_completion": 8400,
"prompt_template": "It is `{current_date}`, consider these travel options `{content}` and answer the `{user_question}`.",
"system_message": "You are a helpful travel assistant specialized in budget travel",
"warnings": [
"MODEL_INACTIVE"
]
}
}
