Sends an AI request to supported LLMs and returns an answer specifically focused on the user's question given the provided context.
The AI agent used to handle queries.
The history of prompts and answers previously passed to the LLM. This provides additional context to the LLM in generating the response.
"Here is the first draft of your professional email about public APIs."
The answer previously provided by the LLM.
"2012-12-12T10:53:43-08:00"
The ISO date formatted timestamp of when the previous answer to the prompt was created.
"Make my email about public APIs sound more professional."
The prompt previously provided by the client and answered by the LLM.
true
A flag to indicate whether citations should be returned.
The items to be processed by the LLM, often files.
Note: Box AI handles documents with text representations up to 1MB in size, or a maximum of 25 files, whichever comes first.
If the file size exceeds 1MB, the first 1MB of text representation will be processed.
If you set mode
parameter to single_item_qa
, the items
array can have one element only.
"123"
The ID of the file.
"file"
The type of the item. Currently the value can be file
only.
Value is always file
"This is file content."
The content of the item, often the text representation.
"multiple_item_qa"
The mode specifies if this request is for a single or multiple items. If you select single_item_qa
the items
array can have one element only. Selecting multiple_item_qa
allows you to provide up to 25 items.
Value is one of multiple_item_qa
,single_item_qa
"What is the value provided by public APIs based on this document?"
The prompt provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.
A successful response including the answer from the LLM.
An unexpected server error.
An unexpected error.
curl -i -L POST "https://api.box.com/2.0/ai/ask" \
-H "content-type: application/json" \
-H "authorization: Bearer <ACCESS_TOKEN>" \
-d '{
"mode": "single_item_qa",
"prompt": "What is the value provided by public APIs based on this document?",
"items": [
{
"type": "file",
"id": "9842787262"
}
],
"dialogue_history": [
{
"prompt": "Make my email about public APIs sound more professional",
"answer": "Here is the first draft of your professional email about public APIs",
"created_at": "2013-12-12T10:53:43-08:00"
}
],
"include_citations": true,
"ai_agent": {
"type": "ai_agent_ask",
"long_text": {
"model": "azure__openai__gpt_4o_mini",
"prompt_template": "It is `{current_date}`, and I have $8000 and want to spend a week in the Azores. What should I see?",
},
"basic_text": {
"model": "azure__openai__gpt_4o_mini",
}
}
}'
await client.ai.createAiAsk({
mode: 'multiple_item_qa' as AiAskModeField,
prompt: 'Which direction sun rises?',
items: [
new AiItemBase({
id: fileToAsk1.id,
type: 'file' as AiItemBaseTypeField,
content: 'Earth goes around the sun',
}),
new AiItemBase({
id: fileToAsk2.id,
type: 'file' as AiItemBaseTypeField,
content: 'Sun rises in the East in the morning',
}),
],
} satisfies AiAsk);
client.ai.create_ai_ask(
CreateAiAskMode.MULTIPLE_ITEM_QA.value,
"Which direction sun rises?",
[
AiItemBase(
id=file_to_ask_1.id,
type=AiItemBaseTypeField.FILE.value,
content="Earth goes around the sun",
),
AiItemBase(
id=file_to_ask_2.id,
type=AiItemBaseTypeField.FILE.value,
content="Sun rises in the East in the morning",
),
],
)
await client.Ai.CreateAiAskAsync(requestBody: new AiAsk(mode: AiAskModeField.MultipleItemQa, prompt: "Which direction sun rises?", items: Array.AsReadOnly(new [] {new AiItemBase(id: fileToAsk1.Id, type: AiItemBaseTypeField.File) { Content = "Earth goes around the sun" },new AiItemBase(id: fileToAsk2.Id, type: AiItemBaseTypeField.File) { Content = "Sun rises in the East in the morning" }})));
try await client.ai.createAiAsk(requestBody: AiAsk(mode: AiAskModeField.multipleItemQa, prompt: "Which direction sun rises?", items: [AiItemBase(id: fileToAsk1.id, type: AiItemBaseTypeField.file, content: "Earth goes around the sun"), AiItemBase(id: fileToAsk2.id, type: AiItemBaseTypeField.file, content: "Sun rises in the East in the morning")]))
BoxAIResponse response = BoxAI.sendAIRequest(
api,
"What is the content of the file?",
Collections.singletonList("123456", BoxAIItem.Type.FILE),
BoxAI.Mode.SINGLE_ITEM_QA
);
items = [{
"id": "1582915952443",
"type": "file",
"content": "More information about public APIs"
}]
ai_agent = {
'type': 'ai_agent_ask',
'basic_text_multi': {
'model': 'openai__gpt_3_5_turbo'
}
}
answer = client.send_ai_question(
items=items,
prompt="What is this file?",
mode="single_item_qa",
ai_agent=ai_agent
)
print(answer)
BoxAIResponse response = await client.BoxAIManager.SendAIQuestionAsync(
new BoxAIAskRequest
{
Prompt = "What is the name of the file?",
Items = new List<BoxAIAskItem>() { new BoxAIAskItem() { Id = "12345" } },
Mode = AiAskMode.single_item_qa
};
);
client.ai.ask(
{
prompt: 'What is the capital of France?',
items: [
{
type: 'file',
id: '12345'
}
],
mode: 'single_item_qa'
})
.then(response => {
/* response -> {
"answer": "Paris",
"created_at": "2021-10-01T00:00:00Z",
"completion_reason": "done"
} */
});
{
"answer": "Public APIs are important because of key and important reasons.",
"citations": [
{
"content": "Public APIs are key drivers of innovation and growth.",
"id": "123",
"type": "file",
"name": "The importance of public APIs.pdf"
}
],
"completion_reason": "done",
"created_at": "2012-12-12T10:53:43-08:00"
}