AWS Claude Haiku 4.5
AWS Claude Haiku 4.5
The AWS Claude Haiku 4.5 model is optimized for high-volume, low-latency applications. It provides strong coding performance with improved speed and cost efficiency compared to larger models.
Model details
| Item | Value | Description |
|---|---|---|
| Model name | AWS Claude Haiku 4.5 | The name of the model. |
| Model category | Standard | The category of the model: Standard or Premium. |
| API model name | aws__claude_4_5_haiku | The name of the model that is used in the Box AI API for model overrides. The user must provide this exact name for the API to work. |
| Hosting layer | Amazon Web Services (AWS) | The trusted organization that securely hosts LLM. |
| Model provider | Anthropic | The organization that provides this model. |
| Release date | October 15th, 2025 | The release date for the model. |
| Knowledge cutoff date | February 2025 | The date after which the model does not get any information updates. |
| Input context window | 200k tokens | The number of tokens supported by the input context window. |
| Maximum output tokens | 64k tokens | The number of tokens that can be generated by the model in a single request. |
| Empirical throughput | Not specified | The number of tokens the model can generate per second. |
| Open source | No | Specifies if the model's code is available for public use. |
Additional documentation
For additional information, see official AWS Claude Haiku 4.5 documentation.