海量在线大模型 兼容OpenAI API

Mistral: Ministral 8B

$0.0004/1k
$0.0004/1k
开始对话
mistralai/ministral-8b
上下文长度: 128,000 text->text Mistral 2024-10-17 更新
Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications.

模型参数

架构信息

模态: text->text
Tokenizer: Mistral

限制信息

上下文长度: 128,000