海量在线大模型 兼容OpenAI API

Databricks: DBRX 132B Instruct

$0.0043/1k
$0.0043/1k
开始对话
databricks/dbrx-instruct
上下文长度: 32,768 text->text Other 2024-03-29 更新
DBRX is a new open source large language model developed by Databricks. At 132B, it outperforms existing open source LLMs like Llama 2 70B and Mixtral-8x7b on standard industry benchmarks for language understanding, programming, math, and logic. It uses a fine-grained mixture-of-experts (MoE) architecture. 36B parameters are active on any input. It was pre-trained on 12T tokens of text and code data. Compared to other open MoE models like Mixtral-8x7B and Grok-1, DBRX is fine-grained, meaning it uses a larger number of smaller experts. See the launch announcement and benchmark results here. moe

模型参数

架构信息

模态: text->text
Tokenizer: Other
指令类型: chatml

限制信息

上下文长度: 32,768