海量在线大模型 兼容OpenAI API

Mistral: Mixtral 8x7B (base)

$0.0022/1k
$0.0022/1k
开始对话
mistralai/mixtral-8x7b
上下文长度: 32,768 text->text Mistral 2023-12-10 更新
Mixtral 8x7B is a pretrained generative Sparse Mixture of Experts, by Mistral AI. Incorporates 8 experts (feed-forward networks) for a total of 47B parameters. Base model (not fine-tuned for instructions) - see Mixtral 8x7B Instruct for an instruct-tuned model. moe

模型参数

架构信息

模态: text->text
Tokenizer: Mistral
指令类型: none

限制信息

上下文长度: 32,768