Mistral: Mixtral 8x7B (base)
mistralai/mixtral-8x7b
上下文长度: 32,768
text->text
Mistral
2023-12-10 更新
Mixtral 8x7B is a pretrained generative Sparse Mixture of Experts, by Mistral AI. Incorporates 8 experts (feed-forward networks) for a total of 47B parameters. Base model (not fine-tuned for instructions) - see Mixtral 8x7B Instruct for an instruct-tuned model. moe