海量在线大模型 兼容OpenAI API

Microsoft: Phi-3.5 Mini 128K Instruct

$0.0004/1k
$0.0004/1k
开始对话
microsoft/phi-3.5-mini-128k-instruct
上下文长度: 128,000 text->text Other 2024-08-21 更新
Phi-3.5 models are lightweight, state-of-the-art open models. These models were trained with Phi-3 datasets that include both synthetic data and the filtered, publicly available websites data, with a focus on high quality and reasoning-dense properties. Phi-3.5 Mini uses 3.8B parameters, and is a dense decoder-only transformer model using the same tokenizer as Phi-3 Mini. The models underwent a rigorous enhancement process, incorporating both supervised fine-tuning, proximal policy optimization, and direct preference optimization to ensure precise instruction adherence and robust safety measures. When assessed against benchmarks that test common sense, language understanding, math, code, long context and logical reasoning, Phi-3.5 models showcased robust and state-of-the-art performance among models with less than 13 billion parameters.

模型参数

架构信息

模态: text->text
Tokenizer: Other
指令类型: phi3

限制信息

上下文长度: 128,000