ai21/jamba-1-5-mini
上下文长度: 256,000
text->text
Other
2024-08-23 更新
Jamba 1.5 Mini is the world’s first production-grade Mamba-based model, combining SSM and Transformer architectures for a 256K context window and high efficiency. It works with 9 languages and can handle various writing and analysis tasks as well as or better than similar small models. This model uses less computer memory and works faster with longer texts than previous designs. Read their announcement to learn more.