mistralai/codestral-mamba
上下文长度: 256,000
text->text
Mistral
2024-07-19 更新
A 7.3B parameter Mamba-based model designed for code and reasoning tasks. Linear time inference, allowing for theoretically infinite sequence lengths 256k token context window Optimized for quick responses, especially beneficial for code productivity Performs comparably to state-of-the-art transformer models in code and reasoning tasks Available under the Apache 2.0 license for free use, modification, and distribution