Skip to content

Latest commit

 

History

History

mixtral-8x7b

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Example notebooks for the Mixtral-8x7B models on Databricks

mistralai/Mixtral-8x7B-v0.1 and mistralai/Mixtral-8x7B-Instruct-v0.1 are a is a pretrained generative Sparse Mixture of Experts.

  • Outperforms Llama 2 70B on most benchmarks we tested.
  • It gracefully handles a context of 32k tokens.
  • It handles English, French, Italian, German and Spanish.
  • It shows strong performance in code generation.
  • It can be finetuned into an instruction-following model that achieves a score of 8.3 on MT-Bench.

Mixtral 8x7B is a high-quality sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0.