Mistral releases new AI models optimized for edge devices


French AI startup Mistral has released its first generative AI models designed to be run on edge devices, like laptops and phones.

The new family of models, which Mistral is calling “Les Ministraux,” can be used or tuned for a variety of applications, from text generation to working in conjunction with more capable models to complete tasks.

There’s two Les Ministraux models available — Ministral 3B and Ministral 8B — both of which have a context window of 128,000 tokens, meaning they can ingest roughly the length of a 50-page book.

“Our most innovative customers and partners have increasingly been asking for local, privacy-first inference for critical applications such as on-device translation, internet-less smart assistants, local analytics, and autonomous robotics,” Mistral writes in a blog post. “Les Ministraux were built to provide a compute-efficient and low-latency solution for these scenarios.”

The Les Ministraux models are available for download — but only for research purposes. Mistral is requiring developers and companies interest in self-deployment to contact it for a commercial license.

Otherwise, developers can use Ministral 3B and Ministral 8B on its cloud platform. Ministral 8B costs 10 cents per million output/input tokens (~750,000 words), while Minstral 3B costs 4 cents per million output/input tokens.



Source link

About The Author

Scroll to Top