|
|
|
|
|
|
|
Mistral 3 is now available to run on Ollama:
|
-
Ministral 3: a model designed for local & edge deployment that achieves the best cost-to-performance ratio of any open-source model.
|
-
Mistral Large 3: A state-of-the-art open model and one of the best permissive models in the world.
|
|
|
|
|
|
|
Get started
|
|
|
|
|
|
ollama run ministral-3
|
|
|
|
Ministral 3 is available in three parameter sizes:
|
|
|
|
ollama run ministral-3:14b
|
|
ollama run ministral-3:8b
|
|
ollama run ministral-3:3b
|
|
|
|
|
|
|
|
ollama run ministral-3:14b-cloud
|
|
ollama run ministral-3:8b-cloud
|
|
ollama run ministral-3:3b-cloud
|
|
|
Mistral Large 3
|
|
Mistral Large 3 is available to run via Ollama's cloud:
|
|
|
|
ollama run mistral-large-3:675b-cloud
|
|
|
|
We are actively working on adding support for the reasoning variants of Ministral 3, along with local support for Mistral Large 3.
|
|
|
|
If you have any feedback, please directly reply to this e-mail.
|
|
|
|
❤️ Ollama
|
|
|
You are receiving this email because you opted-in to receive updates from Ollama
Ollama, 744 High Street, Palo Alto, CA 94301
Unsubscribe
|