Mistral, the company sometimes considered Europe’s great hope for AI, is releasing several updates to its AI assistant, Le ...
Mistral’s le Chat runs at speeds that rival OpenAI and Meta, but it wasn’t available on mobile devices until Feb. 6.
Le Chat's Flash Answers is using Cerebras Inference, which is touted to be the ‘fastest AI inference provider'.
As a reminder, Mistral develops its own large language models ... It also releases a number of open-weight models under the Apache 2.0 license. Mistral hopes to position itself as a credible ...
The new 24B-parameter LLM 'excels in scenarios where quick, accurate responses are critical.' In fact, the model can be run ...
French AI startup Mistral unveils a breakthrough 24B parameter language model that matches the performance of models three ...
Mistral’s model is called Mistral Small 3. The new LLM from the Allen Institute for AI, or Ai2 as it’s commonly referred to, ...