News
The open-source MiMo model has 7 billion parameters and outperformed OpenAI’s o1-mini and Alibaba Group Holding’s QwQ-32B-Preview, part of the Qwen series of models, in maths reasoning and ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results