Update README.md

This commit is contained in:
ai-modelscope
2025-05-14 00:10:25 +08:00
parent 47142ce962
commit d3e5e7dfca
2 changed files with 11 additions and 6 deletions

View File

@ -28,7 +28,7 @@ library_name: transformers
 |
<a href="https://www.modelscope.cn/organization/XiaomiMiMo" target="_blank">🤖️ ModelScope</a>
&nbsp;|
<a href="https://github.com/XiaomiMiMo/MiMo/blob/main/MiMo-7B-Technical-Report.pdf" target="_blank">📔 Technical Report</a>
<a href="https://arxiv.org/abs/2505.07608" target="_blank">📔 Technical Report</a>
&nbsp;|
<br/>
</div>
@ -203,7 +203,7 @@ Example script
```py
from transformers import AutoModel, AutoModelForCausalLM, AutoTokenizer
model_id = "XiaomiMiMo/MiMo-7B-Base"
model_id = "XiaomiMiMo/MiMo-7B-RL"
model = AutoModelForCausalLM.from_pretrained(model_id, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(model_id)
inputs = tokenizer(["Today is"], return_tensors='pt')
@ -221,16 +221,18 @@ print(tokenizer.decode(output.tolist()[0]))
## V. Citation
```bibtex
@misc{xiaomi2025mimo,
title={MiMo: Unlocking the Reasoning Potential of Language Model From Pretraining to Posttraining},
@misc{coreteam2025mimounlockingreasoningpotential,
title={MiMo: Unlocking the Reasoning Potential of Language Model -- From Pretraining to Posttraining},
author={{Xiaomi LLM-Core Team}},
year={2025},
eprint={2505.07608},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://github.com/XiaomiMiMo/MiMo},
url={https://arxiv.org/abs/2505.07608},
}
```
## VI. Contact
Please contact us at [mimo@xiaomi.com](mailto:mimo@xiaomi.com) or open an issue if you have any questions.
Please contact us at [mimo@xiaomi.com](mailto:mimo@xiaomi.com) or open an issue if you have any questions.