mirror of
https://www.modelscope.cn/openai-mirror/gpt-oss-120b.git
synced 2026-04-02 10:02:56 +08:00
Update README.md
This commit is contained in:
16
README.md
16
README.md
@ -13,7 +13,7 @@ tags:
|
||||
<p align="center">
|
||||
<a href="https://gpt-oss.com"><strong>Try gpt-oss</strong></a> ·
|
||||
<a href="https://cookbook.openai.com/topic/gpt-oss"><strong>Guides</strong></a> ·
|
||||
<a href="https://openai.com/index/gpt-oss-model-card"><strong>Model card</strong></a> ·
|
||||
<a href="https://arxiv.org/abs/2508.10925"><strong>Model card</strong></a> ·
|
||||
<a href="https://openai.com/index/introducing-gpt-oss/"><strong>OpenAI blog</strong></a>
|
||||
</p>
|
||||
|
||||
@ -166,3 +166,17 @@ The gpt-oss models are excellent for:
|
||||
Both gpt-oss models can be fine-tuned for a variety of specialized use cases.
|
||||
|
||||
This larger model `gpt-oss-120b` can be fine-tuned on a single H100 node, whereas the smaller [`gpt-oss-20b`](https://huggingface.co/openai/gpt-oss-20b) can even be fine-tuned on consumer hardware.
|
||||
|
||||
# Citation
|
||||
|
||||
```bibtex
|
||||
@misc{openai2025gptoss120bgptoss20bmodel,
|
||||
title={gpt-oss-120b & gpt-oss-20b Model Card},
|
||||
author={OpenAI},
|
||||
year={2025},
|
||||
eprint={2508.10925},
|
||||
archivePrefix={arXiv},
|
||||
primaryClass={cs.CL},
|
||||
url={https://arxiv.org/abs/2508.10925},
|
||||
}
|
||||
```
|
||||
Reference in New Issue
Block a user