《DeepSeek LLM技术报告(英文版)(48页).pdf》由会员分享,可在线阅读,更多相关《DeepSeek LLM技术报告(英文版)(48页).pdf(48页珍藏版)》请在三个皮匠报告上搜索。
1、DeepSeek LLMScaling Open-Source Language Models with LongtermismXiao Bi,Deli Chen,Guanting Chen,Shanhuang Chen,Damai Dai,Chengqi Deng,Honghui Ding,Kai Dong,Qiushi Du,Zhe Fu,Huazuo Gao,Kaige Gao,Wenjun Gao,Ruiqi Ge,Kang Guan,Daya Guo,Jianzhong Guo,Guangbo Hao,Zhewen Hao,Ying He,Wenjie Hu,Panpan Huang
2、,Erhang Li,Guowei Li,Jiashi Li,Yao Li,Y.K.Li,Wenfeng Liang,Fangyun Lin,A.X.Liu,Bo Liu,Wen Liu,Xiaodong Liu,Xin Liu,Yiyuan Liu,Haoyu Lu,Shanghao Lu,Fuli Luo,Shirong Ma,Xiaotao Nie,Tian Pei,Yishi Piao,Junjie Qiu,Hui Qu,Tongzheng Ren,Zehui Ren,Chong Ruan,Zhangli Sha,Zhihong Shao,Junxiao Song,Xuecheng S
3、u,Jingxiang Sun,Yaofeng Sun,Minghui Tang,Bingxuan Wang,Peiyi Wang,Shiyu Wang,Yaohui Wang,Yongji Wang,Tong Wu,Y.Wu,Xin Xie,Zhenda Xie,Ziwei Xie,Yiliang Xiong,Hanwei Xu,R.X.Xu,Yanhong Xu,Dejian Yang,Yuxiang You,Shuiping Yu,Xingkai Yu,B.Zhang,Haowei Zhang,Lecong Zhang,Liyue Zhang,Mingchuan Zhang,Minghu
4、a Zhang,Wentao Zhang,Yichao Zhang,Chenggang Zhao,Yao Zhao,Shangyan Zhou,Shunfeng Zhou,Qihao Zhu,Yuheng Zou*DeepSeek-AIAbstractThe rapid development of open-source large language models(LLMs)has been truly remarkable.However,the scaling laws described in previous literature presents varying conclusio
5、ns,whichcasts a dark cloud over scaling LLMs.We delve into the study of scaling laws and present ourdistinctive findings that facilitate the scaling of large scale models in two prevalent used open-source configurations,7B and 67B.Guided by the scaling laws,we introduce DeepSeek LLM,a project dedica
6、ted to advancing open-source language models with a long-term perspective.To support the pre-training phase,we have developed a dataset that currently consists of 2trillion tokens and is continuously expanding.We further conduct supervised fine-tuning(SFT)and direct preference optimization(DPO)on De