张松昕-Data-centric LLM training.pdf

编号:169171 PDF 26页 3.88MB 下载积分:VIP专享
下载报告请您先登录!

张松昕-Data-centric LLM training.pdf

1、Data-centric LLM trainingSongxin ZhangSUSTechJuly 5th,20241/26Limit of Data Scaling2/26Intelligence emergeing with data scalingBLOOM,June,2022ROOTS is the dataset used by Hugging Face to trainBLOOM.It compiles 0.34T tokens from 498 di?erentdata sources,including classic datasets used in NLPresearch

2、from 2008 to 2021.176B,0.34T tokens0.00850.0660.170.240.340.5GPT-2C4OPTThe PileROOTSGPT-300.10.20.30.40.5tokens3/26Intelligence emergeing with data scalingToday,less than two years later,we are training smaller models thanBLOOM and GPT-3 with more than tokens.2020LLaMA-3 8B is trained on 15T tokens.

3、LLMLlama 3StabilityRedPajamaLaMDAchinchillaGopherMT-NLGgpt-neoxgpt-34/26Large Models are running out of dataThere isnt enough data to train much more capable LLMs.5/26With limited data,how to train large models data-e?ciently?Parametric?t.We?t a parametric modelling of the loss and display contour(l

4、eft)andisoFLOP slices(right).For each isoFLOP slice,we include a corresponding dashed line in the leftplot.In the left plot,we show the e?cient frontier in blue,which is a line in log-log space.Speci?cally,the curve goes through each iso-loss contour at the point with the fewest FLOPs.Weproject the

5、optimal model size given the Gopher FLOP budget to be 40B parameters()Scaling Law()With more data-e?cient training,wecan have better with the same.(N,D)LHo?mann et al.2022Ho?mann et al.2022(N,D)E+(1)LANBDDHo?mann,Jordan,Sebastian Borgeaud,Arthur Mensch,Elena Buchatskaya,Trevor Cai,Eliza Rutherford,D

6、iego de Las Casas,et al.2022.“Training Compute-Optimal Large Language Models.”.Villalobos,Pablo,Anson Ho,Jaime Sevilla,Tamay Besiroglu,Lennart Heim,and Marius Hobbhahn.2024.“Will We Run Out of Data?Limits of LLM Scaling Based on Human-Generated Data.”.https:/arxiv.org/abs/2203.15556https:/arxiv.org/

友情提示

1、下载报告失败解决办法
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站报告下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。

本文(张松昕-Data-centric LLM training.pdf)为本站 (张5G) 主动上传,三个皮匠报告文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三个皮匠报告文库(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。
客服
商务合作
小程序
服务号
折叠