keynote-running-llms-in-the-cloud-xi-daepnanoai-chan-bo-yun-nf-miley-fu-developer-advocate-second-state.pdf

编号:627250 PDF 13页 22.09MB 下载积分:VIP专享
下载报告请您先登录!

keynote-running-llms-in-the-cloud-xi-daepnanoai-chan-bo-yun-nf-miley-fu-developer-advocate-second-state.pdf

1、Running LLMs in the Cloud Miley Fu,WasmEdgeGitHub/Twitter:mileyfuhttps:/ Calling in Open Source TechnologyEmbed LLM into your container apphttps:/ run-rm-p 8080:8080-name api-server secondstate/llama-3-8b-nomic-1.5:latest ctx size-4096Video demo hereKey featuresTightly coupled LLM and applicationMat

2、ches prompts,quantization&runtime with the exact version of LLMThe container app always works regardless LLM upgrade cyclesLightweightOnly 5GB as opposed to 10GB PyTorch appPortableThe same binary app inside the container works on multiple CPUs and GPUsDevelop on Mac and deploy on NvidiaEasy to embe

3、d into Rust/JS/Python appsWorks with existing container tools,such as K8sLlamaEdgeReal-world use cases1.Personal LLMs:Gaia Network,users run personal LLMs with embedded knowledge base.2.AI OS:Open interpreter as a local LLM provider;51.5k stars3.Finance:Financial analytics bot.4.Hardware:Robot voice

4、 control 5.Education:UC Berkley TA6.Game:Open-source game engine Cocos AI,use Wasm to run AI models that enhance gameplay experiences enabled by NPC.Use a Gaia Net imageLlamaEdge:easy LLM deployment+inferenceSingle cross-platform binary(Automagically take advantage of local hardware accelerators)Com

5、pile and test apps on one machine(e.g.,Mac)and deploy it to another cloud server(e.g.,Nvidia CUDA 12)The app can be moved around and deploy to new hardware by K8sPackage the Wasm app into Docker image as an embedded AI/LLM service.Only 1/100 the size of a Python runtime.WasmEdgehttps:/ living knowle

6、dge serverhttps:/ AI EcosystemCNCF+LF AI&Data Survey How your company uses GenAI toolsDifferent use casesModalities/models in useChallenges to adoptionThe role of open source in adoption decisionsCalling for contributorsYouTubeStay in Touch

友情提示

1、下载报告失败解决办法
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站报告下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。

本文(keynote-running-llms-in-the-cloud-xi-daepnanoai-chan-bo-yun-nf-miley-fu-developer-advocate-second-state.pdf)为本站 (山海) 主动上传,三个皮匠报告文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三个皮匠报告文库(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。
客服
商务合作
小程序
服务号
折叠