1、Snapdragon and Qualcomm branded products are products of Qualcomm Technologies,Inc.and/or its subsidiaries.On-device AI and its thermal implicationsNader Nikfar,Sr.Dir.of TechnologyQualcomm Technologies,Inc.HotChips,Aug.2024Agenda1.What is on-device AI?2.Benefits&importance3.Use-case evolution4.Ther
2、mal Implications5.Potential solutions6.Summary3What ison-device AI?4AI is transforming 5Intelligence is moving towards edge devices Running machine learning on devices like smartphones,laptops,and cars,instead of cloud.On-device AI utilizes on-chip processors such as NPU,CPU,and GPU.Current on-devic
3、e AI can support inferences on multiple platforms.Cloud is currently necessary for pooling of big data and training AI inference algorithms,and it complements on-device processing.SOC consist of several processors integrated into the same dieDevices,machines,and things are becoming more intelligentO
4、n-device and AI go hand in hand6Al apps enabled by on-device Generative AI7Benefits&Importance8Advantages of on-device inferenceAutomotiveGen AI can be used for ADAS/ADto help improve drive policy by predicting the trajectory and behavior of various agents Inference running entirely in the cloud wil
5、l have issues for real-time applications that are latency-sensitive and mission-critical like autonomous driving.Such applications cannot afford the roundtrip time or rely on critical functions to operate when in variable wireless coverage.10Use-Case Evolution20152016-202220232023+Use CaseHardwareMo
6、delsSimple CNNTransformer/LSTM/RNN/CNN10B LLMs/LVMs/LMMs10B+LLMs/LVMs/LMMsAudio/SpeechAudio/SpeechCameraVideoLLM-powered assistantsStable Diffusion/ControNetMulti-modal genAI modelsTensorScalarVectorScalarVectorTensorScalarVectorMicro TileInferencingTransformerSupportTransformerSupportMulti-Modal AI