1、Glenn Ge,Ph.D.,MBACo-founder&CEOTetraMem Inc.USAAnalog In-memory computing with multilevel memristive devices for high performance computing The AI and AI Chip MarketSource:ARK Big Idea 2022 AI applications will add$30 trillion to the global equity market capitalization during the next 2 decades and
2、 AI chip market will reach$150 billion at 2030 with 30%CAGR3Source:Global X 2023 OpenAIs Sora Ignites Increased Computing Demand41 minute of OpenAIs Sora video may take over an hour to generate=Without new efficient computing,we need 18,460,000 A100 to generate the videos daily watched in TikTok*.*S
3、ource:Video picture from OpenAI&analysis from Minsheng Securities,2024Customer Pain-points For Energy Efficiency AMDs plenary talk,ISSCC 2023 AMDs plenary talk by Dr.Lisa Su,ISSCC 2023 In-Memory Computing(IMC)Solution For AI ComputingALUCache Memory(L1/L2/L3)ALUALUALUALUALUALUALUDRAM MemoryTradition
4、al von Neumann Architecture IMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC UnitIMC Architecture Data processed in the same physical location as it is stored with minimum intermediate data movement&storage=low power consumpt
5、ion Massive parallel computing process by cross-bar array architecture with device-level grain cores=high throughput Computing by physical laws(Ohms law and Kirchhoffs current law)=low latency Data movementSuperior architecture,but right device is the keyTang etc.2019 Symposium on VLSI CircuitsNote:
6、nvCIM=nonvolatile compute in memory,PE:processing element SRAM:Very fast and little energy(pJ level)for data movement but very limited size(few K to a few hundred M Byte on chip)DRAM:Large memory size(GB)but high energy(1000 pJ level)and slow speed(ns)Computing Memory:Memory Device with Special Attr