Qualcomm’s new Cloud A! 100 chips can push the pushes tera operations per second. (TOPS) performance to greater than 50 for the DM 2.e, about 200 TOPS for the DM.2, and 400 TOPS for the PCIe chip.
Image: Qualcomm

Qualcomm has a new set of AI accelerator chips and a developer kit to stake a bigger claim in the fast-growing edge computing space.

John Kehrli, senior director of product management, said that Qualcomm sees a significant need in the AI computation market for low power consumption, 5G connectivity, and low latency.

“We see explosive growth in edge computing and we are focusing on inference,” he said in a press briefing about the new products.

There are three versions of the new chip: PCIe, DM.2, and DM.2e. Qualcomm is sampling the new chip to multiple customers now and plans to start shipping in the first half of 2021. Qualcomm sees these uses for the AI accelerator:

  1. Data centers and cloud edge
  2. Standalone 5G edge appliance
  3. Advanced driver assistance systems at level 3 and above
  4. 5G infrastructure

SEE: Natural language processing: A cheat sheet (TechRepublic)

Kehrli said Qualcomm’s new CLoud AI 100 chip is designed for high and medium tera operations per second (TOPS). On the AI performance continuum, high TOPS is 50, medium is five to 10, and low is less than five. The design of the Cloud AI 100 chips pushes the TOPS performance to greater than 50 for the DM 2.e, about 200 TOPS for the DM.2, and 400 TOPS for the PCIe chip.

Kehrli also said Qualcomm sees an architectural shift in AI cloud computing with infrastructure using purpose-built accelerators gaining a 10x improvement in speed and power over more traditional designs.

Kehril shared a chart that compared the performance of Qualcomm’s new Cloud AI 100 chips to other suppliers using the ResNet-50 benchmark. Qualcomm’s Cloud AI 100 PCIe chip performed as well as Nvidia’s A100 while using less energy. The Cloud AI 100 DM.2 and DM.2e chips were comparable to Intel and Nvidia chips while using less than 50 watts of power.

“In a world where total cost of ownership really matters, we think this is very compelling,” Kehrli said.

The new chips are designed to do image classification, object detection and tracking, machine translation, and product recommendations for online shopping platforms.

Kehrli said he expects the first commercial deployments of the new chip to be for edge computing not in data centers, including smart city installations, retail stores, and manufacturing.

Keith Kressin, senior vice president and general manager for computing and edge cloud, said that he expects to spend time optimizing certain performance factors for the chip based on customer needs.

The hardware architecture of the chips includes:

  • Up to 400 TOPS
  • Power ranges from 15W for the DM.2e to 25W for the DM.2 and 75W for the PCIe/HHHL
  • Up to 16 AI cores
  • Precision: INT8, INT16, FP16, FP32
  • Up to 144 MB on-die SRAM
  • 4×64 LPDDR4x (2.1Ghz) with inline ECC
  • Up to eight lanes on the PCI3 Gen 3/4

Qualcomm is also releasing a development kit to go along with the two new chips. The company is calling the kit a one-stop-shop that includes the Cloud AI 100 chip, the Snapdragon 865 Mobile Platform Module, and the Snapdragon X55 Modem-RF system. The kit includes two SDKs:

  • AIC apps SDK: compiler, simulator, and sample codes
  • AID platform SDK runtime, APIs, Kernel drivers, and tools

Kehrli said the development kit is a greenfield opportunity for Qualcomm and a tool for customers to get started quickly in this emerging space. The kit comes with a precompiled app for demo purposes or customers can compile their own app and run it immediately.

Kehrli said that the pre-certified 5G module will make life easier for customers who don’t have a lot of experience in working with telcos.

Qualcomm will provide the development kit to customers in October 2020.