According to a report by Phoronix, Intel quietly confirmed the existence of a new ASIC, the Versatile Processing Unit (VPU), via a new Linux driver posted yesterday. This new unit is designed to accelerate AI interference for deep learning applications and will arrive in 14th Gen Meteor Lake processors.
For the uninitiated, AI interference refers to using trained AI networks to make predictions and is a key part of all modern AI workflows. With the tech industry so heavily focused on AI algorithms, it only makes sense for Intel to put these “AI cores” into its chips to meet consumer and developer demand in workflows. You can think of this new unit as a similar alternative to Nvidia’s Tensor cores.
This silent announcement comes six years after Intel acquired AI processing unit expert Movidius. Movidius’ creation at the time was revolutionary; its VPUs could pack impressive performance per watt, with a heterogeneous package consisting of several additional processors designed for specific tasks. With this specialized architecture, Movidius’ creation could pull off 4 TOPS in a 1.5W power envelope.
In layman’s terms, this chip had a level of power efficiency other companies in the industry could only dream of at the time — including Nvidia.
Without a doubt, this new chip is built at least partially by the employees behind Movidius. Unfortunately, we don’t know how powerful this new Versatile Processing Unit will be in Intel’s Meteor Lake processors. However, after five years of research and development, we expect Intel and Movidius’ new ASIC to perform very well.
Thanks to the driver patch notes, all that we know currently is some of the chip’s internals. Intel’s VPU will include a memory management unit for translating VPU data to host DMA addresses and isolating user workloads, a RISC-based microcontroller, a Neural Compute Subsystem, and a Network on Chip.
Meteor Lake is still two CPU generations away, so it will be some time before Intel releases this product to market. But apparently, this chip will be available explicitly for client CPUs. As a result, we don’t know if server chips will get this VPU (or perhaps a beefed-up variant), but this news at least guarantees we’ll see this AI-focused unit on the consumer side of the market.