Surface and AI
NPUs are here to help you take AI innovations to the next level. Check out this solution brief to learn more about how NPUs work, and to discover how Surface devices leverage them to power AI-intensive workloads.
Difference Between CPU, GPU, and NPU
The CPU, or Central Processing Unit, is the primary brain of the computer that executes software instructions and performs basic operations. It can handle various tasks but is not optimized for specific operations. The GPU, or Graphics Processing Unit, specializes in rendering 2D and 3D graphics and is efficient at processing data in parallel but is not suited for AI operations. The NPU, or Neural Processing Unit, is integrated into the System on Chip and is specifically designed for deep learning tasks, enabling faster inference operations with an architecture that reduces memory access needs.
Optimizing CPU Workload with NPU
By utilizing the NPU, various AI-related tasks such as automatic framing, portrait blur, and voice focus can run concurrently while freeing up the CPU to handle other tasks, like PowerPoint presentations and Outlook. This can lead to more efficient multitasking and improved performance without overloading the CPU.
Operations Per Second of the NPU
The NPU in the Surface Pro 9 with 5G can process up to 15 trillion operations per second (15 TOPS). This high performance allows for advanced AI features, contributing to optimized power consumption and improved overall efficiency in device functionality.