Saturday, April 13, 2024
Ios/Mac

Apple executives explain Apple Silicon & Neural Engine in new interview


Apple leaders talk about Apple Silicon



Apple executives recently talked about Apple Silicon in an interview, explaining the Neural Engine and the company’s chip design process.

Laura Metz, the Director of Product Marketing, Anand Shimpi of Hardware Engineering, and Tuba Yalcin from the Pro Workflow team join the interview to discuss the Apple Silicon design process and some of the newest Macs.

They also discussed the Neural Engine that is found in Apple Silicon. Introduced in the A11 Bionic chip inside the iPhone 8 and iPhone X, the Neural Engine is a specialized computing component for specific tasks, as opposed to the general-purpose computing that the rest of the CPU delivers.

The Neural Engine is optimized to handle machine learning and neural networking tasks for things like photography. For example, the 16-core Neural Engine in the M2 Pro and M2 Max chips can handle 15.8 trillion operations per second.

Next, they discuss transistors, the foundation of computer chip technology. As semiconductor processes become more advanced over time, manufacturers are able to add more transistors to chips during fabrication.

Moving to a smaller transistor technology is one of the ways that Apple can deliver more features, more performance, more efficiency, and better battery life. As transistors get smaller, manufacturers can add more of them, which can result in additional cores to the CPU and GPU.

Apple also has a special Pro Workflow team with experts across music, video, photographer, 3D visual effects, and more. The team puts together complex workflows designed to test the limits of Apple Silicon and give feedback to the engineers.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.