Blog

Share
Industry News

Forget the Super Retina Display. The iPhone X’s headline feature is mobile machine learning

by Basil Shikin on Sep 12, 2017

Today, Apple announced the hotly-anticipated iPhone X on the 10th anniversary of the device that changed smartphones forever. The iPhone X features a 5.8-inch edge-to-edge OLED display, which Apple calls Super Retina Display. This display means the iPhone X won’t rely on Touch ID anymore (there isn’t room for a home button), and will instead use facial recognition to unlock. While it’s easy to say that the iPhone X display is its headline feature, it’s actually the prevalence of machine learning throughout the iPhone experience that’s the real star.

Face ID, a secure facial unlock

One of the biggest ways machine learning is being utilized on the iPhone X is Face ID, Apple’s all-new facial unlock technology. According to Apple, the iPhone X’s TrueDepth camera will map your face with over 30,000 infrared dots and learn over time how you look, regardless of whether you’re wearing glasses, you’ve grown a beard, or you’re wearing a hat. Apple achieved this by tapping into multiple neural networks to create a mathematical model of your face, which the company claims is 20 times more secure than Touch ID.

Image credit: Apple

Face ID is a big deal, as competitors have struggled to make a truly secure facial unlock. Samsung has its own facial unlock feature, but it’s easily circumvented a static picture. However, Microsoft’s Windows Hello feature, which relies on two cameras, offers biometric security that even twins can’t defeat, unlike Face ID. But to bolster security, Apple keeps facial recognition data in a “secure enclave,” which means it can only be decrypted on the iPhone. It’ll be interesting to see if Face ID lives up to its security claims.

Custom A11 Bionic processor

To power Face ID, Apple created its A11 Bionic chip with its onboard Neural Engine that can perform up to 600 billion operations per second. This custom hardware allows Apple to distribute machine learning work across the CPU and GPU to optimize performance. Apple isn’t alone in this approach; ARM is designing its new processor to power machine learning.   

iPhone X cameras

Image credit: Apple

It’s this A11 Bionic chip that powers Face ID and the image recognition inside the camera app. Beyond taking portraits with simulated bokeh and studio lighting effects, the iPhone X’s cameras will also be tuned for AR and features real-time image and motion analysis. This means the iPhone X is capable of understanding a scene and optimizing it to make photos of your subject better with machine learning.

While Animojis are a silly way to communicate with friends, the technology that powers it is impressive. Animojis utilize the TrueDepth camera to analyze over 50 different facial muscle movements, which can be translated onto a dozen different emojis, including a chicken, unicorn, and robot.

The iPhone X is an undeniably impressive smartphone, but what it’s packing under the hood in regard to machine learning is what makes it an industry leader. By building a custom processor around machine learning, Apple shows that it believes in a future where machine learning improves every little aspect of our lives, from taking better photos to frictionless biometric security. For all the hype that Machine Learning and AI get, the iPhone X is the clearest example to date of how these technologies are improving our lives by solving real-world problems today, not in the distant future.

Basil Shikin is AppLovin’s VP of Engineering.

We’re hiring! Apply here.