Apple’s TrueDepth iPhone Camera Tech is More Than Just a Biometrics Tool

Apple’s TrueDepth iPhone Camera Tech is More Than Just a Biometrics Tool

When Apple first debuted live Memoji in 2017 with the iPhone X, it did so with the introduction of a new camera sensor – what Apple calls a TrueDepth camera. This sensor became what enabled Apple’s intuitive FaceID system, which creates a 3D map of your face with terrific accuracy.

Today the technology can be leveraged by developers through ARKit 6, Apple’s development platform for augmented reality apps. It’s likely to play at least a small part in the release of the upcoming Apple VR/AR headset, but despite Apple’s focus on the sensor, it hasn’t been widely adopted by many Android competitors. In fact, neither Samsung or Google have modern alternatives that are as powerful as Apple’s TrueDepth camera tech. Despite not being widely adopted, it’s more than just a biometric tool for getting into your phone – here are the other cool things it’s used for.

What is a TrueDepth camera?

A TrueDepth camera projects thousands of dots onto your face, like a mesh net made of invisible lines. It’s a proprietary technology that Apple owns and, as we briefly touched on already, it’s the only technology of its type on a smartphone, excluding a handful of devices from back when Apple first announced the technology. Competitors tend to just use the selfie camera instead of the true depth of a TrueDepth alternative – their version usually only captures a 2D image.

Apple’s version takes the features that make up your face and digitises them in 3D.

It’s also used to animate your face when using Memoji and translate your facial movements – you know, those fun personalised emojis that iPhone users get access to (which, let’s be honest, are mostly pointless). It can also be used in third-party apps developed with the TrueDepth sensor in mind (such as some filters on Messenger video calls and some pretty fun games).

Beyond FaceID and third-party apps, TrueDepth’s most useful application is possibly in vtubing, a subgenre of live streaming where the streamer uses an animated model instead of simply a webcam showing their face. Vtubers use TrueDepth facial tracking through apps on an iPhone to capture their facial expressions, which are translated through to their model, which is then shown on the live stream.

While Android phones and webcams can capture and process the same facial data as Apple’s TrueDepth camera, it’s widely considered in the vtubing community that iPhones with the TrueDepth sensor are more responsive and reactive to facial movements than simply using a webcam or Android phone. GirlDM, a popular vtuber, does a great live example of the differences below.

As is the case with much of Apple’s proprietary tech, how the thing works is shielded in secrecy, but the TL;DR is: A TrueDepth camera captures a 3D map of your face, informing biometric data for FaceID and augmented reality applications.

Which iPhones have a TrueDepth camera?

Apple has included TrueDepth technology on every iPhone it has released since the iPhone X in 2017 (excluding the iPhone SE 2020 and the iPhone SE 5G 2022). This includes models in the iPhone 11, 12, 13, and 14 ranges. Additionally, all iPad Pro models since 2018 have shipped with a TrueDepth camera and FaceID support.

Anyway, to be honest, I just wanted to put this in writing, as I watch a lot of vtubers and have been thinking about the tech quite a bit. It’s really cool, but sorry to my iPhone-loving editor Asha – it’s not enough to bring me back to Apple’s walled garden.