Body and hand pose detection in iOS 14 will provide new ways to interact with your iPhone without touching the screen


Starting with iOS 14 and macOS Big Sur, developers will be able to add the ability to detect the human body and hand postures in photos and videos to their applications using Apple’s updated vision framework, as explained in this WWDC session. 2020.


This functionality will allow applications to analyze people’s poses, movements and gestures, allowing for a wide variety of potential features. Apple provides some examples, including an exercise app that could automatically follow a user’s exercise, a safety training app that could help employees use correct ergonomics, and a media-editing app that could find photos. or videos based on the similarity of the pose.

Manual pose detection, in particular, promises to offer a new way of interacting with applications. The Apple demo showed a person holding their thumb and forefinger together and then being able to draw an iPhone app without touching the screen.


Additionally, apps could use the frame to overlay emoji or graphics on a user’s hands that reflect the specific gesture, such as a peace sign.


Another example is a camera application that automatically activates photo capture when it detects that the user makes a specific gesture with his hand in the air.

The frame is capable of detecting multiple hands or bodies in a scene, but algorithms may not work as well with people who wear gloves, bend over, face down, or wear overflowing or robe-like clothing. The algorithm can also experience difficulties if a person is near the edge of the screen or is partially obstructed.

Similar functionality is already available through ARKit, but it is limited to augmented reality sessions and only works with the rear camera on compatible iPhone and iPad models. With the updated Vision framework, developers have much more possibilities.

.