Augmented Reality is used by Apply in FaceTime eye correction


Apple has been implementing several useful additions to iOS 13. One of these is an eye-correction feature to FaceTime video calls that is based on Augmented Reality. This was revealed by app designer Mike Rundle during the public beta for its most recent update.

He advances that an option named “FaceTime Attention Correction” is now available. According to this functionality’s tooltip, if you activate it, it will increase the accuracy of your eye contact with the camera during video calls. This development followed a minor flaw that most video-calling apps seemed to have. When talking to someone, if you are looking at your screen to look at the person, he/she does not get this impression. This results in a sort of disconnect where everyone in the call seems not to be looking at the other party.

As such, Apple made alterations to your video streams to remedy this situation. This new feature utilise the firm’s ARKit augmented reality software and the TrueDepth cameras built into the latest iPhones. FaceTime uses the latter to have a detailed map of your face. It then sends this data to the augmented reality kit which creates a slightly altered version of your eyes and nose with a new focus.



FaceTime uses augmented reality to change your eye’s focus during video calls

Comments