The AI-controlled "eye contact" function from Microsoft for the Surface Pro X is finally introduced months after it was originally announced for the ARM-powered Surface laptop last year.
The new feature is now being introduced as part of Windows 10 Insider Preview Build 20175. This means that it is probably not too far from a comprehensive version.
According to Microsoft, the new feature relies on the "Artificial Intelligence Features of the Microsoft SQ1 Processor" to adjust where your eyes look in a video call or chat to ensure that you always have eye contact with the camera – even when the lens is on Page shows or you look at your display. If this option is activated, eye contact correction is automatically applied to all apps that use the front camera (e.g. Zoom, Skype or Google Meet). However, it only works when the Surface Pro X is in landscape mode.
An interview with VentureBeat last autumn shows that the technology includes the unique AI functions of the ARM-based SQ1 processor and activates the function on the Surface X. This means that it is not available on normal x86 Windows computers. The processing that takes place here simply consumes too much power to be practical on a normal machine.
If that sounds familiar to you, it's because Microsoft isn't the only company working on AI-corrected looks: Apple has been working on a similar FaceTime attention correction function for some time. Earlier versions of the features appeared in iOS 13 betas last year before it was subtracted from final release. However, the feature is listed on Apple's iOS 14 website, so iPhone and iPad users don't have to wait too long to fake long FaceTime calls.