In today’s video call-first world, fostering natural connections is more important than ever before. One of the natural indicators of showing interest is eye contact, which is lost in the space between the lens and the screen when making video calls. Enter Microsoft’s Eye Contact feature, which adjusts one’s gaze in video calls and recordings so the user appears to be looking directly in the camera while looking at the screen.
This feature is available now on the Surface Pro X, and works in video calling apps like Microsoft Teams, Skype, and Webex Meet, among others.
To find out more about this feature, Microsoft’s technical fellow Stevie Bathiche answered a few questions about the technology.
What is the inspiration behind Eye Contact?
Nothing is more powerful than making a connection with another person. The human face has 43 muscles and can evoke about 21 expressions. Your face has a language all to itself. Social science has shown that when two people talk to each other more often than not, they are looking at each other’s faces. Making eye contact facilitates in communicating our micro-emotions and intent behind what we are saying.
When you can’t make eye contact with somebody, the mind thinks the other person is not paying attention. The language is stunted. And many devices in the market today don’t always help because the camera is not located where the person is looking. Using the dedicated AI silicon in the Microsoft SQ1 (more on that below) in Surface Pro X along with the 5.0MP front-facing camera that captures 1080p full HD video, we have made it easier than ever to enable connections that make conversations feel natural.
What impact do you think this feature will have given the significance of video calling today?
Given the growing significance of remote work, learning, and virtual health appointments for example, we realise that the user-facing camera is more important than ever, especially to access human connection while travel is limited. Eye Contact adjusts the users’ gaze across all these scenarios, helping people to make eye contact as if they were right in the same room. The feature can be simply toggled on or off inside the Surface App and once enabled will be automatically applied any time you use the camera, so it works across any video calling service (i.e. Microsoft Teams) or even when recording a video.
Do you have any data or insights on the psychology behind the impact Eye Contact has on people connecting through screens?
There are many studies on the positive influence of eye contact on communication, such as building trust and better mutual understanding. Young influencers have trained themselves to look at the camera when recording. This can be difficult for some people, like children, and makes it hard for them to connect. It can also be especially difficult when you need to look at a corresponding partner, e.g. as a doctor or in legal professions. We hope Eye Contact enables greater inclusiveness across demographics.
Was there anything particularly challenging in the development process? Anything that was difficult to overcome that people would be surprised to hear?
There were two kinds of challenges. One was purely technical:
Since we wanted any application on Surface Pro X have access to Eye Contact without needing extra work on part of the app developers, we made a conscious choice to integrate it as feature of the camera – in principle, just like how your camera can be adjusted for focus, resolution or brightness, we wanted it capable for Eye Contact. Compared to developing an AI feature in a standalone application, this needed a lot of teaming from across the company.
The second challenge was perceptual. Given eyes send such strong signals of attention and emotion, changes to few pixels can have a huge impact on the perception of the whole image. Our team ensured the AI algorithm maintains people’s identity and intention when communicating. We believe image-modifying features should be opt-in for any user, which is why we provide system-wide control for the user.
Last year Microsoft introduced the Surface Pro X with custom, Microsoft silicon. What did that innovation enable?
The custom Microsoft SQ1 chipset inside Surface Pro X not only enabled our most thin and light Surface Pro with great battery life and fast charging, it is also the first to integrate an AI chip and enable AI offload. This allows the feature to be more power-efficient, which means when you’re using the feature, you shouldn’t feel an impact on your device. Without this, the amount of compute needed to enable such features would have a very perceptible hit on your battery life– especially if you are doing multiple hours of calls and recordings. In fact, it makes Surface Pro X the first Windows PC to fully offload AI onto a specialised chip.
Will Eye Contact be available on other Surface devices in the future?
We believe a various number and type of these AI experiences will be infused across the Surface product line. What experiences and when will be a function of the capability of the device (such as its AI acceleration ability), the AI software platform, and customer needs.
What is one technology trend you’re most excited about in the year ahead?
Certainly, the amount of innovation we are seeing in AI at the edge, and how it enables so much compute with so little power… following closely with the advances in machine learning itself. With Eye Contact, we are solving a hardware challenge in that we cannot place the camera right behind the display (not yet!) but it’s off to the top bezel…the Eye Contact feature is really pioneering these new types of AI-first experiences. It helps drive the right engineering across the entire stack, from an application that a user experiences, to the run time infrastructure in Windows, and all the way down the silicon.
This is just the beginning of a very exciting future. Our devices are going to help us connect better with other people and be more productive than ever before, all the while they will maintain their beauty, portability, and battery life.