Microsoft unveiled a smart mirror that can read human faces and determine a user's emotional state. Is this a good thing... or even a real thing?
According to an old saying, mirrors never lie. But if Microsoft has its way, someday mirrors may be able to lie right to the face looking into it.
At the InnovFest UnBound 2016 technology conference held in Singapore this month, Microsoft revealed a mirror that contained both a hidden camera and a display screen. The demo showed that the mirror could read human emotions and then adjust what was displayed accordingly.
The Microsoft Magic Mirror uses facial recognition to identify the user staring into it. That technology also allows the smart mirror to read eight human emotions, including anger, happiness, and surprise. Not being a morning person, I hope it can also read "hasn't had their coffee yet."
According to an article from CNBC, Izzat Khair, a member of Microsoft Singapore's developer experience team, envisions the Magic Mirror offering users the weather, traffic reports, and Twitter and Facebook feeds as they comb their hair and brush their teeth. Presumably, the smart mirror will also tap into the Internet of Things to gather other useful information, such as that needed for a smoother commute into the office.
Khair pointed out the potential business uses of the Magic Mirror technology. Advertisers could use the mirror to display targeted marketing materials based not only on the location of the device but on the user standing in front of it. All of which sounds great until you dig just a little deeper.
I have family members who spend time in the bathroom each weekday morning getting ready for work and they always have the radio or television on so they can gauge the weather and the commuting traffic. Getting that information from the mirror right in front of you is about the same thing--so I get it.
However, at the same time, I don't want my mirror also reading my emotional state. My emotions are not expressed with the notion that a machine algorithm will be able to read and interpret them. No algorithm is good enough for that and likely never will be.
I look at my personal experience with Facebook as a prime example. Facebook treats me like a college kid in a fraternity that throws wild parties every night. (For the record, I'm 54 and my partying days are long gone.) But because I am a single male it assumes I am only interested in finding female companions to ease my loneliness. Based on that misreading of the situation, Facebook bombards me with what its algorithm has determined is appropriate advertising.
Is the facial recognition technology Microsoft has developed for its Magic Mirror going to be much better than Facebook's? I have my doubts. More than that, I don't want my mirror, or any other computing device, trying to "understand" me and what makes me tick. Heck, I am not sure I understand myself most of the time. How can I expect some mirrored machine to do it for me?
Microsoft and other technology companies are spending a great deal of time and resources trying to create devices that can understand and predict human behavior. Those companies believe that armed with that knowledge, such devices can deliver a better user experience. To date, as far as I am concerned, no device has even come close--and that includes Microsoft's Magic Mirror.
- Microsoft acquires Solair: Plans to own enterprise IoT
- Windows 10 supports AllJoyn making the Internet of Things possible
- Windows 10 on the Raspberry Pi: What you need to know
- Microsoft Surface Hub introduces low-cost telepresence in 4K
Do you really want your technological devices reading your emotions? Why? What are the benefits?