Facebook’s smart glasses could lead to Black Mirror-style privacy concerns
All the reasons why Facebook’s smart glasses are bad for privacy
Smartphones vs. smart glasses
Even though people are now used to being photographed in public, they also expect the photographer typically to raise their smartphone to compose a photo. Augmented reality glasses fundamentally disrupt or violate this sense of normalcy. The public setting may be the same, but the sheer scale and approach of recording have changed.
Such deviations from the norm have long beenrecognized by researchersas a violation of privacy. My group’s research has found that people in the neighborhood of nontraditional cameras want a moretangible sense of when their privacy is being compromisedbecause they find it difficult to know whether they are being recorded.
Absent the typical physical gestures of taking a photo, people need better ways to convey whether a camera or microphone is recording people. Facebook has already beenwarned by the European Unionthat the LED indicating a pair of Ray-Ban Stories is recording is too small.
In the longer term, however, people might become accustomed to smart glasses as the new normal. Our research found that although young adults worry about others recording their embarrassing moments on smartphones,they have adjustedto the pervasive presence of cameras.
Smart glasses as a memory aid
An important application of smart glasses is as a memory aid. If you could record or “lifelog” your entire day from a first-person point of view, you could simply rewind or scroll through the video at will. You could examine the video to see where you left your keys, or you could replay a conversion to recall a friend’s movie recommendation.
Our research studied volunteers who wore lifelogging cameras for several days. We uncovered several privacy concerns – this time,for the camera wearer. Considering who, or what algorithms, might have access to the camera footage, people may worry about the detailed portrait it paints of them.
Who you meet, what you eat, what you watch, and what your living room really looks like without guests are all recorded. We found that people wereespecially concerned about the places being recorded, as well as their computer and phone screens, which formed a large fraction of their lifelogging history.
Popular media already has its take on what can go horribly wrong with such memory aids. “The Entire History of You” episode of the TV series “Black Mirror” shows how even the most casual arguments can lead to people digging through lifelogs for evidence of who said exactly what and when. In such a world, it is difficult to just move on. It’s a lesson in the importance of forgetting.
Psychologists have pointed to theimportance of forgettingas a natural human coping mechanism to move past traumatic experiences. Maybe AI algorithms can be put to good use identifying digital memories to delete. For example, our research has devised AI-based algorithms to detectsensitive placeslike bathrooms andcomputer and phone screens, which were high on the worry list in ourlifelogging study. Once detected, footage can be selectively deleted from a person’s digital memories.
X-ray specs of the digital self?
However, smart glasses have the potential to do more than simply record video. It’s important to prepare for the possibility of a world in which smart glasses use facial recognition, analyze people’s expressions, look up and display personal information, and even record and analyze conversations. These applications raise important questions about privacy and security.
We studied the use of smart glasses by people with visual impairments. We found that these potential users were worried about theinaccuracy of artificial intelligence algorithmsand their potential to misrepresent other people.
Even if accurate, they felt it was improper to infer someone’s weight or age. They also questioned whether it was ethical for such algorithms to guess someone’s gender or race. Researchers have also debatedwhether AI should be used to detect emotions, which can be expressed differently by people from different cultures.
Augmenting Facebook’s view of the future
I haveonly scratchedthe surfaceof the privacy and security considerations for augmented reality glasses. As Facebook charges ahead with augmented reality, I believe it’s critical that the company address these concerns.
I am heartened by thestellar list of privacy and security researchersFacebook is collaborating with to make sure its technology is worthy of the public’s trust, especially given the company’srecent track record.
But I can only hope that Facebook will tread carefully and ensure that their view of the future includes the concerns of these and other privacy and security researchers.
This article has been updated to clarify that future Facebook augmented reality glasses will not necessarily be in the Ray-Ban Stories product line and that, while the company’s goals include identifying people, the Ego4D research data was not collected using facial recognition technology.
Article byApu Kapadia, Professor of Computer Science,Indiana University
This article is republished fromThe Conversationunder a Creative Commons license. Read theoriginal article.
Story byThe Conversation
An independent news and commentary website produced by academics and journalists.An independent news and commentary website produced by academics and journalists.
Get the TNW newsletter
Get the most important tech news in your inbox each week.