Last weekend, Microsoft employees set up a booth at the Ballard Locks, attached some cameras to their computer monitors with blue painter’s tape, and asked passersby to volunteer some very personal information.
This was done in the name of improving Windows Hello, which logs users into Windows 10 by recognizing their face or fingerprints. Many people, including a GeekWire reporter, let Microsoft create a 3D scan of their face, which the company in turn would use to improve Hello’s facial recognition technology.
Viewed through the context of the current digital privacy debate, that event is striking. Today, the FBI and Apple testified during a House Judiciary Committee hearing. At issue is the intelligence agency’s request that Apple help unlock an iPhone that belonged to a shooter in the San Bernardino massacre. Apple has received industry support. Before Apple general counsel Bruce Sewell said the FBI’s proposed fix would “break the encryption system which protects personal information on every iPhone,” Microsoft president and chief legal officer Brad Smith told an audience in San Francisco that “the path to hell starts at the back door.”
Zoom out, and these two circumstances — one dominating the international news cycle, the other an aside on a local website — highlight a disconnect in the public’s views on digital security. They say something about who we trust and who we think we should trust. While Apple is viciously fighting a court order to facilitate back-door access for the government, Microsoft was willingly granted far more sensitive data by folks who have erected flimsy digital walls.
“People are so willing to give up very vital data,” said Barbara Endicott-Popovsky, executive director of the University of Washington’s Center for Information Assurance and Cybersecurity. “The kind of identity theft that you could commit by having somebody’s biometrics scares the heck out of me. … If identity theft now, with credit cards being stolen, is hard to resolve, you tell me what it’s going to be like when your biometrics are stolen.”
Apple is fighting to preserve encryption of its users’ sensitive data (which includes fingerprints). In newer versions of its iOS operating system, a user’s password is the only encryption key, one Apple does not have access to. Apple also built in a system that wipes a password clean after 10 failed attempts. It is this feature the FBI is asking Apple to break, so that agency hackers can then guess the password on their own.
“There is already a door on the iPhone,” FBI Director James Comey said. “We are asking Apple to take the vicious guard dog away and let us pick the lock.”
For its part, Apple is arguing such a move would create a chink in iOS that would leave everyone’s iPhones susceptible to hackers. This argument comes after it has complied with government requests for data (particularly in China) in the past.
At stake for Apple is customer trust. Ever since Edward Snowden revealed the National Security Agency was monitoring citizen metadata, consumers have been increasingly concerned about privacy. That’s why Apple, Microsoft, Google and others are constantly beefing up encryption. That’s why Apple doesn’t give itself access to customer passwords. Consumers have, by and large, made clear they don’t want the government snooping around their data, and tech companies are trying to assure them that won’t happen.
But one must consider a fundamental facet of the current security debate — the government is asking for access to data we’ve willingly given to Apple (or Google or Facebook or Microsoft). Amid skepticism of government, our skepticism of private-sector companies seems minimal in contrast.
Todd Bishop is a founder of GeekWire and the writer who volunteered to let Microsoft scan his face over the weekend. “In terms of privacy, it did initially cause me some pause,” he said of the event. “I said, ‘Do I really want my face in a Microsoft database?’ … But, you know, what does Microsoft not already know about me? What more could they possibly get just from having a scan of my face? Maybe that’s naive, but to me, the risk seemed pretty low.”
Bishop’s decision boiled down to trust in the company; as a reporter, Microsoft has trusted him with sensitive information in the past, and his reading of Microsoft’s privacy agreement made clear that his face scan was to be used only for research. Microsoft declined to supply 425 Business a copy of the agreement, nor did the company clarify whether the scans will be terminated after the Windows Hello research.
What if the booth were operated by the FBI instead of Microsoft? “No, I don’t think I would have done it,” Bishop said. “I would have suspected an ulterior motive.”
Bishop’s government skepticism is not rare; as Endicott-Popovsky points out, our country was founded in part on distaste for an over-reaching state. Yet, even if Microsoft’s or Apple’s or Facebook’s intent to use our data isn’t malicious, that does not inherently protect it from those who do plan malicious acts.
This argument is only complicated when intent is considered. Given access to data, the federal government typically uses it for monitoring and investigations. Given access to data, tech companies use it to sell tailored advertisements, customize news feeds, and automatically learn about your interests, favorite restaurants, and schedule. Corporate access to our data is immense, the best evidence of which is the effectiveness, perhaps even addictiveness, of its products that know an awful lot about us.
Even privacy-focused Apple, a hardware company at its core, has relatively easy access to personal data, if for no other reason that customers store it on the company’s devices. That proximity is the reason the feds asked Apple for help in the first place; FBI agents did not throw their arms up in frustration, thinking Apple’s encryption was so good that Apple’s own employees couldn’t help find a workaround.
According to polls, the general public is torn on whether it supports Apple or the FBI. In contrast, most people seem to be worried about the government having too much access to our data. Yet we remain largely unaware of the security risk all of our data face and blissfully accepting of tech companies’ motives.
“When you don’t question, that’s when you leave yourself open,” Endicott-Popovsky said. “What do companies do to your data? They sell it — it’s worth something. Facebook is free, right? Wrong. You’re paying them with your data. They’re monetizing your data.”
As data filters between more hands — whether through business-to-business transactions, hacking, or user input — it becomes easier and easier for other entities — governmental, criminal, and corporate — to get their hands on it. Be it a volunteered face scan or an FBI investigation, none of our digital information is completely safe. Perhaps the grander debate isn’t what Apple or the FBI should do with data, but instead what we should do to safeguard our most sensitive information.
This post has been edited to reflect Microsoft’s refusal to comment for this story.