Apple released several new features this month under the guise of expanding its protections for young people, at least two of which substantially undermine the company’s longstanding dedication to user privacy. One of the initiatives involves scanning images sent to and from child accounts in Messages, which goes against Apple’s pledge to provide end-to-end encryption in communications. And when such commitments are breached, it opens the door to new damages; this is what makes encryption breaking so insidious.
Edward Snowden, the former Intelligence Community officer and whistleblower, has a unique perspective on what Apple is doing. Snowden released documents that provided a public window into the NSA and its international intelligence partners’ secret mass surveillance programs and capabilities. These revelations generated unprecedented attention around the world on privacy intrusions and digital security, leading to a global debate on the issue.
Snowden had the following to say about the new Apple features:
By now you’ve probably heard that Apple plans to push a new and uniquely intrusive surveillance system out to many of the more than one billion iPhones it has sold, which all run the behemoth’s proprietary, take-it-or-leave-it software. This new offensive is tentatively slated to begin with the launch of iOS 15—almost certainly in mid-September—with the devices of its US user-base designated as the initial targets. We’re told that other countries will be spared, but not for long.
You might have noticed that I haven’t mentioned which problem it is that Apple is purporting to solve. Why? Because it doesn’t matter.
Having read thousands upon thousands of remarks on this growing scandal, it has become clear to me that many understand it doesn’t matter, but few if any have been willing to actually say it. Speaking candidly, if that’s still allowed, that’s the way it always goes when someone of institutional significance launches a campaign to defend an indefensible intrusion into our private spaces. They make a mad dash to the supposed high ground, from which they speak in low, solemn tones about their moral mission before fervently invoking the dread spectre of the Four Horsemen of the Infopocalypse, warning that only a dubious amulet—or suspicious software update—can save us from the most threatening members of our species.
Suddenly, everybody with a principled objection is forced to preface their concern with apologetic throat-clearing and the establishment of bonafides: I lost a friend when the towers came down, however… As a parent, I understand this is a real problem, but…
Apple’s new system, regardless of how anyone tries to justify it, will permanently redefine what belongs to you, and what belongs to them.
The task Apple intends its new surveillance system to perform—preventing their cloud systems from being used to store digital contraband, in this case unlawful images uploaded by their customers—is traditionally performed by searching their systems. While it’s still problematic for anybody to search through a billion people’s private files, the fact that they can only see the files you gave them is a crucial limitation.
Now, however, that’s all set to change. Under the new design, your phone will now perform these searches on Apple’s behalf before your photos have even reached their iCloud servers, and—yada, yada, yada—if enough “forbidden content” is discovered, law-enforcement will be notified.
Apple’s objectives are admirable: protecting children from strangers who use communication tools to lure and exploit them, as well as restricting the dissemination of child sexual abuse content. And it is evident that there are no simple solutions to child endangerment. However, scanning and flagging users messages and photos creates a major risk for privacy violations. It both exposes a security flaw in Messages and ignores the reality of where abuse occurs most frequently, how harmful communications occur, and what young people truly want to feel safe online.
One particular frustration for me is that I know some people at Apple, and I even like some people at Apple—bright, principled people who should know better. Actually, who do know better. Every security expert in the world is screaming themselves hoarse now, imploring Apple to stop, even those experts who in more normal circumstances reliably argue in favor of censorship. Even some survivors of child exploitation are against it. And yet, as the OG designer Galileo once said, it moves.
The whole thing is worth reading, and I recommend you do so. Snowden concludes by warning us that “every iPhone will search itself for whatever Apple wants, or for whatever Apple is directed to want. They are inventing a world in which every product you purchase owes its highest loyalty to someone other than its owner. To put it bluntly, this is not an innovation but a tragedy, a disaster-in-the-making.”
Will that be Android or Blackberry?