This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.
Technology that makes our smartphones hard to break into without a password makes us safer from hackers and other unwanted intruders. But it also makes it tougher for the police to do their job.
American law enforcement officials for years have said companies like Apple and Google should make it easier to break phone passcodes so they can better investigate crimes. Most technology experts say it’s impossible to make a phone that good guys can easily get into without bad guys exploiting it.
My colleague Jack Nicas, who wrote on Wednesday about this standoff between cops and techies, talked to me about a big revelation: Law enforcement officials are able to get into phones far more often than previously understood. Jack also said that the fight over phone security has resulted in a messy but largely effective middle ground on safety — even if no one in this debate will say so.
Shira: Don’t we want the police and prosecutors to get into phones of people accused of crimes?
Jack: Most people would agree that it’s appropriate in some cases for law enforcement to get data from a suspect’s smartphone. And with court warrants they do, using tools from companies like Grayshift and Cellebrite to unlock encrypted iPhones and other smartphones and extract data.
What worries civil rights advocates is that we’ve seen that the police aren’t using these code-breaking technologies only in the most serious cases, but also in shoplifting, drug and assault investigations. And years of records collected by the nonprofit Upturn show that these tools are used by far more law enforcement agencies than we’d known.
Why should law-abiding people care? People might say they have nothing to hide.
We know how much information our phones have, including our entire location history, passwords to accounts, all our text messages and potentially embarrassing photos. That’s what makes smartphones so valuable in police investigations — and what makes it doubly important for there to be caution in when and why police search them.
Upturn found that many U.S. law enforcement agencies had few restrictions, if any, on what circumstances they can break into people’s phones and what they do with the data. More public transparency about how law enforcement uses these tools would be a welcome extra layer of accountability.
If the police frequently break password locks on smartphones, what’s their complaint?
It sometimes takes days, weeks or months for law enforcement to use technology tools to break into a phone, and many investigations are time sensitive. In those situations, cops and prosecutors don’t care about Apple’s worries over cybersecurity.
Our colleagues’ investigation last year into child sexual abuse imagery showed the trade offs of technology that keeps our digital activity private: It lets criminals hide horrible activity. That’s true of smartphone locks, too.
Absolutely. That’s why I don’t think we’ve seen the end of this debate — because there’s no easy solution.
Can smartphones keep us safe from bad guys and also let cops catch criminals? Is there a middle ground?
That middle ground may be the messy status quo.
Smartphones for the most part are well protected from break-ins by criminals and hackers. However, in cases where many of us agree that law enforcement should be able to access phone data, there are tools to help them break in. Those methods don’t require tech companies creating a “backdoor”— a software opening that security experts worry could be exploited by criminals or authoritarian governments.
Huh. So neither cops nor Apple would admit it, but this impasse … it’s good?
Perhaps as good as it can get. Completely impenetrable phones would be bad for public safety, while “back doors” in encryption would be bad for cybersecurity. Neither side is terribly satisfied with this workaround, but it mostly works.
Big Tech makes strange bedfellows
I practically get hives when companies’ purported principles clash with their behavior.
Jack wrote about one of the allegations in the U.S. government’s antitrust lawsuit against Google: A main way that Google preserves its alleged monopoly is through agreements that makes its search service the built-in choice for people using Apple’s computers, iPhones and the Siri voice assistant.
I also want to draw attention to Apple’s role here. This is the hives part.
Apple for years has trash talked companies like Google that sell digital ads based on our activity and interests. “They’re gobbling up everything they can learn about you and trying to monetize it. We think that’s wrong,” Apple’s chief executive, Tim Cook, said in 2015. He’s repeated some version of that regularly.
But here’s the thing: It’s hypocritical for Apple to say that digital advertising machines are ruining our lives while simultaneously taking billions of dollars each year from Google, thereby strengthening one of those digital advertising machines.
An Apple executive was asked in a congressional hearing last year about why the company bashes Google but takes its money. He said that Google’s search engine is the best.
Hmm. Even if Google search cured cancer, it is not mandatory for Apple to take Google’s money for anything.
Companies can conduct business how they wish. The government is not saying that Apple is doing anything wrong. But surely if Apple wanted to apply its principles about “creepy” tactics of companies like Google, it could act differently.
What if Apple didn’t take Google’s money, and when people bought a new iPhone they had the choice of which search engine they wanted to use?
Apple could also remind people every few months to try an alternative like Bing or DuckDuckGo. Apple could, if it wanted, make its own search engine. This would be costly and possibly unwise, but hey, Apple has the money to put its principles to work if it wished to do so.
Before we go …
-
Google’s low-key chief executive gets the attention he probably didn’t want: My colleague Dai Wakabayashi introduces us to Sundar Pichai, Google’s boss who now leads a company in the government’s cross hairs. “He has surrounded himself with other serious, buttoned-up career Google managers who bring a lot of boring to the table,” Dai writes, delightfully.
-
Technology is regularly abused to hurt the vulnerable: Tech nerds have worried for years about “deep fake” technology that digitally alters images and videos being used for political propaganda. But the tech is mostly abused to harass women. One service is creating unauthorized and faked images of women and girls with their clothing removed, The Washington Post reported.
-
When members of Congress go big on Twitch: About 435,000 people at one point tuned into the web streaming site Twitch on Tuesday to see Representative Alexandria Ocasio-Cortez play the murder mystery video game Among Us to encourage young people to vote, the Verge wrote. My colleague Taylor Lorenz recently explained how this game has “begun to serve as a default social platform for young people stuck in quarantine.”
Hugs to this
A sprawling hedgehog “highway” in Britain lets the prickly cuties safely roam through people’s yards via miniature ramps, staircases and fence holes. Here is a video!
We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.
If you don’t already get this newsletter in your inbox, please sign up here.