Last week, the FBI warned iPhone and Android users to stop texting and to use an encrypted messaging platform instead. The news made global headlines, with cyber experts urging smartphone users to switch to fully secured platforms—WhatsApp, Signal, Facebook Messenger—instead. But the FBI also has a serious warning for U.S. citizens using those platforms—even those apps, it warns, must change.
While China has denied any involvement in the ongoing cyberattacks on U.S. telco networks, describing this as “a pretext to smear China,” government agencies are clear that Salt Typhoon hackers linked to China’s Ministry of State Security, have infiltrated multiple networks, putting both metadata and actual content at risk.
Encrypting content is certainly the answer, and the FBI’s advice to citizens seemed clear-cut, “use a cell phone that automatically receives timely operating system updates, responsibly managed encryption and phishing resistant MFA for email, social media and collaboration tool accounts.”
What was missed in almost all the reports covering Salt Typhoon was the FBI’s precise warning. “Responsibly managed encryption” is a game-changer. None of the messaging platforms which cyber experts and the media urged SMS/RCS users to switch to are “responsibly managed” under this definition.
The FBI has now expanded on the wording of its warning last week, telling me “law enforcement supports strong, responsibly managed encryption. This encryption should be designed to protect people’s privacy and also managed so U.S. tech companies can provide readable content in response to a lawful court order.”
This doesn’t mean giving the FBI or other agencies a direct line into content, it means the tech platforms—Meta, Apple, Google—should have the means, the keys to provide content when warranted to do so by a court. Right now they cannot, and police chiefs and other agencies describe this situation as “going dark” and want it to change.
FBI Director Christopher Wray warns that “the public should not have to choose between safe data and safe communities. We should be able to have both—and we can have both… Collecting the stuff—the evidence—is getting harder, because so much of that evidence now lives in the digital realm. Terrorists, hackers, child predators, and more are taking advantage of end-to-end encryption to conceal their communications and illegal activities from us.”
This is a dilemma. Apple, Google and Meta all make a virtue of their own lack of access to user content. Apple, by way of example, assures that “end-to-end encrypted data can be decrypted only on your trusted devices where you’re signed in to your Apple Account. No one else can access your end-to-end encrypted data—not even Apple—and this data remains secure even in the case of a data breach in the cloud.”
“Unfortunately,” Wray said, “this means that even when we have rock-solid legal process—a warrant issued by a judge, based on probable cause—the FBI and our partners often can’t obtain digital evidence, which makes it even harder for us to stop the bad guys… the reality is we have an entirely unfettered space that’s completely beyond fully lawful access—a place where child predators, terrorists, and spies can conceal their communications and operate with impunity—and we’ve got to find a way to deal with that problem.”
The dilemma is that if Google or Meta or even Apple does have the keys, as used to be the case, then the end-to-end encryption enclave falls away. How would users feel if Google could access their currently encrypted content if required/wanted. This is as much about distrust of big tech as trust or otherwise of law enforcement. And, as ever, while the argument runs one way in the U.S. and Europe, the same technical back doors would exist in the Middle East, Africa, China, Russia, South East Asia, countries with a different view on privacy and state monitoring activities.
There are just three providers of end-to-end encrypted messaging that matter. Apple, Google and Meta—albeit Signal provides a smaller option favored by security experts. These are the “U.S. tech companies” the FBI says should change platforms and policies to “provide readable content in response to a lawful court order.”
Last week’s FBI warning highlights that Google and Apple only provide such encryption between their Android and iPhone walled gardens. Which leaves Meta as the world’s provider of cross-platform, end-to-end encrypted messaging, with WhatsApp and Facebook Messenger each counting their user bases in the billions.
In response to last week’s FBI’s warning and its push for “responsibly managed” encryption, Meta told me that “the level best way to protect and secure people’s communications is end-to-end encryption. This recent attack makes that point incredibly clear and we will continue to provide this technology to people who rely on WhatsApp.” Signal hasn’t yet provided a response. What is clear, though, is there is still no appetite across big tech to make any such changes. And they’ve proven willing to fight to protect encryption even if it means exiting countries or even regions.
But the U.S. is different, for this tech the U.S. is home. This debate will change if—and only if there’s a change in public attitudes, a push from users to change these apps to enable such warranted access. The politics are fraught with risk without this change in public sentiment. “Our country,” Wray said, “has a well-established, constitutional process for balancing individual privacy interests with law enforcement’s need to access evidence to protect the American people.”
No signs at all yet of that change coming. Users want security and privacy. End-to-end encryption has become a table stake for iPhone and Android, it is expanding—as we saw with Facebook Messenger’s recent update—not retracting.
Deputy U.S Attorney General Rod Rosenstein first pushed “responsible encryption” in 2017, under the first Trump presidency. “Encryption is a foundational element of data security and authentication,” he said. “Essential to the growth and flourishing of the digital economy, and we in law enforcement have no desire to undermine it.”
Rosenstein warned that “the advent of ‘warrant-proof’ encryption is a serious problem… The law recognizes that legitimate law enforcement needs can outweigh personal privacy concerns. Our society has never had a system where evidence of criminal wrongdoing was totally impervious to detection… But that is the world that technology companies are creating.”
In response, EFF said Rosenstein’s “’Responsible Encryption’ demand is bad and he should feel bad… DOJ has said that they want to have an ‘adult conversation’ about encryption. This is not it. The DOJ needs to understand that secure end-to-end encryption is a responsible security measure that helps protect people.”
The argument against “responsible encryption” is very simple. Content is either secure or it’s not. If someone else has a key to your content, regardless of the policies protecting its use, then your content is exposed and at risk. That’s why the security community feels so strongly about this—it’s seen as black and white, as binary.
Seven years later and the debate has not changed. And in the U.S. and Europe and elsewhere, 2025 looks like the year it ignites all over again.