When Meta Platforms began rolling out encryption for Facebook direct messages this month, it was advancing a project that members of its safety staff have long warned would end in disaster.
The encryption feature, which Meta has said will also be implemented on its Instagram app, is the culmination of work first announced in 2019 as part of a push to enhance users’ privacy. Even before that, some Meta employees such as David Erb had warned internally that such encryption would limit the ability to detect and report child sexual abuse on Meta’s platforms.
In 2018, Erb was an engineering director at Facebook running a “community integrity” team focused on detecting harmful user behavior. When the team began studying inappropriate interactions between adults and minors on Facebook, it determined that the most frequent way adults found children to prey upon was Facebook’s “People You May Know” algorithm.
“It was a hundred times worse than any of us expected,” Erb said in a recent interview. “There were millions of pedophiles targeting tens of millions of children.”
Soon after, Erb learned that top executives were discussing encrypting Facebook messages. Worried that would prevent Meta from combating the child-safety problems his group had uncovered, Erb told his manager that he would resign if encryption were implemented on Facebook messages, he says. Within days, he was removed from his role and placed on leave. He ultimately resigned.
Meta has called privacy a fundamental human right, and privacy advocates have long advocated that tech companies embrace such “end-to-end” encryption, which blocks anyone other than senders and receivers from viewing messages. When Chief Executive Mark Zuckerberg announced the encryption plan in March 2019, he said that “the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure.”
He also acknowledged the dangers, saying the move would require working with law enforcement to help prevent the foreseeable abuse. “When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion,” Zuckerberg wrote.
Spokesman Andy Stone said Meta has since worked to address those risks, saying the company has “spent years developing robust safety measures on Facebook and Instagram to prevent and combat abuse or unlawful activity.”
While other companies, including Apple, have implemented such encryption on their messaging platforms, and Meta’s own WhatsApp platform is already the world’s largest encrypted messaging app, those services don’t generally connect users with strangers.
Facebook and Instagram both have expansive search and recommendation functions that let users find and message people they don’t know. Adding encryption to such social networking services cloaks communications among strangers—including predators contacting children.
Other social-media networks with these discovery features don’t offer encryption. TikTok fully disabled messaging for users younger than 16 in 2020, and requires older teens to opt in to send and receive messages. It says it doesn’t offer encryption because “we place a premium on ensuring that our younger users have a safe experience.” YouTube eliminated private messaging in 2019, saying it wanted to “focus on improving public conversations.”
Meta said that it has taken measures to make messaging safer for underage users. It restricts users who tell the platform they are older than 19 from initiating conversations with teens who don’t follow them, and works to prevent adults who behave suspiciously from interacting with teenagers’ accounts.
Privacy advocates say encryption is essential in the digital age to protect communications from the prying eyes of governments, hackers and the businesses that own the platforms. “Everyone needs safety, and in a world where our information is everywhere for use and abuse by criminals, cops, and corporations alike, encryption—and cybersecurity more generally—should be a priority for all,” the American Civil Liberties Union wrote in October.
Law-enforcement agencies and child-safety advocates, meanwhile, have long worried that encryption will make it harder to combat child predators.
‘Too old for you’
As Meta staffers prepared to implement encryption for Facebook messages, a Chicago-area man named Karl Quilter was using Facebook and two other messaging platforms to solicit sexual photos from young girls in the Philippines, according to court filings. On Facebook, his alias was “Mathew Jones.”
In May 2020, Quilter used Facebook to begin messaging a girl about two weeks before her 16th birthday. He told her he was 55 years old, “too old for you,” but, days later, asked for photos of her breasts and vagina. In exchange, he wrote, he would send money for her to buy medication for herself and food for her family.
The girl uploaded three nude photos, saying: “I did this to help my brothers and sister and to eat more for a day.”
Quilter continued to message the girl on Facebook, telling her he would visit the Philippines that December, and asking her to promise that she would give him her virginity.
Facebook investigators discovered the messages and turned them over to authorities. The company, until recently, had protocols allowing certain employees to screen messages for potential child sexual abuse and imminent risks to human life. Federal investigators discovered that Quilter had been using the same Facebook account to solicit sex content from at least eight other girls in the Philippines.
Quilter was arrested in November 2020, a month before his planned trip. He was sentenced this year to 30 years in federal prison after pleading guilty to sexually exploiting children.
“Their trust and safety team’s ability to access messages was instrumental,” said Brian Fitzgerald, head of the Department of Homeland Security Chicago office that investigated Quilter. “A random stranger shouldn’t be able to—off of first contact—go to encrypted communications with a minor.”
‘A uniquely Meta recipe for disaster’
Before rolling out the encryption on Facebook, Meta considered measures to limit circumstances under which full encryption would occur and pondered allowing some employees to collect additional data or access messages under certain circumstances. Vaishnavi J, Meta’s former head of youth policy, said there was internal discussion about exempting child accounts on Facebook and Instagram from encryption.
The measures were rejected, she and other former employees told The Wall Street Journal, both to avoid potential liability for what happened to teenagers on its platforms and because it would require staff to continue to spend time and resources dealing with problematic behavior. J lost her job earlier this year amid layoffs that were part of what Zuckerberg has branded Meta’s “year of efficiency.”
“Meta will consistently say this is an industrywide issue. But no other company is melding recommendations with encrypted messaging,” she said. “That’s a uniquely Meta recipe for disaster.”
Meta’s Stone called it “patently absurd” to claim that cost or labor savings factored into the decision to not exempt underage users from encryption.
Guy Rosen, now Meta’s chief information security officer, told employees in May 2019 that encrypting messages would require taking a different approach to detecting bad actors with the unencrypted data the company could still access.
“We need to be laser focused on doing everything we can to keep users safe, within the limits of what’s possible in a private, encrypted messaging surface,” he wrote in an internal Facebook post reviewed by the Journal.
Some employees voiced support for the move. Others responded with concern. One wrote that introducing encryption at such a scale would create “unknown unknowns”—a range of potential abuses that the company couldn’t entirely anticipate.
“I personally agree with these concerns and voiced them loudly,” Rosen replied. “The absolute best thing we can do is execute the crap out of this.”
Rosen, through Meta, declined to comment.
There were limits to what solutions Meta’s safety staff could pursue. After learning about the encryption plans, Erb consulted with WhatsApp colleagues experienced with encryption about how to address child-exploitation concerns. They said that Facebook would have to limit its recommendation and content discovery features to avoid the sort of predation that Erb feared, he recalls.
Meta executives rejected his team’s proposal to stop recommending minors to adults via its “People You May Know” algorithm, Erb says, and there was little appetite for the mass removal of the millions of accounts his team had identified as engaged in potential child exploitation.
Stone disputed that Meta hadn’t adequately responded to the team’s findings, saying the work triggered its effort to avoid recommending teens to suspected abusers, work that within months led to restrictions placed on millions of accounts that were behaving suspiciously. Since 2020, Stone said, Meta has removed 32 child-exploitation networks comprising 160,000 accounts.
Meta has built an internal tool to better make use of unencrypted data, in hopes of making up for the lack of access to messages. Macsa, short for “Malicious Child Safety Actor,” flags accounts with suspicious interactions such as adult users repeatedly blocked by young users, or ones with posts removed for containing child exploitation content.
After a Journal article in June showed how Instagram’s algorithms were connecting and promoting accounts openly devoted to underage-sex content, Meta took measures to enhance Macsa. It expanded the algorithm to incorporate 60 different behavioral cues. If users trigger too many of them, Meta will restrict their ability to find and contact children, or potentially disable their accounts.
Current and former employees say Macsa is valuable, but not a substitute for the investigative capacity that encryption would preclude.
Larry Magid, CEO of ConnectSafely, a nonprofit that promotes online safety, said that while encrypted messaging could complicate investigations, it also could make children safer by shielding their communications from sexual predators or others who could use their information to target them.
“Some minors will use messaging apps to share information, location, medical concerns, mental health issues, sexuality information and other highly personal or confidential information with a trusted person,” said Magid, who is a member of Meta’s safety advisory council and whose group gets financial support from tech companies including Meta.
Lured into leaving
When a 15-year-old California girl disappeared last year shortly before her quinceanera party, her friends and little sister told police that she’d been talking to a man named “Angel” on Instagram.
Police identified Angel as 38-year-old Daniel Navarro, but he denied knowing where the girl, referred to as A.T. in court papers, was. Instagram messages between them, obtained through a search warrant, told a different story. Navarro had been soliciting graphic images from A.T. for months, and saying that he’d impregnate her, the court papers say.
Navarro lured her into leaving her home, and took her to Mexico, the court papers say. Authorities used the messages to obtain an arrest warrant, apprehended him as he returned to the U.S., and recovered A.T. from his father’s house in Tijuana. Navarro is awaiting trial on charges of enticing and transporting a minor with the intent to engage in criminal sexual activity. He has pleaded not guilty.
Meta is the largest source of tips to the National Center for Missing and Exploited Children that can then be passed on to law enforcement. In 2022, Meta reported more than 26 million incidents of child exploitation to the center. More than 20 million relied on the contents of Instagram and Facebook messages detected by the company without user assistance, work that won’t be possible with encrypted messaging.
“Unless the minors are actually reporting it to the platform, there will be no knowledge that it’s happening,” said John Shehan, a senior vice president at the center.
Meta says it has successfully worked to increase the frequency with which young users flag inappropriate messages, which will allow the company to adjudicate more reports of abuse under encryption.
Blackmailed on Instagram
Jennifer Buta woke up one morning last year to find that her 17-year-old son, Jordan DeMay, had taken his life.
The prior evening, alleged scammers in Nigeria using a hacked account pretended to be a girl who flirted with him on Instagram to entice him to send a nude photograph, according to court filings. The alleged scammers then threatened to send it to everyone he knew unless he paid them $1,000. DeMay didn’t have enough money to meet their demands, and said he would kill himself if they went through with the threat.
“Good. Do that fast,” the scammers replied.
DeMay killed himself less than six hours after first contact with the scammers. Before he took his life, he deleted the conversation from his phone. Police recovered the messages from Instagram, along with the alleged scammers’ communications with more than 100 other victims, material that paved the way for the indictment and extradition of two men to the U.S. They have pleaded not guilty and are awaiting trial.
“If police hadn’t been able to get access to those messages, I might still be wondering why Jordan killed himself,” she said. “Encrypting messages on Instagram will create a breeding ground for people who can have access to our children thinking that ‘I’m never going to get caught.’”
Write to Jeff Horwitz at jeff.horwitz@wsj.com and Katherine Blunt at katherine.blunt@wsj.com