Skip to content
Signal Foundation Warns Against EU’s Plan to Scan Private Messages for CSAM

Signal Foundation Warns Against EU’s Plan to Scan Private Messages for CSAM

Jun 18, 2024NewsroomPrivacy / Encryption

A controversial proposal put forth by the European Union to scan users’ private messages for detection child sexual abuse material (CSAM) poses severe risks to end-to-end encryption (E2EE), warned Meredith Whittaker, president of the Signal Foundation, which maintains the privacy-focused messaging service of the same name.

“Mandating mass scanning of private communications fundamentally undermines encryption. Full Stop,” Whittaker said in a statement on Monday.

“Whether this happens via tampering with, for instance, an encryption algorithm’s random number generation, or by implementing a key escrow system, or by forcing communications to pass through a surveillance system before they’re encrypted.”

The response comes as law makers in Europe are putting forth regulations to fight CSAM with a new provision called “upload moderation” that allows for messages to be scrutinized ahead of encryption.

A recent report from Euractiv revealed that audio communications are excluded from the ambit of the law and that users must consent to this detection under the service provider’s terms and conditions.

“Those who do not consent can still use parts of the service that do not involve sending visual content and URLs,” it further reported.

Europol, in late April 2024, called on the tech industry and governments to prioritize public safety, warning that security measures like E2EE could prevent law enforcement agencies from accessing problematic content, reigniting an ongoing debate about balancing privacy vis-à-vis combating serious crimes.

It also called for platforms to design security systems in such a way that they can still identify and report harmful and illegal activity to law enforcement, without delving into the implementation specifics.

iPhone maker Apple famously announced plans to implement client-side screening for child sexual abuse material (CSAM), but called it off in late 2022 following sustained blowback from privacy and security advocates.

Cybersecurity

“Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types,” the company said at the time, explaining its decision. It also described the mechanism as a “slippery slope of unintended consequences.”

Signal’s Whittaker further said calling the approach “upload moderation” is a word game that’s tantamount to inserting a backdoor (or a front door), effectively creating a security vulnerability ripe for exploitation by malicious actors and nation-state hackers.

“Either end-to-end encryption protects everyone, and enshrines security and privacy, or it’s broken for everyone,” she said. “And breaking end-to-end encryption, particularly at such a geopolitically volatile time, is a disastrous proposition.”

Found this article interesting? Follow us on Twitter and LinkedIn to read more exclusive content we post.



Source link