Lucene search

K
thnThe Hacker NewsTHN:05940C68F45315D90397BEA2C7AC2E3D
HistoryJun 18, 2024 - 4:22 p.m.

Signal Foundation Warns Against EU's Plan to Scan Private Messages for CSAM

2024-06-1816:22:00
The Hacker News
thehackernews.com
25
european union
private messages
csam
encryption
privacy
audio communications
consent
law enforcement
apple
signal foundation
surveillance
upload moderation

AI Score

6.8

Confidence

Low

Private Messages for CSAM

A controversial proposal put forth by the European Union to scan users’ private messages for detection of child sexual abuse material (CSAM) poses severe risks to end-to-end encryption (E2EE), warned Meredith Whittaker, president of the Signal Foundation, which maintains the privacy-focused messaging service of the same name.

“Mandating mass scanning of private communications fundamentally undermines encryption. Full Stop,” Whittaker said in a statement on Monday.

“Whether this happens via tampering with, for instance, an encryption algorithm’s random number generation, or by implementing a key escrow system, or by forcing communications to pass through a surveillance system before they’re encrypted.”

The response comes as law makers in Europe are putting forth regulations to fight CSAM with a new provision called “upload moderation” that allows for messages to be scrutinized ahead of encryption.

Cybersecurity

A recent report from Euractiv revealed that audio communications are excluded from the ambit of the law and that users must consent to this detection under the service provider’s terms and conditions.

“Those who do not consent can still use parts of the service that do not involve sending visual content and URLs,” it further reported.

Europol, in late April 2024, called on the tech industry and governments to prioritize public safety, warning that security measures like E2EE could prevent law enforcement agencies from accessing problematic content, reigniting an ongoing debate about balancing privacy vis-à-vis combating serious crimes.

It also called for platforms to design security systems in such a way that they can still identify and report harmful and illegal activity to law enforcement, without delving into the implementation specifics.

iPhone maker Apple famously announced plans to implement client-side screening for child sexual abuse material (CSAM), but abandoned the idea in late 2022 following sustained blowback from privacy and security advocates.

Cybersecurity

“Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types,” the company said at the time, explaining its decision. It also described the mechanism as a “slippery slope of unintended consequences.”

Signal’s Whittaker further said calling the approach “upload moderation” is a word game that’s tantamount to inserting a backdoor (or a front door), effectively creating a security vulnerability that’s ripe for exploitation by malicious actors and nation-state hackers.

“Either end-to-end encryption protects everyone, and enshrines security and privacy, or it’s broken for everyone,” she said. “And breaking end-to-end encryption, particularly at such a geopolitically volatile time, is a disastrous proposition.”

Update

Encrypted service providers Proton and Threema have also come out strongly against the so-called Chat Control bill, stating the passage of the law could severely hamper the privacy and confidentiality of E.U. citizens and civil society members.

“It doesn’t matter how the EU Commission is trying to sell it – as ‘client-side scanning,’ ‘upload moderation,’ or ‘AI detection’ – Chat Control is still mass surveillance,” the Swiss company said. “And regardless of its technical implementation, mass surveillance is always an incredibly bad idea.”

Several other organizations, including Access Now, the Electronic Frontier Foundation, Internet Freedom Foundation, the Center for Democracy and Technology, Mozilla, and Privacy International have also signed a joint statement urging the EU to reject proposals that scan user content.

Decision Over Scanning Messages for CSAM Delayed

EU lawmakers have delayed a vote over proposed legislation that could lead to messaging services having to scan photos, videos, and links to detect possible CSAM amid continued pushback from privacy advocates, Politico reported on June 20.

Several countries including Austria, Czechia, Germany, Poland, and the Netherlands were expected to abstain or oppose the law due to cybersecurity and privacy concerns, indicating a lack of consensus among EU members.

Digital rights activist and Pirate Party MEP Patrick Breyer, who has been a vocal critic of the plan, said the postponement “should be celebrated” but emphasized that the “Orwellian chat control” has to be revised to ensure encryption and anonymity.

Found this article interesting? Follow us on Twitter and LinkedIn to read more exclusive content we post.

AI Score

6.8

Confidence

Low