Wednesday, June 29, 2022

New EU rules require chat apps to scan private messages for child abuse

Must read

Shreya Christinahttps://cafe-madrid.com
Shreya has been with cafe-madrid.com for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider cafe-madrid.com team, Shreya seeks to understand an audience before creating memorable, persuasive copy.

The European Commission has proposed: controversial new regulations that would require chat apps like WhatsApp and Facebook Messenger to selectively scan users’ private messages for child sexual abuse material (CSAM) and grooming behavior. The proposal is similar to plans suggested by Apple last year, but much more drastic, critics say.

After a draft of the regulation was leaked earlier this week, privacy experts condemned it in the strongest terms. “This document is the most terrifying thing I’ve ever seen,” cryptography professor Matthew Green tweeted† “It describes the most advanced mass surveillance equipment ever deployed outside of China and the USSR. Not exaggerated.”

Jan Penfrat of digital advocacy group European Digital Rights (EDRi) repeated the concern, saying: “This looks like an embarrassing blanket #surveillance law that is completely inappropriate for a free democracy.” †A comparison of the PDFs shows that differences between the leaked concept and the final proposal are only cosmetic.)

The regulation would establish a number of new obligations for “online service providers” – a broad category that includes app stores, hosting companies and any provider of “interpersonal communication services”.

The most extreme obligations would apply to communication services such as WhatsApp, Signal and Facebook Messenger. If a company in this group receives a “detection order” from the EU, it would be asked to scan messages from selected users to look for known child sexual abuse material, as well as previously unseen CSAM and any messages containing “grooming” or the “calls of children.” These last two categories of content would require the use of machine vision tools and AI systems to analyze the context of images and text messages.

(By contrast, Apple’s proposal last year to scan messages to find child abuse material would have merely known examples of CSAM, which reduces the chance of errors. After receiving much criticism that the proposal would harm user privacy, Apple has removed references to the feature from its site and postponed its rollout indefinitely.)

“Detection Orders” would be issued by individual EU countries, and the Commission claims these would be “targeted and specified” to reduce privacy breaches. However, the regulation isn’t clear on how these orders would be targeted — whether they’d be limited to individuals and groups, for example, or apply to much broader categories.

Critics of the regulation say such detection warrants can be used in a broad and invasive way to target large groups of users. “The proposal creates the opportunity for [the orders] to be targeted, but doesn’t require it,” Ella Jakubowska, a policy advisor at EDRi, told The edge† “It leaves the door completely open to much more general oversight.”

Privacy experts say the proposal could also seriously undermine (and maybe even break) end-to-end encryption. The proposal doesn’t explicitly call for an end to encrypted services, but experts say requiring companies to install in their systems whatever software the EU deems necessary to detect CSAM will provide robust end-to-end encryption. encryption, in fact, impossible. Due to the EU’s influence on digital policies elsewhere in the world, the same measures can also spread around the world, including to authoritarian states.

“There is no other way to do what the EU proposal aims to do, other than governments widely read and scan user messages,” said Joe Mullin, senior policy analyst at the digital rights group Electronic Frontier Foundation, told CNBC† “If passed into law, the proposal would be a disaster for user privacy, not just in the EU, but around the world.”

In addition to encryption issues, the Commission’s decision to address previously unknown examples of CSAM and grooming behavior has also been criticized. Finding this content would require the use of algorithmic scanners, which, according to the Commission, would preserve the anonymity of the targeted users. But experts say such tools are error-prone and would lead to innocent individuals being monitored by their government.

“There was a stir when Apple suggested something similar for finding known [CSAM] contents. But when you introduce ambiguity and these contextual scenarios, where AI-based tools are notoriously unreliable, the challenges are much greater,” says EDRi’s Jakubowska. “You just have to see how unreliable spam filters are. They’ve been around for 20 years. years in our email, but how many of us still get spam in our inboxes and miss legitimate emails? That really shows the limitation of these technologies.”

Jakubowska said: “This whole proposal is based on making technically unfeasible – if not impossible – things mandatory.”


More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article