Will WhatsApp be forced to “scan” all your photos?

Over the past few hours, the bosses of encrypted messaging applications have strongly opposed a draft European regulation. The text could profoundly change the way Whatsapp, Signal or iMessage work.

Will your photos on Whatsapp soon be “scanned” by the application? The platform, a subsidiary of Facebook, which in principle allows you to send encrypted messages, has absolutely no desire to do so. But the European Union could well force it to do so, in the same way as other encrypted messaging services, such as iMessage (Apple), or even Signal.

The goal: to require these encrypted messaging services to analyze the content to detect possible exchanges of a child abuse nature. Breaking, in fact, the encryption which protects the secrecy of correspondence.

• What is end-to-end encryption?

As a reminder, end-to-end encryption is a technology consisting of “encapsulating” content (text or images). Concretely, end-to-end encryption is a digital key, which applies to all messaging exchanges. Whatsapp, Apple (for iMessage) or Signal therefore do not have access to the content of users’ exchanges.

Such protection of correspondence is sometimes not to the taste of the authorities, especially since to date, no one has managed to break the encryption of Whatsapp, iMessage or Signal to intercept communications. In France, the Minister of the Interior Gérald Darmanin, for example, spoke out in favor of deactivating this encryption for the police.

• What would the new regulation entail?

Since the first discussions in 2021, the draft European Union regulation aims to require Whatsapp, iMessage or Signal to analyze content exchanged by users. And in particular photos and videos, so that an algorithm can assess the potential pedophile nature of the images. In the event of a positive result, the platforms should then transfer the information to the authorities, to identify and apprehend possible child criminals.

According to a recent version of the working document, cited by the Euractiv site, the user would then have two choices: accept that all their photos and videos are “scanned” by the machine or keep the encryption, but by giving up sending photos and videos on Whatsapp and other encrypted messaging services. An operation denounced by whistleblower Edward Snowden, who evokes “mass surveillance”, in a tweet also shared by the boss of Whatsapp.

• Which applications are affected

The main applications affected would be those which activate end-to-end encryption by default, starting with Whatsapp, Signal or iMessage. But other services like Messenger – which also has end-to-end encryption enabled – would be affected.

“Scanning people’s messages as proposed by the European Union breaks encryption. It’s surveillance and it’s a dangerous path,” denounced Will Cathcart, boss of Whatsapp, on June 18.

For her part, Meredith Whittaker, boss of the encrypted application Signal, has been criticizing for many months this desire for “surveillance”, which, in her eyes, “weakens encryption and creates significant flaws”.

If iMessage is also affected, Apple has not publicly spoken on the subject. In 2020, the company announced the implementation of a system to analyze users’ photos on iCloud, in order to detect possible child criminal content. Faced with criticism of the project in terms of privacy, Apple finally gave up.

On the other hand, the European lobby for new technologies, the CCIA, which also represents Apple, has positioned itself against the text as it currently exists.

• What are the chances of adoption of the regulation?

Despite the intense outcry, the implementation – in 2026 – of this potential regulation is still far from being achieved. Currently, the version mentioned comes from the Council of the European Union. If this is validated, it would then need to obtain the approval of the European Parliament.

The latter having spoken out against measures breaking end-to-end encryption, negotiations in the coming months could be particularly tough.

• Is such a regulation legal?

If all European representatives managed to agree on a generalized analysis of photos and videos on encrypted messaging applications, a legal risk would still weigh on this regulation: the draft regulation has in fact been submitted to the legal services of the Council of the European Union, which drew a conclusion that was critical to say the least.

According to its conclusions, such a tool could contradict the Charter of Fundamental Rights of the European Union, which notably protects the private lives of Europeans. Particularly because of the risk of generalized surveillance of all European Internet users, without any selection based on specific indicators of potential illegal behavior.

“The regime should apply to people for whom there are reasonable grounds to believe that they are involved in one way or another (in child criminal activities, editor’s note)” thus specify the legal services.

• Would the anti-pedocrime tool be effective?

Alongside the risks that this regulation would pose to privacy, there is also the question of the effectiveness of these tools in the fight against child crime. Thus, the Committee on Civil Liberties, Justice and Home Affairs of the European Parliament has ordered an impact study in 2023.

In the document, experts differentiate between several distinct situations. First, the detection of child pornography content already identified by the authorities on the web. To make the fight more effective, international databases exist, and contain a “digital fingerprint” of all child abuse photos and videos.

Thus, a random sequence of letters and numbers is associated with each image. With the new regulation, each image sent on Whatsapp would thus be transformed into a digital fingerprint, by the same algorithm, to know if this code already exists in the database, and therefore if it is child pornography content already identified .

If this solution is particularly reliable, the detection of images and videos which have never been recorded by the authorities is much more complex: it is then necessary to use artificial intelligence to estimate the nature of the content, with risks of error this time very significant, especially given the immense volume of users of these applications.

Thus the impact study mentions “an increase in content identified as illegal and a drop in the reliability of the tool” with “a considerable impact” on the workload of the authorities.

The document also mentions another risk, which this time is not of a technical nature: an adaptation of child criminals to these new tools by “continuing their activities on the dark web, where their identification will be more difficult”.

Most read

Leave a Comment

Your email address will not be published. Required fields are marked *

Shopping Cart
Scroll to Top