WhatsApp Will Not Adopt Apple’s New Child Safety Measures – Tech Times
WhatsApp’s head has confirmed that the app will not implement Apple’s new child safety measures.
The confirmation was made by WhatsApp head Will Cathcart in a series of tweets posted on his Twitter account.
“People have asked if we’ll adopt this system for WhatsApp. The answer is no,” the first tweet in the thread reads.
Cathcart added that he is concerned about the new measures. He likewise said that it is the wrong approach that will affect user privacy.
WhatsApp Refuses to Adopt Apple’s Child Safety Measures
I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world.
People have asked if we’ll adopt this system for WhatsApp. The answer is no.
— Will Cathcart (@wcathcart) August 6, 2021
WhatsApp’s refusal to adopt Apple’s new child safety measures is due to concerns over user privacy. What are the new child safety measures being talked about here anyway?
Apple announced a plan meant to stop the spread of child sexual abuse material (CSAM) on Thursday. The tech company’s plan includes safety measures in three different areas.
One of these child safety measures will be done through the Messages app, which will have new tools that will enable it to warn both children and their parents about CSAM. Apple will also introduce a new technology in iOS and iPadOS that “will allow Apple to detect known CSAM images stored in iCloud Photos.”
As far as Siri and Search are concerned, Apple is planning on adding more resources that can help both children and their parents stay safe from CSAM online. An example of which is that users will be able to ask Siri how to report CSAM.
Related Article: Apple is Working on a Technology that Allows the Devices to Scan Photos for Signs of Abuse
WhatsApp’s Concerns Over Apple’s Child Safety Measures
(Photo : Pexels/Anton)
WhatsApp Pay Brazil
The latest additions to Apple’s child safety measures has caused concerns over violations of user privacy. In particular, the feature that will enable Apple to scan images stored in iCloud has been criticized.
According to another tweet of Will Cathcart, “Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.”
The WhatsApp head has raised concerns over how this tool can be abused by government, spyware companies, and even Apple itself. “This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control,” the WhatsApp head said.
Professors, Organizations, and More Express Concern
Professors, organizations, and many have shared the same concerns expressed by WhatsApp’s head. The Verge mentioned tweets posted by Johns Hopkins University Associate Professor Matthew Green. Green talked about how the child safety measures Apple is planning to implement can be abused.
Epic CEO Tim Sweeney, politician Brianna Wu, journalist Edward Snowden, and the Electronic Frontier Foundation have also shared their concerns and criticisms over Apple’s new child safety measures.
Apple’s filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change. https://t.co/f2nv062t2n
— EFF (@EFF) August 5, 2021
This is the worst idea in Apple history, and I don’t say that lightly.
It destroys their credibility on privacy. It will be abused by governments. It will get gay children killed and disowned. This is the worst idea ever. https://t.co/M2EIn2jUK2
— Brianna Wu (@BriannaWu) August 5, 2021
Also Read: Apple to Detect Sensitive Content on iPhone Photo Libraries-But Security Expert Has a Warning
This article is owned by Tech Times
Written by Isabella James
ⓒ 2021 TECHTIMES.com All rights reserved. Do not reproduce without permission.