Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

After more than a year of silence, Apple announced that iCloud had abandoned its child sexual abuse material (CSAM) testing program.

2025-04-05 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

CTOnews.com, December 8, in a statement shared with WIRED, Apple announced today that the iCloud cloud service has abandoned its child sexual abuse material (CSAM) testing program. The test, which is scheduled to begin testing in August 2021, will mainly detect CSAM images stored in iCloud, but it has been controversial since its launch.

Illustration: Jacqui VanLiew Apple initially said that CSAM testing would be available in iOS15 and iPadOS15 updates by the end of 2021, but the company eventually postponed the feature based on "feedback from customers, advocacy groups, researchers and others." Now, after a year of silence, Apple has completely abandoned its CSAM testing program.

CTOnews.com learned that Apple's plan has been criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argue that the feature will create a "back door" for devices that governments or law enforcement agencies can use to monitor users. Another concern is false positives, including the possibility that someone may deliberately add CSAM images to another person's iCloud account so that their account is tagged.

Apple said in a statement:

After extensively consulting experts and collecting feedback on the child protection initiatives we proposed last year, we are increasing our investment in the Communication Safety feature that will be launched in December 2021. We decided not to push forward the CSAM testing tool for iCloud photos that we proposed earlier.

Children can be protected without companies combing personal data, and we will continue to work with governments, child advocates and other companies to help protect young people and maintain their right to privacy. make the Internet a safer place for children and all of us.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report