Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Apple responded by explaining the reasons for abandoning the child sexual abuse material (CSAM) testing program.

2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

CTOnews.com, September 1 (Xinhua)-- Apple announced in December last year that the iCloud cloud service would stop testing child sexual abuse materials (CSAM) after a lot of opposition from the media, non-profits and consumers.

The test, which is scheduled to begin testing in August 2021, will mainly detect CSAM images stored in iCloud, but it has been controversial since its launch.

Illustration: Jacqui VanLiew Apple initially said that CSAM testing would be available in iOS15 and iPadOS15 updates by the end of 2021, but the company eventually postponed the feature based on "feedback from customers, advocacy groups, researchers and others."

Heat Initiative, a child safety group, told Apple this week that it was organizing a campaign to "detect, report and delete" CSAM from iCloud and to provide users with richer tools to report CSAM content to Apple.

CTOnews.com quoted Wired as saying that Apple responded to Heat Initiative in a rare way, in which Erik Neuenschwander, Apple's director of user privacy and child security, outlined the reasons for abandoning the development of iCloud CSAM scanning:

Child sexual abuse materials are abhorrent and we are committed to breaking the chain of coercion and influence that makes children vulnerable.

However, scanning each user's privately stored iCloud data will become a data thief to discover and use to create new threat vectors.

And this approach can lead to many unintended consequences, such as scanning one type of content opens the door to batch monitoring and may create a desire to search for other encrypted messaging systems across content types.

Apple previously said when shutting down its CSAM program:

After extensively consulting experts and collecting feedback on the child protection initiatives we proposed last year, we are increasing our investment in the Communication Safety feature that will be launched in December 2021. We decided not to push forward the CSAM testing tool for iCloud photos that we proposed earlier.

Children can be protected without companies combing personal data, and we will continue to work with governments, child advocates and other companies to help protect young people and maintain their right to privacy. make the Internet a safer place for children and all of us.

Related readings:

"after more than a year of silence, Apple announced that iCloud had abandoned its child sexual abuse material (CSAM) testing program."

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report