At a briefing on Thursday, Apple confirmed earlier rumors of plans to roll out new technology in iOS, macOS, watchOS and iMessage that will detect potentially child abuse images on user devices.
The latest versions of iOS and iPadOS, which will be released this fall, will contain new cryptographic applications that will help limit the dissemination of material depicting child sexual abuse, while maintaining user privacy.
The most important thing about the new feature is that it scans images on the device before backing up to iCloud. Hence, scanning does not take place before the image is uploaded to iCloud. For years, Apple has used hash systems to identify child abuse images sent by email. Similar features are used by Gmail and other cloud-based email services. The feature announced today will apply the same scan to custom images stored in iCloud, even if they are not sent to other users.
Apple said that it will not have access to images that do not contain prohibited materials. In addition, the risk of the system mistakenly flagging an account as containing inappropriate material is extremely low. To completely eliminate errors, Apple specialists will manually check all system reports. Users will not be able to access the database of images containing child abuse scenes and will not be able to determine which images have been flagged by the system.
Apple commissioned a technical assessment of the system from three independent cryptographers who found it mathematically extremely reliable. Experts note that this system will significantly increase the likelihood that people who own or sell prohibited photos and videos will be caught. In addition, the accuracy of the system makes it nearly impossible to tag images that do not contain illegal material.
Along with the new measures in iCloud Photos, Apple has added two additional systems to protect the youngest iPhone owners from intruders. The Messages app already scans attached images for child accounts to detect potentially explicit content. Once detected, the content becomes blurry and a warning appears. In addition, parents will be notified if a child receives or sends an explicit image.
Another system concerns Siri. The voice assistant will explain to users that finding content containing child abuse scenes can be problematic, and will also provide links to resources that provide assistance in eliminating unhealthy interests.
0 Comments:
Post a Comment
Your comment and facebook share will be appreciated