UPDATE: After announcing plans to scan iPhone, iPad and other Apple devices for Child Sexual Abuse Material (CSAM) in August, Apple has now announced that it will be delaying the launch. This development comes after Apple recently released an updated paper that seemingly hoped to pacify concerns that this new scanning system could be abused by authoritarian governments.
Apple has said, "Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
• Read more: Best iPhone for photography
Previously Apple had sought to assuage privacy concerns by stating that it would not rely on a single government-affiliated database to identify CSAM. Instead, it had stated that it would only match images from at least two groups with different national affiliations. This plan was designed to prevent a single government being able to secretly insert unrelated content for censorship purposes.
However, despite this attempt to appease its users, it appears that the backlash was still strong enough to make Apple rethink its plans. At present, it's unclear how long this delay will be.
ORIGINAL STORY: As camera phone technology progresses, user privacy has become a hot button topic – and Apple's latest announcement could add another aspect to this ever-evolving issue. Apple has published a post that explains its new child safety features that have been broke down into three different areas in order to limit the spread of Child Sexual Abuse Material (CSAM) and protect children from predators.
Apple will be introducing new communication tools that will enable parents to play a more informed role in helping their children navigate the internet. Using on-device machine learning, the Messages app will help safeguard against sensitive content with features such as keeping sexually explicit photos blurred and warning the child against viewing it. The parents will even be informed if the child does decide to view the image.
However, one of the other new changes that will be implemented is proving to be controversial. Apple has advised that "new technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children [NCMEC]."
The post goes on to say that "Apple's method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."
Despite Apple's focus on easing privacy concerns in the post, there are some who still have their doubts. Matthew Green, a John Hopkins University professor and cryptographer, reacted to the news saying (via PetaPixel), "This is a really bad idea. These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear. Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems".
With the recent news that a new, particularly powerful piece of spyware is able to harvest photographs and secretly record its victims, it's clear that data protection and privacy rights are only going to become more contentious topics as camera phone technology evolves.
Read more
Best Samsung phone
Best Xiaomi phone
Best Google phone
Best Sony phone