There is criticism over the new Apple child sexual abuse material detection software on US users devices.
The new software is supposedly capable of detecting material on users’ devices before the image is backed up to the iCloud.
The concerns regarding the software are not against the search for users with these kinds of images, but rather, what could the government do if they were given access.
WhatsApp has said they find the decision to introduce this kind of spying tool “very concerning”.
Although Apple has brought a rebuttal to the table stating that, “new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy”.
The system will flag matches using an AI, then a human will check the images to see if the match is correct.
Apple then can disable the users account and alert law authorities of the indecent material found on a user’s device.
Apple states that the new technology actually offers benefits over the existing methods available to them.
Apple will only be checking images which the AI has flagged as possible child sexual abuse.
The question has to be asked though, if you were taking photos of your child having a bath to send to your mother, would Apple also be looking at those.
Would you find a police officer at your door for taking a picture of your children, and how would you explain yourself to the authorities?
Police do now have the rights to hold an individuals phone if a criminal charge is brought against you, they can then scan the device to look for information.
We also know that the delays due to COVID has left those who have been charged have been without their phone for several months due to delays.
Sajid Javid has praised Apple for taking the step to help protect children from the horrors of those who do not wish the best for them.
However, the question does remain, why does the step need to be so drastic, WhatsApp has managed to report more than 400,000 cases to the US National Center for Missing and Exploited Children without breaking encryption.