- A new report alleges that Apple plans to subvert iPhone privacy in the name of stopping child abuse.
- Reportedly, the company plans to scan user photos for evidence of child abuse. If found, the algorithm would push that photo to a human reviewer.
- The idea of Apple employees accidentally monitoring legal photos of a user’s children is certainly concerning.
Over the past few years, Apple has pushed hard to solidify its reputation as a privacy-focused company. It frequently cites its “walled garden” approach as a boon for privacy and security.
However, a new report from Financial Times throws that reputation into question. According to the report, Apple is planning on rolling out a new system that would rifle through user-created photos and videos on Apple products, including the iPhone. The reason Apple would sacrifice iPhone privacy in this way is to hunt for child abusers.
The system is allegedly known as “neuralMatch.” Essentially, the system would use software to scan user-created images on Apple products. If the software finds any media that could feature child abuse — including child pornography — a human employee would then be notified. The human would then assess the photo to decide what action should be taken.
Apple declined to comment on the allegations.
iPhone privacy coming to an end?
Obviously, the exploitation of children is a huge problem and one that any human with a heart knows should be dealt with swiftly and vigorously. However, the idea of someone at Apple viewing innocuous photos of your kids that neuralMatch accidentally flagged as illegal seems like an all-too-real problem waiting to happen.
There’s also the idea that software designed to spot child abuse now could be trained to spot something else later. What if instead of child abuse it was drug use, for example? How far is Apple willing to go to help governments and law enforcement catch criminals?
It’s possible Apple could make this system public in a matter of days. We’ll need to wait and see how the public reacts, if and when it does happen.