Apple is doubling down on its efforts to combat the crime of child abuse. It’s now more vigilant than ever and is poised to scan each and every photo uploaded to the iCloud in order to check for any incidents of child abuse or to catch potential perpetrators or suspects. In fact, it’s not just Apple receiving the said pressure but other tech companies as well with Cloud services.
In the meantime, however, Apple has already unveiled its initiative at a tech conference. From this point on, any images backed up to the company’s online storage services, iCloud is to be screened and checked for any illegal activities. This tends to mark a first for Apple as it has frequently clashed with authorities for refusing to break into criminals’ phones to make catching them easier.
“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. As part of this commitment, Apple uses image-matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled,” according to Jane Horvath, Apple’s chief privacy officer.
Horvath further explains that while removing encryption for messaging in order to catch criminals is not the way they’re solving things, the company is more than willing to utilize some technologies to help screen for child sexual abuse materials.
It’s unclear how the company checks for child abuse images as it did not elaborate on the process.
3 thoughts on “Apple Is Now Scanning Your Photos To Check For Child Abuse”
“This tends to mark a first for Apple as it has frequently clashed with authorities for refusing to break into criminals’ phones to make catching them easier.”
The presumption of guilt is disturbing here.
The is the USA, not a third-world dirt hole.
The Goob hath ruled on this issue.
How do we know the perverts heading apple aren’t trying to add to their collection?
I wonder if Apple would mind if I take a scan thru all their private files. Just for security, of course.