Apple Algorithms Scan Your Photos in iCloud
At CES 2020 in Las Vegas, Jane Horvath, Apple's chief privacy officer, confirmed that Cupertino's digital algorithms scan users' photos in iCloud's storage. Apple officially attended CES for the first time in 27 years.
According to Horvath, this practice is used to determine whether photos show signs of abuse or sexual abuse of children. At the same time, photo scanning is allowed by the terms of the user agreement, which was updated by the Cupertino company in 2019 for the last time. It says that Apple can use the user's personal information to ensure the security of the account.
However, an Apple representative did not tell how exactly the company checks images: it may be using the PhotoDNA system, which Facebook, Twitter, and Google work with, or some other systems.
Jane Horvath said, "Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image-matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled."
It was also not specified whether information about signs of ill-treatment or sexual violence against children was transmitted to law enforcement agencies.