This is continuation of my previous blog covering the R.4(2) of new IT guidelines, which also included the Apple’s dilemma of creating back door for unlocking the iPhone in San Bernardino case for National Security. The current blog is related to the scanning of User’s Data for identification of CSAM (Child Sexual Abuse Material) content, but, at the Client’s Side. And would this Apple’s feature negate its long standing repute of adhering to User’s Privacy, come what may? I’ll elucidate this point by point.
The New iPadOS 15 / iOS 15 / watchOS 8 / macOS, will use applications of cryptography for scanning of images determining sexual explicit content to reduce or limit spread of CSAM online while adhering to USERs privacy. And if matched, as there’s 1 in a trillion chance per year of being incorrect, as being contended by Apple, then it would be reported to the concerned authority. One can appeal to Apple if wrongly being flagged.
Now, Apple contends instead of scanning images in the cloud itself, the system perform on-device matching using DB of known CSAM image hashes & before the image is stored on iCloud, an on derive matching is performed for images in such DB using cryptographic technology called private set intersection & creates a voucher along with additional encrypted data about the image & is uploaded to the iCloud & further using threshold secret sharing technology, Apple then manually reviews each report to confirm a match, disables account, & sends report to the authority. USER can appeal if they think their account has been wrongly flagged to reinstating it.
Is it breach of Privacy; Privacy on which the foundation of Apple is built upon?
Now, if we see from the perspective of S. 67B (a) to (e) of IT Act that prohibits, publishing or transmitting any material depicting Children in Sexually Explicit Act in Electronic Form, BUT, excludes books, painting, drawing, images etc. that are:
§ In the Interest of Science, Literature, Art or Other Objects of General Purposes;
§ Which is kept or used for bona fide Heritage or Religious Purposes.
And further, S. 67C that clearly defines the Preservation and Retention of Information by Intermediaries for a certain duration.
The question is: Is Apple truly in breach of Privacy?
Answer could differ as there’re several interpretations circulating. Also on one of Apple’s Privacy Policy, it is mentioned under Apple’s Use of Personal data:
Security and Fraud Prevention:
To protect individuals, employees, and Apple and for loss prevention and to prevent fraud, including to protect individuals, employees, and Apple for the benefit of all our users, and prescreening or scanning uploaded content for potentially illegal content, including child sexual exploitation material.
Comply with Law:
To comply with applicable law — for example, to satisfy tax or reporting obligations, or to comply with a lawful governmental request.
And further in their iCloud Agreement it’s written:
….Apple reserves the right at all times to determine whether content is appropriate & in compliance with this agreement, and may screen, move, refuse, modify and/or remove Content at any time….
So, Apple as an intermediary, might pre-scan the images (on-device, debatable) adhering to the stringent User’s Privacy Policy + First giving a Chance for Explanation to the Individual Flagged rather complaining First to the Authority as the Individual being victim of Cyber attack or truly being Complicit. 😇
© Pranav Chaturvedi
No comments:
Post a Comment