You are reading a single comment by @JonD and its replies. Click here to read the full conversation.
  • iOS 15 update. Scans photos on phones for 'inappropriate' images.
    Obv child abuse images are universally inappropriate, but (among other things) concern about whether governments could force Apple to enable similar tech for things the regime of the day deems inappropriate

    https://www.independent.co.uk/life-style/gadgets-and-tech/apple-iphone-photo-scanning-csam-b1903328.html

    Earlier this month, Apple announced that it would be adding three new features to iOS, all of which are intended to fight against child sexual exploitation and the distribution of abuse imagery. One adds new information to Siri and search, another checks messages sent to children to see if they might contain inappropriate images, and the third compares photos on an iPhone with a database of known child sexual abuse material (CSAM) and alerts Apple if it is found.

    It is the latter of those three features that has proven especially controversial. Critics say that the feature is in contravention of Appleā€™s commitment to privacy, and that it could in the future be used to scan for other kinds of images, such as political pictures on the phones of people living in authoritarian regimes.

About

Avatar for JonD @JonD started