• iOS 15 update. Scans photos on phones for 'inappropriate' images.
    Obv child abuse images are universally inappropriate, but (among other things) concern about whether governments could force Apple to enable similar tech for things the regime of the day deems inappropriate

    https://www.independent.co.uk/life-style/gadgets-and-tech/apple-iphone-photo-scanning-csam-b1903328.html

    Earlier this month, Apple announced that it would be adding three new features to iOS, all of which are intended to fight against child sexual exploitation and the distribution of abuse imagery. One adds new information to Siri and search, another checks messages sent to children to see if they might contain inappropriate images, and the third compares photos on an iPhone with a database of known child sexual abuse material (CSAM) and alerts Apple if it is found.

    It is the latter of those three features that has proven especially controversial. Critics say that the feature is in contravention of Appleā€™s commitment to privacy, and that it could in the future be used to scan for other kinds of images, such as political pictures on the phones of people living in authoritarian regimes.

  • There's a really interesting Sam Harris pod cast on this subject.
    https://samharris.org/podcasts/213-worst-epidemic/
    It's been a while since I listened but tells the story of how Facebook was the largest platform for child pornography going, but only because it was unencrypted so the number of reports and prosecutions could be defined. When Facebook encrypted those numbers zero'ed and the reality is that all platforms that employ encryption get to abscond from the responsibility of hosting such material because 'they don't know it exists', the numbers from Facebook are absolutely staggering though so painfully they all know it does exist and the longer its been around and proven to be well encrypted the higher those figures are.

About

Avatar for boisb @boisb started