You are reading a single comment by @kboy and its replies. Click here to read the full conversation.
  • iOS 15 update. Scans photos on phones for 'inappropriate' images.
    Obv child abuse images are universally inappropriate, but (among other things) concern about whether governments could force Apple to enable similar tech for things the regime of the day deems inappropriate

    https://www.independent.co.uk/life-style/gadgets-and-tech/apple-iphone-photo-scanning-csam-b1903328.html

    Earlier this month, Apple announced that it would be adding three new features to iOS, all of which are intended to fight against child sexual exploitation and the distribution of abuse imagery. One adds new information to Siri and search, another checks messages sent to children to see if they might contain inappropriate images, and the third compares photos on an iPhone with a database of known child sexual abuse material (CSAM) and alerts Apple if it is found.

    It is the latter of those three features that has proven especially controversial. Critics say that the feature is in contravention of Appleā€™s commitment to privacy, and that it could in the future be used to scan for other kinds of images, such as political pictures on the phones of people living in authoritarian regimes.

  • Tricky one for a number of reasons.

    Individual integrity vs the potential of saving children from trafficking, abuse, etc. Personally, I'm leaning towards it being a good thing and that we need to accept some carefully thought through privacy invasion for the benefit of less protected individuals in society. If an AI bot needs to look through my icloud photos to save kids I'm fine with that as long as it happens as Apple has outlined their system (all scanning happens on device, zero comms with servers unless there's potential shady images found)

    I don't think I'd be ok with any other company than Apple to do this though. Apple has proved themselves to protect end users several times when governments have asked them to open phones.

    Of course there might be a huge gray area and also cases where Apple has cooperated that we don't know about, but it seems like a wasted opportunity to not help vulnurable kids when we have the technology to do so.

About

Avatar for kboy @kboy started