-
Tricky one for a number of reasons.
Individual integrity vs the potential of saving children from trafficking, abuse, etc. Personally, I'm leaning towards it being a good thing and that we need to accept some carefully thought through privacy invasion for the benefit of less protected individuals in society. If an AI bot needs to look through my icloud photos to save kids I'm fine with that as long as it happens as Apple has outlined their system (all scanning happens on device, zero comms with servers unless there's potential shady images found)
I don't think I'd be ok with any other company than Apple to do this though. Apple has proved themselves to protect end users several times when governments have asked them to open phones.
Of course there might be a huge gray area and also cases where Apple has cooperated that we don't know about, but it seems like a wasted opportunity to not help vulnurable kids when we have the technology to do so.
iOS 15 update. Scans photos on phones for 'inappropriate' images.
Obv child abuse images are universally inappropriate, but (among other things) concern about whether governments could force Apple to enable similar tech for things the regime of the day deems inappropriate
https://www.independent.co.uk/life-style/gadgets-and-tech/apple-iphone-photo-scanning-csam-b1903328.html