Saw that story. Entirely unsurprising - machine learning systems are good at finding loopholes in the rules humans set for them. Like the system which was supposed to detect early signs of tuberculosis in chest x-rays; initially seemed to be successful, but it turned out it had just learned to spot when somebody had used an X-Ray device sited near a respiratory medicine ward. Systems supposed to learn to win at computer games but mostly learn to make them crash so that they don't lose, list goes on.
People in positions of power like AI because it offers them a way of avoiding accountability. A particularly lethal attitude when those systems will decide who to kill.
Saw that story. Entirely unsurprising - machine learning systems are good at finding loopholes in the rules humans set for them. Like the system which was supposed to detect early signs of tuberculosis in chest x-rays; initially seemed to be successful, but it turned out it had just learned to spot when somebody had used an X-Ray device sited near a respiratory medicine ward. Systems supposed to learn to win at computer games but mostly learn to make them crash so that they don't lose, list goes on.
People in positions of power like AI because it offers them a way of avoiding accountability. A particularly lethal attitude when those systems will decide who to kill.