-
"We coded the car to expect cyclists to get the fuck out of the way...", etc.
To simply things (again by a large margin) and turn this into a kind of thought experiement: if 9 out of 10 cyclists one experiences gets out of the way one can expect cyclists will get out of the way 90% of the time-- that is most of the time but not all of the time . Of those 10% that don't get out of the way we have also statisical experience how many get into crashes etc.. Throw in a cost function and we can .. If we only have data about crashes (labels) etc. the networks "learn" a set of actions.. If the chance of hitting another car is significantly higher than the chance of hitting a cyclist when not avoiding one is greater and the cost of hitting another car is higher-- perhaps causing even multiple crashes to a given probability-- the action might be to "ignore" and risk the crash with the cyclist. In this example we see that the more cyclists get out of the way the more likely a car will "ignore" them and risk a crash. If this was a collective strategic game the best strategy for cyclists would be to collectively not get out of the way as this would force a response for cars to avoid them-- since we increased the cost of ignoring cyclists. Unfortunately the cost function for the cyclist is quite high so on the individual level it is the worst strategy. What this ultimately means is that cyclists will need to increase their avoidance and cars will increasingly ignore them in traffic.
An "interesting" google car crash. They were edging out into another lane but expected a bus to give way and it didn't http://www.theguardian.com/technology/2016/feb/29/google-self-driving-car-accident-california The solution seems to be that they will add code to suggest that larger vehicles won't necessarily give way