-
The solution seems to be that they will add code to suggest that larger vehicles won't necessarily give way
You don't "add code". Procedural code is, in general, a nasty approach to address these issues. These cars are driven by, to (overly) simply things, multistage probability matrices that encode "learned features" and "learned responses". Instead of code like "see big object then.." one now has some new data on possible scenarios. These one can use to synthesize a class of new data as ground truth to train the system. The "big vehicle" model is integrated here as "example". In the network a number of characteristics might represent those "larger vehicle" cases but it might not but something else that within the corpus of data leads to the "right decisions". In fact imagine with enough data one can see a trend that shows, for example, the chance that a public buses during rush hour will cut off a driver is.... That is why the race is to collect data.
Normally--- I don't know any end to end network for this application-- one uses multiple networks of different designs, example: CNNs for image recognition, some variant of a RNN/LSTM to learn action detection and a RNN for control. The future for control might be reinforced learned-- see the Deep Mind paper in last years Nature-- but right now.. and controlling cars is indeed simple enough...
Sometimes what is learned is different than what can expect. The classical example of this is a old story about the first networks used by the military to spot tanks. The network was trained with all kinds of pictures. Out in the field it failed. Back to the drawing board.. It seems that all the photos taken of tanks were made on cloudly days and all the photos without were on sunny days. The network did not learn the truth of tank but of cloudy versus sunny day.
These days the networks have many more variables and the data volume is significantly larger.. Still.. One of the networks used by a major player, for example, trained on school buses seemed to learn black and yellow stripes.
An "interesting" google car crash. They were edging out into another lane but expected a bus to give way and it didn't http://www.theguardian.com/technology/2016/feb/29/google-self-driving-car-accident-california The solution seems to be that they will add code to suggest that larger vehicles won't necessarily give way