I'm actually more impressed that they are trying to code "believed the bus would slow".
Understanding the expected behavior of other drivers is critical to making self driving cars work. And that seems like a pretty hard thing for a computer to figure out.
> Understanding the expected behavior of other drivers is critical to making self driving cars work. And that seems like a pretty hard thing for a computer to figure out.
You're definitely right.
With regard to this particular instance, based on my experiences in many cities in different countries, buses pulling out into traffic when they don't technically have the right of way is an expected behavior. :)
Unless I'm reading the report wrong, it appears that the bus did have the right of way. Indicating intent and hoping will get you in trouble with more than just busses.
> Unless I'm reading the report wrong, it appears that the bus did have the right of way.
Yeah, I was more trying to make an observation about general behavior of buses. When I read the article there was no mention of fault or reference to a report, so I didn't know. But I do know that buses disregarding right of way is fairly common. "With regard to this instance" was probably too strong/literal of a phrase to use to tie-in my observation.
Then we'll have the problem of us humans trying to predict the behaviour of the machine that unintuitively will be harder than predicting that of other humans. Of course the system will be much safer when there are no more humans involved in the decisions.
I think I have achieved 95% accuracy in attempting to predict driver behavior as much as much as 6-10 cars ahead of me. I say 95% conservatively. Though I have not exactly ever found myself making the wrong guess.
Understanding the expected behavior of other drivers is critical to making self driving cars work. And that seems like a pretty hard thing for a computer to figure out.