07-01-2016, 11:42 PM
I'm actually surprised a self driving car got something this simple so wrong.
This is why I asked about the kind of sensors used.
If optical, I'm not so surprised. If radar, I am surprised.
Maybe the Tesla brain KNEW about the truck and decided there was nothing that could be done, any other course of action would have resulted in more deaths.
Without knowing all, or at least more (as in the speeds), of the AI details, a human driver, not relying on auto pilot, I don't see how applying the brakes, even at the last second, would be worse than not applying them at all. I understand not turning or swerving, assuming the electronics even had the ability to weigh that option. But not braking at all? No way.
I think it more likely a failure of sensors, a failure of sensors to handle a particular situation, or a failure of programming to handle a particular situation.
In CA, and it most states, I'm guessing, the Tesla would absolutely have the right of way, with the truck required to yield to it. However, if the truck makes a turning movement, the Tesla would be required to yield it's right of way, if not doing so would cause a collision.
Many states' authorities list the suspect/at fault vehicle as Driver 1, V1, V01, etc. Not all states are the same, but the FHP's investigation might be declaring the rig at fault. NHTSA could change al that, or confirm it. Maybe.
Seems like a stretch.
I don't think so. Be that as it may, it's a standard investigative yard stick to help establish responsibility and possible cause.
Not only that, but since the "driver" had it in autopilot, he probably wasn't paying as much attention as a driver normally should.
Which to me seems to support the "not noticing the truck" supposition, at least until it was too late. Autopilot or no, I'm sure the driver would have tried to brake, if he had noticed the truck in time.
I wonder what a human driver could have done.
Don't we all. First, as mentioned, without the option of Autopilot, the driver presumably would have paid more attention. Depending on his distance to the truck, he should have seen the tractor before it started the turn. He should have seen it moving before the trailer was 'lost' in the sun. Again, this is supposition.
I think it possible, and maybe likely that the truck driver saw the Tesla, and assumed it had enough lead to start the turn while giving the car enough time to stop safely, even though he knew that if it didn't stop, there would be a collision. But who doesn't stop?
When starting his turn, did the driver allow enough time for the Tesla to stop safely? Did the driver see it before turning? Was there enough time for the Tesla driver to stop safely? Autopilot or no, I think at this point in legislation and technology, the driver still has the ultimate responsibility. I don't for a second believe the driver has a reasonable expectation to absolutely trust the autopilot without oversight.
This is why I asked about the kind of sensors used.
If optical, I'm not so surprised. If radar, I am surprised.
Maybe the Tesla brain KNEW about the truck and decided there was nothing that could be done, any other course of action would have resulted in more deaths.
Without knowing all, or at least more (as in the speeds), of the AI details, a human driver, not relying on auto pilot, I don't see how applying the brakes, even at the last second, would be worse than not applying them at all. I understand not turning or swerving, assuming the electronics even had the ability to weigh that option. But not braking at all? No way.
I think it more likely a failure of sensors, a failure of sensors to handle a particular situation, or a failure of programming to handle a particular situation.
In CA, and it most states, I'm guessing, the Tesla would absolutely have the right of way, with the truck required to yield to it. However, if the truck makes a turning movement, the Tesla would be required to yield it's right of way, if not doing so would cause a collision.
Many states' authorities list the suspect/at fault vehicle as Driver 1, V1, V01, etc. Not all states are the same, but the FHP's investigation might be declaring the rig at fault. NHTSA could change al that, or confirm it. Maybe.
Seems like a stretch.
I don't think so. Be that as it may, it's a standard investigative yard stick to help establish responsibility and possible cause.
Not only that, but since the "driver" had it in autopilot, he probably wasn't paying as much attention as a driver normally should.
Which to me seems to support the "not noticing the truck" supposition, at least until it was too late. Autopilot or no, I'm sure the driver would have tried to brake, if he had noticed the truck in time.
I wonder what a human driver could have done.
Don't we all. First, as mentioned, without the option of Autopilot, the driver presumably would have paid more attention. Depending on his distance to the truck, he should have seen the tractor before it started the turn. He should have seen it moving before the trailer was 'lost' in the sun. Again, this is supposition.
I think it possible, and maybe likely that the truck driver saw the Tesla, and assumed it had enough lead to start the turn while giving the car enough time to stop safely, even though he knew that if it didn't stop, there would be a collision. But who doesn't stop?
When starting his turn, did the driver allow enough time for the Tesla to stop safely? Did the driver see it before turning? Was there enough time for the Tesla driver to stop safely? Autopilot or no, I think at this point in legislation and technology, the driver still has the ultimate responsibility. I don't for a second believe the driver has a reasonable expectation to absolutely trust the autopilot without oversight.