Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
NTSB lays partial blame on Apple for fatal Tesla crash involving employee
#1
NTSB lays partial blame on Apple for fatal Tesla crash involving employee

The U.S. National Transportation Safety Board in a hearing on Tuesday knocked Apple for failing to enforce policies that restrict employees from using smartphones while driving, a practice that could have saved the life of engineer Walter Huang.

In 2018, Huang was involved in a fatal crash in Mountain View, Calif., when the Autopilot system on his Tesla Model X failed to recognize an obstacle, plowing the car headlong into a highway barrier at 71 miles per hour. Two cars subsequently hit Huang's vehicle and the Tesla's high-voltage battery was breached, resulting in a fire. Huang succumbed to his injuries after being transported to a nearby hospital.


At the hearing today, NTSB classified Huang, who was playing a game on a company-issued mobile device at the time of the crash, as a distracted driver, reports CNBC. Both Apple and Tesla are partially to blame for the engineer's death, according to remarks made by NTSB officials.

Tesla was taken to task for failing to prevent misuse of Autopilot, a driver assistance feature included in its range of cars and SUVs. The system's forward collision warning system failed to alert Huang to the approaching barrier, nor did its automatic emergency braking system activate prior to impact.

Bruce Landsberg, a vice chairman for NTSB, called Tesla's Autosteer feature "completely inadequate." The feature is designed to navigate tight roads and keep vehicles in their lane while traveling at highway speeds.


The automaker has been criticized for muddying already murky automated driving waters with its Autopilot branding. Consumers are either not clear about vehicle automation limitations or disregard warnings that require drivers to supervise onboard systems at all times and intervene when necessary.

Tesla's Autopilot is classified as a level 2 automated system, far from a theoretical level 5 self-driving car capable of acting as a user's personal chauffeur. Still, as noted by NTSB Chairman Robert Sumwalt, Huang was "using level 2 automation as if it were full automation."


Sumwalt also laid blame on Apple, saying in a statement, "The driver in this crash was employed by Apple -- a tech leader. But when it comes to recognizing the need for a company PED policy, Apple is lagging because they don't have such a policy."

The comments dovetailed with NTSB arguments revolving around employer responsibility. During the hearing, NTSB officials noted companies need to enact strict policies prohibiting the use of cellphones while driving. Apple currently lacks such rules.

"We expect our employees to follow the law," Apple said in a statement to CNBC.


Beyond corporate measures, NTSB argued a case for technological solutions like a lock-out mechanism that restricts mobile device access when a vehicle is in motion. Apple's has incorporated such capabilities into its iOS mobile operating system with a "Do Not Disturb While Driving" option, but the feature is disabled by default.

https://forums.appleinsider.com/discussi...mployee/p1



"We expect our employees to follow the law," Apple said

By extension, the NTSB should have chided CA for not having stricter laws sooner than eight years, to avoid creating a culture and climate where driving while distracted by your phone has tacit approval.

What the NTSB has done is given merit to lawsuits against Apple for 'their part'. I can almost hear the words 'knowingly', 'reasonable to assume', 'logical conclusion' and similar, being penned to paper.

If I'm not mistaken, this is the same collision in which the driver previously reported a problem with the Tesla auto-navigating that barrier and/or stretch of road.

The NTSB also cited equipment failure as contributing to the collision. Even so, the driver had prior knowledge of the specific problem and chose to drive with distracted.

In this instance, blaming Tesla doesn't make sense to me, for obvious reasons.

Blaming Apple is even more ridiculous.
Reply
#2
What a bunch of crap. How about holding the driver 100% responsible for his own actions and stupidity.

It isn’t his employers fault he was using his phone while driving, nor is it Tesla’s fault he repeatedly failed to follow vehicle instructions to grasp the wheel.
Reply
#3
C(-)ris wrote:
What a bunch of crap. How about holding the driver 100% responsible for his own actions and stupidity.

It isn’t his employers fault he was using his phone while driving, nor is it Tesla’s fault he repeatedly failed to follow vehicle instructions to grasp the wheel.

A friends Model 3 that I rode in, if you didn't have "hands on" for three minutes (during which time, it would warn you SEVERAL times), the car would signal, and pull off the road and STOP.... So, he was at least actively TRYING to defeat the cars safety system.

It's the DRIVERS responsibility... not the car, not the phone, not the employer, and not his mommy.
Reply
#4
Yes, an absurd decision - but it's only going to get worse.
Reply
#5
What a bunch of crap. How about holding the driver 100% responsible for his own actions and stupidity.


Indeed.

The NTSB is offering opinion as fact, fact that is in fact unsubstantiated conjecture.

'If you hadn't called it autopilot/autosteer this would have been less likely to happen' is what they're saying.

No doubt they've may suppositions in the past, but basically saying 'the driver believed that because the system was called autopilot, he was fine driving while distracted. This could have been avoided. He chose to ignore California law and drive while using his 'Phone, but if his employer would have told him not to, this could have been avoided.'

Talk about having an agenda.
Reply
#6
As much as I believe Tesla’s naming of Autopilot is misleading and dangerous, it is 100% the drivers fault for not being in control of his vehicle.
Reply
#7
pRICE cUBE wrote:
As much as I believe Tesla’s naming of Autopilot is misleading and dangerous, it is 100% the drivers fault for not being in control of his vehicle.

Only if you believe that the largest most vocal crowd should determine what a word means. If you actually look up the definition of autopilot and how the term is used in aviation (which is the only place it was used previously) you would find that Tesla has used it correctly. Sadly, any idiot can get a drivers license. Thankfully, Pilots have to have training so they know what Autopilot actually means and does. Maybe you should have to pass an test or go through a safety course to get a "Tesla" license to make sure that everyone who buys one can actually operate it effectively. Similar to a motorcycle endorsement.
Reply
#8
If anyone is interested, here is the detailed NTSB report on the analysis of the cell phone and cell service usage.

https://dms.ntsb.gov/public/62500-62999/...632606.pdf
Reply
#9
If anyone is interested, here is the detailed NTSB report on the analysis of the cell phone and cell service usage.


I've looked through it, and the depth of investigation is interesting.

But nowhere in their 'deep dive' do I see anything that shows Apple as having any collateral responsibility for the collision.

At this point, I don't know if it's been proven that the Tesla had trouble navigating the barrier it crashed into. Maybe that's already been demonstrated one way or the other and I've missed it.

But the driver thought it was, and whether or not that was the case, believed it to be true. Given that, my feeling is the sole onus falls on him to avoid what he believed to be a known hazard, and not be playing a game on his phone.

I'd be more interested in reading the NSTB's official summation, and how they arrived at their conclusions.

Will/did they mention CA's several cellphone laws prohibiting use?

Will/did they show some evidence that Telsa's use of Autopilot and Autosteer lead to the collision, or conversely, had they not used that branding, this collision would not have occurred?

I bet not, in either case.

Fire requires three conditions to exist. Removing only one absolutely prevents or stops fire.

In the case of this collision, adding an advisory by Apple not to use the phone while driving and Tesla not using Autopilot and Autosteer as marketing terms cannot absolutely prevent a similar incident.

That responsibility falls upon the driver, who at the very least violated a long standing rule of the road as defined by the CVC..
Reply
#10
RAMd®d wrote:
I'd be more interested in reading the NSTB's official summation, and how they arrived at their conclusions.

There's a lot of reading material on the NTSB website.

https://www.ntsb.gov/news/press-releases...00225.aspx The press release about this crash
https://ntsb.gov/investigations/pages/HWY18FH011.aspx The home page for this accident, Click on the Docket link and you'll see a list of 58 documents.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)