UPDATE July 2 9:00 am. According to Automotive News, “There was a portable DVD player in the vehicle,” said Sergeant Kim Montes of the FHP in a telephone interview.
The story about the man who died in a crash while driving his Tesla Model S in Autopilot mode on a highway in Florida is all over the news. When the story broke a few days ago, I first assumed it happened this week. I was surprised to learn later that it actually happened on May 7. I do this stuff for a living (sort of) and so I monitor news about Tesla and Elon Musk fairly closely. Until Wednesday, there was not a single report anywhere on the internet about a fatal crash involving a Tesla back in May. Then Tesla apparently finally found out about the crash and NHTSA got involved.
At first, details were sketchy. We heard that a tractor trailer was making a left turn at an intersection on a four-lane highway with a wide median strip. There were no traffic lights at the intersection. These sorts of road junctions are quite common in Florida and other parts of the country. Whether the tractor trailer was turning left or executing a U turn is still unclear to me.
In any event, my first reaction was that a large vehicle like that should not have pulled out in front of oncoming traffic. Perhaps the truck driver was partially at fault? If you are building an autonomous driving system, how do you program it to anticipate every stupid thing a human being is capable of, whether it is a truck turning into the path of the car or a clueless pedestrian stepping off a curb into traffic while engrossed in the morning newspaper and drinking a flat white latté?
Now, details are beginning to emerge and they are disturbing. First, the driver of the Tesla was Joshua D. Brown, of Canton, Ohio. He was 40 years old and a Navy SEAL for 11 years. He left the Navy in 2008, according to the Pentagon. He was a very tech-savvy fellow. He was the founder of his own internet network and camera company, according to a report in the Dallas Morning News. He posted videos about his Tesla experiences on several occasions, including one showing how the Autopilot system in his car once saved him from a potentially dangerous collision when a white commercial truck swerved suddenly into his lane.
A woman driving on the same highway and in the same direction as Brown claims she was driving 85 mph when Brown’s Model S flew by her at a high rate of speed. That information is not included in the official traffic accident report filed by the Florida State Police, but is surely something known to Tesla, as it has access to all of the data stored in the car’s computer system. It is included in a story atTeslarati.
The driver of the tractor trailer, Frank Baressi, age 62, told the Dallas Morning News in a telephone interview that the Tesla driver was “playing Harry Potter on the TV screen.” He added, “It was still playing when he died.” Baressi said, “he went so fast through my trailer I didn’t see him.” He didn’t see the video playing, but claims he could hear the movie still playing when the Tesla finally came to a stop several hundred yards up the road. Tesla Motors says it is not possible to watch videos on the Model S touchscreen, but word is that Brown had a portable DVD player in the car.
The really scary part is that not only did the sensors in Brown’s car fail to detect the tractor trailer directly in front of it, the car itself continued to drive down the highway for several hundred yards after its roof was sheared off. It finally came to a stop in the yard of a home owned by Bobby Vankavelaar. He told ABC Action News that the Tesla traveled “hundreds of yards from the point of impact, through a fence into an open field, through another fence and then avoided a bank of trees before being unable to swerve and miss a power pole that eventually stopped the car a few feet away.”
So, is this a ‘perfect storm’ event? Someone driving much too fast while distracted, coupled with a truck driver who looked but didn’t see a car approaching in the opposite direction, in unfortunate lighting for a white truck? And what impact will this have on the advent of autonomous driving technology? Mike Harley, an analyst at Kelley Blue Book, says systems like Tesla’s, which rely heavily on cameras, “aren’t sophisticated enough to overcome blindness from bright or low contrast light.” He says to expect more deaths as the autonomous technology is refined.
Karl Brauer, a senior analyst with Kelley Blue Book, said the crash is a huge blow to Tesla’s reputation. “They have been touting their safety and they have been touting their advanced technology,” he said. “This situation flies in the face of both.”
Really, Karl? We said in a previous post that fatalities will continue to occur even when more cars start driving themselves. The difference is that, statistically, the likelihood of a fatal accident will be less for autonomous cars than cars operated by human drivers. This unfortunate incident occurred after Teslas worldwide had accumulated more than 130 million fatality-free miles while driving in autonomous mode. Statisticians say a death every 100 million miles is normal. So, the Tesla Autopilot system is already 30% more likely to save your life than if you are driving yourself unaided by computers.
Elon Musk continues to remind people that Autopilot is still only an aid. Drivers must remain alert, aware, and ready to take control at any time. My wife says autonomous cars are like boats towing water skiers. Most states require there be two people in the boat — one to steer and one to watch the skiers. She thinks if you are going to use computers to drive your car while you nap or watch videos, another responsible adult should be required to watch the road ahead.
Her idea is not as fanciful as it may sound. At the beginning of the automotive age, drivers entering a city were required to have a person on foot walk in front of the car sounding a klaxon to warn the citizenry that a motorized vehicle was approaching. Will regulators require something similar now that we know our machines can fail to protect us from all danger?
My guess is that the impetus of technology cannot be denied. Absent deliberate malfeasance on the part of a manufacturer, the world will accept that there is always an element of risk. The technology will get better. Within a few years, when IHS Automotive says 20 million autonomous cars a year will be sold every year worldwide, the chances of a fatal accident occurring in a self-driving car will probably be closer to once every 500 million miles.
Tesla is very lucky this was a single car accident. If another driver had died as a result of being impaled by a speeding Tesla with its roof sheared off, the odds are an army of trial lawyers would descend on the victim’s family, begging for a chance to be the first to sue Tesla. In the end, the fate of autonomous driving technology may not be determined as much by regulators as by the courts.
Screenshot via Teslarati
No comments:
Post a Comment