Previously this month, a Tesla in Utah crashed into a stopped fire engine at 60mph. The story got a lot of attention from the media, which triggered billionaire Tesla CEO Elon Musk to have a disaster on Twitter about prejudiced reporting.
Today, the Associated Press got file on the Utah crash from regional authorities which appear to reveal that the lorry was under the control of Auto-pilot at the time, and sped up for a couple of seconds prior to the crash. The chauffeur struck the brakes by hand a minute prior to the effect.
Musk’s tweets after news of the mishap initially emerged took aim at that when other auto accidents, they do not get the very same type of attention from the media:
What’s actually amazing about this accident is that a Model S hit a fire truck at 60mph and the driver only broke an ankle. An impact at that speed usually results in severe injury or death.
— Elon Musk (@elonmusk) May 14, 2018
To a degree, he has a point. Human beings are extremely bad at driving, and a couple of crashed Teslas does not imply that the innovation is less safe than human-piloted automobiles. However the concept that since something is much better than exactly what came prior to it, it does not get to be inspected is plainly incorrect. More homes burned down prior to asbestos was a thing, however would small lung problems due to asbestos be the type of story that papers should not have covered in the 1930s? Lead pipelines brought tidy water to 10s of countless families, driving a new age of hygienic living conditions that allowed urbanization, however the negative effects are still worth studying.
Tesla, by outright option and in no part because of Musk’s nonstop PR trip, is among the front-runners in releasing motorist assistant innovations to vehicles. Whatever from the name– Auto-pilot– to the promotion videos of hands-free driving provides the impression that less attention is had to drive a Tesla than a routine car. Newspaper article about a brand-new innovation stopping working in a sometimes-fatal method aren’t meant to state that driver-assist innovations are bad and ought to be prohibited; they simply raise awareness of the side-effects of relying on one specific brand-new innovation (Auto-pilot) excessive, and sometimes raise the concern of whether there’s a slower however more secure approach of presenting these brand-new innovations to the general public.
If Musk is actually bought encouraging individuals that his vehicles are more secure, Tesla must launch even more information on Auto-pilot’s security record. There’s presently one public fact about the security of Tesla automobiles prior to and after Auto-pilot, and the NTSB, which initially launched the number, has considering that stated that it’s flawed, at finest.
Human motorists are bad. Anybody who has actually aimed to drive in the left lane of the I-95 can inform you that, and that individuals pass away in auto accident due to human mistake every day isn’t really news anymore. Reporting on the failure of a brand-new innovation does not suggest that the brand-new innovation is even worse than the status quo; it just makes individuals knowledgeable about the issues of embracing brand-new innovations prior to they’re totally prepared for the mainstream.