The Department of Justice is currently investigating Tesla for a series of accidents — some fatal — that occurred while their autonomous software was in use. In the DoJ’s eyes, Tesla’s marketing and communication departments sold their software as a fully autonomous system, which is far from the truth. As a result, some consumers used it as such, resulting in tragedy. The dates of many of these accidents transpired after Tesla went visual-only, meaning these cars were using the allegedly less capable software.
Consequently, Tesla faces severe ramifications if the DoJ finds them guilty.
And of course:
The report even found that Musk rushed the release of FSD (Full Self-Driving) before it was ready and that, according to former Tesla employees, even today, the software isn’t safe for public road use. In fact, a former test operator went on record saying that the company is “nowhere close” to having a finished product.
So even though it seems to work for you, the people who created it don’t seem to think it’s safe enough to use.
My neighborhood has roundabouts. A couple of times when there’s not any traffic around, I’ve let autopilot attempt to navigate them. It works, mostly, but it’s quite unnerving. AP wants to go through them ready faster than I would drive through them myself.
This kind of serious trouble (from the article):
And of course:
So even though it seems to work for you, the people who created it don’t seem to think it’s safe enough to use.
My neighborhood has roundabouts. A couple of times when there’s not any traffic around, I’ve let autopilot attempt to navigate them. It works, mostly, but it’s quite unnerving. AP wants to go through them ready faster than I would drive through them myself.