banner



Tesla Autopilot crash is a reminder that your car still can’t drive itself

Tesla Autopilot crash is a reminder that your machine still can't drive itself

Tesla model 3 vs model y: power
(Image credit: Tesla)

Some other Tesla has been involved in a high profile accident, with a reportedly Autopilot-controlled Tesla Model 3 colliding with an emergency vehicle last calendar week — specifically a Florida Highway Patrol Cruiser about Orlando.

While nobody is injure, these kinds of stories highlight the inherent flaws in electric current autonomous automobile software, be it Autopilot or something else. And that'south exactly why y'all're told to take an circumspect driver at the cycle at all times.

  • Everything yous need to know almost the Tesla Model 3
  • Tesla Model 3 vs Tesla Model Y: Which 'cheap' Tesla is right for you?
  • Plus: Mercedes EQE is a luxury electric car with the range to beat Tesla

According to CNN the incident happened on Interstate 4 merely before 5 a.k. ET. The Orange County trooper had stopped to assistance a cleaved down vehicle, but for a Tesla Model to hit the side of the patrol auto and then crash into the jerry-built Mercedes — narrowly missing the trooper in question. The patrol car did have its emergency lights flashing at the time.

The drivers of both the Tesla and the Mercedes were left with small-scale injuries, though nobody was seriously hurt. The Tesla'southward commuter besides confirmed to the Trooper on the scene that the car was in Autopilot manner at the fourth dimension of the crash.

Florida police said the crash would be reported to Tesla and the National Highway Traffic Safety Administration (NHTSA) — the latter of which is currently investigating Tesla Autopilot.

The NHTSA claims that Teslas take collided with emergency vehicles, including police cars and ambulances, at least 11 times between January 2018 and July 2021. The incidents happened in nine different states, and most of them apparently took identify at night.

What'southward more, the NHTSA said that the scenes had utilized emergency vehicles lights, flares, illuminated arrow boards and road cones prior to each accident.

In this instance, it'south non clear whether the commuter was misusing Autopilot or not. However it'due south some other warning of why drivers shouldn't be too trusting of Autopilot, or any other semi-autonomous commuter aid tech. It may seem like the car is capable of driving itself, just it's not effective enough to completely supercede the driver.

Autopilot is not a self-driving motorcar organization, no matter what it sounds like

Tesla itself has said that Autopilot could "practice the wrong thing at the worst time ,"  which is when the driver is needed to take control. If the commuter isn't paying attention, or worse, has really got out of the driver's seat, then the car is essentially left to its own devices when serious situations occur.

Semi-autonomous driver assistance tech is a massive help, specially on longer journeys, only it's not an alternative to actually driving. Even if terms like 'Autopilot' and 'Full Self Driving' make information technology sound like the automobile is able to exercise everything for you lot.

Tesla CEO Elon Musk has constantly dedicated the name Autopilot, claiming it'due south based on the autopilot used in planes that was built to help an attentive pilot. Just that hasn't stopped the automaker from landing in hot water.

High german courts accept ruled the proper noun Autopilot, alongside marketing that suggested Teslas could drive themselves, is misleading. Likewise the NHTSA has asked the FTC to investigate Tesla's use of the name Autopilot as a form of false advertising, though it isn't articulate what the FTC'due south response was.

Tesla likewise needs to practice more to stop people existence able to get out of the driver seat while Autopilot is engaged. Currently the system uses sensors in the steering bike to cheque if the commuter's easily are present, and will disengage if the seatbelt is unbuckled.

However, testing has shown these safety measures are terrifyingly easy to get effectually. Weights on the steering bicycle can mimic the presence of hands, and drivers could, in theory, sit on top of a buckled seat belt to give them liberty to leave the driver'south seat.

This is not a problem exclusive to Tesla, with other tests showing that autonomous driving safe measures are just every bit easy to cheat. And the biggest problem these systems all share are the lack of weight sensors in the driver's seat, checking if someone is really at that place or not.

Clearly, something needs to exist done across the board to cease this happening. Keeping someone in the driver'south seat isn't going to terminate them from getting distracted or taking their optics off the route, but it's a good start. In the meantime just remember that your 'autonomous' car isn't. Nosotros still take a long way — and at the very least several years — to get before your own automobile volition be driving you lot effectually without needing any supervision.

  • More than: Tesla hatchback: $25K cost, release, possible range and more than

Tom is the Tom's Guide's Automotive Editor, which means he can usually be found knee deep in stats the latest and best electric cars, or checking out some sort of driving gadget. It's long way from his days as editor of Gizmodo United kingdom of great britain and northern ireland, when pretty much everything was on the table. He's commonly found trying to squeeze another giant Lego set onto the shelf, draining very large cups of coffee, or complaining that Ikea won't let him buy the stuff he really needs online.

Source: https://www.tomsguide.com/news/tesla-autopilot-crash-is-a-reminder-that-your-car-still-cant-drive-itself

Posted by: beamonshent1951.blogspot.com

0 Response to "Tesla Autopilot crash is a reminder that your car still can’t drive itself"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel