
Tesla owners choose ‘full self-driving’ software meaning thousands can soon hit the road
Owners of Tesla vehicles are now able to activate ‘Full Self-Driving (FSD)’ software following its release early on Saturday, much to the horror of regulators who claim it to be both unregulated and largely untested.
Chief Executive Elon Musk said Tesla drivers would be able to request a ‘beta’ version of its software starting Friday. But only those rated as ‘good drivers’ by Tesla’s insurance calculator would be able to use the system.
Owners will need to agree to have their driving monitored and only when their driving is deemed to be ‘good’ over a seven-day period, will ‘beta access will be granted.’
But the software comes all the while federal vehicle safety authorities are investigating the car maker for possible safety defects following a series of crashes into parked emergency vehicles.

Tesla has rolled out a long-awaited software update that allows customers to request access to its controversial, Full Self-Driving Beta (FSD beta) program

A disclaimer states ‘the currently enabled features require active driver supervision and do not make the vehicle autonomous.’ There is also a message that appears on the upgraded screen, warning drivers that ‘it may do the wrong thing at the worst time’

One Tesla driver posted some images of the software upgrade onto social media

Elon Musk has said the firm is starting full self-driving slowly and cautiously ‘because the world is a complex and messy place’
Tesla sparked controversy by testing the unfinished technology to 2,000 people since October on public roads, but Musk claims there have been no accidents with the beta users.
‘FSD beta system at times can seem so good that vigilance isn’t necessary, but it is. Also, any beta user who isn’t super careful will get booted,’ Musk tweeted.
The beta offers features allowing vehicles to navigate and change lanes on city streets and enabling left and right turns.
Tesla has said the FSB beta even warns drivers that it ‘may do the wrong thing at the worst time, so you must always keep your hands on the wheel.’

In several tweets Musk has made lofty predictions about being able to have full self-driving cars
Early beta tests of the FSD system showed in struggling with roundabouts and left turns. It would also suddenly veer towards pedestrians in the street and cross double-yellow lines in the center of the road, directly into the path of oncoming traffic.
This weekend’s software release is available to those who purchased the $10,000 software upgrade, and those who also have a subscription from Tesla ranging from for about $100 to $200 per month – although drivers will still need to pass the safety monitoring.
It will see drivers scored and marked out of 100. 0 to 100 criteria. Drivers will be assessed on five factors, including forward collision warnings per 1,000 miles, instances of hard braking, aggressive turning, unsafe following and forced disengagements of the Autopilot system.
Tesla will then use a formula to calculate their score with most drivers likely to score above 80.

This weekend’s software release is available to those who purchased the $10,000 software upgrade – although drivers will still need to pass the safety monitoring

A Tesla video demonstrates how Autopilot features work
‘These are combined to estimate the likelihood that your driving could result in a future collision,’ Tesla explained.
It’s not clear what score would need to be achieved in order to access FSD.
Worryingly, Musk has shared his own concerns over the self-driving software noting ‘we need to make Full Self-Driving work in order for it to be a compelling value proposition.’
Investigators are still looking at looking at FSD’s predecessor known as Autopilot which steers vehicles from highway on-ramps to off-ramp. The software can also park cars.
Last month, the National Highway Traffic Safety Administration opened an investigation last month into about a dozen crashes involving parked emergency vehicles while Autopilot was engaged.

Last month, the National Highway Traffic Safety Administration opened an investigation last month into about a dozen crashes involving parked emergency vehicles while Autopilot was engaged
Although the company has not specifically commented on the investigation, Tesla has repeatedly argued Autopilot is safer than cars being driven manually.
The move to rapidly roll out the feature is drawing criticism from regulators who say the issue needs further study with a focus on safety.
and industry peers who say the company is taking a hasty approach to an issue that requires careful study and an emphasis on safety.
‘I do think that their product is misleading and overall leads to further misuse and abuse,’ said National Transportation Safety Board Chair Jennifer Homendy to the Washington Post.
‘I’d just ask [Musk] to prioritize safety as much as he prioritizes innovation and new technologies … safety is just as important, if not more important, than the development of the technology itself.
‘Tesla has not responded to any of our requests [regarding safety and previous crashes]. From our standpoint they’ve ignored us — they haven’t responded to us and if those are not addressed and you’re making additional upgrades, that’s a problem,’ Homendy said.

National Transportation Safety Board Chair Jennifer Homendy, pictured, has voiced concern over the company’s plans for self-driving cars
‘It is incumbent on a federal regulator to take action and ensure public safety,’ Homendy said. ‘I am happy that they’ve asked for crash information from all manufacturers and they’re taking an initial step with Tesla on asking for crash information on emergency vehicles. But they need to do more.’
Tesla’s cars ‘aren’t actually fully self-driving,’ added industry group the Chamber of Progress.
‘The underlying issue here is that in case after case, Tesla’s drivers take their eyes off the road because they believe they are in a self-driving car. They aren’t.’
Scrutiny from US safety regulators, who opened an investigation into its driver assistant system, follows 11 accidents feared to have been caused because the system has trouble spotting parked emergency vehicles.
The National Highway Traffic Safety Administration (NHTSA) said the investigation covers 765,000 vehicles, nearly everything Tesla has sold domestically since 2014.
Of the 11 crashes that have been identified over the past three years, 17 people were injured and one was killed.
That deadly accident happened in Interstate 70 in Cloverdale, Indiana, in December 2019 and saw passenger Jenna Monet, 23, killed after the Tesla being driven by her husband Derrick slammed into the back of a parked fire engine.
Two US senators also called on the Federal Trade Commission to investigate Tesla, saying it misled consumers and endangered the public by marketing its driving automation systems as fully self-driving.
The 11 crashes have occurred when Teslas on Autopilot or Traffic Aware Cruise Control hit vehicles at scenes where first responders have used flashing lights, flares, an illuminated arrow board or cones warning of hazards.
The crashes into emergency vehicles cited by NHTSA began on January 22, 2018 in Culver City, California, near Los Angeles.
That incident saw Tesla using Autopilot struck a parked firetruck that was parked partially in the travel lanes with its lights flashing. Crews were handling another crash at the time.
Since then, NHTSA said there were crashes in Laguna Beach, California; Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater, Massachusetts; Cochise County, Arizona; Charlotte, North Carolina, Montgomery County, Texas; Lansing, Michigan; and Miami, Florida.
https://www.dailymail.co.uk/news/article-10028917/Tesla-owners-choose-self-driving-software-meaning-thousands-soon-hit-road.html