Tesla is being criticized by motor-safety experts for using YouTubers to beta test its self-driving technology rather than trained safety drivers.
After signing non-disclosure agreements, these would-be influencers film their experiences on the road, Vice first reported, using Tesla’s Full Self-Driving (FSD) Beta to navigate busy streets.
Urban policy expert David Zipper criticized the FSD in a tweet, posting a clip which sees one Seattle beta tester’s steering wheel suddenly spin right and the car lurches toward a crosswalk.
‘Whoa, s–t!, sorry—it gave up there,’ the driver announces with an apologetic wave to pedestrians. ‘NOT the move.’
Galileo Russell, a.k.a. Seattle YouTuber ‘HyperChange,’ posted a video trying out v.10 of Tesla’s Fully Self Driving beta. In the clip, the steering wheel suddenly spins right and the car lurches toward pedestrians until Russell grabbed the wheel
Zipper, a visiting fellow at the Harvard Kennedy School, called out the clip as evidence of why Tesla’s FSD is dangerous.
”Not sure why FSD is such a safety hazard – especially for pedestrians and cyclists?’ he wrote. ‘Check out this video.’
In February, the National Transport Safety Board (NTSB) slammed the automaker for using its drivers as ‘guinea pigs’ to test its autonomous driving technology before it is officially approved.
The clip is actually a snippet from a longer video posted by Seattle YouTuber Galileo Russell, who posted it Sunday on his channel, HyperChange.
Russell, who has more than 155,000 followers on YouTube is among those in the Early Access Program who get to try out the FSD upgrade for free rather than pay the $200 monthly fee
Russell tried out version 10.1 with what he called the ‘Monorail test,’ weaving between pillars of Seattle’s trams system, and later riding on bumpy cobblestone streets of Pikes Place Market.
Both the YouTubers and Tesla are interested in ‘edge cases,’ according to Vice, unusual scenarios that put FSD to the test.
Tesla sparked controversy by giving the unfinished technology to 2,000 people since October to use on public roads, but CEO Elon Musk claims there have been no accidents.
‘FSD beta system at times can seem so good that vigilance isn’t necessary, but it is,’ Musk tweeted on September 17. ‘Also, any beta user who isn’t super careful will get booted,’
DailyMail.com reached out to Musk on Twitter for comment. Tesla does not have a traditional media relations department.
The beta offers features that allow vehicles to navigate and change lanes on public streets and enable left and right turns.
Tesla has said the FSB beta even warns drivers that it ‘may do the wrong thing at the worst time, so you must always keep your hands on the wheel.’
‘I drive around downtown Seattle, doing the Monorail test [three times], where it performed better than ever … and driving right by Pike Place market,’ Russell, who has 155,000 followers on YouTube, wrote in the video’s description.
‘Overall, I would say 10.1 is a light improvement over the last version, mostly because of the monorail test,’ he added.
Though the video is titled ‘Epic Monorail Success,’ at various points, Russell comments about how the car seems to want to make unrequested turns and even move forward on a red light.
Full Self-Driving beta is not actually autonomous driving, according to the U.S. Department of Transportation’s six-level definition.
It’s Level 2 driving, which requires drivers to remain aware and in control at all times.
As Vice points out, Russell and the other EAP YouTubers have consented to be a part of an experiment—and filmed doing it—but the pedestrians, bicyclists and other motorists they may endanger have not.
According to Vice, in order to get free access to v.10 of Tesla’s Fully Self-Driving AI, drivers must sign a non-disclosure agreement promising not to speak with the media
According to the site, the non-disclosure agreements signed by beta testers like Russell prohibit them from speaking to or giving test rides to the media.
In exchange for being part of the Early Access Program, the testers get free access to a pre-release version 10.1 of Tesla’s FSD software, which usually costs $200 per month.
There’s also press to present a positive picture of their experience with FSD and Tesla.
‘Do remember that there are a lot of people that want Tesla to fail,’ the contract reads. ‘Don’t let them mischaracterize your feedback and media posts.’
HOW DOES TESLA’S AUTOPILOT WORK?
Autopilot uses cameras, ultrasonic sensors and radar to see and sense the environment around the car.
The sensor and camera suite provide an awareness of the surroundings that a driver alone would not have.
A powerful onboard computer processes these inputs in a matter of milliseconds to help what the company say makes driving ‘safer and less stressful.’
Autopilot is intended to be used only with a fully attentive driver—it does not turn a Tesla into a truly self-driving car.
Before enabling Autopilot, drivers must agree to ‘keep your hands on the steering wheel at all times’ and to always ‘maintain control and responsibility for your car.’
If insufficient torque is applied, Autopilot will also deliver an escalating series of visual and audio warnings.
If drivers repeatedly ignore the warnings, they are locked out from using Autopilot during that trip.
EAP members are encouraged to ‘share on social media responsibly and selectively,’ Vice reported. ‘Consider sharing fewer videos, and only the ones that you think are interesting or worthy of being shared.’
In the aforementioned letter to National Highway Traffic Safety Administration (NHTSA), the NTSB called for stricter controls over who uses automated driving systems on public roads.
Tesla was named 16 times in the document, mainly due to the fact it released its ‘Full Self-Driving’ (FSD) beta version to the public ‘with limited oversight or reporting requirements.’
Although NTSB points to the Elon Musk-owned firm for its lack of safeguarding, the agency is also slamming NHTSA for its ‘hands-off approach’ to monitoring the company.
Tesla first launched its FSD beta program in October 2020 to a limited number of customers who were deemed ‘expert and careful drivers.’
It now has more than 1,000 drivers testing the beta version, though it’s not clear how many are paying subscribers and which are Early Access Program members getting it for free.
In the letter, NTSB chair Robert Sumwalt noted that it was Tesla’s lack of appropriate safeguards and NHTSA’s inaction that led to the death of drivers in Florida.
A number of Teslas using Autopilot have also been involved in accidents with parked emergency response vehicles.
By releasing its Level 2 Autopilot system ‘Tesla is testing on public roads a highly automated AV [automated vehicle] technology but with limited oversight or reporting requirements.’
WHAT ARE THE SIX LEVELS OF SELF-DRIVING AUTOMATION?
Level Zero – The full-time performance by the human driver of all aspects of the dynamic driving task, even when enhanced by warning or intervention systems.
Level One – A small amount of control is accomplished by the system such as adaptive braking if a car gets too close.
Level Two – The system can control the speed and direction of the car allowing the driver to take their hands off temporarily, but they have to monitor the road at all times and be ready to take over.
Level Three – The driver does not have to monitor the system at all times in some specific cases like on high ways but must be ready to resume control if the system requests.
Level Four – The system can cope will all situations automatically within defined use but it may not be able to cope will all weather or road conditions. System will rely on high definition mapping.
Level Five – Full automation. System can cope with all weather, traffic and lighting conditions. It can go anywhere, at any time in any conditions.
Tesla’s Model 3 Sedan – one of the world’s most advanced road-legal cars with autonomous elements – currently operates at Level Two autonomy. It is equipped for Level Three autonomy, which may be introduced in a future software update