in

Tesla has removed a video that shows drivers carrying out their own safety tests to determine whether the electric vehicle’s Full Self-Driving (FSD) capabilities would make it automati

Youtube has removed a video that shows Tesla drivers carrying out their own safety tests to determine whether their full self-driving capabilities would make it automatically stop for children walking across or standing in the road.

The video was originally posted on whole Mars catalog’s YouTube channel. It involved Tesla owner and investor, tad park, testing Tesla’s fsd feature with his own kids.

Tesla full self-driving beta does n’t run over kids?. Thanks to @ minimalduck and @ @ @ amatadpark for your help on this!.

Youtube has specific rules against content that’endangers the emotional and physical well-being of minors’. Youtube spokesperson Ivy Choi said the video violated its policies against harmful and dangerous content.

Park told CNBC that the car was never traveling more than eight miles an hour.’I’m very confident that it’s going to detect my kids, and I’m also in control of the wheel so I can brake at any time,’ park said.

As of August 18th, the video had over 60,000 views on YouTube. The video was also posted to Twitter and still remains available to watch.

A video and ad campaign posted to Twitter showed Tesla vehicles apparently failing to detect and colliding with child-sized Dummies placed in front of the vehicle. Tesla fans were n’t buying it, sparking a debate about the limitations of the feature on Twitter.

The National Highway Traffic Safety Administration issued a statement warning against using children to test automated driving technology.’no one should risk their life, or the life of anyone else, to test the performance of vehicle technology,’ the agency said.

Tesla’s fsd software does n’t make a vehicle fully autonomous. It’s available to Tesla drivers for an additional $ 12,000 (or $ 199 / month subscription).

Earlier this month, the California DMV accused Tesla of making false claims about autopilot and fsd. The agency alleges the names of both features wrongly imply that they enable vehicles to operate autonomously.

In June, the NHTSA released data about driver-assist crashes for the first time. In addition to over two dozen Tesla crashes, some of which have been fatal.

Spread the AI news in the universe!

What do you think?

Written by Nuked

Leave a Reply

Your email address will not be published. Required fields are marked *