Tesla’s self-driving car fails, fails to recognize child-sized dummy, collides

A video of Tesla is going viral on Twitter. Sharing this video, it is being claimed that during testing Tesla’s self driving car has failed. It is seen in this video that Tesla’s self-driving car does not recognize the child’s dummy on the way and hits it.

According to the news report, according to the Dawn project, the beta software present in the latest version of the Tesla full self-driving car fails to recognize the child on the way.

An advocacy group on Twitter is demanding that this Tesla car should be banned. The group says that Elon Musk will have to come forward and prove that in future this car will not harm any child on the road.

Apart from this, Tesla claims that these experiments have been done in extremely controlled conditions, which have been done at a test track in California.

hit at a speed of 40 km/h

Even during the test conducted in extremely control condition, it is being said that this collision has happened at a speed of 40 km per hour. This self-driving car was none other than the Model 3. In this car, Tesla has equipped its latest full self-driving beta software.

According to the UK's prestigious website The Guardian, Dan O'Dowd, founder of the Advocacy Group, has said that this FSD software is extremely dangerous. He has mentioned in his tweet that at present 1 lakh people are driving cars with full self driving mode on the roads. Which is very dangerous for the society.

Get the latest more news updates about Automobiles