Tesla Autopilot Drives Itself Into Picture of a Road

If you need more proof that electronic driving aids don’t make a car autonomous, here it is

YouTuber Mark Rober put his Tesla Model Y’s Autopilot technology to the test by setting up a Looney Toons-style trap.

Rober set up styrofoam wall with a picture of a road on it, the middle of an actual road, to see how the system reacts. The test sheds light on the differences between cameras and lidar.

Rober put the Model Y, which has Autopilot technology relying on cameras, head to head against a Lexus RX-based prototype fitted with a lidar.

Autopilot passes the first two tests, which include stopping for a mannequin standing in the road and stopping for a mannequin that runs into the road, but it doesn’t detect the mannequin in fog or in rain.

Lidar sees the kid-sized mannequin regardless of road conditions.

The final test draws inspiration from the famous Wile E. Coyote versus Road Runner cartoons—meep, meep! The wall stretches the entire width of the road and it blends in surprisingly well with the landscape surrounding it.

It hopefully can’t fool a human driver, however, and it doesn’t fool the lidar-equipped Lexus, either. The prototype detects that it’s speeding toward a wall and stops without drama.

The Model Y starts driving toward the wall at about 15:00 in the video. Rober reaches 40 mph, engages Autopilot, and crashes right through the wall.

The test took place in broad daylight, without rain or fog, so it’s not like the Tesla didn’t see that it was driving toward a wall. However, one of the key differences between the two technologies is that the lidar is scanning the road ahead and detecting the wall.

What’s printed on the wall isn’t taken into account by the car’s brain. Autopilot’s cameras, on the other hand, rely on what they see, and in this, case that’s a road.

Only two cars were included in the test, but I’m guessing that many other camera-based semi-autonomous systems would have failed as well; The error isn’t Tesla-specific by any means.

And, granted, the odds of encountering a wall that looks almost exactly like a road while you’re commuting are pretty low, but the test does a pretty good job of highlighting what each technology is (and, crucially, isn’t) capable of.

And, on a secondary level, the Wile E. Coyote test shows that Autopilot doesn’t make a car autonomous by any stretch of the imagination. It’s a Level 2 system, meaning the driver needs to keep both eyes on the road and both hands on the steering wheel.

Tesla CEO Elon Musk once called lidar “a crutch,” but this isn’t the first time we’ve seen videos that highlight the limitations of camera-based electronic driving aids.

In 2022, a Model Y failed to detect a dummy that was standing in the middle of a road at night.

See more here thedrive.com

Please Donate Below To Support Our Ongoing Work To Defend The Scientific Method

PRINCIPIA SCIENTIFIC INTERNATIONAL, legally registered in the UK as a company incorporated for charitable purposes. Head Office: 27 Old Gloucester Street, London WC1N 3AX. 

Trackback from your site.

Comments (4)

  • Avatar

    Howdy

    |

    The lack of foresight by the ‘genius’ is astounding. When you add in the battering ram capabilities of the cyberchump, it’s well past time these vehicles were removed from the road.

    Have you watched ‘whistlindiesel’ testing a cyberchump to destruction against an F150? In two parts:

    https://youtu.be/PK_EJ3DyiiA

    Reply

  • Avatar

    Aaron

    |

    no lack of foresight by musk, all planned all about money and control
    how long will we continue to believe the show these con artists are presenting?

    Reply

  • Avatar

    John Galt

    |

    I”d love to see how ‘self driving’ works in rain, fog, snow. I’m inclined to rely on my Mk II eyeballs and onboard neural processor to make decisions TYVM.

    Reply

  • Avatar

    Dave

    |

    ALL EVs, Unsafe at any Speed!🔥🔥🔥

    Reply

Leave a comment

Save my name, email, and website in this browser for the next time I comment.
Share via