US Regulators Investigate Tesla's Self-Driving System: Red Light Violations and Crashes (2025)

Imagine cruising down the highway in a car that promises to drive itself, only to find out it's been breaking traffic laws and causing accidents—scary stuff, right? That's the heart of the latest drama unfolding with Tesla's self-driving tech, and it's got regulators on high alert. But here's where it gets controversial: Is this cutting-edge innovation worth the risks, or is it putting lives in danger? Let's dive in and unpack this story step by step, so even if you're new to the world of autonomous vehicles, you'll get the full picture.

American automotive safety officials have kicked off a formal probe into Tesla's lineup of cars fitted with its advanced full self-driving (FSD) system, following a string of incidents where the technology allegedly led to violations of road safety rules and resulted in crashes. The National Highway Traffic Safety Administration (NHTSA), the key federal body overseeing vehicle safety, announced that Tesla's driver-assistance feature—which still demands that the person behind the wheel stays alert and ready to step in—has prompted behaviors in the vehicles that defy traffic laws.

This initial assessment marks the starting point in a process that could lead to a widespread recall if the NHTSA determines the cars present a serious threat to public safety. To put it simply, a recall means Tesla might have to fix or update the software in millions of vehicles to prevent further issues, much like how a faulty airbag could lead to a massive fix across an entire model line.

According to the agency, they've gathered data on nearly 2.9 million Tesla models that ran red lights or switched lanes in the wrong direction while the FSD system was active. Digging deeper, the NHTSA highlighted six specific incidents where a Tesla with FSD enabled barreled through an intersection against a red signal, colliding with other cars in the process. Tragically, four of these accidents caused injuries to one or more people. Tesla hasn't yet issued a statement in response to inquiries from Reuters.

Beyond that, the NHTSA has logged 18 consumer complaints plus one media account claiming that Teslas using FSD at intersections either didn't halt completely at red lights, failed to stop at all, or incorrectly showed the traffic signal status on the car's display. Some users even reported that the system gave no alerts about its planned actions as it neared a red light, leaving drivers in the dark about what was happening.

For beginners wondering what FSD really is, think of it as Tesla's most sophisticated driving aid—far beyond its basic Autopilot mode. It's designed to handle complex tasks like navigating city streets, but crucially, it still requires a human driver to supervise and take control if things go awry. Tesla's own site emphasizes this, stating that FSD is meant for attentive operators who keep their hands on the wheel and are always prepared to intervene. While the company aims to improve it over time, it stresses that current versions don't make the car fully autonomous, meaning it's not ready to drive without human oversight.

This isn't Tesla's first rodeo with regulatory scrutiny. The NHTSA has been examining FSD for about a year now. Back in October 2024, they launched an inquiry into 2.4 million Tesla vehicles equipped with the system after reports of four collisions in challenging visibility conditions, such as bright sunlight, mist, or dust in the air. One of these, from 2023, sadly ended in a fatality.

And this is the part most people miss: While Tesla touts FSD as a game-changer for safer, more efficient travel, critics argue it might be creating new dangers by lulling drivers into complacency. Is the tech evolving faster than our ability to regulate it, or should we hit the brakes on such ambitious features until they're truly foolproof? What do you think—does the promise of hands-free driving outweigh the potential for misuse and accidents? Share your thoughts in the comments below; we'd love to hear if you agree, disagree, or have your own take on this heated debate.

Reuters played a role in reporting this story.

Quick Guide

We're all about delivering top-notch journalism that serves the public good, and that often means hearing directly from those with insider knowledge.

If you've got insights or details to share on this topic, feel free to reach out confidentially through these secure channels.

Secure Messaging in the Guardian App

Our app includes a handy feature for sending tips on articles. Everything is encrypted end-to-end and blends into the normal app activity, so no one can tell you're contacting us—or what you're saying.

If you haven't downloaded the Guardian app yet, grab it for iOS here or Android here, then head to the menu and choose 'Secure Messaging'.

SecureDrop, Instant Messengers, Email, Phone, and Mail

For those who can safely access the Tor network without drawing attention, you can submit messages and files to us via our SecureDrop service.

Plus, check out our full guide on tips at theguardian.com/tips for more options, including the advantages and drawbacks of each method.

Illustration: Guardian Design / Rich Cousins

US Regulators Investigate Tesla's Self-Driving System: Red Light Violations and Crashes (2025)

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Lakeisha Bayer VM

Last Updated:

Views: 6322

Rating: 4.9 / 5 (49 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Lakeisha Bayer VM

Birthday: 1997-10-17

Address: Suite 835 34136 Adrian Mountains, Floydton, UT 81036

Phone: +3571527672278

Job: Manufacturing Agent

Hobby: Skimboarding, Photography, Roller skating, Knife making, Paintball, Embroidery, Gunsmithing

Introduction: My name is Lakeisha Bayer VM, I am a brainy, kind, enchanting, healthy, lovely, clean, witty person who loves writing and wants to share my knowledge and understanding with you.