close
close

Ourladyoftheassumptionparish

Part – Newstatenabenn

Being a Tesla self-driving vehicle tester sounds like the most stressful job ever
patheur

Being a Tesla self-driving vehicle tester sounds like the most stressful job ever

If the job description listed “must eliminate all sense of legal duty, moral ethics, and humanity” as responsibilities, would you apply? According to a new report, that’s essentially the role of a test driver for tesla.

autonomous driving Vehicles are the future. If the tech companies have their way, Self-driving cars would be only future. However, many of us live in reality and understand that software is nowhere near where it should be for that sci-fi level of automation. But the people with the funds will keep trying and speed up the process, even if that means beta testing it on public roads with all of us as guinea pigs.

Insider business information has revealed details of a specialized team of Tesla test drivers called Project Rodeo. This test equipment is trained to push the limits of the automaker. Fully autonomous driving (FSD) and Autopilot software. What exactly is the limit? Crashing. But the general rule seems to be to get as close as possible to colliding with something or someone. The scarier the situation, the better.

“You’re practically running on adrenaline for the entire eight-hour shift,” said one former test pilot. “There is a feeling that you are on the verge of something going very wrong.”

Nine current and former Project Rodeo drivers and three Autopilot engineers from California, Florida and Texas were interviewed. Most asked to remain anonymous. The situations they describe are revealing but not surprising. Although FSD-related accidents have been well documented, none of those interviewed were involved in any.

Project Rodeo is a test group made up of smaller teams. For example, the “gold manual” team drives by the rules, follows traffic rules, and does not use any driver assistance features. The opposite end of that spectrum is the “critical intervention” team. More cyclists than drivers, critical intervention testers allow software to take care of all aspects of driving. They get involved or “intervene” only to avoid a collision.

Part of the reason test drivers wait until the 11th hour to take control manually is because it gives the software time to react and make the right or wrong decision. The more data that is collected, especially in real-world scenarios, the easier it will be for engineers to adjust and update the software.

“We want the data to know what led the car to that decision,” said a former Autopilot engineer. “If you keep intervening too early, we don’t really get to the exact moment where we say, OK, we understand what happened.”

However, this leads to vehicles being allowed to run red lights, cross double yellownot paying attention to stop signs and exceeding speed limits, all on public roads. Even if the situation becomes uncomfortable for the driver, supervisors would say they took charge too soon. Therefore, Project Rodeo drivers, even those in non-critical intervention roles, felt pressured to maintain risky driving situations or sometimes create them entirely to test the software and keep their jobs.

John Bernal, a former test pilot and data analyst, said he was never told to deliberately violate the law for data collection, but it was strongly implied. “My training was to wait until the wheels touched the white line before I could slam on the brakes,” he said.

On top of that, certain units were used solely to train the software to recognize and adapt to “vulnerable road users” such as pedestrians, cyclists and wheelchair users. While driving with his coach, a former tester said his vehicle came within three feet of a cyclist before he hit the brakes.

“I vividly remember this guy jumping off his bike,” he said. “I was terrified. “The car came at him and all I could do was hit the brakes.” Apparently, his coach was actually happy about it, telling him that his delayed reaction was “perfect” and exactly what they wanted him to do. “It seemed like the goal was almost to simulate an unpredictable accident and then prevent it at the last second.”

Cruise and Waymo They are also developing autonomous vehicles, but say they conduct rigorous software testing in controlled environments or feel their autonomous system is “fundamentally different” from Tesla’s. Hmm, then why do these companies experience the same problems with vehicles that don’t read the roomper se? In the case of Uber’s now-closed autonomous vehicle divisionsometimes the results are deadly.

“If you have a parent who holds the bike all the time, they never learn,” said a former Tesla engineer. Ultimately, data is king. For these autonomous tech companies that are now at the mercy of shareholders, it’s a high-risk, high-reward environment that the public didn’t sign up for.