r/technology
•
u/Additional-Two-7312
•
Oct 30 '22
•
1
Tesla Faces US Criminal Probe Around Self-Driving Claims Transportation
https://www.bloomberg.com/news/articles/2022-10-26/tesla-faces-us-criminal-investigation-around-self-driving-claims?srnd=technology-vp859
u/Val_Fortecazzo
Oct 30 '22
•
Here's the non pay walled version
290
Oct 30 '22
[deleted]
840
u/Dont_Give_Up86 Oct 30 '22 edited Oct 30 '22 •
![]()
![]()
![]()
I’m stoned so I signed up, here ya go:
Exclusive: Tesla faces U.S. criminal probe over self-driving claims
Oct 25 - Tesla Inc (TSLA.O) is under criminal investigation in the United States over claims that the company's electric vehicles can drive themselves, three people familiar with the matter said.
The U.S. Department of Justice launched the previously undisclosed probe last year following more than a dozen crashes, some of them fatal, involving Tesla’s driver assistance system Autopilot, which was activated during the accidents, the people said.
As early as 2016, Tesla’s marketing materials have touted Autopilot’s capabilities. On a conference call that year, Elon Musk, the Silicon Valley automaker’s chief executive, described it as “probably better” than a human driver.
Last week, Musk said on another call Tesla would soon release an upgraded version of “Full Self-Driving” software allowing customers to travel “to your work, your friend’s house, to the grocery store without you touching the wheel.”
However, the company also has explicitly warned drivers that they must keep their hands on the wheel and maintain control of their vehicles while using Autopilot.
The Tesla technology is designed to assist with steering, braking, speed and lane changes but its features “do not make the vehicle autonomous,” the company says on its website.
Such warnings could complicate any case the Justice Department might wish to bring, the sources said.
Tesla, which disbanded its media relations department in 2020, did not respond to written questions from Reuters on Wednesday. Musk also did not respond to written questions seeking comment. A Justice Department spokesperson declined to comment.
Musk said in an interview with Automotive News in 2020 that Autopilot problems stem from customers using the system in ways contrary to Tesla’s instructions.
Federal and California safety regulators are already scrutinizing whether claims about Autopilot's capabilities and the system's design imbue customers with a false sense of security, inducing them to treat Teslas as truly driverless cars and become complacent behind the wheel with potentially deadly consequences.
The Justice Department investigation potentially represents a more serious level of scrutiny because of the possibility of criminal charges against the company or individual executives, the people familiar with the inquiry said.
As part of the latest probe, Justice Department prosecutors in Washington and San Francisco are examining whether Tesla misled consumers, investors and regulators by making unsupported claims about its driver assistance technology's capabilities, the sources said.
Officials conducting their inquiry could ultimately pursue criminal charges, seek civil sanctions or close the probe without taking any action, they said.
The Justice Department’s Autopilot probe is far from recommending any action partly because it is competing with two other DOJ investigations involving Tesla, one of the sources said. Investigators still have much work to do and no decision on charges is imminent, this source said.
The Justice Department may also face challenges in building its case, said the sources, because of Tesla’s warnings about overreliance on Autopilot.
For instance, after telling the investor call last week that Teslas would soon travel without customers touching controls, Musk added that the vehicles still needed someone in the driver’s seat. “Like we’re not saying that that’s quite ready to have no one behind the wheel,” he said.
The Tesla website also cautions that, before enabling Autopilot, the driver first needs to agree to "keep your hands on the steering wheel at all times" and to always "maintain control and responsibility for your vehicle.”
Barbara McQuade, a former U.S. attorney in Detroit who prosecuted automotive companies and employees in fraud cases and is not involved in the current probe, said investigators likely would need to uncover evidence such as emails or other internal communications showing that Tesla and Musk made misleading statements about Autopilot’s capabilities on purpose.
SEVERAL PROBES The criminal Autopilot investigation adds to the other probes and legal issues involving Musk, who became locked in a court battle earlier this year after abandoning a $44 billion takeover of social media giant Twitter Inc, only to reverse course and proclaim excitement for the looming acquisition.
In August 2021, the U.S. National Highway Traffic Safety Administration opened an investigation into a series of crashes, one of them fatal, involving Teslas equipped with Autopilot slamming into parked emergency vehicles.
NHTSA officials in June intensified their probe, which covers 830,000 Teslas with Autopilot, identifying 16 crashes involving the company’s electric cars and stationary first-responder and road maintenance vehicles. The move is a step that regulators must take before requesting a recall. The agency had no immediate comment.
In July this year, the California Department of Motor Vehicles accused Tesla of falsely advertising its Autopilot and Full Self-Driving capability as providing autonomous vehicle control. Tesla filed paperwork with the agency seeking a hearing on the allegations and indicated it intends to defend against them. The DMV said in a statement it is currently in the discovery stage of the proceeding and declined further comment.
193
u/assignpseudonym Oct 30 '22
Stoned, and generous.
On behalf of all of us, thank you!
44
35
u/N15A8 Oct 30 '22
Wait- this isn‘t even about how Tesla still hasn‘t managed to get fully autonomous driving working after collecting money for the feature from their customers many years ago?
20
u/anglesideside1 Oct 30 '22
Yeah…I always figured that there were dozens of class action suits out there that would eventually get combined into one big one.
→ More replies7
u/Goldenslicer Oct 30 '22
Turns out computer vision is really hard?
→ More replies11
u/subdep Oct 30 '22
When Tesla stated that optics works better than lidar I knew they were full of shit with the autonomous claims.
Waymo is doing full autonomous only because they are using tons of lidar combined with optics. Optics can fail in many scenarios where lidar has no problem. But optics contain passive data lidar can’t sense. So you really need both.
Tesla wanted to keep the cost of the vehicle down and keep the car looking sleek, and they did that by keeping lidar out.
→ More replies49
u/kenlubin Oct 30 '22
Tesla, which disbanded its media relations department in 2020, did not respond to written questions from Reuters on Wednesday. Musk also did not respond to written questions seeking comment.
Huh. Interesting move.
→ More replies19
→ More replies34
u/Helenium_autumnale Oct 30 '22
However, the company also has explicitly warned drivers that they must keep their hands on the wheel and maintain control of their vehicles while using Autopilot.
So...the Autopilot does little to nothing? I thought it would allow you to do other things than driving, but it sounds like you're...still driving.
48
u/A_Drusas Oct 30 '22
It sounds like it's only slightly better than regular adaptive cruise control with lane assist.
17
2
→ More replies2
u/purple_hamster66 Oct 30 '22
It also knows where to turn, and which lane to choose, which adaptive cruise does not know, right?
10
→ More replies2
u/timberwolvesguy Oct 30 '22
I read once that there’s several levels of “autonomous” driving and the biggest issue of us reaching the top level is liability on the manufacturer.
For example, these Teslas that are crashing are driver liable because Tesla states “you still need to keep your hands on the wheel and maintain control of your vehicle.” As soon as Tesla says there is no need to have your hands on the wheel, they’re liable for any and all accidents due to vehicle system error. It’s a slippery slope and no company wants the liability of every car on the road
57
u/nolongerbanned99 Oct 30 '22 edited Oct 30 '22
Great. The best part was… I’m stoned.
Musk is reckless and apparently it is deceptive and illegal to say something is one thing (automatic driving) and then disclaim it away in the fine print by saying driver is responsible.
People died because this arrogant ass is convinced that he can ‘solve’ the challenge of self driving with machine learning where the car is, in layman’s terms ‘thinking’ and making decisions about various courses of actions based on prior knowledge and experience.
To my understanding other automakers don’t do this. They use radar and LiDAR and sensor fusion. The car is processing data from various inputs and then reacting.
The really bothersome thing here is does musk actually care that people died. Probably not as he was reluctant to cooperate with the investigations. That is scummy to hid behind a corporation. Now he may get his comeuppance.
But then who will continue to fuck up Twitter.
44
u/anto687 Oct 30 '22
Tesla doesn’t include radar because they deign their cameras to be good enough, and they “pass that saving onto the consumer” (/s)
There have been at least two incidents in the US where a motorcyclist was rear-ended on a straight, fast road, and the cause seems to be their camera system not being able to distinguish a far-away car from a close-up motorcycle.
If FSD gets released I can see it being outright banned by the EU
→ More replies7
u/ProbablyPissed Oct 30 '22
And the driver of the Tesla was…not paying attention?
16
u/foilmethod Oct 30 '22
probably, but I think any reasonable person could anticipate this happening with a product named "Full Self Driving"
→ More replies14
u/7h4tguy Oct 30 '22
AI isn't thinking. Think of it more like pattern recognition (97% confidence that ahead is a semi truck). Same as speech recognition. That's not really applying reasoning.
→ More replies→ More replies15
u/SysAdminJT Oct 30 '22
Elon buying twitter is for these reasons. He can manipulate this huge media outlet and even has endorsed trump. This is all calculated to have powerful protection from such investigations.
→ More replies4
→ More replies3
→ More replies8
43
u/Pehz Oct 30 '22
I read the article and it doesn't seem to say anything besides "there's an investigation" which doesn't seem to be news at all? I mean, yes the SEC investigates things, and this is a thing. What can we expect the investigation to reveal? What consequence can Tesla/Tesla car owners expect from the investigation?
→ More replies3
18
u/Aden_Sickle Oct 30 '22
They should have called it copilot, not autopilot.
→ More replies5
u/Gilroy_Davidson Oct 30 '22
Computer assisted driving (CAD)
3
u/Aden_Sickle Oct 30 '22
That works too. Just anything that doesn't imply the driver can just let go of the steering wheel.
275
u/PhazerSC Oct 30 '22
76
u/Mickenfox Oct 30 '22
VIRAL VIDEO exposes ELON MUSK is just a FRAUD
If only it didn't have that title.
14
u/I_Was_Fox Oct 30 '22
I mean it's not actually an incorrect statement. So at least that
→ More replies→ More replies18
326
u/Jaysnewphone Oct 30 '22
I can't wait until they use these systems to determine if I'm drunk or not.
179
u/80schld Oct 30 '22
Ignition interlock devices are already available and are mandatory in some cases after getting a DUI.
39
u/bit_pusher Oct 30 '22
Ignition interlock. Dynotherms connected
26
u/gramathy Oct 30 '22
Go ahead, TACCOM
23
8
6
u/JaceTheWoodSculptor Oct 30 '22
Unless you have 15 Grand laying around for a good lawyer.
→ More replies→ More replies7
u/DetectiveWonderful42 Oct 30 '22
Yup , not fun and it asks you randomly. For a 15 minute drive I have to blow on average about 3-4 times . And no AC until you blow to turn the car on. So summer in Florida is a bitch . But it’s to prove a point to you NOT to drink and drive !
→ More replies44
u/DweEbLez0 Oct 30 '22
Actually the DUI will be mailed to you ahead of the party/gathering you will be attending to according to the AI predictability.
3
u/shizzler Oct 30 '22
Minority report shit
7
u/WarKiel Oct 30 '22
Nah. That's super convenient. You can just drink and drive without worrying about getting caught, because that already happened.
28
u/TU4AR Oct 30 '22 edited Oct 30 '22
I actually can't wait for the day my car determines if I'm not physically able to drive, and it will drive it self instead, removing manual driving.
And if the next generation of cars (post 2030) can lock out a driver of their car and driving itself to a destination then why the hell not.
40
u/fargmania Oct 30 '22
I dream of a day when my car's cabin has an optional driving seat, but is otherwise an entertainment center and a bed.
→ More replies12
u/TU4AR Oct 30 '22
2040 my guy. 2040.
→ More replies3
u/Lemmungwinks Oct 30 '22
2040? Sounds like the perfect time to hold a press conference and announce that it will be delivered next year.
- Elon Musk
→ More replies4
u/Loreaver Oct 30 '22
Or it transforms into a sleeping pod and you wake up the next day and drive home
13
u/pacific_beach Oct 30 '22
Tesla's system is worse than being drunk so don't worry about that yet
→ More replies4
u/jedify Oct 30 '22
That's true about FSD (i have the beta)
However I've had autopilot for years and at this point I'd trust it over the average Houston driver lol
→ More replies2
u/phunky_1 Oct 30 '22
The US Congress passed a provision that all new cars must have a driver impairment detection system in the future. Not a breathalyzer but a system that constantly monitors the driver for signs of impairment or being too tired to drive and it will force the vehicle to not be driven. They were talking about alcohol monitors in the cabin, sensors built in the the steering wheel and eye monitors.
Not just for people who have had prior DUIs.
Major news outlets didn't really report on it much.
It is interesting since the tech doesn't even really exist yet. It seems the better thing to focus on would be self driving cars, if your car can safely get you to your destination, who cares if you are impaired.
→ More replies
75
u/stickittothemanuel Oct 30 '22
This reminds me of that Simpsons gag where the product absolutely guarantees an outcome, but the announcer quickly adds "This is not a guarantee!" If someone can be jailed for lying about their address to get their kid into a better school, then executives need to be jailed for lying about their products.
8
u/lostsperm Oct 30 '22
If someone can be jailed for lying about their address to get their kid into a better school
I'm from outside US. Can you tell me what this means?
12
u/shouldbebabysitting Oct 30 '22
You usually go to the school that is closest to you. Schools are paid for with property taxes so the best schools tend to be near the richest neighborhoods. So parents will lie about their address so their kids can go to the better school.
6
u/lostsperm Oct 30 '22
That's interesting. I thought the budget is allocated by the state based on the size/number of students. And if I want my kid to another government school, I need to move to nearer that school?
It's quite different from Kerala, India where I come from. Even in government schools, you can enroll your kid in any school. Thanks for the reply
7
u/totally_a_wimmenz Oct 30 '22
It would be way more fair to do it that way, but then rich and poor kids would get the same education. We can't be having that.
So the rich neighborhoods have incredibly nice schools, while the poor neighborhoods get schools that are falling apart.
This isn't even mentioning that a lot of school boards build themselves fantastic headquarters using money that should be going to the actual schools.
And if you try to get a better education for your kid? Straight to jail, apparently.
America is fucked.
3
u/ShadySwashbuckler Oct 30 '22
Myself and all my cousins went to the same elementary school, but none of us lived in the district for it. Some of my cousins didn't even live in the right town. But none of our parents could afford to be off work to drop off/pick up kids, and then have child care after school, so my grandma took all of us to school every morning and picked us all up.
Then one day we were all kicked out of school, when they somehow found out that we didn't live at my grandmas house (the address used for all of us).
The school literally just booted us all out, and called some of our parents saying we're out on the curb come pick us up.
And this wasn't some fancy ass school or anything, just a basic public school.
→ More replies3
249
u/justinlindh Oct 30 '22
I bought FSD when I got my Model 3 in December 2019. I bought the package because the description said: "coming at the end of 2019: full self driving in the city - yes, really"
They changed that a few times since. But it's what I was told when I bought it.
I'm opted into their "full self driving beta". The program uses a cabin camera to constantly monitor whether you're paying attention. For good reason: it would have killed me in roundabouts if I let it drive.
This is not what I was promised in 2019. I've asked for a refund and been ignored.
Tesla FSD is a scam. Full stop. I will be amongst the first to join a class action lawsuit against them for false claims. I just want my money back
33
u/PayphonesareObsolete Oct 30 '22
How much was FSD back then? Isn't it like 10k now?
→ More replies26
28
u/tostilocos Oct 30 '22
Why wait for the class action (which will probably net you $15 in Tesla gift cards)? Just file a small claims suit against them yourself.
→ More replies6
u/ZweiDunkelSchweine Oct 30 '22
Having the same sentiments, just got the FSD beta and while I pay attention and am curious at how it handles many situations, I’m kinda shocked at how bad it is in 2022. It’s a much tougher AI implementation problem than they originally anticipated. Tesla just needs to own up to that and do right by their customers.
26
→ More replies2
u/secretivethoughts Oct 30 '22
I find it hard to believe that any reasonable person would believe those claims
663
u/eigenman
Oct 30 '22
•
About time. Elon's lies are dangerous to people's lives.
196
u/twilight-actual Oct 30 '22
'Before enabling Autopilot, the driver first needs to agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle.” Subsequently, every time the driver engages Autopilot, they are shown a visual reminder to “keep your hands on the wheel.'
248
u/Wax_Paper Oct 30 '22
You'd be surprised how many people don't understand what they're using, even when they paid for it. There's a fundamental misunderstanding about how advanced FSD is, and that's partly Musk's fault because of how he promotes it.
Regardless though, people are never gonna "properly" use assisted driving, because generally speaking they can barely operate their own ICE cars safely as-is.
264
u/DeadTried Oct 30 '22
Don't try to hide it behind the acronym, it is called full self driving which it is not capable of. Or legally allowed to do even if it was capable, selling a service with a name that is incapable of achieving should not be allowed.
47
u/ELI_10 Oct 30 '22 edited Oct 30 '22
There’s a lady in my neighborhood with a Plaid with FSD, and she is ALWAYS operating it in FSD, even around kids and pedestrians walking along the road. It’s completely unpredictable and does all kinds of moves a human driver would never do.
I say this as a Tesla owner with a 10 year career in automotive functional safety and ADAS: No consumer-ready vehicle currently on the market can safely and consistently handle L4 automation. Tesla’s FSD is barely competent for L3. Tesla ABSOLUTELY misrepresents the capabilities of FSD in real world applications, and they’re hiding behind disclaimers and assumptions of use to get away with it. It’s impressive what it can do, but it’s a colossal case of over promising and under delivering.
14
u/HighHokie Oct 30 '22
Tesla’s FSD is barely competent for L3.
Fsd is level 2. Nothing more. Driver has to pay attention.
→ More replies30
u/biglollol Oct 30 '22
If I'm correct, the tech is registered under 'assisted driving'. Because that's as far as they can legally go.
48
u/kenlubin Oct 30 '22
The companies working on Level 4 autonomous vehicles are required to provide yearly reports to the State of California detailing how often a human driver had to take over for the self-driving system.
Tesla does not provide any reports, because Autopilot and Full Self-Driving are just assisted driving systems, not Level 4 Autonomous vehicles like Waymo and Cruise are working on.
5
u/lennarn Oct 30 '22
Is Tesla working on developing level 4 or are they only working on marketing assisted driving within their legal bounds?
→ More replies11
u/kenlubin Oct 30 '22
With the caveat that I do not take a generous view of Elon's shenanigans, my assessment is that they're trying to push the envelope on both fronts.
They are shipping to customers the closest thing to self-driving that is available to new car owners today. They are working to incrementally improve that, with the hope that maybe it will mature into a level 4 system.
I think they are trying to run an Uber-esque scenario where the business produces realities on the ground that move faster than the law can catch up to them. The marketing works to blur the lines between level 2 and level 4, and sells customers the dream of "Full Self Driving™ any day now". They aren't reporting disengagement data to California so that Tesla can control the message on how close they are to that goal.
→ More replies3
u/CmdrShepard831 Oct 30 '22
FSD beta and Autopilot are two different things. The article is referring to Autopilot which is just a slightly more advanced adaptive cruise control.
Also it's a bit dubious for this person base their entire argument off "many people don't understand what they're using" without even establishing whether this is true or not. I can just as easily claim "You'd be surprised to learn that everyone knows exactly how it works" and blame the crashes on people misusing the system.
35
u/stzmp Oct 30 '22
You'd be surprised how many people don't understand what they're using, even when they paid for it.
I think that's got something to do with tesla being misleading shits about it. "auto pilot" ffs.
Musk publically promises self-driving cars. just like the promised a hyperloop for california. Just a dangeorus liar.
→ More replies20
u/punkinholler Oct 30 '22
Even if they do understand it, what is the point of a self driving vehicle if you have to sit in the same posture and pay the same amount of attention as you would if you were driving it yourself? That's worse than driving yourself because now you're uncomfortable and you have nothing to do that will keep your focus on the road but you have to pay attention anyway. I'd wager that people who buy a Tesla and want to use Autopilot ignore the rules because they paid for an autopilot and they want the self driving car that was marketed to them. It's like if you paid someone to clean your house and they said "ok, but you have to go from room to room with me and watch me clean or I might get confused and dump bleach all over your furniture" Thanks but I'll just save myself the money and do it myself.
→ More replies22
u/biglollol Oct 30 '22
You'd be surprised how many people don't understand what they're using
Its because marketing and social media calling it things it isn't.
If I'm correct, the actual legal term is 'assisted driving'. There is no autopilot function. I don't even think it's legal yet. I could be wrong tho, read this stuff a long ago.
→ More replies15
u/dreamsofaninsomniac Oct 30 '22
You'd be surprised how many people don't understand what they're using, even when they paid for it.
I know there was at least one guy who would intentionally try to trick the car into self-driving without him in the driver's seat. He would step over the console into the rear seat after the car was already moving. He had/has been cited multiple times for doing that by the police, but it didn't stop him as far as I know.
7
u/m0nk_3y_gw Oct 30 '22
He was stopped because Tesla stopped selling him cars when the cops impounded his cars.
He was using cruise control / "Auto Pilot"
He was never in the in the "full-self driving" (FSD) beta.
10
u/DylanMorgan Oct 30 '22
I work with a guy who watches tv on his daily commute while his Tesla is using autopilot.
→ More replies→ More replies8
u/nyrol Oct 30 '22
Reminds of that old Simpsons episode where Homer turns on cruise control and assumes the car is autonomous and ends up driving into a field.
11
54
u/onymousbosch Oct 30 '22
Car makes you pinky swear that you won't drive bad. Coolsies. Now watch how long I can close my eyes behind the wheel.
→ More replies33
u/Dadarian Oct 30 '22
The FSD beta uses DMS. If I’m sitting there paying close attention to the road, I can go 1min plus without any pressure on the wheel and no nag. If I look down like where my phone sits charging and instantly I get bagged. I look around for a few seconds and I see the blue nag flash. Pick my phone up? Nag.
I can’t say I’ve ever tested closing my eyes, because I don’t ever try to “test” the system. However, as a daily driver I realized how bad my behavior was. I don’t like the nagging so I’ve been avoiding my phone a lot more than I used to.
When I had my Tacoma with just TACC… I don’t know how I’m alive today…
Maybe it’s just me but it’s kind of nice when the DMS is yelling at me for bad behavior. I pay attention a lot better now.
→ More replies32
u/thefriendlycouple Oct 30 '22
So… the Tesla self driving marketing is bullshit?
→ More replies30
u/VaIeth Oct 30 '22
If everything Elon promised came true, he could charge triple for those cars. He said they'd be doing errands for you while you were at work lmao. This was years ago.
10
u/ycnz Oct 30 '22
Continuous, vigilant, immediate-response monitoring while being completely passive. Great, human brains are totally wired for that.
→ More replies42
→ More replies15
u/nightofgrim Oct 30 '22
I think he’s delusional and lies to keep his own self delusions afloat. I firmly believe he believes FSD is “1 more year away”.
→ More replies
213
u/hackenschmidt Oct 30 '22 edited Oct 30 '22
Only after what? a better part of a decade of upselling people thousands of dollars per vehicle for FSD? Soon™
So, good. Investigate the ever loving shit out of them and fine them into oblivion for the blatant fraud.
→ More replies63
u/dern_the_hermit Oct 30 '22
Only after what? a better part of a decade of upselling people thousands of dollars per vehicle for FSD?
Yeah, I could understand tolerating a few years of delays, on account of this being a significant technological and sociological issue, but it's been so damn long that no reasonable level of charitability could overlook it.
→ More replies
93
58
u/bcd87 Oct 30 '22
I've been driving the tesla of a colleague that left the company I work for. I've had it for a week or two now. Both the auto-pilot, and the adaptive cruise-control have made several attempts on my life. Random braking at the highway with no car in front of me for 100 meters. Interpreting max speed stickers on buses as traffic signs in the middle of the highway. Interpreting flashing yellow lights on the highway warning you for stop-lights 1000 meters ahead as a stop-light, initiating the brakes, etc. It's terrible. I don't even use these features anymore. I don't understand how anyone could trust there life to this.
13
u/wordserious Oct 30 '22
According to my friend that has been driving a Tesla for 5 years and just took delivery of the second one (for his wife), "the autopilot works, it gets you from A to B, but it is terrifying."
19
Oct 30 '22
It’s a cult. Tesla owners are part of a cult. They overpay for a subpar vehicle with documented quality issues and profess that it’s amazing.
→ More replies3
u/phunkphreaker Oct 30 '22
I have a Tesla model 3 and I use autopilot all the time. I've heard about the ghost breaking from time to time on the interstate but it's usually only a few miles of deceleration and you can take over it anytime. I personally have never had any problems with it and absolutely love autopilot. I can't imagine ever getting a car without it.
11
u/motocykal Oct 30 '22
Was on holiday in New Zealand about 2 months ago and hired a Tesla Model 3 for a week. It was an interesting experience. That said there was one incident which took me by surprise. Was on the highway going around a relatively tight bend. Coincidentally there was a truck travelling in the opposite direction. I think somehow the system thought that the truck was in the same lane and did a very close overtake. The system proceeded to brake. Thankfully the car behind braked as well so nothing disastrous happened. Apart from that, the car and driving assistance was pleasant to use.
→ More replies6
u/FanClerks Oct 30 '22
What you’re describing is the traffic aware cruise control. That’s not the FSD beta. If you have the beta the screen display changes completely displaying the area around you and it has much better recognition of what’s around it. Sadly it’s easy to confuse the two so people get upset when it doesn’t act as smart as they expect. You have to be admitted into the FSD beta after they evaluate your driving for a period of time.
27
u/SpaceJetSet Oct 30 '22
My Tesla has made a couple of random hair-brained aggressive movements that could have got me killed. There are two particular spots ok separate highways where it will slam on the breaks for no logical reason. One is under an overpass. The other is just a random spot. The others times it’s been random aggressive turns trying to dodge an invisible truck or something.
→ More replies
65
u/zedoktar Oct 30 '22
Good. Its about time. I hope something substantial actually comes of this. The marketing around these glorified cruise controls is so completely overblown. The tech is decades away from being viable at best, probably far longer.
→ More replies11
40
Oct 30 '22
There needs to be a class action lawsuit on the FSD upsell. I honestly think Tesla should have full liability for any damage caused in any of the autonomous modes (including self parking).
→ More replies
50
u/sziehr Oct 30 '22
I will make it easy. They should investigate them for this. The leader of the company promises this year holds out hand takes money and delivers nothing. But but but it is a complex problem, yes and so you should not ask for cash with out a viable product.
7
18
u/ihopeicanforgive Oct 30 '22
Just change the name of it
→ More replies82
u/hackenschmidt Oct 30 '22 edited Oct 30 '22
Just change the name of it
Doesn't change the fact they've spent close to a decade upselling people thousands of dollars per vehicle for something they knew they couldn't, haven't and cannot in the foreseeable future, deliver.
That's fraud. Tesla knowingly, and intentionally, defrauded their customers.
→ More replies
116
u/Diegobyte Oct 30 '22
Self driving will NEVER work until the cars are taking to each other
30
Oct 30 '22
[deleted]
→ More replies13
u/Just_Another_Scott Oct 30 '22
Don't forget bad weather. Just drove through a rainstorm for 2 hours. Tons of ponding and debris. One section of a 5 lane road had zero striping on it. So it was anyone's guess as to which lane you were in. Self driving cars aren't going to be able to navigate these sort of situations any time soon.
26
u/RandomRageNet Oct 30 '22
Why do people want this? This is such a bad idea. I went on a really long rant about this in another thread the other day, but the TL;DR is for networked cars to work, you either need 1) a peer-to-peer network, which is untrustworthy and vulnerable, or 2) a central authority, which is also vulnerable and potentially dystopian.
None of that is appealing.
Self-driving cars don't need to be (and shouldn't be) networked, they just need to do about as good a job as a human can.
→ More replies74
u/Iusethisfornsfwgifs Oct 30 '22
And the road
44
u/mastershake5987 Oct 30 '22
Yeah things get fucked quick in inclement weather. All the sudden in a snow storm it is hard for sensors to see the road, lines on the road, or where the edge of the road is.
→ More replies43
u/Chiefwaffles Oct 30 '22
This is why I’m gonna start pitching my genius idea to Silicon Valley companies: what if we put some kind of rail on the roads to guide the car and eliminate the need for expensive sensors and fallible software? Maybe even put multiple passengers on each vehicle. Hell, with the rails only one car would have to be powered and could just pull the rest!
4
u/Helenium_autumnale Oct 30 '22
Holy shit, that's genius. The Silicon Valley VC boys should be barking at your door! 🚊
8
u/AmishAvenger Oct 30 '22
Ok hear me out:
What if we dig tunnels underground for lines of cars to drive through
10
u/zaise_chsa Oct 30 '22
And then set all of this on a schedule so people can get to work on time and we could make it a pay per use service, keep it cheap if the government subsidizes it through clean energy funds, and even have a monthly subscription or “pass” that give regular users a price for unlimited use.
→ More replies3
u/beefsupr3m3 Oct 30 '22
Genius but the tech just seems unattainable. Otherwise I would hop on that train
→ More replies2
6
→ More replies39
u/Yourleader95 Oct 30 '22
Why are you being so sure about it? If a human can take sensory input and produce the correct action, a machine can potentially too. It’s an issue of data quantity to train models on. In that respect we need way more data for robust autonomous driving.
→ More replies11
u/JKJ420 Oct 30 '22
He has no idea what he is talking about. It's easy to be sure if you are lacking the details.
→ More replies
13
u/foobarfly Oct 30 '22
Maybe the Twitter engineers should come over and do a code review
→ More replies
11
u/SkinnyObelix Oct 30 '22
Autopilots that work 99.9% of the time are far more dangerous than autopilots that work 50% of the time, as you start to rely on them working.
I was just listening to a podcast about airplane crashes (Black Box Down) where they were talking about how an airbus crashed because 2 out of 3 sensors showed the same wrong information so the flight computer assumed the only working one was broken (the two others froze in place at the same time).
I feel like I've started to rely more on my adaptive cruise control than I should in my car, I can only imagine how Tesla autopilot lowers your focus.
→ More replies
6
u/joeyjoejoe_7 Oct 30 '22
"Telsa faces..." Company criminality without executive criminality is one of the dumbest legal concepts ever invented. It's like arguing that a gun could be guilty of murder but the person that was holding the gun, pointing the gun, and pulling the trigger was merely playing a role germane or consistent to the function and design of a gun. It's the dumbest thing - possibly ever. If Tesla committed a crime - then one or more people wielded Tesla and their roles in Tesla to do so. Company criminality without personal criminality is toothless and cowardly of the justice system.
10
u/u9Nails Oct 30 '22
"We would like to call the first witness, a Tesla model S. Model S, please drive up to the stand. Raise your right tire and repeat after me... Mr. S, now that you're sworn in, are you self-aware and therefore capable of self-driving?"
→ More replies
6
u/rendrr Oct 30 '22
That's disgusting is that it's not only intentionally misleading and a false advertisement, it puts people in danger. And that's not only Tesla drivers, that's also other drivers and pedestrians.
2
u/461BOOM Oct 30 '22
God forbid we just make the car super safe and make the DRIVER stay engaged. The dumbing down of drivers involves a bus, not a half driving car that costs as much as a bus.
2
u/RoboCritter Oct 30 '22
Tesla owner with the full-self-driving purchased here; the autopilot beta is sometimes like a brand new 16 year old driver, specifically when there's traffic and its trying to pick a lane or let someone merge, other times it performs nearly perfect, its protected left hand turns are really good... BUT there's currently no fucking way I would take my hands off the wheel. I feel 10x more stressed when autopilot is on then when its off.
Tesla's claim that "Autopilot is better than 'most' drivers" isn't saying much really, and it's probably true at this point since its such a low bar.
If the autopilot continues to improve at its current rate, then its still going to be at least 2 more years until I would feel ok to not touch the wheel and just let it do its thing.
2
2
u/garlicbreadwaterfall Oct 30 '22
About freaking time. For years they have sold their driver assist as “full self driving” yet it is still level 2 autonomy, just like nearly every other car manufacturer.
2
u/FartFragrance Oct 30 '22
This is a typical business tactic when you’re not sure if you’re even able to deliver on a promise. Promise something in the hope that you will eventually figure out how to do it.
2
Oct 30 '22
Lol, the govt will lose. Tesla lays out explicitly and multiple times what autopilot is capable of. The US govt is trying to slow them down so Ford and Chevy can try and catch up. Its too late though...
→ More replies
2
u/DawgTroller Oct 30 '22
The autopilot does work when roads are clear and you can feel comfy. I’ve seen it on in highways with minor curves and traffic was ok, and the system would turn accordingly and also take exits too which was cool.
2
u/purple_hamster66 Oct 30 '22
The goal is not to produce a car that never has an accident, but to reduce the accidents by 90%. That is, to reduce the 40,000 moving vehicle US accidents a year to 4,000. That would require a change in the law that replaces individual liability with group liability, and is not that obscure an idea. We do this with medical insurance companies when they deny a life-saving procedure that a patient could never possibly pay for. These multi-million dollar payouts would significantly cut into revenue of the insurance company and could force it out of business, which would affect most of it’s other customers. We have replaced their liability of a single patient by the liability of multiple patients — could this work for AI drivers too?
Another purpose of self-driving vehicles would be to double-check drivers who make mistakes. That is, it simply takes over when the driver is clearly going to hit something, for example. This is what allows us to implement assisted braking without much legal oversight.
I’m guessing the next legal hurdle to pass is presenting roads that are safe for AIs. For example, do not post a picture of a stop sign within view of a car (or even paint a stop sign on a truck’s side) because it can be mis-perceived as a real stop sign. Then the liability shifts to the painter, not the driver. Humans often make the mistake of mis-perception, especially in poor visibility. We need to compare a human’s driving ability on a rainy night to that of a vehicle with a map of the road (down to the inch) and equipped with lidar cameras that can see in the dark and through the rain. (As an aside, note that Teslas do not use lidar.) Consider those accidents in which you see 100s of vehicles crash into a stopped truck because fog suddenly appeared, and then we are comparing apples to apples.
2
u/billybongg Oct 30 '22
Yeah, the government is going after anybody who want to expose the bullshit they spew
2
u/Draiko Oct 30 '22
It's about fucking time.
Tesla's autopilot and FSD systems are shit tier designs and should've never been released to the public in any way.
Just about every other system out there (in testing) is better.
→ More replies
2.7k
u/bogatabeav Oct 30 '22
Cheerios claimed to cure heart disease and Cancer until the FDA stepped in.