- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
Tesla reportedly asked highway safety officials to redact information about whether driver-assistance software was in use during crashes::Elon Musk’s Tesla has faced investigations into Autopilot, including an ongoing NHTSA probe of more than 800,000 Teslas after several crashes.
Tesla’s business model is pushing half-baked sloppy software into production and putting lives in danger.
deleted by creator
Don’t you agree to be aware and under control while using teslas though? I don’t mean to provide excuses but isn’t that why they are able to deploy this software?
Tesla advertises their cars as “our cars drive themselves”.
Fair point!
Since everybody knows that problems just magically disappear when their consequences are hidden.
Especially really smart businessmen!
The original New Yorker article is an incredible read. I recommend it
Was it this one?
https://www.newyorker.com/magazine/2023/08/28/elon-musks-shadow-rule
Yes. Ronan Farrow is an excellent journalist
Is it? Admittedly, when I was reading it the first time I had to skim a lot because I was in a rush and had to put my phone away.
I want to get back to it, in fact I will now because it’s my lunch break, but my first thought was that it was kinda tame. Yeah the stuff about the gov not liking their dependence on his services was good, but General Milley sounded like they got along great. It was weird.
This is the best summary I could come up with:
Tesla directed the National Highway Traffic Safety Administration to redact information about whether driver-assistance software was being used by vehicles involved in crashes, The New Yorker reported as part of investigation into Elon Musk’s relationship to the US government.
"The Vehicle Safety Act explicitly restricts NHTSA’s ability to release what the companies label as confidential information.
Autopilot, which is meant to help on highways, is the driver-assist software built into all Teslas, while Full Self-Driving is a beta add-on that costs $15,000 a year.
Full Self-Driving is more advanced, and allows cars to change lanes, recognize stop signs and lights, and park, Tesla says.
Two months later, the agency announced another investigation into the feature after identifying 11 crashes since 2018 in which Teslas hit vehicles at first-responder sites.
A Department of Justice criminal investigation has also been underway, with Tesla confirming in February that the DOJ requested documents about the Autopilot and Full Self-Driving features.
The original article contains 446 words, the summary contains 155 words. Saved 65%. I’m a bot and I’m open source!
I don’t see the problem for Tesla. Regardless of whether the autopilot is active or not, the driver is responsible.
The driver is fully responsible, but Tesla are also making the big bucks with misleading marketing of how good their driving assistance is. So it’s more profitable to keep people unaware of its actual capabilites.
Am I confusing Autopilot and FSD?
There has been little progress in the Autopilot software. Will FSD is much more features rich.
Most mid to high end car manufacturers are making money on L2 driver assistance.
Legally yes, but in my opinion when you market your car as self-driving you do share a certain level of responsibility if it self-drives into an accident.
Until they actually aren’t:
https://m.youtube.com/watch?v=6Kf3I_OyDlI&pp=ygUYdGVzbGEgc2VsZiBkcml2aW5nIGZhaWwg
Here is an alternative Piped link(s): https://piped.video/watch?v=6Kf3I_OyDlI&
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.