- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
Judge finds ‘reasonable evidence’ Tesla knew self-driving tech was defective::Ruling clears way for lawsuit brought against company over fatal crash in 2019 in which Stephen Banner was killed near Miami
What if one of them is dirty? What if you are driving with the sun right in front of you? What on a foggy winter day? The big problem here isn’t even what the cameras are or aren’t capable of, but that there is little to no information on all the situations Tesla’s autopilot will fail in. We are only really learning that one deadly accident at a time. The documentation of the autopilots capabilities is extremely lacking, it’s little more than “trust me bro” and “keep your hands on the wheel”.
The fact that it can’t even handle railroad crossing and thinks trains are a series of trucks and buses that blink in and out of existence and randomly change directions does not make me wanna blindly trust it.
Here is an alternative Piped link(s):
railroad crossing
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Do you work in the field? Sun/fog/etc are all things that can be handled with exposure adjustments. It’s one place a camera is more versatile than our eyes.
All that being said my experience is from indirect work on OpenPilot, not from Tesla. So a system that’s not commonly used by the average person, and does not have claims of commercial FSD.