- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
Judge finds ‘reasonable evidence’ Tesla knew self-driving tech was defective::Ruling clears way for lawsuit brought against company over fatal crash in 2019 in which Stephen Banner was killed near Miami
With two offset cameras, depth is reliable, especially using a wide angle and narrow angle lens offset. This is what OpenPilot does with the Comma 3 (FOSS self driving).
Radar is better, but some automotive radar seems to only be great at short ranges (from my experience with my fork of OP in combination with radar built into a vehicle).
What if one of them is dirty? What if you are driving with the sun right in front of you? What on a foggy winter day? The big problem here isn’t even what the cameras are or aren’t capable of, but that there is little to no information on all the situations Tesla’s autopilot will fail in. We are only really learning that one deadly accident at a time. The documentation of the autopilots capabilities is extremely lacking, it’s little more than “trust me bro” and “keep your hands on the wheel”.
The fact that it can’t even handle railroad crossing and thinks trains are a series of trucks and buses that blink in and out of existence and randomly change directions does not make me wanna blindly trust it.
Here is an alternative Piped link(s):
railroad crossing
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Do you work in the field? Sun/fog/etc are all things that can be handled with exposure adjustments. It’s one place a camera is more versatile than our eyes.
All that being said my experience is from indirect work on OpenPilot, not from Tesla. So a system that’s not commonly used by the average person, and does not have claims of commercial FSD.
No it’s not. World is filled with optical illusions that even our powerful brains can’t process and yet you expect two web cams to do. And depth is not the only thing that’s needed when it comes autonomous driving. Distance is an absolute factor. Case in point, two killed (if not more) bikers because they had 2 tail lights instead of one and Tesla thought it’s a car far away instead of motorcycle close by. Ran them over as if they were not there. Us as humans would see this rider and realize it’s a motorcyle… first because of sound second because our brain is better at reasoning. And we’d avoid the situation. This is why cars MUST have more sensors, because processing is lacking so much.
Here is an alternative Piped link(s):
Case in point
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.