Looking at it another way: we’re all guinea pigs if we consider untested public policy that should work in theory.
Self-driving is not untested, but the problem is that deep down AI is just a lot of statistically derived rules and life is random and will inevitably find a loophole. Technically it’s still less likely to kill you on average, maybe even on average if you exclude drunk driving, street racing, and the like.
It’s really a philosophical question: would you take dying by your own fuck-up over dying because an AI confused a piece of cardboard for a brick wall or pedestrian?
I think the sweet spot is having the AI back up the human instead of the other way around, but that won’t sell as well as reading a book on your commute.
It’s really a philosophical question: would you take dying by your own fuck-up over dying because an AI confused a piece of cardboard for a brick wall or pedestrian?
It’s a “philosophical question” that implies that we must choose between manual driving and AI driving that can be confused by a piece of cardboard.
There’s nothing saying that Tesla’s full self driving is something we have to accept. Musk himself artificially limited the solution by disallowing lidar (amongst other bullshit decisions).
We’re not at the point of philosophical questions yet IMO, and we shouldn’t get locked into the false dichotomy of manual or musk’s version of automatic driving when there are other, much safer and more reasonable solutions both inside automobiles, as well as alternatives such as expanding public transit.
The process of it becoming good enough will be more gradual, with corporate interests lobbying for whatever is most marketable at the time. There will be no singular convention on driving where the philosophical question at the root of policy gets resolved - research will show new drivers benefit most, so at first first-time drivers might be obliged to have some AI backup then there will be some incremental movement as the political climate is favorable.
Not to go off on a tangent, but I think gun control is a useful example of what I’m talking about: it’s so easy to make people fight bitterly over minutia while ignoring the core philosophical questions entirely (government monopoly on violence, civilian relationship to government, civilian disarmament), an earnest discussion of which would likely be more disruptive to either overarching agenda than losing any court case (by calling other policy into question - like militarized police who do not even see themselves as civilians anymore).
So nVidia releases a better self driving AI than Tesla, and everyone is comfortable with letting it drive on the highway for you. Each step will be fairly uncontroversial until at some point we’re all comfortable with the thing and someone only wants to make it mandatory for some small segment of drivers, which itself will not draw much controversy because classic non-AI cars with manual transmissions and such will only be in the realm of enthusiasts and collectors.
Not to go off on a tangent, but I think gun control is a useful example of what I’m talking about: it’s so easy to make people fight bitterly over minutia while ignoring the core philosophical questions entirely (government monopoly on violence, civilian relationship to government, civilian disarmament),
All I see online is people (poorly) discussing these questions. The government doesn’t have a monopoly on violence or even legal violence, but someone always brings it up. If you really think this junk isn’t being discussed, I’d like to know what you’re even reading.
Each step will be fairly uncontroversial until at some point we’re all comfortable with the thing and someone only wants to make it mandatory for some small segment of drivers, which itself will not draw much controversy because classic non-AI cars with manual transmissions and such will only be in the realm of enthusiasts and collectors.
Yeah see here you go again, arguing it is inevitable that we’ll have to accept some junky AI driving with tradeoffs.
It’s obviously the view of auto manufacturers (and especially auto manufacturers that masquerade as tech companies) that this is the future of transit for people, but who knows what the future holds?
They’ve been at it for more than a decade and it still sucks.
Yes good point. Ends-means arguments are basically that people trust some rigid system more than the humans around them, even if it means them losing control and being helpless themselves.
As someone who works on AV SW for a living, it’s really not a big deal, assuming you’ve got certain limits already in place.
However, unlike Tesla, we’re not just handing this out to random people who clicked “I agree” on the screen. We’ve got tons of dedicated training and have to demonstrate we can react to stuff and take over under worst-case conditions, and take incidents like this really seriously.
It’s funny that he says “Oh, this is why it’s not released to the public”, as I did some driving with a Model 3 on the latest version of FSD within the last few weeks, and in a 1 hour drive had plenty of “Oh shit” moments like this. So yeah, they’ll totally release garbage like this to the public, no doubt about it
Sure, and we do hundreds of thousands of miles of simulation on each SW build before it’ll be okayed for even driving on site, then it has to pass additional tests before it’s allowed on public roads with a test driver.
The software is probably better tested than a good percentage of human drivers on the road in America, and definitely a better driver than some subset of that group. Good 'nuff, right?
(But seriously, get this crap off the streets along with the people who shouldn’t have licenses.)
Why is it on any car, even his, on public roads? Why should untested, unregulated software be controlling thousands of pounds of metal at all?
Because he has more money than you, which gives him more rights than you.
Removed by mod
Subscription based democracy?
Only sign up for the laws you want, with a quid pro quo/golden rule stipulation?
Removed by mod
“Lay thine hands upon on yur boomstick”
Looking at it another way: we’re all guinea pigs if we consider untested public policy that should work in theory.
Self-driving is not untested, but the problem is that deep down AI is just a lot of statistically derived rules and life is random and will inevitably find a loophole. Technically it’s still less likely to kill you on average, maybe even on average if you exclude drunk driving, street racing, and the like.
It’s really a philosophical question: would you take dying by your own fuck-up over dying because an AI confused a piece of cardboard for a brick wall or pedestrian?
I think the sweet spot is having the AI back up the human instead of the other way around, but that won’t sell as well as reading a book on your commute.
It’s a “philosophical question” that implies that we must choose between manual driving and AI driving that can be confused by a piece of cardboard.
There’s nothing saying that Tesla’s full self driving is something we have to accept. Musk himself artificially limited the solution by disallowing lidar (amongst other bullshit decisions).
We’re not at the point of philosophical questions yet IMO, and we shouldn’t get locked into the false dichotomy of manual or musk’s version of automatic driving when there are other, much safer and more reasonable solutions both inside automobiles, as well as alternatives such as expanding public transit.
The process of it becoming good enough will be more gradual, with corporate interests lobbying for whatever is most marketable at the time. There will be no singular convention on driving where the philosophical question at the root of policy gets resolved - research will show new drivers benefit most, so at first first-time drivers might be obliged to have some AI backup then there will be some incremental movement as the political climate is favorable.
Not to go off on a tangent, but I think gun control is a useful example of what I’m talking about: it’s so easy to make people fight bitterly over minutia while ignoring the core philosophical questions entirely (government monopoly on violence, civilian relationship to government, civilian disarmament), an earnest discussion of which would likely be more disruptive to either overarching agenda than losing any court case (by calling other policy into question - like militarized police who do not even see themselves as civilians anymore).
So nVidia releases a better self driving AI than Tesla, and everyone is comfortable with letting it drive on the highway for you. Each step will be fairly uncontroversial until at some point we’re all comfortable with the thing and someone only wants to make it mandatory for some small segment of drivers, which itself will not draw much controversy because classic non-AI cars with manual transmissions and such will only be in the realm of enthusiasts and collectors.
All I see online is people (poorly) discussing these questions. The government doesn’t have a monopoly on violence or even legal violence, but someone always brings it up. If you really think this junk isn’t being discussed, I’d like to know what you’re even reading.
Yeah see here you go again, arguing it is inevitable that we’ll have to accept some junky AI driving with tradeoffs.
It’s obviously the view of auto manufacturers (and especially auto manufacturers that masquerade as tech companies) that this is the future of transit for people, but who knows what the future holds?
They’ve been at it for more than a decade and it still sucks.
Don’t forget dying because of someone else’s fuck up
Yes good point. Ends-means arguments are basically that people trust some rigid system more than the humans around them, even if it means them losing control and being helpless themselves.
As someone who works on AV SW for a living, it’s really not a big deal, assuming you’ve got certain limits already in place.
However, unlike Tesla, we’re not just handing this out to random people who clicked “I agree” on the screen. We’ve got tons of dedicated training and have to demonstrate we can react to stuff and take over under worst-case conditions, and take incidents like this really seriously.
It’s funny that he says “Oh, this is why it’s not released to the public”, as I did some driving with a Model 3 on the latest version of FSD within the last few weeks, and in a 1 hour drive had plenty of “Oh shit” moments like this. So yeah, they’ll totally release garbage like this to the public, no doubt about it
Simulations show people will get out of your way 95% of the time, and it avoids nasty traffic jams.
Sure, and we do hundreds of thousands of miles of simulation on each SW build before it’ll be okayed for even driving on site, then it has to pass additional tests before it’s allowed on public roads with a test driver.
Leaks from SW development are mot encouraging fans.
The software is probably better tested than a good percentage of human drivers on the road in America, and definitely a better driver than some subset of that group. Good 'nuff, right?
(But seriously, get this crap off the streets along with the people who shouldn’t have licenses.)