this post was submitted on 10 Jun 2023
56 points (100.0% liked)

Technology

37712 readers
187 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

More or less Tesla's autopilot is not as safe as Tesla would have you believe.

top 30 comments
sorted by: hot top controversial new old
[–] iam8bitwolf@beehaw.org 20 points 1 year ago (3 children)

They're running a beta test to the general public - only the thing they're testing is a 2 ton ball of metal and explosive material regularly traveling at 45 mph (70 kmh). They even have the gall to charge for the ability to beta test it. I really hope this gets regulated at some point, otherwise this is just the beginning

[–] darkmugglet@lemm.ee 6 points 1 year ago* (last edited 1 year ago)

IMO, this is the problem. Any normal person doing this would be in prison. Something like automated driving should be strictly regulated. I own a Mach-e, and while its self driving features are limited, it errs so much on the side of caution that you cannot not pay attention to the road. As it should be.

[–] Knoll0114@lemmy.world 6 points 1 year ago (1 children)

I'm shocked it isn't already regulated. I get it's a developing technology but cars can be murderous.

[–] ShadowAether@sh.itjust.works 1 points 1 year ago

Where I am, SAE Level 3 is banned as in you need authorization to test it out on public roads but SAE Level 2 is allowed. There are also SAE Level 5 vehicles in operation today, they're just on private roads/property and nearly all of them are regulated, it's just under workplace safety laws instead of driving laws.

[–] lumi@beehaw.org 1 points 1 year ago

Maybe a controversial opinion, but I'm glad they are charging for it. I wish there were a better way to vet who gets to be beta testers, but at least by charging money, they are ensuring only people who care about the technology get to use it.

Maybe I'm jaded, but it seems like drivers, in general, have gotten worse post-pandemic, and I wouldn't trust 90% of them with autonomous driving features in the state it's in.

[–] jjagaimo@lemmy.ca 15 points 1 year ago (1 children)

While this is undeniably tragic in every instance, I can't help but point out that the title had my sleep deprived brain thinking, "how the hell did the car crash that many times and keep on driving"

I'm so relieved I wasn't the only fool, thank you lol

[–] Wiitigo@lemmy.world 13 points 1 year ago (2 children)

Still almost exactly half the crash rate of human-only drivers. Therefore, we should ban human-only driving.

[–] RandomBit@sh.itjust.works 9 points 1 year ago (1 children)

I don’t think this is a fair comparison since an Autopilot crash is a 2 stage failure: the Autopilot and then the driver both failed to avoid the crash. The statistics do not include the incidents where Autopilot would have crashed but the human took control and prevented it. If all instances of human intervention were included, I doubt Autopilot would be ahead.

[–] Kepler@lemmy.world 1 points 1 year ago (1 children)

If all instances of human intervention were included, I doubt Autopilot would be ahead.

Why would you interpret non-crashes due to human intervention as crashes? If you're doing that for autopilot non-crashes you've gotta be consistent and also do that for non-autopilot non-crashes, which is basically...all of them.

[–] RandomBit@sh.itjust.works 3 points 1 year ago

If a human crashes and their action/vehicle is responsible for the crash, the crash should be attributed to the human (excepting mechanical failure, etc). I believe that if an advanced safety systems, such as automatic braking, that prevent a crash that otherwise would have occurred, the prevented crash should also be included in the human tally. Likewise, if Autopilot would have crashed if not for the intervention of the driver, the prevented crash should be attributable to Autopilot.

As has been often studied, the major problem for autonomous systems is that until they are better than humans WITHOUT human intervention, the result can be worse than both. People are much less likely to pay full attention and have the same reaction times if the autonomous system is in full control the majority of the time.

[–] darkmugglet@lemm.ee 4 points 1 year ago (3 children)

You're missing the point -- with a human driver there is accountability. If I, as a human, cause an accident, I have either criminal or civil liability. The question of "who is at fault" get murky. And then you have the fact that Tesla is not obligated to report the crashes. And then the failures of automated driving is very different than human errors.

I don't think anyone is suggesting that we ban autonomous driving. But it needs better oversight and accountability.

[–] Locrin@lemmy.world 3 points 1 year ago

In these cases the human is still accountable. Do you think that if a Tesla plowed into a kindergarten while using Autopilot the driver would avoid punishment? The driver is using a feature of the car. It tells you to stay alert and be prepared to take over on short notice. Those crashing are the idiots that sit in the backseat, go to sleep or play on their phones while the Autpilot is on. The only self driving right now where I would be in favour of punishing the company if something went wrong is those taxis that you purely are the passenger in.

Sit behind the wheel, you are responsible for what happens.

[–] Faceman2K23@discuss.tchncs.de 1 points 1 year ago

My main issue with Teslas autopilot is it's branding and the way they advertise it.

Almost every non-tech person I talk to about things like that think it is 100% a hands off robot driver and that is a very, VERY dangerous idea.

It's a very good system, and it is improving with every update, but it is far from the idea that many people have in their heads.

The videos you see of people sleeping on autopilot are worrying, do Teslas not have driver alert monitoring? if I look away from the road for 5 seconds in my Mazda it lets me know very loudly that it wants me to pay attention, if I were to fall asleep it would do it's best to wake me up. when I use it's very simple and limited self driving function I cant take my hands off the wheel for more than about 10 seconds before it alerts me.

[–] Fubarberry@aiparadise.moe 1 points 1 year ago

I'm all for more accountability, but it's still better than human driving. Cutting human car deaths in half in exchange for murky accountability is clearly a worthwhile trade.

[–] communist@beehaw.org 10 points 1 year ago (1 children)

They mention the crashrate being lower than a human, what is the actual crashrate?

[–] Saik0Shinigami@lemmy.saik0.com 0 points 1 year ago (1 children)

Irrelevant; that's what it is. Considering that a human is still ultimately responsible when they're behind the wheel whether or not "autopilot" is running, it's the human that should be attributed the lower crash rate.

Otherwise you risk incidents like this one where the human intervenes in a near-miss and actively stops the car from actually causing a severe accident being counted as "pro-autopilot" when it was the human that actually stopped the event from occurring.

[–] communist@beehaw.org 1 points 1 year ago

No, the crash rate is definitely relevant, but I do get what you're saying about safety being human caused.

[–] SmugBedBug@sh.itjust.works 7 points 1 year ago

Doesnt it also have trouble recognizing children as pedestrians? Thought I remembered reading that.

[–] zomtecos@feddit.de 4 points 1 year ago (1 children)

In an ideal world, automation could free up the human workpower to relevant topics (instead of taxi/Uber/etc.). Science, development or even social working.

But as we are not in an ideal world, this would never work, as it would need equal and ver could education where you see every student what it is: essential brain capital which must not be left ignored.

So the reason why we get automated driving is: because we hate monotonously driving every day the same route through the traffic jam.

[–] communist@beehaw.org 1 points 1 year ago

I think we really need to know what the actual crashrate is before we decide on that.

Even more, we need the crashrate based on different environments, I figure AI does worse than humans in snowy conditions, but in most highway conditions, I bet it can do better, perhaps we could regulate it based on weather conditions if we had such data

I don't know, but I think we need more data before we say anything.

[–] amanneedsamaid@sopuli.xyz 0 points 1 year ago (3 children)

I hate banning technology and stifling innovation. Lets ban automobile self-driving technology, no one needs it and the inherent risks and ethical dilemmas are not worth it at all.

[–] Knoll0114@lemmy.world 7 points 1 year ago (2 children)

To be fair 'no one needs it' isn't entirely true. There are many reasons someone who needs to get around might not be able to drive. For example, some people with epilepsy, senior citizens, teenagers going to work etc. I don't need it but I'd love the convenience and stress relief of never having to drive again. Public transport could help some of this but some areas just aren't populated enough for truly good public transport.

[–] amanneedsamaid@sopuli.xyz 4 points 1 year ago (1 children)

I think solutions like better public transportation, or government services so people could get free rides as some companies offer rides are better options. A computer driving a car has too many real world consequences that outweigh the convenience.

[–] Stormyfemme@beehaw.org 3 points 1 year ago (1 children)

The solution is always better public transit but I'd be shocked if any of us saw it approach even passable levels in our lifetime here in the States. Timelines for small projects stretch on for a decade. Massive ones can't even get off the ground. I wish it weren't true but I've basically given up on it. Maybe I'll move to Europe some day to have access to transit options.

[–] Knoll0114@lemmy.world 4 points 1 year ago

Even in Europe though rural areas are a thing. I've lived in Australia and the UK, travelled extensively in Europe. Many European cities have excellent public transport, but if you need to get to a small town for whatever reason you can't. In Australia it's definitely better in the major cities than it is in US major cities but that are so few people and it's such a large country that outside of those really big cities there's very little.

[–] Manticore@beehaw.org 2 points 1 year ago

The issue you've described though is not about self-driven technology. It's that 'driving' is the only form of transport, and thus the only way that anybody can ever be independent.

It's that too many areas design their infrastructure around the personal car, and make it impossible to get around without them. With them, it means sitting in traffic for hours at a time (because everybody else is in a car, too). Stretches of noisy rumbling multi-lane roads that don't have walkways or crossings. Bike lanes are non-existent, or pressed up against fast-moving car traffic. And because walking/cycling isn't an option, we have more people driving than ever - children being driven to and from school or sports, driving down to a store 100m a way to pick up eggs, etc.

Cars spend ~95% of the time parked somewhere, and 4% of the time moving a single person. They're incredibly inefficient, and yet they've been painted as a symbol of 'freedom' and 'independence' that seems massive amounts of land converted into parking spaces to accommodate something magnitudes larger than a person, one per person.

Cities that design around subway trains and bus lanes from the get-go have far smoother commutes. Smaller villages designed around trams and cycling are quiet, pleasant, and walkable. Both of them offer independence to a population that cannot drive - either practically or financially.

If self-driving car-sharing was available already now, then I'd be more likely to agree. Car-sharing (not ride-sharing, but hiring cars per minute via app) is the best way for car-based infrastructure to migrate towards lower traffic. Ripping up roads for trains is expensive, but knowing you can use a town car to visit your friend, then a van to help them move, and park neither of them in your driveway, will really help.

But right now self-driving cars are a passion project. They're not actually practical, they're just exciting and expensive. If accessibility for our blind, elderly, and impoverished population is the concern here, then billionaires funding the self-driving cars they can't ever afford is not the answer.

[–] minishoemaze@beehaw.org 5 points 1 year ago

One use case could be senior citizens who aren't ready to give up driving entirely. I'm sure it's not easy to admit that your vision and reaction time are deteriorating to the point that you're a danger on the road. As long as we live in a car-centric society, I hope the tech has solidified by the time I reach that point.

[–] darkmugglet@lemm.ee 2 points 1 year ago (1 children)

For me, the problem is one of justice. If I, as a meat sack, kill someone I am liable and most likely criminally liable for it. When AI commits man slaughter, then what? A company has the financial incentive and very little of the legal exposure because it's out sourced to the owner. Effectively the human operator trusting Evil Corp gets the raw end of the deal.

IMO, each version of the software should get a legal license.

[–] amanneedsamaid@sopuli.xyz 2 points 1 year ago

Exactly, lets just live with every driver being responsible for their own actions.

load more comments
view more: next ›