Self-driving car discussion catch-all

Jonman wrote:
Gremlin wrote:

I'm still in favor of the rules that Volvo and other non-Uber companies have been pushing towards: every collision that a self-driving car is involved in should be the manufacturer's liability.

Yes, but it's a lot more complicated than that.

True, but at the point where it's car manufactures suing each other to prove that their car is safer than the other guy's car I'm going to say we've vastly improved transportation safety.

Jonman wrote:

The good news is that the very nature of a self-driving car means that it's collecting a huge amount of data that can recorded for post-accident analysis. Crash survivable accident data recorders need to be mandated as part of the law for self-driving cars.

Which is why I'm disturbed when I hear about self-driving car manufacturers trying to cut down on the number and quality of the sensors to save costs.

Uber's car couldn't tell if there was a person front of it and didn't have the sensors to disambiguate the image detection system's confusion.

RawkGWJ wrote:

I don’t know if these features have ever actually helped me.

I'm very upset that none of these features seem to be designed by anyone with any degree of HCI-in-crisis-situation experience. They should hire some commercial airline cockpit designers or something. Alert fatigue is a thing that we've known about for ages.

Jonman wrote:

What about the self-driving Volvo that crashes into a self-driving Volkswagon - which manufacturer has liability?

Statistically unlikely, if not impossible. Only way it should happen is if one of the cars has sensor failure, tire blowout, etc., something catastrophic. And the car with the malfunction should be obvious.

Stele wrote:
Jonman wrote:

What about the self-driving Volvo that crashes into a self-driving Volkswagon - which manufacturer has liability?

Statistically unlikely, if not impossible. Only way it should happen is if one of the cars has sensor failure, tire blowout, etc., something catastrophic. And the car with the malfunction should be obvious.

Statistically CERTAIN in terms of entire fleet risk, you mean. Sensors fail. Software bugs are an inescapable fact of complex systems. Situations unforeseen by the designers occur. Maintenance errors are commonplace.

Jonman wrote:
Stele wrote:
Jonman wrote:

What about the self-driving Volvo that crashes into a self-driving Volkswagon - which manufacturer has liability?

Statistically unlikely, if not impossible. Only way it should happen is if one of the cars has sensor failure, tire blowout, etc., something catastrophic. And the car with the malfunction should be obvious.

Statistically CERTAIN in terms of entire fleet risk, you mean. Sensors fail. Software bugs are an inescapable fact of complex systems. Situations unforeseen by the designers occur. Maintenance errors are commonplace.

Right, but that it's very easy to place responsibility on the vehicle that experienced the error, and it''l be a rare enough occurrence that it'll be covered by simple CODB. No holds the pilot of an aircraft financially responsible when it experiences a mechanical fault. I don't believe they even hold the pilot responsible in the event of pilot error (but i'm less certain about that one)

I do not want to live in a world where every tenth (or more!) car has 360 degree video surveillance that can and will be given to the government at their request alongside the video surveillance from internet-enabled doorbells and the like.

bnpederson wrote:

I do not want to live in a world where every tenth (or more!) car has 360 degree video surveillance that can and will be given to the government at their request alongside the video surveillance from internet-enabled doorbells and the like.

You may want to look for off-planet transportation. You're going to need it in 5-10 years.

thrawn82 wrote:

Right, but that it's very easy to place responsibility on the vehicle that experienced the error, and it''l be a rare enough occurrence that it'll be covered by simple CODB. No holds the pilot of an aircraft financially responsible when it experiences a mechanical fault. I don't believe they even hold the pilot responsible in the event of pilot error (but i'm less certain about that one)

You're entirely wrong about that.

In the aviation world, incorrect maintenance that results in accidents shifts the liability onto the maintainer who dun screwed up. A pilot has liability for pilot error (which is why pilots carry insurance, just like drivers. As do airlines).

Manufacturers can be liable for design errors, too.

I appreciate the discussion about edge cases and acknowledge the inevitability of these kinds of issues, but even if the automated vehicle fatalities are an astronomical 5000 per year, it will be the greatest public safety improvement in transportation since... ever.

Jonman wrote:
thrawn82 wrote:

Right, but that it's very easy to place responsibility on the vehicle that experienced the error, and it''l be a rare enough occurrence that it'll be covered by simple CODB. No holds the pilot of an aircraft financially responsible when it experiences a mechanical fault. I don't believe they even hold the pilot responsible in the event of pilot error (but i'm less certain about that one)

You're entirely wrong about that.

In the aviation world, incorrect maintenance that results in accidents shifts the liability onto the maintainer who dun screwed up. A pilot has liability for pilot error (which is why pilots carry insurance, just like drivers. As do airlines).

Manufacturers can be liable for design errors, too.

Like I said I wasn't certain about pilot errors, I'd just never hard of a pilot being on the financial hook for an accident. Pilots maintain their airliners now too? I had no idea the airlines were cutting costs THAT hard.

Jokes aside I still find the most reasonable scheme to be the manufacturer guarantees against software and hardware faults, and the pilot gets responsibility if they become a 'driver' by taking manual control (and then all the current stuff for drivers applies)

Those hardware and software faults are still going to be vanishingly rare compared to the 17,000 daily accidents that result from human drivers.

Paleocon wrote:

I appreciate the discussion about edge cases and acknowledge the inevitability of these kinds of issues, but even if the automated vehicle fatalities are an astronomical 5000 per year, it will be the greatest public safety improvement in transportation since... ever.

Keep in mind horses were far more dangerous than cars on a per mile basis.

thrawn82 wrote:

Those hardware and software faults are still going to be vanishingly rare compared to the 17,000 daily accidents that result from human drivers.

Yes, and.

One driver making a mistake affects one car.

One design engineer making a mistake potentially affects every car.

thrawn wrote:

Like I said I wasn't certain about pilot errors, I'd just never hard of a pilot being on the financial hook for an accident.

You're right, because those pilots have legal liability, not financial, and end up in jail, not writing a check.

Jonman wrote:
thrawn82 wrote:

Those hardware and software faults are still going to be vanishingly rare compared to the 17,000 daily accidents that result from human drivers.

Yes, and.

One driver making a mistake affects one car.

One design engineer making a mistake potentially affects every car.

I am a transportation engineer with a minor in computer science. I don't think that many of the programmers / CS-major types that these companies are working with truly grok the nature of the system that they're entering into (not to mention the people steering the ship). It's huge and complex and messy, and you have a responsibility to everybody around to you operate in as safe a manner as you can. Transportation engineers have worked to mitigate the inherent safety faults and lapses of human drivers for decades, and while it will be easy for computers to be safer than the average human, that doesn't seem to be a focus in the same way that being first to market is. Civil engineers are just trained to look at these systems differently, and we talk a lot about codes of ethics about our service and duty to the public.

I was going to pick a nit about how one driver making a mistake can affect many cars - but really, every crash situation is a result of a complex interplay of the environment, driver actions, reactions, and lack thereof, mechanical properties of the vehicle, etc. It's not something we fully understand or have boiled down to the type of mathematical model that computers trade in. This is where machine learning comes in, certainly, but that's a black box process that further entrenches the biases of its programmers.

(I'm not sure if any of that made sense, and it certainly wasn't directed at anybody in particular. I just think the idea that autonomous vehicles are the domain of mechanical engineers and computer scientists rather than transportation engineers is dangerous and wrong. But we're currently stuck in a reactionary position.)

Jonman wrote:
thrawn82 wrote:

Those hardware and software faults are still going to be vanishingly rare compared to the 17,000 daily accidents that result from human drivers.

Yes, and.

One driver making a mistake affects one car.

One design engineer making a mistake potentially affects every car.

and this seems like a bit of a red herring to me. What makes this true about self driving cars that means we shouldn't go in that direction that isn't already true about every car thats been released in the past decade that operates on a computer controlled drive-by-wire system?

thrawn82 wrote:

What makes this true about self driving cars that means we shouldn't go in that direction that isn't already true about every car that's been released in the past decade that operates on a computer controlled drive-by-wire system?

The fact that current generations aren't "Full Authority" drive-by-wire. You still have a physical linkage between the steering wheel and front wheels, and a hydraulic linkage to the brakes. The throttle, sure, that's you making a suggestion to the ECU and getting what you get.

Which of course leads me to mention that there _have_ been events in those systems.

thrawn82 wrote:
Jonman wrote:

One design engineer making a mistake potentially affects every car.

and this seems like a bit of a red herring to me. What makes this true about self driving cars that means we shouldn't go in that direction that isn't already true about every car thats been released in the past decade that operates on a computer controlled drive-by-wire system?

Or every car ever released? The Pinto's design flaw certainly effected all of them. Any time a vehicle is subject to a manufacturers recall it's because they've found a design flaw that needs to be addressed.

Jonman wrote:

One design engineer making a mistake potentially affects every car.

That also means one patch fixes every driver.

cheeze_pavilion wrote:
Jonman wrote:

One design engineer making a mistake potentially affects every car.

That also means one patch fixes every driver.

Just like Windows!

*ulp*

BadKen wrote:
cheeze_pavilion wrote:
Jonman wrote:

One design engineer making a mistake potentially affects every car.

That also means one patch fixes every driver.

Just like Windows!

*ulp*

there are already a whole lot of B.O.B.s out there on the road...

ActualDragon wrote:

(I'm not sure if any of that made sense, and it certainly wasn't directed at anybody in particular. I just think the idea that autonomous vehicles are the domain of mechanical engineers and computer scientists rather than transportation engineers is dangerous and wrong. But we're currently stuck in a reactionary position.)

Yeah, I think a big part of the problem is that CS isn't as much of an engineering profession as we like to think that we are and we're not used to thinking about the consequences of the things that we're building. I mean, when one of the discussions on the academic level is still stuck at "what does an ethics class for a CS major look like" I'm rather uncomfortable with my students working on things that can potentially kill people.

(Which is why I mostly teach about how to make video games instead.)

I keep wondering if self-driving cars will be able to identify pedestrians or will they simply be detected as just another object. If you think about it from the machine perspective, what combination of algorithms and sensors would be needed to separate people from street signs? Thermographic imagery? How would that work when people are bundled up in the winter or in hot areas like Arizona? Movement? That doesn't bode well for people standing on a corner. I'm honestly not sure how it would work.

Nevin73 wrote:

I keep wondering if self-driving cars will be able to identify pedestrians or will they simply be detected as just another object.

I don't have the research at hand, but computer vision and video/image processing is a huge field in computer and electrical engineering. My understanding is that pixels are grouped by things like color and value/imputed depth (best paired with lidar) into likely objects, which are compared against known models. So a pedestrian might be identified as being shaped mostly human-like (whether or not wearing a bulky winter coat), while a pole may be skinnier, taller, and without movement. As I posted above, part of the problem with Uber's algorithm is that it didn't identify pedestrians unless they were in a crosswalk, a simplification likely made to decrease the workload. Another major issue right now is properly identifying and categorizing cyclists as separate from pedestrians.

It's a very difficult problem in the field, and one of the few driving tasks that humans perform better than computers, but the research is more advanced than one may think. Machine learning is also playing into improving these algorithms a lot, and there's also the issue of technology transfer between academia and industry.

Thanks, I wasn't sure if it would be attempting to map human shapes, which seems crazy complicated (or just dangerous to anyone who is an outlier to the typical size/shape).

Nevin73 wrote:

Thanks, I wasn't sure if it would be attempting to map human shapes, which seems crazy complicated (or just dangerous to anyone who is an outlier to the typical size/shape).

It's actually easier to map human shapes than it is to use one of the other methods you mentioned (which would require different sensors, etc.). Machine learning has brought us fast image classification that can quickly and reasonably accurately detect if the things it has been trained to detect are present in an image. They can, of course, make use of thermal data if they are trained on it, but mostly they aren't.

Basically, this XKCD from 2014 is slightly obsolete because the deep learning revolution that kicked off in 2015 has lead us to having bird scene description AI today.

It's still got some major blind spots, of course, if you'll pardon the pun: it's not magic. The "reasonably" in "reasonably accurately" is different if you're trying to detect birds picked up by a trail cam over versus a pedestrian where you really don't want false negatives.

Gremlin wrote:

Basically, this XKCD from 2014 is slightly obsolete because the deep learning revolution (...)

To be fair, she did say to give them five years

ActualDragon wrote:
Gremlin wrote:

Basically, this XKCD from 2014 is slightly obsolete because the deep learning revolution (...)

To be fair, she did say to give them five years ;-)

Sounds like her research team made some breakthroughs

Self driving car crashes into a wrecked trudk.

Baron Of Hell wrote:

Self driving car crashes into a wrecked trudk.

Can we avoid Russia Today as a source of anything please?

Paleocon wrote:
Baron Of Hell wrote:

Self driving car crashes into a wrecked trudk.

Can we avoid Russia Today as a source of anything please?

Why?

Baron Of Hell wrote:
Paleocon wrote:
Baron Of Hell wrote:

Self driving car crashes into a wrecked trudk.

Can we avoid Russia Today as a source of anything please?

Why?

Because that supports a propaganda arm of the Russian government.

There's a decent Jalopnik article with video sourced from Taiwan (where this happened).