Jump to content

Self-Driving Uber Car Kills Arizona Pedestrian


Recommended Posts

Certainly, our lidar is capable of clearly imaging Elaine and her bicycle in this situation. However, our lidar doesn't make the decision to put on the brakes or get out of her way."

 

 

So what good is it if the driver isn't paying attention? Might as well not have the stupid thing. No alarm to warn the driver something is in the way and just keep on truck'n. Stupid!

Link to comment
Share on other sites

 

 

So what good is it if the driver isn't paying attention? Might as well not have the stupid thing. No alarm to warn the driver something is in the way and just keep on truck'n. Stupid!

You missed the point. The VEHICLE is supposed to apply the brakes, not the sensor system. Velodyne is saying we only supply the sensor system and we’re not responsible if the brakes were not activated.

Link to comment
Share on other sites

You missed the point. The VEHICLE is supposed to apply the brakes, not the sensor system. Velodyne is saying we only supply the sensor system and we’re not responsible if the brakes were not activated.

I didn't miss the point. This system is absolutely useless. It does nothing to warn the driver of a possible collision or over ride the brake or steering system. It might as well not even be there at all.

Link to comment
Share on other sites

I didn't miss the point. This system is absolutely useless. It does nothing to warn the driver of a possible collision or over ride the brake or steering system. It might as well not even be there at all.

 

That’s because the sensor is just a sensor. It can’t apply the vehicle’s brakes or steer. Uber’s software has to interpret those sensor signals and it has to apply the brakes or take evasive action. Not the sensor.

 

Uber’s software has to be capable of applying the brakes and/or taking evasive steering action. It just didn’t work for some reason.

  • Like 1
Link to comment
Share on other sites

Any kind of warning would also have to come from the Uber software which is interpreting the sensor readings. Not from the sensor itself.

This is kind of like saying they only sold you the tires. They have no control over what wheels you decide to mount them on or forget to tighten the lug nuts. Instead of one standard system, the companies are buying parts and building their own system.

Link to comment
Share on other sites

Uber disabled safety component in self-driving car before fatal crash, company alleges......

 

So they didn't even tell the driver it was inoperative. Great!

 

http://www.foxnews.com/us/2018/03/27/uber-disabled-safety-component-in-self-driving-car-before-fatal-crash-company-alleges.html

What they disabled is the Volvo version of Adaptive Cruise. That's unrelated to the self driving system.

Link to comment
Share on other sites

This is kind of like saying they only sold you the tires. They have no control over what wheels you decide to mount them on or forget to tighten the lug nuts. Instead of one standard system, the companies are buying parts and building their own system.

 

You can’t buy a self driving car from Velodyne. Velodyne is a parts supplier. The vehicle mfr (in this case it’s Uber) takes a Volvo car, adds the Velodyne sensors and its own software and creates a self driving car. It’s up to the Uber software to take the sensor inputs and determine how and when to brake, steer, accelerate, etc.

 

If the sensor was working but the vehicle didn’t brake then that’s on Uber, not Velodyne.

 

If the sensor wasn’t working then Uber is still responsible because it’s their vehicle, but they could take action against Velodyne if they can prove the sensor failed.

 

In no circumstance would a sensor be able to activate the brakes on a car. That’s like saying the adaptive cruise sensor should be applying the brakes. It’s not the sensor it’s the PCM reacting to the sensor inputs that determines when to go or stop. And in this case the software controlling that was Uber’s.

Link to comment
Share on other sites

In this situation it’s clear that Uber software is the controller not the subsystem

So all intergration of parts, control and validation lies with Uber

 

Remembering that this trial in Arizona was exactly that, a trial

Or more correctly beta testing in public, I think the stipulations

On observers remaining vigilant needed to be stressed more

Than they were

Link to comment
Share on other sites

They had to disable it to allow the Uber software to take over control.

So they disabled it and didn't do anything to replace what they disabled. Then they put a "driver" in the car didn't tell him / her that the system is inoperative and expect the texting driver to be just fine in a self driving car that can kill people. I Get it now.

I'm sorry but somebody had their head up their ass and should be held responsible for the death of that woman. I hope the family sues the pants off them.

Link to comment
Share on other sites

So they disabled it and didn't do anything to replace what they disabled. Then they put a "driver" in the car didn't tell him / her that the system is inoperative and expect the texting driver to be just fine in a self driving car that can kill people. I Get it now.

I'm sorry but somebody had their head up their ass and should be held responsible for the death of that woman. I hope the family sues the pants off them.

Oh good grief. They had to disable the factory system to allow the Uber software to control everything. Otherwise they would be competing with each other. And of course the Uber software was SUPPOSED to brake for an obstacle. It just didn’t work for some reason and now that’s the million dollar question - why not? It was obviously able to operate the brakes in normal driving. Either the software didn’t pay attention to the sensors, the sensors didn’t register the obstacle or the software tried to apply the brakes but something didn’t work. The only way to find out for sure is to review the system logs.

Link to comment
Share on other sites

It's not good grief. Somebody f'd up and should be held accountable. I don't understand why you don't see it that way. If these cars are under beta testing there should be safe guards built in. And the operator should be paying complete attention. Ever see that big RED button on the dash of a prototype car being tested on the streets. It's a kill switch, and it's their for a reason, cause things can go wrong in a hurry. There was nothing in that car to bring the system down and that operator should not have been texting period.

Link to comment
Share on other sites

It's not good grief. Somebody f'd up and should be held accountable. I don't understand why you don't see it that way. If these cars are under beta testing there should be safe guards built in. And the operator should be paying complete attention. Ever see that big RED button on the dash of a prototype car being tested on the streets. It's a kill switch, and it's their for a reason, cause things can go wrong in a hurry. There was nothing in that car to bring the system down and that operator should not have been texting period.

Where did I say or imply that? You were saying they disabled the ability for the car to stop and that Velodyne was responsible because they said they couldn’t stop the car. But you misunderstood their comments.

 

It was Uber’s car and Uber’s software so Uber is to blame for any malfunction.

 

However that doesn’t translate to criminal or civil liability for the incident necessarily.

Link to comment
Share on other sites

That's my point. Someone seemed to think this contributed to the accident when it had to be disconnected so it didn't compete with the Uber system.

Right, but coupe3w doesn’t seem to understand that.

 

Uber disabled the Volvo systems because THEIR SOFTWARE took over those functions using the Velodyne sensors which are supposed to be way better than any stock vehicle system.

 

The fact that this software was disabled doesn’t mean anything. The reason these companies are saying anything is to make sure the public knows that their company is not responsible for the accident. The Volvo safety systems were disabled therefore they could not have responded. Velodyne only makes the sensors and their sensor doesn’t have the ability to apply the brakes.

 

The UBER software is supposed to take the input from the sensors and drive the car including applying the brakes. That is what failed because the car never even attempted to brake. So it’s Uber’s responsibility. We just don’t know which component failed - their software or the sensor or the vehicle itself.

  • Like 1
Link to comment
Share on other sites

Right, but coupe3w doesn’t seem to understand that.

 

Uber disabled the Volvo systems because THEIR SOFTWARE took over those functions using the Velodyne sensors which are supposed to be way better than any stock vehicle system.

 

The fact that this software was disabled doesn’t mean anything. The reason these companies are saying anything is to make sure the public knows that their company is not responsible for the accident. The Volvo safety systems were disabled therefore they could not have responded. Velodyne only makes the sensors and their sensor doesn’t have the ability to apply the brakes.

 

The UBER software is supposed to take the input from the sensors and drive the car including applying the brakes. That is what failed because the car never even attempted to brake. So it’s Uber’s responsibility. We just don’t know which component failed - their software or the sensor or the vehicle itself.

Okay now I understand. The two software packages would not work together so they disabled the Volvo car software and sensors. Still though, no backup or override system in place in case of a malfunction. And a driver that wasn't paying attention with a beta system. Can we agree on that?

Link to comment
Share on other sites

Of course. My guess is Uber just didn’t anticipate that the software wouldn’t function properly and they didn’t train the backup driver accordingly. This is a common problem with software developers who don’t have a lot of real world experience like us old guys.

Link to comment
Share on other sites

Of course. My guess is Uber just didn’t anticipate that the software wouldn’t function properly and they didn’t train the backup driver accordingly. This is a common problem with software developers who don’t have a lot of real world experience like us old guys.

 

That happens pretty much with anything-real world experience is the best teacher..then again software/IT development 101 is always have a backup or backup plan

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...