Options

Self driving cars can kill you?

135

Comments

  • Options
    HP.80 VictorHP.80 Victor Posts: 1,118
    Forum Member
    ✭✭✭
    You can't blame the autonomous mode entirely here. If US trucks were legally obliged under Federal statutes to fit the same rear and lateral under run protection that trucks in the UK have to, then perhaps this guy would've survived. If such protection was fitted then the car wouldn't have travelled so far under the trailer and the windscreen and the roof would not have been subject to such drastic deformation which in turn caused the fatal penetration into the passenger cell.

    Then again, if the driver wasn't too busy watching Harry Potter perhaps he'd have noticed the truck and managed to avoid a collision entirely.
  • Options
    ianxianx Posts: 9,190
    Forum Member
    https://en.wikipedia.org/wiki/Bridget_Driscoll
    Bridget Driscoll (1851 – 17 August 1896) was the first pedestrian victim of an automobile collision in Great Britain. As she and her teenage daughter May and her friend Elizabeth Murphy crossed Dolphin Terrace in the grounds of the Crystal Palace in London, Driscoll was struck by an automobile belonging to the Anglo-French Motor Carriage Company that was being used to give demonstration rides. One witness described the car as travelling at "a reckless pace, in fact, like a fire engine".

    Although the car's maximum speed was 8 miles per hour (13 km/h) it had been limited deliberately to 4 miles per hour (6.4 km/h), the speed at which the driver, Arthur James Edsall of Upper Norwood, claimed to have been travelling. His passenger, Alice Standing of Forest Hill, alleged he modified the engine to allow the car to go faster, but another taxicab driver examined the car and said it was incapable of exceeding 4.5 miles per hour (7.2 km/h) because of a low-speed engine belt. The accident happened just a few weeks after a new Act of Parliament had increased the speed limit for cars to 14 miles per hour (23 km/h), from 2 miles per hour in towns and 4 miles per hour in the countryside.

    The jury returned a verdict of "accidental death" after an inquest enduring some six hours, and no prosecution was made. The coroner, Percy Morrison, (Croydon division of Surrey) said he hoped "such a thing would never happen again." The Royal Society for the Prevention of Accidents estimate 550,000 people had been killed on UK roads by 2010.

    Plus ça change, plus c'est la même chose.
  • Options
    alanwarwicalanwarwic Posts: 28,396
    Forum Member
    ✭✭✭
    https://www.theguardian.com/science/2016/jun/14/statistically-self-driving-cars-are-about-to-kill-someone-what-happens-next
    The US has one death per 4 billion miles yet Tesla claim their 1 death per 130 million which for now is 30 times higher.

    The scary thing is the risk to others, possibly unaccounted for in the 30 times increased risk.
    No driver is ever going to admit their car has killed if it makes them instantly guilty of manslaughter by negligence.

    Reading that link, it does seem quite incredible the stupid acting things have not killed and killed.
  • Options
    ianxianx Posts: 9,190
    Forum Member
    alanwarwic wrote: »
    https://www.theguardian.com/science/2016/jun/14/statistically-self-driving-cars-are-about-to-kill-someone-what-happens-next
    The US has one death per 4 billion miles yet Tesla claim their 1 death per 130 million which for now is 30 times higher.

    The scary thing is the risk to others, possibly unaccounted for in the 30 times increased risk.
    No driver is ever going to admit their car has killed if it makes them instantly guilty of manslaughter by negligence.
    I get the feeling that you've heard other people use the phrase 'statistical significance', but you've not really understood what they meant by it.
  • Options
    njpnjp Posts: 27,583
    Forum Member
    ✭✭✭
    ianx wrote: »
    I get the feeling that you've heard other people use the phrase 'statistical significance', but you've not really understood what they meant by it.
    There's that, and there's also the question of where he conjured 1 death per 4 billion miles from. The referenced figures give 1.08 deaths per 100 million miles. So he should be singing Tesla's praises!
  • Options
    zx50zx50 Posts: 91,270
    Forum Member
    ✭✭✭
    Nilrem wrote: »
    That completely ignores another problem with nearly silent cars.

    Those who don't have a choice about looking for them, the same people that are the reason things like crossings have beepers.
    Namely those with eye problems.

    Yeah, I suppose there's that. I would assume those with bad enough eye problems would have someone with them though.
  • Options
    alanwarwicalanwarwic Posts: 28,396
    Forum Member
    ✭✭✭
    njp wrote: »
    There's that, and there's also the question of where he conjured 1 death per 4 billion miles from. The referenced figures give 1.08 deaths per 100 million miles. So he should be singing Tesla's praises!
    Seems I misread the fatalities Wiki as 7 billion km when it meant 7 per billion km for the USA.

    In a way, the irresponsible way Musk has set the auto drive would be acceptable, but for the fact that it is puts other vehicles at far more risk than the driver.

    Freedom to risk ones own life is OK but the designed auto drive puts others at more risk, even more so with auto speeding.

    Another interesting aspect is that it spies on the driver , sending data back to Musk, so they know who is speeding !
  • Options
    njpnjp Posts: 27,583
    Forum Member
    ✭✭✭
    alanwarwic wrote: »
    Seems I misread the fatalities Wiki as 7 billion km when it meant 7 per billion km for the USA.

    In a way, the irresponsible way Musk has set the auto drive would be acceptable, but for the fact that it is puts other vehicles at far more risk than the driver.

    Freedom to risk ones own life is OK but the designed auto drive puts others at more risk, even more so with auto speeding.

    Another interesting aspect is that it spies on the driver , sending data back to Musk, so they know who is speeding !
    Asserting that autopilot mode puts others at greater risk is not the same thing as showing it. As I said before, the issue is whether or not the technology is a net benefit. This truth will only emerge from the accident statistics over a prolonged period.

    I'm fairly sure you get to choose whether or not you want to share your vehicle data with Musk, although I suspect most do.
  • Options
    alanwarwicalanwarwic Posts: 28,396
    Forum Member
    ✭✭✭
    There seem to be plenty those auto driving Teslas running into others.
    http://www.dailymail.co.uk/sciencetech/article-3281562/Tesla-autopilot-fail-videos-emerge-Terrifying-footage-shows-happens-autonomous-driving-goes-wrong.html

    "With videos such as these emerging on YouTube, many have questioned the legality of Tesla's latest software update.

    The firm is able to get around regulations, because, in most of the US, laws on self-driving cars remain ambiguous."

    So ambiguously, fast and loose Tesla lets auto drive break the speed limits too, likely because that is what the customer wants.
  • Options
    Jellied EelJellied Eel Posts: 33,091
    Forum Member
    ✭✭✭
    alanwarwic wrote: »
    So ambiguously, fast and loose Tesla lets auto drive break the speed limits too, likely because that is what the customer wants.

    Practically every car manufacturer does this, ie cars aren't restricted to 70mph, but 155mph. There could even be situations where accelerating and temporarily exceeding the speed limit lets you avoid an accident.. But I think I'd prefer a car that chose to slow as a preference.
  • Options
    gomezzgomezz Posts: 44,625
    Forum Member
    zx50 wrote: »
    Yeah, I suppose there's that. I would assume those with bad enough eye problems would have someone with them though.
    Are you for real? :o
  • Options
    alanwarwicalanwarwic Posts: 28,396
    Forum Member
    ✭✭✭
    With Musk it is certainly marketing needs conflicting with actual safety.
  • Options
    alanwarwicalanwarwic Posts: 28,396
    Forum Member
    ✭✭✭
    "A spokesman for the National Highway Traffic Safety Administration, Bryan Thomas, declined to say why the agency waited until late June to begin a formal inquiry into an accident that happened in May, or why the agency did not require Tesla to notify owners about a possible problem."

    "With a federal investigation underway, Tesla has declined to respond to many questions about the Florida crash, including why it did not make details of the accident public for nearly two months — and then not until regulators announced their inquiry.

    In addition, Tesla did not respond to emails on Friday about when the company would disclose more information about the accident, or about any plans for possibly alerting vehicle owners about the dangers of misusing the Autopilot feature."
    http://www.nytimes.com/2016/07/02/business/a-fatality-forces-tesla-to-confront-its-limits.html
  • Options
    KarisKaris Posts: 6,380
    Forum Member
    zx50 wrote: »
    I agree. Electric self-driving cars will be fully 'awake' all the time, and they won't suffer from impatience and whatnot while travelling. The number of traffic accidents will likely be VERY low once there's a load of them on the road. From the videos I've watched, they seem to be nice and quiet as well.

    I agree; people can't cherry pick what is acceptable loss and what's not. This is merely another thing the media can focus on to create more hysteria.

    When they're all self-drive the world will be a much safer place. But these are first generation cars - still in development - they'll be so very much safer when they're finally released.
  • Options
    zx50zx50 Posts: 91,270
    Forum Member
    ✭✭✭
    .....
  • Options
    zx50zx50 Posts: 91,270
    Forum Member
    ✭✭✭
    Karis wrote: »
    I agree; people can't cherry pick what is acceptable loss and what's not. This is merely another thing the media can focus on to create more hysteria.

    When they're all self-drive the world will be a much safer place. But these are first generation cars - still in development - they'll be so very much safer when they're finally released.

    Exactly. The software that drives them will be improved more and more as time goes by. They will become safer because of this. They're, as you say, still in the development stage but once they're ready to be released, they'll probably be a lot more safer. They'll basically just get a lot more safer after each problem has been fixed.
  • Options
    Jellied EelJellied Eel Posts: 33,091
    Forum Member
    ✭✭✭
    njp wrote: »
    I'm fairly sure you get to choose whether or not you want to share your vehicle data with Musk, although I suspect most do.

    From what I've read, it's optional. But presumably data logged by the car would be available to the investigators. It's a sad case, and there's a bit of a whiff about it-

    http://www.theregister.co.uk/2016/06/30/tesla_autopilot_crash_leaves_motorist_dead/

    "The vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S," Tesla said in a statement on Thursday.

    "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.


    From the local newspaper report, it happened here-

    https://www.google.co.uk/maps/@29.4105302,-82.5386546,388m/data=!3m1!1e3

    Which looks like a nice clear, straight section of road, and probably enough contrast between road, foliage and sky to have been able to spot a lorry. And conversely, the lorry driver should have been able to spot the Tesla.
  • Options
    treefr0gtreefr0g Posts: 23,655
    Forum Member
    ✭✭✭
    It seems very odd that the car is totally reliant on cameras and does not use proximity sensors to judge distance.
  • Options
    and101and101 Posts: 2,688
    Forum Member
    ✭✭✭
    treefr0g wrote: »
    It seems very odd that the car is totally reliant on cameras and does not use proximity sensors to judge distance.

    It does use radar and ultrasonic sensors to determine distance. The problem in this case was the lorry was side on to the car and US lorries do not have bars running along the side in between the wheels like they do in the UK so at the height of the sensors there was free air and nothing to detect.
  • Options
    d'@ved'@ve Posts: 45,530
    Forum Member
    From what I've read, it's optional. But presumably data logged by the car would be available to the investigators. It's a sad case, and there's a bit of a whiff about it-

    http://www.theregister.co.uk/2016/06/30/tesla_autopilot_crash_leaves_motorist_dead/

    "The vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S," Tesla said in a statement on Thursday.

    "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.


    From the local newspaper report, it happened here-

    https://www.google.co.uk/maps/@29.4105302,-82.5386546,388m/data=!3m1!1e3

    Which looks like a nice clear, straight section of road, and probably enough contrast between road, foliage and sky to have been able to spot a lorry. And conversely, the lorry driver should have been able to spot the Tesla.

    This is bullshit by Tesla.

    I find it impossible to believe that a driver who was paying attention to the road ahead would fail to notice a moving lorry/trailer/tractor in that situation whatever, the colour, contrast or state of the sky. The human visual system is finely honed over thousands of years to notice even the slightest change in contrast or the slightest movement. It's what we do best, in comparison with technology of any kind.

    If the driver was paying attention, he would have seen the obstacle and should have been able to take appropriate action but clearly this is not so with the system (unless perhaps he was speeding grossly).
  • Options
    njpnjp Posts: 27,583
    Forum Member
    ✭✭✭
    d'@ve wrote: »
    This is bullshit by Tesla.
    How is it bullshit? He obviously failed to notice, unless you think he chose an unusual way of committing suicide. Accidents happen every single day because people fail to notice things. The only issue here is whether or not his use of autopilot mode made the accident more likely. I don't think we know.
  • Options
    starry_runestarry_rune Posts: 9,006
    Forum Member
    How many people are killed and injured annually in human driven vehicles?

    https://www.youtube.com/watch?v=u13cDmKpo-4

    https://www.youtube.com/watch?v=7EtLisQG1gk
  • Options
    and101and101 Posts: 2,688
    Forum Member
    ✭✭✭
    How many people are killed and injured annually in human driven vehicles?

    https://www.youtube.com/watch?v=u13cDmKpo-4

    https://www.youtube.com/watch?v=7EtLisQG1gk

    Based on the 2012-13 statistics for the UK there were 1713 deaths and 195,723 casualties for 244.4 billion car miles driven. That is one death for every 142.67 million miles and one casualty for every 1.25 million miles.
  • Options
    Jellied EelJellied Eel Posts: 33,091
    Forum Member
    ✭✭✭
    d'@ve wrote: »
    If the driver was paying attention, he would have seen the obstacle and should have been able to take appropriate action but clearly this is not so with the system (unless perhaps he was speeding grossly).

    Conversely, the lorry driver probably should have been able to see traffic coming and it wasn't clear to turn. From my own driving experiences in the US, crossing traffic is meant to wait for a safe gap. We don't know if the lorry driver decided to make the turn and assumed traffic would slow for him though.

    I suspect there's blame on both sides, and complacency around what the 'autopilot' could do played a part.
  • Options
    alanwarwicalanwarwic Posts: 28,396
    Forum Member
    ✭✭✭
    http://www.nytimes.com/2016/07/05/business/tesla-and-google-take-different-roads-to-self-driving-car.html

    "Engineers using onboard video cameras to remotely monitor the results were alarmed by what they observed — a range of distracted-driving behavior that included falling asleep.

    “We saw stuff that made us a little nervous,” Christopher Urmson, a former Carnegie Mellon University roboticist who directs the car project at Google, said at the time.

    The experiment convinced the engineers that it might not be possible to have a human driver quickly snap back to “situational awareness,” the reflexive response required for a person to handle a split-second crisis."

    As I suggested, humans will be slow to switch back on, that time lag enough to make the void in the system very high risk.
Sign In or Register to comment.