View Full Version : Tesla kills YouTuber
Brown was an enthusiastic booster of his 2015 Tesla Model S and in April praised its sophisticated Autopilot system for avoiding a crash when a commercial truck swerved into his lane on an interstate. He published a video of the incident online. "Hands down the best car I have ever owned and use it to its full extent," Brown wrote.
https://www.youtube.com/watch?v=9I5rraWJq6E
Joshua D. Brown of Canton, Ohio, the 40-year-old owner of a technology company, was killed May 7 in Williston, Florida, when his car's cameras failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn't automatically activate its brakes
http://www.northjersey.com/news/tesla-driver-s-death-using-car-s-autopilot-probed-by-nhtsa-1.1624600
I saw this video on YouTube originally and had to fight the urge to post a "you kids today" style rant about relying on these systems when they're not fully baked but I figured why waste my breath shouting at the wind?
Insomniac
07-01-16, 10:31 AM
It's tragic. Unfortunately it's being reported the failure was Tesla's autopilot. Hopefully this is a reminder to all Tesla drivers they can't check out. Reading about the scene it seems like he didn't even know it was about to happen (ignoring the self serving description of the accident by the truck driver).
I am surprised the enormous smug cloud surrounding him didn't cushion the collision.
Napoleon
07-01-16, 12:48 PM
https://www.youtube.com/watch?v=9I5rraWJq6E
Hey, I know exactly where that video was taken at. It is not that far from where I sit. So the guy was from Canton, Ohio.
I have a client in Williston FL and will be meeting with them at the end of July. If I remember, I'll ask them about this.
Tifosi24
07-01-16, 02:00 PM
That's the first time I had seen that video. I am impressed that the self driving factor was able to pick up the merging truck but that is something any average driver should be able to handle. Granted, we knew what was going to happen but there wasn't anything particularly horrible about the truck's lane change.
Tesla's Take: https://www.teslamotors.com/blog/tragic-loss
It's tragic. Unfortunately it's being reported the failure was Tesla's autopilot. Hopefully this is a reminder to all Tesla drivers they can't check out. Reading about the scene it seems like he didn't even know it was about to happen (ignoring the self serving description of the accident by the truck driver).
Yes, a driver turning left in front of traffic is almost certainly at fault. But the truck driver claims that he had already begun his turn when the Tesla crested a rise coming the other way. Google Street view appears to confirm that there's limited visibility at that location.
862
https://goo.gl/maps/VokWxmz1hvE2
Witness reports claim that the Tesla was travelling at a high rate of speed, although the claim that he was travelling over 85 seems unlikely given that Autopilot is supposed to be limited to 10mph over the speed limit. Depending on how fast the Tesla was travelling it's not impossible that the turning driver couldn't see the Tesla before he started his turn. Is the turning driver responsible when it's not possible to see the on-coming vehicle and then the on-coming vehicle makes no attempt to avoid the collision?
https://www.youtube.com/watch?v=gQNMvHbL3jU
Another important question: How can a vehicle like this suffer serious damage from a crash and continue 300 yards through two fences, hitting and shearing off a power pole before stopping? Shouldn't everything go into emergency stop mode as soon as there are crash forces or at worst when the vehicle leaves the road?
Oh, another thing... I call BS on Tesla's explanation. They claim that the truck wasn't seen because it was bright white against a brightly lit sky. Playing with Google Earth shows that the Tesla would have been cresting a rise and looking down at the truck backed by the road and the green median and trees. We're talking about the broad side of a truck not being seen at all before impact. That should flunk an eye test even in Florida.
Google Earth also shows that the crest that the Tesla came over is 300 meters from the intersection. That would take under 10 seconds to cover at 75mph with the Tesla not braking at all. Rough calculations of an 80 foot semi crossing a 20 foot road suggest that it takes 15 seconds or more.
Tesla's view of the intersection:
863
Napoleon
07-01-16, 09:02 PM
Apparently this guy had a bunch of speeding tickets. NEOhio media is wall to wall on this.
Yes, a driver turning left in front of traffic is almost certainly at fault.
Why, is this like you are always at fault if you hit someone from behind no matter what (serious question)?
Another important question: How can a vehicle like this suffer serious damage from a crash and continue 300 yards through two fences, hitting and shearing off a power pole before stopping? Shouldn't everything go into emergency stop mode as soon as there are crash forces or at worst when the vehicle leaves the road?
Great questions, which never occurred to me. I am embarrassed to admit the first thing I thought this evening when seeing the footage they played on local tv news was "how did the car continue with him obviously being decapitated?"
Thank God he didn't kill anyone else (and he is at fault for that accident).
Napoleon
07-01-16, 09:06 PM
Oh, another thing... I call BS on Tesla's explanation.
They were just caught a week or 2 ago on a flagellant violation of federal law on reporting problems with their cars.
Insomniac
07-03-16, 01:55 PM
I mostly meant that given he didn't brake, the problem wasn't autopilot. You'd hope it would've prevented it, but it also not supposed to be used the way he used it. The car probably has data on the crash to break it all down though. My only guess on the car is there are no sensors that high up to send that type of signal to the car. It seems crazy at first, but maybe not the typical place crash sensors are placed. perhaps some other sensors are there, but are non-critical. You would not want a sensor failure to cause the car to do something unexpected. With the era of driverless cars coming, there's going to be a headline where the passenger arrives at their destination dead at some point.
I've read some about this on Tesla forums and I think I see part of the problem. While Musk's PR machine may give the impression that their so-called "Autopilot" is practically on par with Google's self-driving cars it's really not. While Musk claims that the system didn't see the white truck against the bright sky as the reason for no braking the truth is that it's not designed to look for that kind of lateral crossing traffic anyway.
While Musk calls it "smarter than a human" and may give the general public the notion that it's practically autonomous, there are a whole array of situations that it can't or won't deal with and owners are warned that they have to pay attention and be ready to take over at any time. That hasn't prevented some Tesla owners from treating it as though it's "smarter than a human" with predictable results.
Recently a Google called Tesla irresponsible (http://www.autoblog.com/2016/06/02/google-scientist-tesla-autopilot-roll-out-irresponsible/) for rolling out a system that lulls the driver in to a false sense of safety. Of course your can argue that just about any driver aid technology that drivers come to rely on them and become less attentive or competent as a result. We're seeing this same challenge with aircraft systems today.
Why, is this like you are always at fault if you hit someone from behind no matter what (serious question)?
In almost all cases where there is no traffic control left turning traffic is responsible for yielding to through traffic coming the other way. So normally it's an open and shut case - turning driver is responsible.
A couple of things may mitigate fault in this case: If the truck driver can show that he couldn't see the Tesla because of the crest in the road or if he can show that he could have made the turn if the Tesla weren't speeding.
It seems very likely that the driver just did what truck drivers do - decide that they have time to make the turn if you do the normal thing and slow down to let him through. I'm not sure where the law will fall in that case.
The good news is that with telemetry from the Tesla if they did a good job collecting data at the accident scene then we should eventually get a very clear view of what happened.
datachicane
07-03-16, 07:29 PM
Seeing reports from witnesses that a Harry Potter movie was still playing on a portable DVD player in the Tesla after the crash.
Insomniac
07-04-16, 12:05 PM
In almost all cases where there is no traffic control left turning traffic is responsible for yielding to through traffic coming the other way. So normally it's an open and shut case - turning driver is responsible.
A couple of things may mitigate fault in this case: If the truck driver can show that he couldn't see the Tesla because of the crest in the road or if he can show that he could have made the turn if the Tesla weren't speeding.
It seems very likely that the driver just did what truck drivers do - decide that they have time to make the turn if you do the normal thing and slow down to let him through. I'm not sure where the law will fall in that case.
The good news is that with telemetry from the Tesla if they did a good job collecting data at the accident scene then we should eventually get a very clear view of what happened.
I'd assume they've already decided no blame for the truck driver. The accident was nearly 2 months ago, they must've concluded and made citations by now. They probably couldn't even determine the Tesla's speed because there were no skid marks. I'd bet you're right on what happened. The big difference is the Tesla driver failed to see the truck or take any action as well.
They may also have footage since this driver liked to record driving. He paid the ultimate price, so they may've decided there was no need to publicly apportion blame.
I'd also say on the Tesla PR side, the reality of what the system did/didn't do is probably way too complex to explain simply so they probably oversimplified it. I mean, it's surely designed to detect obstructions/stopped traffic and stop and that failed here too.
Airbags did not deploy
http://www.techinsider.io/tesla-airbag-didnt-deploy-in-autopilot-crash-2016-7
Insomniac
07-04-16, 03:49 PM
Airbags did not deploy
http://www.techinsider.io/tesla-airbag-didnt-deploy-in-autopilot-crash-2016-7
That might be connected to why the car just kept going. Perhaps the whole electronic system was severed/shut down immediately.
Tesla Model X goes off the road and crashes in Montana, driver blames the Autopilot (http://electrek.co/2016/07/11/tesla-model-x-crash-montana-driver-blames-autopilot/)
some interesting discussion in the comments section
Insomniac
07-11-16, 11:11 AM
Tesla Model X goes off the road and crashes in Montana, driver blames the Autopilot (http://electrek.co/2016/07/11/tesla-model-x-crash-montana-driver-blames-autopilot/)
some interesting discussion in the comments section
Unless Autopilot overpowered the driver's steering input, I blame the driver. One thing that's going to be interesting is how insurance companies start to handle this.
Napoleon
07-11-16, 11:12 AM
There is a pretty well known political blog on the left I follow whose owner/main contributor regularly asserts that completely independently self driving cars is just not going to happen and it is just "vaporware". Today he linked to the below opinion piece that appeared in the NY Times and the last bit on Google adjusting its timeline for when they will be on the road to 30 years is interesting.
http://www.nytimes.com/2016/07/10/opinion/sunday/silicon-valley-driven-hype-for-self-driving-cars.html?ref=opinion&_r=2
Here again I think the problem is Tesla's hype machine creating unrealistic expectations for their "autopilot". Yeah, the driver acknowledged that they understood that they're responsible for paying attention, etc. but Tesla's PR still sells it as "smarter than a human." Smarter than a I human that would just drive into the side of a truck, I suppose.
Yes, the driver is ultimately responsible for the crash but Musk has created unrealistic expectations by being irresponsible in the way he has promoted it. He needs that continuous hype to sustain his confidence game.
It will happen. Sooner than 30 years, I hope. There's a vast number of baby boomers who will be better off being driven in their advancing years by some sort of AI. I can envision a small runabout, similar to a Smart car with easy entrance and exit begin sold specifically to the elderly as grocery getters, doctor appointment conveyances and to shuttle us to dinner at 4:30 pm.
Insomniac
07-12-16, 11:03 AM
I really hope it's less than 30 years too. I think there's a lot to be said about how everyone else clearly calls these technologies assisting ones. Tesla and Musk hasn't done any favors by the name or how they talk about it. They say it is better than humans. But the reality is it makes humans better as it may catch something the human misses. I'd rather have the technology than not, but would not rely on it to drive.
Tesla Ramping Up For Autopilot 2.0 With LIDAR, More Cameras (http://gas2.org/2016/07/10/tesla-ramping-autopilot-2-0-lidar-cameras/)
Tesla's Crash May Speed up Race for This New Technology (http://fortune.com/2016/07/12/tesla-autopilot-technology-sensor-lidar/)
What Tesla's Elon Musk Misses About Self-Driving Cars (http://fortune.com/2016/07/11/elon-musk-tesla-self-driving-cars/)
GM sees self-driving cars as gradual rollout (http://phys.org/news/2016-07-gm-self-driving-cars-gradual-rollout.html)
"There isn't going to be a particular moment or day when we see it—it will unfold in a gradual way, but it will be a lot faster than people are expecting," said GM president Dan Ammann at the Fortune Brainstorm Tech conference in Colorado.
Other manufacturers will soon jump on the bandwagon spurring additional tech development, refining the concept. Within a few short years the Tesla as we know it now will appear to be dated as other more advanced systems are developed. Which has been the story for tech development in the past several decades.
It will happen. Sooner than 30 years, I hope. There's a vast number of baby boomers who will be better off being driven in their advancing years by some sort of AI. I can envision a small runabout, similar to a Smart car with easy entrance and exit begin sold specifically to the elderly as grocery getters, doctor appointment conveyances and to shuttle us to dinner at 4:30 pm.
Fine, but if my car decides that I'm too old to drive, then it had better be a sweet ride.
869
870
First YouTubers, then the sport :eek:
Will autonomous cars kill motorsports? (http://readwrite.com/2016/07/09/autonomous-cars-may-make-motorsport-niche-competition-tl4/)
“The threat we face in motorsport is autonomous vehicles,” said Di Grassi. “In the future, people in general will lack the experience of normal driving on the road. So if you don’t drive, you won’t get the passion and feeling for motorsport. Because less people will drive I believe it will become a niche sport and not a mass sport—as I believe it was in the 1990s and 2000s.”
What happens when these cats are playing Pokemon GO with autopilot enabled? :saywhat: :gomer:
Called it...
http://thesmokinggun.com/documents/stupid/pokemon-go-car-crash-362895
:shakehead:
chop456
07-14-16, 02:43 AM
What happens when these cats are playing Pokemon GO with autopilot enabled? :saywhat: :gomer:
Called it...
http://thesmokinggun.com/documents/stupid/pokemon-go-car-crash-362895
:shakehead:
http://time.com/4405221/pokemon-go-teen-hit-by-car/
“The Pokémon game took her across a major highway at 5 o’clock in the evening, which is rush hour,” her mother, Tracy Nolan, said in an interview with local Channel 11 News. “Parents, don’t let your kids play this game because you don’t want to go through what I went through last night. I really thought I was losing my daughter.”
Amazing that a cell phone can pick up a 15-year old human and take them across a highway. :gomer:
http://time.com/4405221/pokemon-go-teen-hit-by-car/
Amazing that a cell phone can pick up a 15-year old human and take them across a highway. :gomer:
Not to get off topic, but I must... :)
I sent my missive as a father to my DDs to NOT download and install this crapware. If you need to 'parent' by using an app to get your kids outside and to be active...you SUCK as a 'parent'. Period.
Back on topic...
http://www.thedailybeast.com/articles/2016/07/14/why-tesla-s-cars-and-autopilot-aren-t-as-safe-as-elon-musk-claims.html
Half the peeps I encounter on the road can't even use a turn signal. :saywhat: :irked: State law here is wipers on, lights on...guess the % I see following the law. Perhaps 25%. :mad:
If you need to 'parent' by using an app to get your kids outside and to be active...you SUCK as a 'parent'. Period.
Well, maybe some people just have s***** kids. Ever think about that?
:p
;)
Tesla's Autopilot: Too Much Autonomy Too Soon (http://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/)
"By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security," says Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports. "In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we're deeply concerned that consumers are being sold a pile of promises about unproven technology. 'Autopilot' can't actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver's hands are on the wheel."
Well, maybe some people just have s***** kids. Ever think about that?
:p
;)
Oh, having volunteered for years at my DD's elementary, I get that. Ever heard of golf, tennis, biking, hiking, swimming...shizz. I even had mine gardening, and they WALKED WDW and DL at 4 & 5 (when many parental units ask questions on FB about strollers for those ages). Go figure. :shakehead:
Tesla's Autopilot: Too Much Autonomy Too Soon (http://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/)
Too many idiots out there to trust it, and we have so much construction around NW Cbus, it's just nuts. Used to be roundabout city, but now it's orange barrel city. We have a 'bypass' route nearby that's 25mph, but peeps use it like a road course...and with cars parked on the side of the road. :irked: I'd also love to see how autopilot would do here with IIRC 30 roundabouts. And of course, the old adage is if you want instant idiots, just add water. :saywhat:
Can this augmented reality software project Forza Motorsport or Gran Turismo on my windshield?
I like driving, but I'm not earning enough points. :gomer:
http://s32.postimg.org/yse1p44t1/Forza_Motorsport_6_Demo_Reflection.jpg
Can this augmented reality software project Forza Motorsport or Gran Turismo on my windshield?
I like driving, but I'm not earning enough points. :gomer:
Quite certain that GT will be the next Pokemon GO combined with autopilot. :gomer: :saywhat:
Tesla's Autopilot: Too Much Autonomy Too Soon (http://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/)
This may be the first time ever that I've agreed with Consumer Reports.
Can this augmented reality software project Forza Motorsport or Gran Turismo on my windshield?
I like driving, but I'm not earning enough points. :gomer:
http://s32.postimg.org/yse1p44t1/Forza_Motorsport_6_Demo_Reflection.jpg
I had a Pontiac Grand Prix that had a head up display. I liked it very much.
Garmin (https://buy.garmin.com/en-US/US/on-the-road/discontinued/hud-head-up-display-/prod134348.html) makes an after market model, probably others as well. Have no idea how effective they are.
Seems appropriate to bump this thread. A recent study has found that Tesla has the highest fatal accident rate among all car brands.
https://www.roadandtrack.com/news/a62919131/tesla-has-highest-fatal-accident-rate-of-all-auto-brands-study/
https://www.iseecars.com/most-dangerous-cars-study#v=2024
This seemed surprising because they tout their safety features as being superior to human drivers. But other studies confirm that they have among the worst drivers. This is what you get when you convince people that their car will keep them safe.
https://www.lendingtree.com/insurance/brand-incidents-study/
And then, as if on cue.. Four Dead In Fire As Tesla Doors Fail To Open After Crash (https://myelectricsparks.com/four-dead-tesla-doors-fail-open-crash-fire/)
The crash is still under investigation - but my understanding is the first Tesla was traveling at a high rate of speed and left the roadway and struck the parked Tesla. Result was a double fatality and subsequent lithium ion battery fire.
https://www.youtube.com/watch?v=06zpy6EkpEQ
nissan gtp
11-21-24, 11:57 AM
Seems appropriate to bump this thread. A recent study has found that Tesla has the highest fatal accident rate among all car brands.
https://www.roadandtrack.com/news/a62919131/tesla-has-highest-fatal-accident-rate-of-all-auto-brands-study/
https://www.iseecars.com/most-dangerous-cars-study#v=2024
This seemed surprising because they tout their safety features as being superior to human drivers. But other studies confirm that they have among the worst drivers. This is what you get when you convince people that their car will keep them safe.
https://www.lendingtree.com/insurance/brand-incidents-study/
The cars are relatively affordable (the Model 3), accelerates much faster than almost any driver is used to, and the FSD nonsense encourages people to let the car (try to) drive. Many seem to work to avoid what little "pay attention" features. It's like riding with a nervous teenage driver. I have a Model 3 and would never pay for FSD, and wouldn't use it if it were free. IMO software that steers the car should not be allowed.
Powered by vBulletin® Version 4.2.5 Copyright © 2024 vBulletin Solutions Inc. All rights reserved.