×
INTELLIGENT WORK FORUMS
FOR ENGINEERING PROFESSIONALS

Contact US

Log In

Come Join Us!

Are you an
Engineering professional?
Join Eng-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!

*Eng-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Students Click Here

Tesla Autopilot, fatal crash into side of truck
6

Tesla Autopilot, fatal crash into side of truck

Tesla Autopilot, fatal crash into side of truck

(OP)
CBC News

The Telegraph

"[It didn't notice] the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

RE: Tesla Autopilot, fatal crash into side of truck

Surprising that the front radar would likewise have failed to detect the truck. Moreover, since the truck made the left turn in front of the car, there should have been a time when both the cameras and the radar would have detected the tractor, and the autopilot should have had sufficient history file to say, oh wait something passed in front of us earlier; did it really complete its turn yet?

We had problems with a tracker along the same lines; no history file, no track history

TTFN
I can do absolutely anything. I'm an expert!
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

I must say that Tesla's response is rather blasé. Yes one fatality in130 million miles is slightly better than the USA average,but I bet when you correct for demographics that changes significantly.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Tesla Autopilot, fatal crash into side of truck

That's not an easy thing to really watch for. You think well, "look higher" but seeing a bridge or a street light and stopping suddenly could be disastrous too.

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

At the height of the tractor trailer, it's quite likely the radar had no cross-section to get a return signal from.

Dan - Owner
http://www.Hi-TecDesigns.com

RE: Tesla Autopilot, fatal crash into side of truck

It's a new and complicated technology, there are bound to be bugs to work out. That said, I have no doubt it will result in a more pleasant, safer, and easier mode of transportation. I certainly hope this doesn't result in a massive fallout on the technology because it's definitely something we could use considering the number of human error related accidents on the road.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
"...height of the tractor trailer..."

In other words, perhaps the radar system design implicitly 'assumes' that the car is only two feet tall, and thus fails to ensure that the way ahead is of sufficient height. If that's the case, then that seems like a fairly major system design process flaw.

This tragic incident is so 'blatant' (driving straight into the side of a truck) that it will hopefully reduce the industry hype about the near term future of self-driving vehicles.

How many more blind spots do these systems have?

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
Might as well anticipate a point...

Some will take the position that as long as the accident rate is equal to or at least slightly better than humans, then it's all okay. "It's an improvement."

I suspect that such levels would prove to be unacceptable. For example, the concentration of liability on one doorstep might be financially impossible.

It seems that there's an implicit requirement that they be much, much safer, and that they totally avoid such systematic 'dumb' mistakes.

They've still got a lot of work to do.

RE: Tesla Autopilot, fatal crash into side of truck

Do we have a video/screen capture of what the vehicle/driver saw?

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

So next we will see rules that trucks can't be painted white.

RE: Tesla Autopilot, fatal crash into side of truck

I'd kind of make the point that a big part of the reason so many drivers are so poor at it these days is because the cars are so good. Power everything, automatic everything, infotainment systems, drivers are more and more insulated / isolated from both the physical and mental driving environment.

But, I guess the only thing to do is go all the way and let the machines do it all.

Regards,

Mike

The problem with sloppy work is that the supply FAR EXCEEDS the demand

RE: Tesla Autopilot, fatal crash into side of truck

Quote (cranky108)

So next we will see rules that trucks can't be painted white.
Well maybe - or maybe rules that say that white trucks will need to have some appliqué feature to make them more machine-identifiable.

We used not to have a rule to say that vehicles over 6m (20 ft) long had to have steady amber lights at intervals down the side so you can tell (visually) when someone stops their truck across a junction ahead of you at night. We (in the UK) do now.

Imposing new rules on one class of road users to protect against other road users' perceptual deficiencies (like not being able to see in the dark) is nothing new and (with certain exceptions) tends to find acceptance eventually.

A.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (Mike)

I'd kind of make the point that a big part of the reason so many drivers are so poor at it these days is because the cars are so good. Power everything, automatic everything, infotainment systems, drivers are more and more insulated / isolated from both the physical and mental driving environment.

But, I guess the only thing to do is go all the way and let the machines do it all.

As a pilot I see that argument often that automation or information overload causes accidents or makes us worse pilots. Overall, the conclusion I've drawn (and I believe this is supported by studies) is that it's not automation nor excessive use of computers that drives these accidents; rather it's our lack of preparedness to identify and take control when things do go wrong.

In the case of this fatal car crash, this does expose a flaw in that how is a driver supposed to be able to tell when the automation isn't working? Many "famous" crashes in aviation occurred when the autopilot turned off and the pilot wasn't aware it was no longer in control and allowed the plane slowly fly itself into a crash. This is about my only concern with the systems in self-driving cars from an engineering standpoint; we need to see what the car is reacting to before or while it's reacting (or not reacting). Perhaps the driver saw the truck and expected the car to brake for him and then wasn't ready to intervene when it didn't.

In aviation we follow strict procedures to ensure the autopilot functions properly on the ground, can be overridden, and presents the proper warnings when it is disabled. In addition, it's always emphasized during training to be aware of what mode the autopilot is in and be prepared to take control at any moment.

In my educated opinion, autopilots definitely make flying a safer and more comfortable experience. While I was trained on a simple trainer aircraft I currently fly a modern cockpit aircraft with multifunction displays and a full autopilot. I will "hand fly" the vast majority of the time but when workload is high it definitely improves safety if I do not need to strictly monitor my altitude and heading while I focus on reading an approach chart or focus on preparing to land.

But, with this convenience/safety feature comes another skill that becomes the crux of the issue. You must be mentally prepared to respond to an abnormal situation. Many aviation crashes occur due to automation failing to perform as expected and pilots getting fixated on getting the automation to perform the action they want, rather than just revert to manually flying and deal with the problem when reestablished on their appropriate course. Still, the vast majority of pilots understand this and practice this skill regularly.

Of course you will have people who use this feature as a crutch. Some general aviation pilots are chided as simply "following the magenta line" (in reference to the purple colors used to indicate information provided from a GPS source) in reference to their lack of non-automated flying skills. You see this in drivers today as well; such as people who drive into lakes because their GPS told them to. Of course these people will abuse self-driving features but they will exist regardless of the technology given them. I'd rather a computer system which has software that can be updated driving towards me than a person looking down to read that important text they just got.

Overall the best example of this is to remind yourselves that we used to need two pilots and a flight engineer to safely operate an airline. Does anyone lament the loss of the flight engineer position as a safety issue? In fact, many modern charter aircraft are flown with a single pilot these days. This is only made possible with increased automation and yet flight safety continues to improve while simultaneously the cost of flights continue to decrease. While this is a little "apples and oranges" the assertion that automation = worse pilots/drivers isn't quite that black and white if you ask me.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

Years ago there was a concern that people were setting the cruise control for 25 MPH so they would not speed in school zones. That alarmed people, because the 25 MPH was intended to allow people more time to react.

What happened is law enforcement increased in school zones which caused many more tickets for people not watching their speed. So people started to set their cruise control for 25 MPH so they would not have the problem of getting tickets for driving too fast.

Now most cruise controls can't be set for anything below 35 MPH, so drivers must watch their speed, and not so much the road. (Has technology improved things?)

On the same topic, I thought that trucks were required to have reflectors mid trailer so they can be seen. This is now being required on rail cars because of the number of accidents with trains.

But still the comment about the white trailer and bright sky is bothersome when a red reflector was all that should have prevented this. So my comment about not allowing white trailers.

Come to think of it, I think lights mid trailer are also required, but being daylight they likely were not on.

RE: Tesla Autopilot, fatal crash into side of truck

Perhaps the sides of all trucks should have this painted on the side:






Or for easier computer recognition:



Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

Quote (TehMightyEngineer)

another skill that becomes the crux of the issue. You must be mentally prepared to respond to an abnormal situation.

Never more true than in an accident that was reported yesterday. Imagine a ship motoring through a wreck-infested patch of water on autopilot in the middle of the night when the system throws in an undemanded 30° turn towards the rocks. Should the watchkeeper:

a. Disengage the autopilot and attempt to manoeuvre to safety in hand steering?
b. Apply Astern propulsion to stop the ship in the water giving him time to think before sorting himself out?
c. Throttle back a wee bittie, then wander off to wake the skipper?



A.

RE: Tesla Autopilot, fatal crash into side of truck

RE: Tesla Autopilot, fatal crash into side of truck

What I've been hearing is that the Tesla's self-driving system apparently "saw" the truck, but mis-categorized it as an overhead sign board.

RE: Tesla Autopilot, fatal crash into side of truck

Is this a darwin award worthy event? Trust your life on a computer.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (cranky)

Is this a darwin award worthy event? Trust your life on a computer.

So.. what are you saying about everyone who flies in an Airbus? :)

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

I see that speculation is starting to appear in the media that the driver might have been watching a DVD at the time of the accident. That wouldn't help any "diverse redundancy" claims in the safety case.

A.

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
TME "...video... capture of what the vehicle/driver saw?"

It's a near-certainty that the Tesla Autopilot system captures video and stores it upon a crash. So there should be video. As well as lots of related data.

Given that the Autopilot drove straight into the side of a truck, I wonder how devastatingly embarrassing the video would be, or if it shows something that provides some rational explanation.

This is certainly 'a learning moment' for the over-hyped Self-Driving Car industry, the regulators and public.

RE: Tesla Autopilot, fatal crash into side of truck

AI trying to kill itself because it realizes what it artificially is.

RE: Tesla Autopilot, fatal crash into side of truck

I assume that the detectors are mounted in the grill.
The detectors saw the space under the trailer and deduced that there was enough space for them to pass under the trailer.
All the detectors did make it safely under the trailer. It was only the upper part of the car above the detectors that did not clear the trailer.
As for reflectors, not much use if the trailer is backlit and there is no source of illumination to reflect.

Bill
--------------------
"Why not the best?"
Jimmy Carter

RE: Tesla Autopilot, fatal crash into side of truck

This is NOT a case of the autopilot causing a crash, this is a case of the autopilot failing to prevent a crash caused by others.
The truck failed to yield right of way. So many times I run up to an 18 wheeler who pulls out because there is enough time for cross traffic to stop or even slow down but not enough for the truck to clear the crossing, using the adage of might makes right.

Hopefully the driver of the truck was properly cited, involuntary manslaughter?

True, this particular driver pushed the Darwin button a few times before and got lucky as seen by the videos posted. This time not so lucky.

Hydrae

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
"The truck failed to yield right of way."

I've read quite a few of the news items on this incident and I've seen nothing to indicate any such thing. The consistent story is that the Tesla failed to brake either to slow down or stop while the truck made a perfectly normal left turn across traffic (I'm interpreting this to mean: plenty of room).

This happened weeks ago. If the tucker had failed to yield, then he'd likely have been ticketed by now based on local police investigation.

It's possible I've missed something somewhere.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (VE1BLL)

It's a near-certainty that the Tesla Autopilot system captures video and stores it upon a crash. So there should be video. As well as lots of related data.

Except the car never realized it had even been in a wreck. It continued driving at the same speed for over three hundred yards!! It finally failed to stay on the road and went off-roading at high speed. It steered safely between two big trees and then.... absolutely pegged a power pole, stoving-in the front-end about 5 feet.

I think this points out the need for 'secondary sensors' that should be present to detect a crash and get the whole show stopped.

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
OMG...

If the car didn't notice the roof being sheared off, then they've got some redesign to do. Perhaps: Aircraft can have 'frangible switches' scattered around various crash-sensitive locations.

Even still, assuming the Tesla finally noticed the last pole, the circular buffer for video is likely at least several minutes, and these days could easily be hours.



RE: Tesla Autopilot, fatal crash into side of truck

""This is certainly 'a learning moment' for the over-hyped Self-Driving Car industry, "

That'd be the industry that has consistently pointed out that Tesla's autopilot is not a level 4 autonomous car. so a fairly uninformed comment at best.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Tesla Autopilot, fatal crash into side of truck

That's contemporary product development though. Put something out there woefully unfinished, let the early users cobble together their own use cases, then bend the original design to suit the new requirements. If all goes well, the orignal design will still be malleable enough to bend into the new shape. Who'd ever have thought about planning for remote operation (selfie sticks) when putting cameras into telephones?

Steve

RE: Tesla Autopilot, fatal crash into side of truck

What annoys me about this is that for decades people have been pointing out that autopilots with human oversight are the wrong approach but Tesla's half arsed beta is more of the same. We aren't very good at intervening at a moment's notice when the machine packs up.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
GL "...industry that has consistently pointed out... ... fairly uninformed comment at best."

The complete quote also included "...the regulators and public."

"This is certainly 'a learning moment' for the over-hyped Self-Driving Car industry, the regulators and public."

The sentence should be parsed as follows:
1) The Self-Driving Car industry is over-hyped.
2) This is 'a learning moment' for industry*, regulators, public.

(* 'Industry' includes Tesla, but others may learn as well.)

Both of these points are defensible. This tragic incident provides clear evidence on both points.

I wasn't intending to blame all members of the industry for the over-hype, if that was your concern. ...and apologies if I've misinterpreted your concern.

The 'technology press' is likely the main contributor to the over-hype. But some in industry seem to use hype as a marketing tool.

None of these groups are monolithic. Some parts of industry may be responsible and cautious, while others in industry are literally running 'beta' tests on public highways; even while some members of the public may have too much misplaced faith in the over-hyped and still-immature technology.

Apologies that the quoted sentence caused concerns. Parsed correctly and understanding who is 'industry', I think that it's defensible.

PS: Forum posts by their nature are compact, and subtle meaning and intent can be lost. It requires far too much time to craft posts that cannot be misinterpreted.


RE: Tesla Autopilot, fatal crash into side of truck

VE1BLL

Look at the Washington post link in Bimr's post. It has the crash report.

The intersection is uncontrolled meaning no lights therefor those turning left yield right of way to oncoming traffic. But does have to yield to cross traffic which has a stop sign.

You can also look at the intersection on google earth 29°24'38.71" 82°32'22.44".

Hydrae

RE: Tesla Autopilot, fatal crash into side of truck

That would be 29°24'38.71"N 82°32'22.44"W (The original co-ordinates defaulted to somewhere in the Himalayas!)

This intersection looks as flat as can be with nothing that would have obstructed visibility between the two vehicles.

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
"...Washington post link in Bimr's post. It has the crash report."

The Washington Post has a map (which I've seen before) extracted from a report. But I don't see any crash report, or even any link to a crash report.

This happened weeks ago. If the tucker had failed to yield, then he'd likely have been ticketed by now. There's no mention anywhere that I've seen of the trucker being ticketed. Therefore it seems likely that there was adequate room.

If the truck had suddenly pulled into the Tesla's path, and the Tesla then applied full brakes, then that would be a different story. That's not what happened here.

RE: Tesla Autopilot, fatal crash into side of truck

There wasn't enough room, otherwise the car wouldn't have hit the truck. The truck giving enough room for the car to safely yield it's right of way if the car or it's occupant had actually applied the brakes isn't the same as the truck giving enough room to turn left in front of the car.

I'm with a few others here. Tesla was beta testing their system on the public and it caused someone to die.

I also agree with the comments about people not being capable of suddenly taking over the controls when they're not really paying attention. Even airplane pilots who have trained for that situation have issues when this occurs.

RE: Tesla Autopilot, fatal crash into side of truck

Sort of brings up the question that are these computers hackable, or do they have some sort of fail safe for a rouge program. Not because the person behind the wheel may not notice, but because some have been talking about driving without a person behind the wheel.

RE: Tesla Autopilot, fatal crash into side of truck

Certainly, in California, a left-turning vehicle must yield to a vehicle coming straight-on, under the law.

gotta love having GPS coordinates in the traffic crash report, although, someone's GPS is WAY OFF, like about 200 ft, but good enough.

TTFN
I can do absolutely anything. I'm an expert!
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

I heard an eyewitness said when they got to the car a Harry Potter movie was playing on the screen. This would explain what appears to have been no pre-contact reaction what-so-ever by the Tesla driver, yet the police report states the Tesla driver was "not distracted".

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
byrdj, thank you for the link to the 4-page report. Failure to Yield is listed. I stand corrected on that point.



RE: Tesla Autopilot, fatal crash into side of truck

I have to wonder whether the Tesla's speed could have been retrieved from the event data recorder within the short period of time between the crash itself late on one day and issuance of the report the next. And if not, how was it estimated?

If the Tesla was in fact speeding, the "failure to yield" call against the truck driver should go away . . .

Norm

RE: Tesla Autopilot, fatal crash into side of truck

I realize that's what's on the report. But if that "65" number wasn't taken directly from the EDR, and the short amount of time between the crash and the report suggests that it might not have been, how do we know how accurate it is? That it wasn't just an assumption based on the posted speed in the absence of immediately available and solid evidence to the contrary?

Norm

RE: Tesla Autopilot, fatal crash into side of truck

Some of that information on the police report is just going to represent lack of evidence to the contrary. "No distraction" may mean "he wasn't on his cell phone." The "65 mph" may just mean "no evidence of higher speed". (Don't they estimate speed from the skid marks? And there weren't any here, right?)

RE: Tesla Autopilot, fatal crash into side of truck

What screen would the movie be playing on exactly? I highly doubt you could play a movie on the in-dash system while the car is moving.

RE: Tesla Autopilot, fatal crash into side of truck

If I recall correctly, one of the first "updates" that Tesla had to make to the Autopilot system was to limit the speed to not much over whatever the posted speed limit is (IIRC 5 mph over).

The movie was apparently playing on a portable DVD player - not on the vehicle's own screen. The vehicle itself is not capable of playing movies on its own screen while driving, but there's nothing it can do to stop people from using separate equipment to do so.

RE: Tesla Autopilot, fatal crash into side of truck

Did I just see another roll-over crash reported after 9 days after the event?

RE: Tesla Autopilot, fatal crash into side of truck

You know, it occurs to me that the accident scene must have been pretty gruesome already. But would have been worse if that car had just navigated its way on home and parked in the garage or whatever the normal procedure was. And that's bound to happen at some point with self-driving cars, too.

RE: Tesla Autopilot, fatal crash into side of truck

Does that mean that the computer can be charged with "leaving the scene of an accident"? winky smile

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

So it didn't avoid TWO major objects in its path... a semi and a tree.

Dan - Owner
http://www.Hi-TecDesigns.com

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
Very old advice: "A.I. is hard."


RE: Tesla Autopilot, fatal crash into side of truck

JSteven - skidmark lengths are worthless for estimating the speed of ABS-equipped vehicles. Generally, there aren't any, by ABS system intent

MacGyver - I'm guessing that all bets were off as far as autopilot being able to do anything right from the moment of the first "failure to avoid".


Norm

RE: Tesla Autopilot, fatal crash into side of truck

See the report linked above. The car hit the truck prior to smashing through two fences prior to hitting a utility pole. I count three more failures to avoid after leaving the scene of the first failure to avoid.

RE: Tesla Autopilot, fatal crash into side of truck

From what I understand, Tesla uses a radar unit in the grill for obstacle avoidance and a camera behind the rear-view for lane detection. I would expect the autopilot software to disconnect due to a system fault after the camera was knocked-off the car.

The above is also likely why it drove under the truck. The radar unit in the grill is looking ahead, not up, hence it didn't think anything was in it's path. Not ensuring the road is clear for the whole car is a rather large deficiency in the system.

There is also this incident where it appears the "Summons" feature was accidentally activated and the car drove under a trailer.

http://jalopnik.com/man-claims-his-tesla-model-s-c...

And the new rollover incident on the Pennsylvania Turnpike Friday being reported.

I see no mention of this portable DVD player in the accident report, and you'd think that would be a rather important detail to note.

RE: Tesla Autopilot, fatal crash into side of truck

I wouldn't see it loosing it's lane sensor without disconnecting, that would just be baffling if it didn't have that feature. It could entirely be that the autopilot did disconnect at impact and that explains the long travel. Without a human or computer to operate the brakes it could have coasted the distance. The accident report shows the Tesla traveling roughly straight after the crash toward the bank, I suspect it accidentally threaded between the two trees and then struck the poll which finally took all it's speed and spun it out.

The Tesla Model S is only 56.5" high according to google while a typical semi-trailer is 52 inches. Thus, if it missed the wheels it could easily pass under while smashing the roof down but still keep much of it's speed.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

It would have to miss the trailer's landing gear as well. Hitting that might even be worse than hitting the trailer's wheels.

Apparently the trailer in question did not have the side-skirts that are being pitched for fuel economy reasons, that would have presumably registered with the autopilot and that it would have then taken whatever actions were appropriate. Can Tesla's autopilot execute a high speed lane change maneuver and back again at an autocross level of intensity?

Never mind that the failure to see an obstruction 52" above the pavement when the Tesla itself is taller than that is really troubling. Even if you're willing to sacrifice the roof and optimistically assume that it would get cleanly sheared off, the top of any occupants' heads is likely to be at or slightly higher than 52" as well.


Norm

RE: Tesla Autopilot, fatal crash into side of truck

Quote (Norm)

Can Tesla's autopilot execute a high speed lane change maneuver and back again at an autocross level of intensity?

I don't know about this extreme of a maneuver but the driver in the fatal accident previously posted a video of his Tesla taking an abrupt maneuver onto the shoulder to avoid a truck that tried to occupy the same space as the Tesla. However, I believe the safest action to being cutoff will always be to brake straight ahead.

Agreed that missing an obstruction below the roof of the car seems like something that shouldn't happen.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

Why is there repeated mention of the sensor being incapable or seeing objects as high as the bottom of the trailer? Was there any factual statement released about height limits of the sensor?

I only saw a statement saying the sensor was unable to differentiate the white truck from the bright clear sky behind it. This would seem to insinuate the problem is not one of geometry but of optical analysis.

Have I missed something else?

RE: Tesla Autopilot, fatal crash into side of truck

Quote (Tesla PR Letter)

The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer

Quote (Elon Musk twitter post)

Radar tunes out what looks like an overhead road sign to avoid false braking events

The impression (my own, not cited) is that the visual camera didn't distinguish the trailer due to the bright sky/light while the radar incorrectly categorized it as an overhead sign (possibly because the camera was not recognizing the truck's trailer).

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

@TehMightyEngineer: I see, I interpreted a bit differently before, but I think you're probably right.

Personally, I think the biggest fault is in the user's decision of bestowing control to a system not deserving of complete control. It's never been advertised as being able to replace driver attention, and you have to hit an 'ok' button (or similar) every time you turn on the auto-pilot feature, agreeing that you, as the driver, understand you have to keep your hands on the wheel and pay attention, etc etc.

However, it's also called AUTO PILOT which, in common parlance, basically means you can relinquish control and let the machine take over. (see below) I think the name is a bit unfortunate, but I don't know how much responsibility that puts on Tesla for any 'deception'. Obviously we're all accustomed to just hitting 'ok' without reading EULA-like text on a screen, so the effectiveness of those dialog windows only exists in a courtroom.

It's unfortunate. I don't put any blame on Tesla, personally. They never advertised their system as being a foolproof object avoidance guarantee - just that it's an augmented system designed to assist a driver in their normal driving habits. I think people have trouble discerning the current state of technology from the very public discussions from Tesla on what they /want/ it to eventually be. I am guessing early adopters may be more prone to conflating car capabilities as they are most likely very eager to see the new technology and may have their rose tinted glasses on at times.

Or maybe it was simply a mistake that will be common to the laziness of apathy that all drivers are capable of.

RE: Tesla Autopilot, fatal crash into side of truck

JNieman: I'd agree with that assessment (and Volvo engineers had a similar sentiment earlier, though obviously they have bias given that Tesla is a competitor).

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

The two hypotheses seem to be the sensors were looking too low, or couldn't see a light colored trailer against a light colored sky. What about those dark colored things called tires that must have passed several times at sensor level in front of the high portion of the trailer?

RE: Tesla Autopilot, fatal crash into side of truck

Quote (TME)

However, I believe the safest action to being cutoff will always be to brake straight ahead.
One of the primary intentions of ABS is to allow maneuverability without requiring the driver to modulate the braking effort himself, so the actions of steering and braking shouldn't be considered mutually exclusive.

FWIW, Autopilot should be better at mixing the two because it isn't limited by such subjectives as panic or unfamiliarity with what the car is capable of doing and how to make that happen.

I've watched that video where the Tesla avoided getting hit, but getting around the much bigger truck would have required something more extreme at least in lateral travel if not necessarily in lateral and yaw accelerations.


Norm

RE: Tesla Autopilot, fatal crash into side of truck

Quote (Norm)

FWIW, Autopilot should be better at mixing the two because it isn't limited by such subjectives as panic or unfamiliarity with what the car is capable of doing and how to make that happen.

Very good point, this was the primary reason my normal response while diving would be to brake straight ahead but you would expect (hope?) that an autopilot could have a more proactive response without worsening the situation.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

I totaled my little Ranger pickup a while back. How it happened was that I was in the right lane on a freeway, had traffic behind me and to my left, and a truck moving very slowly merged in front of me. The mistake I made was assuming that the truck had an acceleration lane, when in fact, due to long-term construction, there was none, so he had to go into the lane or stop. The key to avoiding that wreck was to anticipate that somebody else is going to do something and hit your brakes even when there's currently zero obstructions in your path. It makes me curious to what extent they can work that into the software.

A similar issue: You're driving down a freeway at speed in the middle lane, and for some reason, traffic is backed up and stopped in the right lane. Your lane is perfectly clear. What do you do? Well, past experience shows that somebody in that right lane is going to pull in front of you, and if you wait until it happens to do anything, you have problems.

In this case, a major part of avoiding that wreck would have been realizing what was going on when that truck started moving. Not seeing it when it was right in front of the car is bad enough. Not seeing it move into that position at a typical snaily truck pace is worse.

It'll be interesting to see what this costs Tesla.

RE: Tesla Autopilot, fatal crash into side of truck

JStephen: that's one aspect where I actually imagine a full driverless car would excel. For example, google's car has shown that it can keep track of objects that are normally obscured from the driver, and keep track of them in all directions. Thus, a car sufficiently programed and experienced might realize that the likely outcome of the situation you were in would be vehicles merging into your lane and may be able to anticipate it to some degree.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
The following is speculation, but probably true: The Autopilot would have seen the truck (the tractor part of the tractor-trailer), and it would have seen the truck's rear wheels under the front of the trailer. The truck's tires are presumably black, and presumably perfectly visible. These image features would have been exiting the lane towards the Tesla's right, and the Autopilot then presumably decided that the space behind the truck was empty. But even then, the system seemingly decided to do a 65 mph flyby mere feet behind the black tires that were still moving across the road in front of it. That's not something that any sensible driver would ever do.

If it's a system design failure, then the nicest thing we can say is that they're not done yet. It would be preferable if it were a simple hardware failure, as opposed to such a blatant system design failure. TBD what really caused this failure.

I half-expect that the US DoT will request that Tesla remotely disable the Autopilot feature on the entire fleet until it's finished, tested and properly certified. If this requires major hardware modifications, then it opens a new can of worms.

Very old advice: "A.I. is hard." (<- Many ships have crashed upon those rocks, so to speak.)

RE: Tesla Autopilot, fatal crash into side of truck

Is there any reason to believe that all of the sensors on the car were working, properly calibrated, pointed the right way, unobstructed, etc.?
Probably something that Tesla engineers can pull out of a log file, but is there also a diagnostic that the driver can access? Or a warning?

STF

RE: Tesla Autopilot, fatal crash into side of truck

What do you mean not finished? What does it claim to do that it cannot?

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
"What does it claim to do that it cannot?"

Legal disclaimers aside, there's an implied capability of driving down the road without crashing into the sides of trucks that happen to be painted white. One can argue if this was an 'implied capability', or a naïve assumption. In any case, it's a blatant system failure of something. I'm sure that Tesla is looking into it...

There's been a huge amount of hype about 'self-driving cars', in the context of "A.I.". These sorts of failures are quite revealing. It all goes back to "A.I. is hard."

It's not very interesting (in terms of technology) to get into the legal disclaimers, or if the trucker failed to yield. The system failure (it not braking) is the main point. It's relevant to the hype, the actual state of development, the fitness for purpose, and how many years into the future until they'll be ready for widespread deployment on public roads.

I acknowledge that others may have differing opinions. Which is fine.

RE: Tesla Autopilot, fatal crash into side of truck

I suppose an interesting question is whether you can build a good enough driver by accreting lots of small tasks and skills, or do you need a mighty monolithic platform? Tesla have obviously decided that the first approach may be sufficient

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Tesla Autopilot, fatal crash into side of truck

I have laser cruise control on my van. I don't use it often because it will just slow down behind a slow poke driver. So I just switch it off, overtake and switch it back on only when lane is clear or when cars ahead are moving good. I will not have confidence using an autopilot driving system.

RE: Tesla Autopilot, fatal crash into side of truck

It makes sense to put the blame on the driver, but performing BETA testing with peoples lives is a dumb thing to do.

Reading about the system further, it uses the windshield mounted camera as an "eye" to classify objects, lane markers, signs etc. Lots of image gathering and learning has been done so the fancy processing can figure out what it's actually looking at. It classifies what the objects are so the system can know what these objects might do or what type of hazard they present. The camera is the primary system used to detect objects and plan the driving path. The detection system comes from Mobileye with Tesla apparently doing their own self-learning algorithms. The system also has a forward facing radar unit in the grill. To me, that says the radar unit, as a minimum, is a backup sensor used to ensure physical objects in front of the car are properly detected so the car doesn't drive into them.

So, sure the camera didn't realize it was a trailer and possibly classified it as an overhead sign by mistake. But, the radar unit also failed to tell the system there was a blockage the (whole) car could not drive past and the white of the trailer has no relevance to how it operates. If the radar unit didn't pick out the object then that also tells me it's not looking high enough to ensure objects aren't in the was of the roof of the car.

The system detecting objects probably 30-40' apart on each side of it's intended path and deciding it was OK to drive between them at full speed also seems to be a logic failure. With objects limiting the space for the car to pass, at some point it should decide the path is clear but has a lower level of "safeness" so proceed with an appropriately elevated level of caution.

I agree, "A.I. is hard."

RE: Tesla Autopilot, fatal crash into side of truck

Quote (LionelHutz)

but performing BETA testing with peoples lives is a dumb thing to do.

I wouldn't say I disagree with this but I would say there are exceptions. Statistically (not enough data I know, but for arguments sake let's say there is) it appears that the Tesla autopilot results in fewer accidents when used properly, according to Tesla. To me this is not very different from the FEMA and SAC joint venture response to the Northridge Earthquake. They were effectively implementing steel seismic designs that were still in a "beta" testing phase. However, they showed that a flaw existed in current designs and their fix was determined to result in improved performance even while still in a "beta" testing phase.

I don't fault Tesla for offering this "beta" testing feature as long as it was demonstrated prior to offering it that it was able to statistically improve safety. If they did offer it without qualifying it's safety and just lucked out that it was safer than the average driver, then I entirely agree that it was a dumb thing to do.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

"Statistics" necessarily being "statistics to date" may be enough to say that using the buying public for such beta testing was somewhat better than totally dumb . . . provided that every customer was fully aware that they would be in that situation (and by a lot more than some fine print buried in a hundreds-of-pages-long owner manual).

But statistical grounds are still not enough to take it out of the realm of "personally risky" to each owner, given that a near-infinite range of possibilities exist that you'd be trying to use a finite and ultimately small sample size to predict the breadth of.


Norm

RE: Tesla Autopilot, fatal crash into side of truck

Regarding statistics, Tesla appears to be playing fast and loose in that regard too with their metric of fatalities per mile driven. Inasmuch as they conflate all roads miles for cars controlled by gray matter vs. those controlled by chips which are only supposed to be driven on limited or controlled-access highways (which are inherently safer on a miles driven basis). How many of Tesla's 130 million miles were driven on roads with no at-grade interchanges? Nearly all of them would be my guess for this very reason.

The Tesla must only be driven on those types of highways because it doesn't know how to resolve traffic control devices, bicycles, pedestrians, or (as brutally evident) cross-traffic. Regardless of what it thought the big white trailer was, its computer managed to thread a pretty narrow window to kill the driver. It had to have seen the tractor, trailer wheels, and landing gear pull in front of it, and the rear of the trailer entering the intersection after it and thought, "hey, I have one second to shoot this 25' gap that is going to open up in front of me under this road sign that just appeared." I read it somewhere, that the manufacturer of the sensor suite says that it just doesn't do cross-traffic. At this point, it's a novelty.

RE: Tesla Autopilot, fatal crash into side of truck

Interesting that since this happened, I've gotten at least three ad emails from Tesla touting their safety performances.

Please remember: we're not all guys!

RE: Tesla Autopilot, fatal crash into side of truck

Who is that said "The best defense is a good offense."

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

Sounds like another autopilot crash, but this time the autopilot was engaged not as recommended. The road had no center marking, which is not an approved operating condition. Can't say this was an autopilot failure. Maybe more of did not read the manual, or who cares what the manual said.

RE: Tesla Autopilot, fatal crash into side of truck

"The road had no center marking, which is not an approved operating condition."

This sort of implies that the driver has to keep track of and remember all the caveats. Seems to me that the program ought to have its own check, "Road center marking not detected, disengaging in 5 seconds unless center markings are re-acquired."

TTFN
I can do absolutely anything. I'm an expert!
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

Quote (Cranky108)

The road had no center marking...
Looks to me like the highway center is well marked by a wide median strip, which the Tesla did not cross.

RE: Tesla Autopilot, fatal crash into side of truck

JStephen - some of the other chatter on teslarati is more than a little concerning.

From http://www.teslarati.com/tesla-model-x-crash-monta...

Quote:

TMC member electricity says, “I’m starting to fear Tesla will limit AP even more and screw it up for the rest of us because of the stupidity of others.”


Norm

RE: Tesla Autopilot, fatal crash into side of truck

Quote (IRstuff)

This sort of implies that the driver has to keep track of and remember all the caveats.

This would actually be similar to many aviation autopilots which have certain limits where the autopilot is not intended to be used; usually minimum airspeed and cross-wind, but often various others. Obviously there is a higher level of proficiency and responsibility associated with getting a pilots license than a drivers license, so it may be a little too much apples and oranges. Still, I think as long as the limitations are made crystal clear in the documentation and are reasonable then I think we can have a few things in this world that are not 100% idiot proof. But I agree that, if practical, the more checks where the autopilots can warn when you've left the defined limits, the better.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

The main difference is that pilots KNOW that ANYTHING they screw up, they die. They are mostly very conscientious about pre-flight checks, checklists, etc. If starting up a Tesla required going through pre-drive checks that required sign-offs and inspections, things might be a bit more professional.

TTFN
I can do absolutely anything. I'm an expert!
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

Quote:

pre-drive checks that required sign-offs

I like it! Heck, they have BIG displays, put up a rules quiz that changes each time. The first question is, "will you be using the auto-drive today"? If they answer NO the quiz ends. If they answer YES then a question or two that have to actually be read to answer correctly.

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

How long before we have the first case of a drunken driver using the autopilot to get home?

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
www.americanconcrete.com

RE: Tesla Autopilot, fatal crash into side of truck

Quote (TME)

Still, I think as long as the limitations are made crystal clear in the documentation and are reasonable then I think we can have a few things in this world that are not 100% idiot proof. But I agree that, if practical, the more checks where the autopilots can warn when you've left the defined limits, the better.
I suspect that most car owners don't look at the owner manual that came with their car for much more than finding out how to operate the infotainment system, whatever other comfort & convenience features, and maybe the HVAC. Even among people having at least some claim to car enthusiasm there is considerable lack of awareness of O.M. content relative to the things that could really matter - vehicle dynamic assist systems, "driving modes", and other topics related to the actual driving of their cars.

The more serious the potential consequences resulting from error, the closer any thing needs to be to 100% idiot proof. As fast as circumstances can change, expecting a mostly inattentive driver to recognize that a warning was issued and take the correct actions - all within the time between when the AP issued its warning and the point of no return - may not be realistic.


Norm

RE: Tesla Autopilot, fatal crash into side of truck

The "blame Tesla" stance (not necessarily people here, but the cumulative opinions I've seen in general) is kind of hard for me to get behind. It feels like Tesla provided a product that has made things better for the driver, but because it didn't make it /perfect/ for the driver, it's somehow 'in the wrong'. The driver is better off for having the feature... so what good does it do to rub Tesla's nose in this incident and condemn them? It seems like it would only serve to set the precedent that it's not worth innovating at all, because if you try to improve a little bit, you'll only be tarnished for not improving a lot.

Reminds me of the tired of engineer/manager joke:

Quote (anyone/no one/someone)

A man in a hot air balloon realized he was lost. He reduced altitude and spotted a woman below. He descended a bit more and shouted, "Excuse me, can you help me? I promised a friend I would meet him an hour ago, but I don't know where I am."

The woman below replied, "You're in a hot air balloon hovering approximately 30 feet above the ground. You're between 40 and 41 degrees north latitude and between 59 and 60 degrees west longitude."

"You must be an engineer," said the balloonist. "I am," replied the woman, "How did you know?"

"Well," answered the balloonist, "everything you told me is, technically correct, but I've no idea what to make of your information, and the fact is I'm still lost. Frankly, you've not been much help at all. If anything, you've delayed my trip."

The woman below responded, "You must be in Management." "I am," replied the balloonist, "but how did you know?"

"Well," said the woman, "you don't know where you are or where you're going. You have risen to where you are due to a large quantity of hot air. You made a promise which you've no idea how to keep, and you expect people beneath you to solve your problems. The fact is you are in exactly the same position you were in before we met, but now, somehow, it's my fault."

Note, I'm not saying they don't need to improve, or should stop trying to improve. I just don't /fault/ them for a driver /negligently/ driving 'hands free' with a feature that tells you in no uncertain terms that it isn't for 'hands free' use.

RE: Tesla Autopilot, fatal crash into side of truck

Tesla probably can't be faulted for any individual driver driving 'hands free'. Not with whatever legal disclaimers exist in their product documentation, if that sort of CYA approach is deemed sufficient.

Failure to be more pro-active in keeping hands free driving from happening is a different story, as 'hands free' driving if only for personal experimentation is clearly a predictable occurrence. Especially since Tesla freely discloses that AP is still in beta.


Norm

RE: Tesla Autopilot, fatal crash into side of truck

I seem to remember a juvenile joke from years ago.
A not very smart person bought a motor home.
He headed out on the freeway.
He engaged cruise control, and then left the drivers seat to make himself a cup of coffee in the galley....

For years I thought that it was just a silly joke.
Now I wonder.
The height of irony would be if the Tesla delivering the Darwin Award impacted a truck while on auto-pilot.
I wonder if it would be better to send the Darwin Award by drone?

Bill
--------------------
"Why not the best?"
Jimmy Carter

RE: Tesla Autopilot, fatal crash into side of truck

How did the radar unit fail to sense a row of posts with a guardrail on top?

Why would the AP even engage if the road is questionable?

Why would the AP not disengage if the road was good but turned questionable?

Why isn't there some detection method to ensure the drivers hands are still on the wheel since, according to Telsa, the drivers hands need to be on the wheel?

With these incidents occurring and the more I read about the system, the more convinced I am that Tesla is attempting to use the camera only because they believe that a camera based system is the future. However, it would appear the so-called AI system behind the camera needs to get A LOT smarter and do A LOT more learning about what it's seeing before it's ready. But, I'm not fully convinced a camera system can actually be ready. As a really basic description, the system relies on pattern or shape recognition, but there is an almost infinite number of patterns or shapes possible out on the road. So, a system that takes some new pattern or shape and tries to "best fit" it to the patterns or shapes it has in it's learned history will make identification mistakes.

The worst part is that when wrong, you get a hazard misidentified as something benign which results in the system driving on oblivious to the fact there even is a hazard. No warnings occur and the driver needs to catch the system acting up in time or the car crashes full speed into the hazard.

The other situation is misidentifying a benign object and taking evasive action not expected by drivers around the car resulting in a crash due to other drivers not expecting a car to be suddenly braking or swerving.

RE: Tesla Autopilot, fatal crash into side of truck

@NormPeterson

Should every car maker pro-actively keep people from driving while under the influence of drugs? I mean obviously that's a predictable occurrence. I think the onus you suggest putting on them is unreasonable when compared to current practice/norms.

@LionelHutz
I could see it being beneficial to engage the AP in questionable conditions as it has many other features besides object avoidance or lane-following. After all, it's meant to /assist/ the driver, not replace interaction of the driver.

I'm betting a sensor to ensure there are is a hand / are hands on the wheel would be pretty easy to defeat, and like I mentioned in my response to NormPeterson, I think it's unreasonable to require them to go so much further than every other common car on the market.

As to the frequency of negatives vs positives in computer assisted driving, the decision is a pretty big one to weigh. Obviously proponents have been touting statistics of these intelligent cars being x% safer than general drivers. I remain thoroughly unconvinced, personally, because I don't think we've had a good analysis of the data and I've never seen mention of how they interpret the data. I doubt the current Tesla driver is a typical representation of all drivers in general. It could be that Tesla drivers, as they are right now, are a safer-driving demographic to begin with.

One thing I do find difficult is equating a computer mistake with a human mistake. If a human causes an accident, I mostly think "c'est la vie" and move on. When machines/computers/software make mistakes with similar results, I find myself angrier or more frustrated with it. It's less acceptable for machines to foul up, in my mind, I suppose. That's one reason I tend not to care for increased automation or reliance on machine systems in my personal life. I very much prefer to screw things up directly, rather than have a machine screw it up for me :)

RE: Tesla Autopilot, fatal crash into side of truck

re: impairment; no, the car should not be responsible, even though there are some lock-outs that can be used on cars. The liability problems just shifts to the false negatives that are impaired but somehow manage to the get car started and die. Whose liability is that?

I think the main issue is still with the name "AutoPilot," which connotes way more than it really is. Thus, expectation and reality are not aligned, and that is clearly the fault of Tesla.

TTFN
I can do absolutely anything. I'm an expert!
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

The problem with sloppy work is that the supply FAR EXCEEDS the demand

RE: Tesla Autopilot, fatal crash into side of truck

Well, of course it pisses you off when a machine is supposed to make your life safer but it kills you instead.

Engaging the AP is self-defeating if the road conditions make the AP incapable of automatically piloting the car. If you wanted assistance driving then the system could provide visual clues to the hazards or nudges in the right direction or assisted braking. But to allow a driver to engage a system that is supposed to completely take over driving when it's incapable of safely taking over the driving duties is rather dumb.

Of course, the AP has the same problem deciding if it's capable of driving as it does not recognizing hazards. Basically, it's still too "dumb" to know it can't safely drive the car in the present conditions.

Sorry, but providing a safety feature even though it can be defeated isn't the same as deciding to not bother with a safety feature because it could be defeated. And not bothering is based on comparing to other self driving cars that are common in the marketplace now?

RE: Tesla Autopilot, fatal crash into side of truck

Quote (LionelHutz)

a system that is supposed to completely take over drivingBut to allow a driver to engage a system that is supposed to completely take over driving when it's incapable of safely taking over the driving duties is rather dumb.

1) The system is not supposed to completely take over driving.
2) Yes, it would be stupid to let "Jesus take the wheel" and engage a feature that can't operate in poor conditions and expect it to work flawlessly and beyond all advertised capabilities. It's also stupid to turn off your headlights on a rural country road at night while driving 60mph but there's no interlock to prevent it.

There ain't no power in the 'verse that can keep people from making dumb (or to put it politely: uninformed) decisions, and in the end, you are the driver and still responsible for the vehicle. Yes, one day we will hopefully have automatic conveyance but it is not today, nor tomorrow, next year. It's unreasonable to expect more than that.

I heavily agree with IRStuff "I think the main issue is still with the name "AutoPilot," which connotes way more than it really is. Thus, expectation and reality are not aligned, and that is clearly the fault of Tesla." Marketing screws the pooch again, imo.


EDIT: Fixed errant 'spoiler tag'

RE: Tesla Autopilot, fatal crash into side of truck

True, but the experience of the aircraft industry is that having an automatic system that can do the boring stuff but flicks control back to the pilot at a moment's notice is an invitation to chaos,and unfortunately on the road you don't have the luxury of 30000ft to sort things out.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Tesla Autopilot, fatal crash into side of truck

Evidently you don't have the height of a typical trailer (billboard?) to figure it out, either...

Dan - Owner
http://www.Hi-TecDesigns.com

RE: Tesla Autopilot, fatal crash into side of truck

There's an inherent risk in using automation for commuter tasks. How often have we come home with no recollection of how we got there from work? What we repeatedly do without injury fades into the background and is a form of inattention blindness. It takes great discipline and training, neither of which are common to drivers, to even begin to keep such things at the top of the risk watchlist. Even trained pilots still manage to land at the wrong landing strip, as happened a week or so ago, and even with a 3-person cockpit crew, very often, they're so engrossed in the mechanics of the landing that they fail to notice the orange gorilla walking across the stage, or worse, are afraid to contradict the pilot about his landing choice.

TTFN
I can do absolutely anything. I'm an expert!
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

Quote:

1) The system is not supposed to completely take over driving.

Say what? The system drives the car all on it's own. The system doesn't rely on the human driver to provide any gas pedal, brake pedal or steering input to drive the car down the road avoiding obstacles and staying in it's lane. It completely takes over the driving duties with the human driver of the car ONLY monitoring it by being prepared to take control back.


Quote:

"Road center marking not detected, disengaging in 5 seconds unless center markings are re-acquired."

I thought found this rather comical in how useless it would be. You could be dead long before that 5 seconds expires if the car can't figure out where the lane is.


Quote:

It's also stupid to turn off your headlights on a rural country road at night while driving 60mph but there's no interlock to prevent it.

I can not turn off the headlights when driving at night on the 2011 vehicle I drove to work today.

RE: Tesla Autopilot, fatal crash into side of truck

Interesting note about the owners manual, however many used cars don't come with the owners manual.

RE: Tesla Autopilot, fatal crash into side of truck

I saw recently (forget where) that it could take as long as 11 seconds for a driver to fully re-engage himself in the business of driving.

On interlocks . . . when a logical system has the potential for doing something incredibly wrong, while still thinking that it's doing everything exactly right and better than a human driver could, you need some means of determining that the human is paying at least a modicum of attention to the road ahead. Maybe more than one method, so that there's some redundancy.

A human driver choosing to drive at speed down a country road at night without headlights . . . serious questions exist whether he should be driving at all. It's not something he'd be unaware of, like he would a logical oversight or mis-step in some programming.


Norm

RE: Tesla Autopilot, fatal crash into side of truck

"You could be dead long before that 5 seconds expires if the car can't figure out where the lane is."

That was not intended for an autonomous system, but for a driver's aid, which has radically different constraints.


"A human driver choosing to drive at speed down a country road at night without headlights"

There are certainly lots of drivers driving without lights in the city; making them massively hazardous to other drivers.

TTFN
I can do absolutely anything. I'm an expert!
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

NormPeterson,

Ontario driver charged after using headlamp instead of headlights: police

They are out there.

The problem with aircraft autopilots is that the pilot stays in the seat and pays attention. If the pilot engaged the autopilot and then headed back to the passenger area to see if he could score a stewardess, then dis-engagement of the autopilot would be very much more dangerous.

--
JHG

RE: Tesla Autopilot, fatal crash into side of truck

The railroads use a method that if you don't move a control, or press something every so often the engine will shut off and the breaks will be activated. That would be enough to make someone pay a little attention.

But another question: How does a Tesla handle school zones?

RE: Tesla Autopilot, fatal crash into side of truck

The railroads use an external traffic control system that will stop other traffic from using the track that a stopped train is on well before that stopped train gets hit, and there's no cross traffic.

An "autopilot" equipped vehicle that decides it doesn't want to drive any further but which doesn't get a response from its human driver when it requests the human to take over control ... should do what? Stop in the middle of a live traffic lane so that traffic coming from behind can plow into it? (illegal to stop in a live traffic lane in many places - and rightly so) Pull over to the side of the road (what if there's no breakdown lane)?

Circumstances that an "autopilot" can not deal with don't necessarily come with many seconds or minutes of warning before they happen.

RE: Tesla Autopilot, fatal crash into side of truck

My daughter is driving my old 1993 Oldsmobile. You can't turn the headlights off at night.

Bill
--------------------
"Why not the best?"
Jimmy Carter

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
Cars in Canada generally have Daytime Running Lights.

Far too many cars have a 'User Interface' design / regulatory flaw in that the unwary will be driving on a dark and stormy night, with their DRLs dimly illuminating their forward path, but the rear of their car blacked out like WWII London.

The better designed cars simply refuse to allow this condition. Irrespective of the minimum regulations or headlight control settings, the rear of the car is illuminated when appropriate.

RE: Tesla Autopilot, fatal crash into side of truck

My 1992 Buick always has the headlights on at night. My 2011 Tacoma I need to turn them on. Seems backwards. My Tacoma doesn't even have daytime running lights.

RE: Tesla Autopilot, fatal crash into side of truck

Virtually all GM products have 'daytime running lights' as standard equipment. And while there were ways to disable this in some older models, I don't think that's an option anymore with the newer cars (I never saw where that was possible with my 2013 GMC Terrain but it was with my previous car, a 2001 Chevy Blazer). I know when I was in Denmark a few years ago, we were told that 'daytime running lights' was required by law, and I was told that that was one reason why GM decided to make it standard since several countries were starting to require it anyway. Along those same lines, here in California, you're now required to turn on your headlights when you wipers are on. With my 2013 Terrain, while the 'running lights' are on all the time, when I turn the wipers on the full headlight/taillight system comes on. Of course, it rains so infrequently in California that I suspect a lot of people, with cars that do not offer this automatic feature, are technically breaking state law when it actually does perchance rain.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

Quote (VE1BLL)

Far too many cars have a 'User Interface' design / regulatory flaw in that the unwary will be driving on a dark and stormy night, with their DRLs dimly illuminating their forward path, but the rear of their car blacked out like WWII London.

That is my pet driver peeve at the moment. These poor idiots drive around thinking their "lights" are on and the back is dark. You can flash them, yell at them that their tail lights are off, and nothing works because their dim little minds see some head light.

I stopped next to a car and tried to explain it to two guys and they never got it.

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
KC "...tried to explain..."

The internationally-accepted hand gesture sequence is to first represent the two ends of their vehicle, by holding up two fingers in a repeated upward thrusting movement; followed by an indication that only one end of their vehicle is actually illuminated, by waving the same hand with just one finger extended.

wavey2

RE: Tesla Autopilot, fatal crash into side of truck

Quote (Norm)

I suspect that most car owners don't look at the owner manual that came with their car for much more than finding out how to operate the infotainment system, whatever other comfort & convenience features, and maybe the HVAC.

My lovely wife doesn't read _any_ of the owner manuals for _anything_.

Instead, she demands that I explain the operation of the whatever it is, even if I have never operated it, even if I have never seen it, while she is operating it, and has already forced it into undocumented modes of operation that no one understands or can reproduce.

I have become her PDA and her IT guy, and have failed to meet her expectations, yet again.

Mike Halloran
Pembroke Pines, FL, USA

RE: Tesla Autopilot, fatal crash into side of truck

On my truck, and many other vehicles, the 'daytime running lights' that stay on, even at night, are quite dim, and at least on my truck - are simply the amber lights at the corners, also used as turn signals. It's a 2003 Silverado.

To share a random eBay pic, here's a similar switch, below. There are three settings. Auto, off, on.



I wasn't aware that there were vehicles where it was simply IMPOSSIBLE to turn off headlights when it was dark. That seems really crappy, to me. There are times I want the truck turned on, but don't want the lights on, yet. I always thought it polite to turn off the lights when pulling into a parking spot if your lights were going to be shining through a window into someone's face, or if you're wanting to NOT blind people as you pull up toward them in a parking lot or other off-road place.

No wonder some people are being acclimated to the idea that their cars should be responsible for them, instead of the other way around!

RE: Tesla Autopilot, fatal crash into side of truck

When it comes time to drive through the Christmas lights in the area (paid presentation, slow drive through a park), the people with cars that do not allow the headlights to be turned off get nasty sneers aimed in their direction.

Dan - Owner
http://www.Hi-TecDesigns.com

RE: Tesla Autopilot, fatal crash into side of truck

Oh yes, and in fast food drive-thru as well... it's always nice to not flood the low-sitting car in front of you, if you drive a full sized truck. Even just the reflection in the mirrors can be annoying to people.

Hence my surprise that some vehicles don't allow it /at all/. I wonder how hard it is to 'fix' that, if my next vehicle comes that way. Not enough to sway the purchasing decision, but it would annoy me.

RE: Tesla Autopilot, fatal crash into side of truck

I think you will find, at least on newer GM products, that the 'Auto/Off/On' switch only controls whether the full headlight/taillight mode comes on either automatically at dusk (and off at dawn) or whether it's controlled manually. The status of the 'daytime running lights' are not affected by the setting of this switch.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

I'm surprised that it took them this long to weight in on this issue:

Consumer Reports calls on Tesla to disable Autopilot function

http://www.marketwatch.com/story/consumer-reports-...

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

One thing that makes it harder to know what's going on with newer headlights is the fact that in older cars, driving at night required you to at least actively turn on lights, because the dashboard lights were linked with the running lights, but newer cars with plasma or LED displays are always lit, and there's no indication that the headlights are off.

TTFN
I can do absolutely anything. I'm an expert!
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

Since this has devolved into a discussion about headlights, which isn't too much at all analogous to a semi-guided missile hurtling down the highway, I thought I'd weigh in. My '02 GM (I'm a miser), automatically turns on the headlights when it gets dark. I can switch the main lamps off, but the /corner/parking/tail lights (yellow and reds) stay on. Under no circumstances can I make the car completely dark at night, nor can I ever turn the tail lamps off. All other times I have daytime running lights (unless in park during the day, then I'm dark) or if I actively turn the headlamps to on during the day. If I drive into a tunnel or sit under an overpass for more than a few seconds, the headlights come on. Seems like a pretty straightforward, simple setup.

Until an active cruise/autopilot/autonomous car is better than a human, yes, it should require the driver to still be engaged with the system. It is an assist for a driver (like passive cruise control), not a replacement. I'm all for hand/eye monitoring/tracking to keep the systems engaged.

And yes, if the system detects the driver has disengaged, it should pull to the shoulder, stop, and put the hazards on. And absent a safe pull off area, it should proceed to the next available safe location. If it's good enough to drive the car without hands on the wheel, it should be good enough to accomplish these things. And if it's not good enough for that, just have it disengage as well and roll out of the gas.

RE: Tesla Autopilot, fatal crash into side of truck

That pic from E-bay is interesting.
My 01 Sierra has a similar switch. There is no off position. In Canada, the positions are Automatic, ParkLights, and Headlights on Override.
In my cars with LED indicators, the illumination level of the LEDs drops when the headlights automatically turn on.
If I am unsure whether the headlights are on, I flick the dimmer switch. Daytime running lights do not have a high beam indicator.

Bill
--------------------
"Why not the best?"
Jimmy Carter

RE: Tesla Autopilot, fatal crash into side of truck

On my 2013 GMC Terrain, it has the navigation package displayed on an LED panel which, when the actual headlights come on, irrespective as to how that was triggered, automatically goes into 'nighttime' mode where the map's background goes from light to dark and where the brightness level drops significantly, so there's no question that you're now in full headlight/taillight mode.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

Clearly a problem with the seat to steering wheel interface.

RE: Tesla Autopilot, fatal crash into side of truck

".......irrespective as to how that was triggered............."

RE: Tesla Autopilot, fatal crash into side of truck

I was alluding to the fact that the headlights can be turned-on by either the 'dusk' event or by manually turning the headlight switch to 'ON'.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

This may be a dumb question, as I only drive older trucks because of the annoying new features, but if your headlights come on when you turn on the windshield wipers, to they also come on when you wash the windshield?

Also I don't believe the drivers tests ask if people can identify hand signs like from a bike rider might offer.

Also as far as self driving cars, be aware that it is possible for a car to slow down without using the brakes. I have shown that to many tailgaters, as they believe my brake lights will warn them (But I only do this when taping down the cruse control does not annoy them enough).

Brian I believe you misunderstood my reference to the railroads dead man detection system, not to the PTC they are now installing.

RE: Tesla Autopilot, fatal crash into side of truck

No, the headlights don't turn on because it's raining, but rather because YOU turned on the wipers. In other words, there isn't a 'rain' sensor controlling this (although I've heard that they're working on this sort of thing).

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

"Also I don't believe the drivers tests ask if people can identify hand signs like from a bike rider might offer."

Certainly, the bikers I've seen recently haven't learned or have forgotten the correct hand sign for a left turn maneuver. They're all doing this sort of left, downward wave thing, which is usually done concurrently with the merge into the carpool lane from the carpool lane boundary. It's basically a BS thing to prevent the CHP for citing them for incorrect lane change, even though they're already violating the between-car threading rules.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

I thought the between-cars thing was legal in California (first time I ever saw it was there).

RE: Tesla Autopilot, fatal crash into side of truck

Quote:

I thought the between-cars thing was legal in California (first time I ever saw it was there).

It is. But I believe the rules are they must not exceed 5MPH faster than the cars they're passing.

Two weeks ago in Oakland I watched in fascination as a guy on a Harley swerved thru multiple lanes going around cars, constantly changing which lanes he was splitting at about 35MPH, while we were stuck at about 2MPH. I'm sure he's probably dead now.

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

My last bike accident was a somersault over the bonnet of a car being let across two lanes of stationary traffic I was filtering through. Not technically my fault, but I could have prevented it with better behaviour. It wrecked my bike and could so easily have wrecked me. It changed my attitude. Whenever I see a bike fast filtering past me these days, my heart sinks and I feel myself internally chanting "organ donor, organ donor".

Steve

RE: Tesla Autopilot, fatal crash into side of truck

SomptingGuy; Glad you lived to be smarter another day. I seem to know a lot of people who have chronic disabilities caused by car wrecks they've been in. It's very sad really.


Quote (V1EBLL)

The internationally-accepted hand gesture sequence

HA! I'm in California.. Hand jesters gesture from cars are only used to suspend gunfire for reloading.

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

In California, a hand jester is something different from a hand gesture. Do you want to rephrase that?

RE: Tesla Autopilot, fatal crash into side of truck

That came to mind, Steve, as did somebody opening their door to dump out cold coffee cup, etc.

RE: Tesla Autopilot, fatal crash into side of truck

Hand jester:

RE: Tesla Autopilot, fatal crash into side of truck

I haven't gone through everything, but I'm curious to find out if the trailer had conspicuity tape on it? I thought all trailers (and rail cars) have to have reflective stripe elements along the side for visibility, but I admit that I see quite a few without them, so I don't know if this is true or even more of a regional thing.

RE: Tesla Autopilot, fatal crash into side of truck

I think the tape is at best an acceptable alternative to reflex reflectors (when the tape itself and its installation meets reflector requirements).

Not sure if AP would have seen the tape any better than the reflectors or the huge box that they'd both be attached to . . .


Norm

RE: Tesla Autopilot, fatal crash into side of truck

Now someone has introduced a self driving bus. So if you are at the bus stop and the bus just drives by, it really might not be personal.

RE: Tesla Autopilot, fatal crash into side of truck

Reflector tape works great at night, against headlights. In the daytime, not so much, but that does suggest that Tesla could trivially improve its performance by having a IR cut-filtered camera operating in the NIR with an illuminator, or just possibly from the daylight driving lights, which would pick up the reflector tape, and/or see such objects better, since the horizon is more likely to be dark in the NIR, particularly above 850 nm.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

A collision avoidance system shouldn't rely on characteristics of the object about to be hit. It shouldn't rely on the object about to be hit having reflectors, reflective tape, or being painted white, black, or pink with purple dots.

RE: Tesla Autopilot, fatal crash into side of truck

That's the ideal case, and you could probably build something that could do that, but it would probably cost more than the car itself. We built and obstacle avoidance system at a previous company, and our best estimate was about $60k, and that was only one modality of 3 or 4 modalities required.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
When I taught my eldest son to drive, I taught him to be "looking for empty road" (as opposed to a long list of possible obstacles).

I explained to him that "looking for other cars" (a common algorithm) might lead him to fail to notice bicycles, motorcycles, pedestrians or an escaped hippopotamus.

"Looking for empty road" is the more general (i.e. not dangerously incorrect) solution.

It also correctly covers off the situation where you can't see the empty road because of the six-foot tall snow bank that blocks the view.

I wonder what driving algorithm concept Tesla used?

RE: Tesla Autopilot, fatal crash into side of truck

"I wonder what driving algorithm concept Tesla used?" And will they be modified by what we learn from Pokemon Go?

RE: Tesla Autopilot, fatal crash into side of truck

I can't help but wonder whether this is fall-out from the recent autopilot failure.

http://www.marketwatch.com/story/mobileye-shares-p...

While I've never had any 'professional' contact with any 'Tesla' people, I have visited 'Space-X' several times as part of my former job (now retired) and knew several people who work there. One of the things that I learned about Elan Musk, not directly but from talking to 'Space-X' people, is that he's always willing to do it himself. That is, when he has a problem with a supplier, he's just as likely to bring the work 'in-house' then to go and find another company to do business with. He's apparently done this several times with respect to critical components for his Falcon rockets, particularly engine and fuel system parts, at least that's the impression that I've gotten. Perhaps this is something similar taking place.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
From recode link: "But to be honest, the Mobileye system is so easy to reproduce; you don’t need the best talent. I did it in a couple of months. Our software can already do more than Mobileye’s."

Clearly an industry in desperate need of some adult supervision.


RE: Tesla Autopilot, fatal crash into side of truck

Very dumb of the driver to /replace/ 911 emergency services with the /hope/ that his car went where he wanted.

At the least, he should have dialed 911 and kept them apprised of his location and intent to proceed to the hospital so that emergency services were there as a backup, or to keep him from careening out of control in the event of a car control failure.

Glad it worked out for him, though. That's a nice story.

RE: Tesla Autopilot, fatal crash into side of truck

I'm sure this is going on somewhere in the Tesla hierarchy.


CODE

WIN | LOSE
 1      1 

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

Bigger problem is that the Tesla could have just motored down to the hospital, found its own parking space, and sat there waiting for the guy to get out...forever. I guess they need to build "detect dead driver" functionality into these things.

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
One could try to argue that it would be logical to equally weight the "lives saved" and "those killed" by a given technology or process.

But that's not how it works.

The BBC podcast series (that I linked to in another thread) gave the example of EMS Helicopters, which helps 400,000 cases a year (USA, if I recall correctly) and undoubtedly saved thousands of lives per year. But when crashes of EMS Helicopters killed a couple dozen in one year, the NTSB still began a program of corrective action.

It's not good enough that the balance sheet is in your favour. The values we silently apply are not symmetrical.

The asymmetry even shows up financially. e.g. The EMS Helicopter performs a mission that saves a life, and they invoice somebody for, say, $2000. A job well done. But they mess up and crash, killing somebody, they might be liable for several million dollars. The asymmetry is at least 1000:1.

Another example is the huge Takata airbag recall. Those airbags have undoubtedly saved tens of thousands of lives in their expected operation. But some much smaller number (dozens?) have allegedly been killed by fragments, so massive recall underway.

There are endless similar examples.

One won't get very far arguing about 'greater good' and balance sheets. The regulators and authorities enforcing our value system generally won't allow it.

There might be the occasional historical exception that's been grandfathered in by default.

All this philosophy probably aligns with the 'First, do no harm' mantra.

RE: Tesla Autopilot, fatal crash into side of truck

You need to update your helicopter flight pricing. I believe you will find they range from between $30K and $80K. There is currently an uproar for family economic disasters with people losing their homes etc. due to the exorbitant prices charged when they may have not even needed the flight. So, you're looking at only about 50 victims to cover the $3M helli.

But I totally get your valid point.

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
Yep. It was a guess, as indicated by "say". But you're correct that it's a bit low.

News: "...total 750 EHS LifeFlight missions last year. Nova Scotia currently spends $3.6 million per year for EHS LifeFlight helicopter service." Pretty good value. It was in the news because they're tightening up the airworthiness rules, and a certain old helicopter is no longer permitted to land on the hospital roofs. Costs will be rising.

So our local system is about Cdn $4,800 per mission, on average. I'm not sure if that's a full accounting. As far as I know, the patient only pays a relatively small fee.

YMMV. US pricing on anything medical is notorious. EMS helicopter service, or a $750 box of tissue at the bedside.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (VE1BLL)

So our local system is about Cdn $4,800 per mission, on average. I'm not sure if that's a full accounting. As far as I know, the patient only pays a relatively small fee.

Considering that your average medivac heli is going to cost somewhere in the realm of $1300-$1600 per hour to operate, I'd say that $4,800 per mission is a reasonably fair estimate of what the owner actually spends.

RE: Tesla Autopilot, fatal crash into side of truck

Just playing devil's advocate... does the $1300-$1600 value include the cost of having trained paramedics ready to go, or does this just represent the helicopter maintenance and operating costs?

(I live in Canada, and am counting my lucky stars based on this conversation...)

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
I don't see how the regulators [USA] would fail to instruct Tesla to remotely disable the system.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (marty007)

Just playing devil's advocate... does the $1300-$1600 value include the cost of having trained paramedics ready to go, or does this just represent the helicopter maintenance and operating costs?

(I live in Canada, and am counting my lucky stars based on this conversation...)

Your average medivac chopper is something in the mid-size range, like a Bell 222 or Augusta A-109.
The Bell 206 is also very common.

Helis in that range, you're looking at $900-$1000 per hour, wet, conservatively. Tack on another $100-$150 per hour for a high-hours pilot, plus anywhere from $100 an hour to $400 an hour for the medical crew (paramedic/paramedic on the low end, nurse/physician crew on the high end).

I'd guess the average mission is in the 2-3 hour range, which leaves a little, but not exorbitant, profit for the operator. Certainly nothing like charging $100,000 for a service that requires $4,000 of outlay.

RE: Tesla Autopilot, fatal crash into side of truck

So at roughgly $1,500/hr all in, at 3 hours, that's <$5k... but they're charging $40k. Yeah, I'd call that a "little" profit peace

Dan - Owner
http://www.Hi-TecDesigns.com

RE: Tesla Autopilot, fatal crash into side of truck

What about liability insurance for the medi-vac operator...

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

Quote (MacGyverS2000)

So at roughgly $1,500/hr all in, at 3 hours, that's <$5k... but they're charging $40k. Yeah, I'd call that a "little" profit peace

I was referencing VE1BLL's quote that the average mission for Canada is a $4800 expense. Pretty much right on target.

Quote (JohnRBaker)

What about liability insurance for the medi-vac operator...

Cheaper than you'd think as a per-hour cost. Remember that a medivac operator is operating thousands of hours per year.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (jgKRI)

Your average medivac chopper is something in the mid-size range, like a Bell 222 or Augusta A-109.
The Bell 206 is also very common.

Helis in that range, you're looking at $900-$1000 per hour, wet, conservatively. Tack on another $100-$150 per hour for a high-hours pilot, plus anywhere from $100 an hour to $400 an hour for the medical crew (paramedic/paramedic on the low end, nurse/physician crew on the high end).

I'd guess the average mission is in the 2-3 hour range, which leaves a little, but not exorbitant, profit for the operator. Certainly nothing like charging $100,000 for a service that requires $4,000 of outlay.

$4500 for a 3-hour mission vs. $40,000 bills. Again, I'm glad I'm in Canada...

RE: Tesla Autopilot, fatal crash into side of truck

One of the questions I had was "What did that truck actually look like", and I never could find any pictures of it when searching previously. I did find some links to the Preliminary NTSB report, which includes one photograph of the truck:

http://www.ntsb.gov/investigations/AccidentReports...

Can't see it there, but I remember long long ago, my employer had a track hoe that was parked well off the road out by the fenceline for the night. A driver (maybe drunk driver, I forget) ran off the road and into that trackhoe. Report from the scene said there was still hair and blood on the counterweight of that trackhoe when the crew got there the following morning.

RE: Tesla Autopilot, fatal crash into side of truck

Thanks for that. There's something fundamentally WRONG with a driver's assistant that misses an obstacle that gigantic. The supposed argument that the system thought it was a an overhead sign is absurd; the system was incorrectly designed if that was actually the case.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

It's interesting in the China crash article it seems to focus on how Tesla feels the driver is wrong because his hands were not on the wheel. If Tesla can detect that hands are on the wheel and they require hands on the wheel then why doesn't the system warn and disconnect when hands are not on the wheel????

It's ridiculous that a system deemed usable by the general public can't avoid a disabled car that was in it's path.

Both these point out the fundamental flaw of expecting a "learning" system that makes a best guess based on it's database of objects to work reliably all the time. Tesla though they were close enough yet the system still didn't detect the side of a truck trailer and a car partially in the lane correctly, which are rather simple cases compared to what else is out there.

RE: Tesla Autopilot, fatal crash into side of truck

They have a name for a system that requires you to keep your hands on the wheel, your attention on the road, etc. They call it "driving"!

I keep seeing these conditions placed on the "auto-pilot" system for "proper" use... but those conditions keep taking us right back to having no auto-pilot system at all. Collision avoidance system, maybe? Okay, then call it that and don't try to control the car 100% of the time, only that 0.00001% of the time when a problem arises. Of course, that raises a completely different set of concerns... what happens when the car suddenly tries to arrest away control from the driver and the driver knows best?

Dan - Owner
http://www.Hi-TecDesigns.com

RE: Tesla Autopilot, fatal crash into side of truck

Some things simply take time to get drivers used to doing a different thing. When ABS first came out, there were avoidable accidents because drivers attempted to override the ABS control of the brakes.

The bottom line that these accidents reveal is that we are still a ways from a true auto-pilot system. The DARPA Grand Challenge required way more sensors than any of these cars have, and you really need a suite of sensors to cover all the possible and probable contingencies, since no single sensor can do all the jobs all the time.

What might really be an issue in the future is what happens if EVERY car is equipped with radars, sonars, and ladars; avoidance of interference from other cars' emissions will be an interesting problem to solve. Just imagine if you're surrounded by cars pinging you and each other while your poor sensors are doing the same and trying to sort out all the resultant clutter.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
Apparently, one is supposed to drive around like this:



As far as I know, that's an official Tesla marketing photo.

RE: Tesla Autopilot, fatal crash into side of truck

(OP)

Quote (TTFN winky smile)

...you really need a suite of sensors to cover all the possible and probable contingencies...

Microphones to hear the emergency vehicle sirens, or (ultimately) the terrified passengers screaming.

Smoke detectors, to wake up the dozing passengers, in case the car catches fire.

They're already starting to realize that Tesla Autopilot, retroactively renamed Autopilot 1.0, needs additional hardware.

http://bgr.com/2016/08/18/tesla-autopilot-2-0-feat...

Next, they'll sadly discover that the CPU isn't powerful enough.

When they finally get a system that's actually acceptable, the ratio of the solution complexity (acceptable/initial) will be a moderately large number. Like dozens-to-one.






RE: Tesla Autopilot, fatal crash into side of truck

So why would Tesla be driving in that lane, and not the lane to the right? I've never seen in the drivers manual where it describes why you would want to drive in the middle left lane.

Also I can see if you are trying to pull over, because you are being pulled over, that the car takes off leading you involved in a high speed chase away from the police.

Actually a CO2 detector in a car is a good idea.

And the statement "since no single sensor can do all the jobs all the time" sort of implies that no human can do the job.

But if a self driving car is ever perfect, then we won't need seat belts, air bags, and car crumple zones.

RE: Tesla Autopilot, fatal crash into side of truck

"When they finally get a system that's actually acceptable, the ratio of the solution complexity (acceptable/initial) will be a moderately large number. "

as was the case with the Grand Challenge competitors.

" sort of implies that no human can do the job."

and it can't [no lights, rain, fog, sleet], but it's way better than most sensors because the brain makes up for missing data by making up or inferring data. There's no way that a human driver would have not noticed that truck. The biggest issue with the human isn't the sensing, per se, it's the boredom and alertness. Computers don't get bored or lose alertness, yet.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

Quote (cranky108)


So why would Tesla be driving in that lane, and not the lane to the right?

On a highway like that, that would be the lane that I would consistently drive in.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

As someone who usually drives the speed limit, I can tell you that driving in the lanes on the left when there is little traffic can be of a concern. There seems to always be people who want to drive much faster than the speed limit. So I would stay in the next lane over.

I would assume the Tesla would follow the speed limit (which would be a plus if every car did that).

So when is the driving theory of the computer going to be hacked so people can go a little faster? Sort of like changing the chip on your car to make it accelerate faster (who cares what it was intended for).

RE: Tesla Autopilot, fatal crash into side of truck

If you look at the picture, the Tesla WAS IN THE NEXT LANE OVER from the so-called 'high-speed' lane. I tend to drive the speed limit and so when there are three or more lanes, I drive in the one NEXT to the 'high-speed' lane, as the Tesla was doing in the posted photo.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
"...if a self driving car is ever perfect, then we won't need seat belts, air bags, and car crumple zones."

What about speed limits?

If self-driving cars are going to be 'perfect', then they can drive as fast as they like.

Right?

RE: Tesla Autopilot, fatal crash into side of truck

You'll still crumple zones, seat belts and air bags with a perfect self-drive. There are road disasters that reveal at such great rate e.g. truck's wheel disintegrates, train derails, that leave an instant reaction/ pre-prepared (expert precautionary) computer unable to fully prevent serious damage simple because the mechanisms of cars and road space limit responses- maximum braking application, maximum turning speed...

http://www.eng-tips.com/threadminder.cfm?pid=1529
Use translation assistance for Engineers forum

Note the rules include No Student posting

RE: Tesla Autopilot, fatal crash into side of truck

Quote (cranky)

So why would Tesla be driving in that lane, and not the lane to the right? I've never seen in the drivers manual where it describes why you would want to drive in the middle left lane.

I doubt you'll find it in any driver training manual, but one lane over makes it easier for traffic merging onto the highway - this usually being on the right - to do so without requiring
either that traffic or you to do anything special to accommodate each other. The less you have to do, the easier it is on the people following to not have to do anything special to fit what
you did. Basically it avoids having both the merging driver and the driver already on the highway trying to force a "me first/you're going to wait" mentality on the other, or both doing an
"after you, Alphonse" dance to the disadvantage of everybody in the vicinity.

The effects of highway crown are usually greater in the extreme right and extreme left lanes. That's where the puddles will be deeper when it rains, and these will be the lanes that sacrifice
some width to snow clearing operations.

On a long highway stretch with no interchanges and little or no traffic you might as well drive in the right lane, subject to any consequences of the weather.


Norm
(edited to eliminate having to scroll horizontally)

RE: Tesla Autopilot, fatal crash into side of truck

Some states, but NOT California, reserves the far Left lane, of a multi-lane highway, for passing only. And there are some states which also requires that anyone driving less than the posted speed-limit, they must do so in the far Right lane.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

So a self driving car should change the way it drives depending on the state or country it is in. Correct?

I notice that people from California tend to ignore the yield signs when coming into traffic, and often they find they run out of acceleration lane.

Not that I dislike people from California, it's just the largest state driving difference that I see very much of.

And what about construction zones, where the speed limit changes, but not on fixed basis?

RE: Tesla Autopilot, fatal crash into side of truck

When we first moved to California 36 years ago (in fact, we arrived 36 years ago TOMORROW, which also happened to be my birthday) one of the first things we were told was to not look at driving as needing to 'fight traffic'. If that's your attitude, you're going to lose as traffic is bigger than we are. While you may never get to where you love it, you will have to learn to respect it and try to live with it as best as you can. And one other thing you learn here is that the local gendarme's primary job, when it comes to dealing with drivers, is to assure the safe flow of traffic, which goes well beyond just enforcing the speed limits.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

What makes the failure to yield thing worse is that more people now follow too closely so there is no spaces between cars. I believe this to be a young people problem with not wanting to wait for anything.

They must really hate me because I slow down for tailgaters.

RE: Tesla Autopilot, fatal crash into side of truck

Quote:

They must really hate me because I slow down for tailgaters.
I hope you're not doing this in anything but the rightmost lane when traveling on a multi-lane highway. In many places it would otherwise be against the law.

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
The correct approach for tailgaters is to take the missing '2-second rule' gap and add it to the front. You need to allow a 4-second gap, double the normal, in front of you to compensate for the tailgater.

If you live where you can't leave any gap, move! smile

Then, perhaps not quite so correctly, you can subtly oscillate your speed up and down by an imperceptible one or two mph (or kmh), with a repetition rate matching their tailgating response time - but carefully out of phase. They'll oscillate increasingly wildly and give up and go around. If you're subtle enough (e.g. do not look in mirror, use peripheral vision only), they won't even know what happened. An important detail in gun-toting states.

RE: Tesla Autopilot, fatal crash into side of truck

ah yes, passive aggressive driving styles...

I try to be as little interference to other road users as possible, while bending the law as little as possible to safely do so. E.g., I will accelerate above my cruising speed in order to complete a passing manoeuver in the left lane more quickly, in order to make way for faster traffic behind me, if any.

"Schiefgehen wird, was schiefgehen kann" - das Murphygesetz

RE: Tesla Autopilot, fatal crash into side of truck

On a road with multiple lanes in the same direction the FIRST correct response for tailgaters, if you are in anything but the rightmost travel lane, is to move right. Let them by. If you need to speed up or slow down a little to expedite moving right, so be it.

After that, I like VE1BLL's suggestion smile but I've found that after moving right, the tailgaters are seldom an issue.

Rule of thumb: Hemi has this right as well; present as little interference to other road users as possible. If someone else wants to drive faster than you do, get out of their way (as long as it is safe to do so). If someone wants to tailgate, let them tailgate somebody else.

Will self-driving logic address this ... ? ? ?

RE: Tesla Autopilot, fatal crash into side of truck

The news this morning was touting self driving semis. I wonder what their sensors won't recognize.

RE: Tesla Autopilot, fatal crash into side of truck

Maybe they won't see deer on the road, or they might over react.

My normal commute includes a two lane road with passing lanes every few miles, then it turns into a divided 4 lane with several ill timed stop lights, and a speed change or two.

Seem to find tailgaters on the two lane part. And although parts allow passing, there is usually too much traffic.

At times several road kill deer can be seen along the road, though I rarely see a car or truck nearby (perhaps people keep driving).

Amazing that people seem to miss the signs that say "Keep right except to pass". So I would ask if the self driving cars or trucks can understand the signs any better? Or is it like the GPS that never gets updated.
What subroutine allows it to avoid road damage, or blown tire parts?

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
"...self driving semis. I wonder what their sensors won't recognize?"

White Tesla cars.

RE: Tesla Autopilot, fatal crash into side of truck

Amazing that people seem to miss the signs that say "Keep right except to pass".

>> not surprising at all; the right lanes are slower, so why get stuck in them?

So I would ask if the self driving cars or trucks can understand the signs any better?

>> All that stuff has been digitized; which is why almost all the nav systems show speed limits, one-way, and number of turn lanes, etc.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

With the number of signs out there these days, I'm not surprised when people miss them. In some cases I think we've reached sensory overload (misc online google image search...):

RE: Tesla Autopilot, fatal crash into side of truck

Now you bring up getting used to ABS as if that system has been perfected in the last 20 or 25 years of widespread deployment. That is a complete piss-poor analogy to use when saying we just need to get used to it. We should get used to these self driving cars working "most" of the time without knowing exactly when they might just try to kill us.

The ABS system on the vehicles I have owned have performed like complete and utter crap when on certain snow covered roads. I have to drive according to how piss-poorly the system works instead of driving according to the road conditions because the system takes away almost all the braking force. It can easily add 3 or 4 car lengths to the stopping distance in cases where I'm only going speeds in the 25-30mph range. And this includes it disengaging at 15mph when I begin to apply real braking force that actually stops the car.

RE: Tesla Autopilot, fatal crash into side of truck

^ I use the (mechanical cable-operated) hand-brake to intentionally lock the rear wheels in situations like that, and there are times when I wish I could do the same to the front. The ABS is powerless to overrule a mechanical cable going straight to the brake pads. Heaven help if your "parking brake" is a pushbutton that doesn't overrule the ABS.

RE: Tesla Autopilot, fatal crash into side of truck

I still remember by first ABS rental car. I am glad there was no traffic on the primary highway as there was NO braking as I approached at 60MPH on the washboard dirt access road

I Just pulled the fuse for the ABS

RE: Tesla Autopilot, fatal crash into side of truck

I've found that the ABS systems work quite well, at least they have for me. Now it is true that you have to know how to properly utilize them and that's where the problem occurs. Eventually this problem will tend to mitigate itself as new drivers will be learning only on ABS equipped and therefore they will hopefully NOT BE TOLD TO PUMP THE BRAKES!!!

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

ABS works well enough most times it's needed. But there are a few situations where it can and does get it "wrong", to where an experienced driver
who's paying attention can do better. A road contour that suddenly angles down and unloads the tires enough is another example. Like in the
washboard road situation, the problem seems to be that ABS systems generally can't tell the difference between a tire that's sliding because
increasing brake torque has overpowered a reasonably constant amount of contact patch grip or if the amount of contact patch grip itself has
suddenly decreased.

Brian - too much rear brake (or a combination of wheel brake + engine compression braking) in a low-grip situation can be a risky proposition
if, for example, you're going downhill at the time.

In my own experience, ABS has been most useful in preventing expensive tires from getting flatspotted under unusual conditions at a
certain motorsports activity.


Norm

RE: Tesla Autopilot, fatal crash into side of truck

They still running demolition-derbies at county fairs? auto

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

No question too much rear brake causes stability issues. That's why I'd also like to be able to lock the fronts, overruling the ABS, but I can't.

Usually these situations aren't happening at 120 km/h anyhow; usually it's more like 20 km/h and the feeling of slowly but inexorably approaching and passing the stop sign while the ABS refuses to allow any braking at all even though a locked wheel at that speed isn't much of a stability issue. Also if this is happening while I'm approaching the back of another vehicle, I really don't care about stability at that point. I just want it to STOP. Doesn't matter if it's frontwards or sideways or backwards, I just don't want to slide into the other car!

And Mr Baker, I suspect by your published location in your signature that you wouldn't have a few months of this to deal with every year smile

RE: Tesla Autopilot, fatal crash into side of truck

I should add that the handbrake overrule of the ABS is also useful to overcome terminal understeer on similarly snow-covered surfaces. Turn steering wheel ... nothing happens ... momentary judicious handbrake application, and now the car turns. Of course, getting the car to point a different way, and actually changing its direction of motion, are two different things, but getting pointed the right way seems to be half the battle. I think what happens is that the wheel creates a bigger wedge of snow when it's a bit sideways than when it's straight, and the rear brake application transfers a bit of weight forward to help with steering up front.

Will self-driving be able to do this??

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
My car's ABS allows me to lock up the wheels when (for example) slowly creeping down an icy hill covered in snow.

Used normally, it's the usual 30 Hz ABS chatter. But press the brake pedal just a bit firmer, and one may (by choice) lock up the wheels. I presume only under specific conditions, such as slow speeds.

This slow speed ABS override feature can be very useful in certain conditions. Studded winter tires sometimes need a few seconds to scrub through the raft of snow to reach the ice, where they can provide some braking. Then one releases the brakes enough to regain steering, before going off the edge of the road. Repeat for length of hill. Kinda like manual ABS, but the physics of snow on ice with studded tires needs a cycle time of about four or five seconds to penetrate the snow rafting under the tires, not the usual 30 Hz ABS.

Mercedes. They do seem to know what they're doing.

RE: Tesla Autopilot, fatal crash into side of truck

Typically even old school ABS does not intervene at all below a particular speed, of the order of 8 mph. That is quite an interesting thought on ice, since if you manage to lock all 4 wheels then the vehicle speed is zero, and so they'll stay locked. I'd guess that more modern systems model the apparent mu and adjust their strategy accordingly.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Tesla Autopilot, fatal crash into side of truck

Just a question about self driving trucks. Who puts the chains on the tires? Just simple that at times chains are required.
Maybe that will increase the number of jobs for roadside assistance. Which from my viewpoint needs to be improved.
And maybe it will bring back full service filling stations. (clean that lens sir and/or mam).

How do self driving cars handle tire blowouts?

Good point about snow tires, although I don't use studded snow tires myself. The all season tires just don't preform as well as snow tires.
Maybe there needs to be automotive drivers training on things like tires, breaking, how to fix many common problems, like flat tire. Maybe also to not take fast curves with large things in the back of your truck.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (BrianPetersen)


And Mr Baker, I suspect by your published location in your signature that you wouldn't have a few months of this to deal with every year

If you're referring to driving in the rain, then yes, we're only confronted with that situation a few months out of each year.

But that being said, it's amazing how it seems that when that first rainfall of the season hits, that collectively, nearly everyone in Southern California has totally forgotten how to drive, period. The most accidents, many of them albeit minor, seem to occur on those first few days of rain after a long period of rain-free weather. Granted, some of this can be attributed (and always is by the local media) to the fact then many of the roads get a covering of oil that doesn't mean much when the rods are dry but can become a real problem when the first rain starts to fall.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

Rain?

Subtract quite a few degrees C.

RE: Tesla Autopilot, fatal crash into side of truck

If you meant 'ice' or 'snow', that was NEVER made clear in your post. And BTW, it snows in Southern California every winter, just that it keeps its distance, up in the mountains, so that you can go visit it on the weekends if you want winky smile

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

Interesting, but do they work on the trucks with trailers? The doubles and triples.

Local deliveries would not make as much since as the long haul where there is a shortage of drivers. And the long haul is where the changing weather, and road conditions, weigh stations, fuel stops, and self reliance are required. Granted over 50% of long haul is by rail, much of it is not.

In fact UPS does put trucks on rail from CA to NY. But there are still routes between cities that don't have direct rail service (Kansas city to Denver comes to mind).


RE: Tesla Autopilot, fatal crash into side of truck

"we're only confronted with that situation a few months out of each year"

we wish winky smile typically, it's only about 35 rainy days per year. The storms are so spaced out that we typically take a couple of storms before everyone is acclimated to rainy driving, and the oil sheen has washed out.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers

RE: Tesla Autopilot, fatal crash into side of truck

Perhaps, but you have to admit that those "35 rainy days" are spread over "a few months" each year...

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

LOL, of course ABS works ~perfectly~ in sunny California. Too bad it fails when presented with other real-world driving challenges.

RE: Tesla Autopilot, fatal crash into side of truck

The maximum braking force is generated just before the tire starts to slide or skid. I see ABS as a second chance and a warning to ease off the brakes and apply just a little less force on the brake pedal.
I have verified this by experiment. Coming up to the corner stop sign at at the same speed and distance from the stop sign, apply the most braking that I can without triggering the ABS one time and letting the ABS buzz away the next time. Yes, 3 or 4 time as much braking distance with ABS.
Use ABS as a warning and back off the d___ brake pedal.

Interesting, but do they work on the trucks with trailers? The doubles and triples.
Well a "Super B Tridem" will have 30 tires available for braking and only 8 tires available for pulling traction. The automatic chains are just used on the driving wheels.
Ever watched "Ice Road Truckers"? At time they are only running with chains on 2 or 4 out of 8 driving tires.

They still running demolition-derbies at county fairs?
Out here we run "Combine Crunches" at the rodeos. Instead of cars we use old agricultural combine-harvesters.
While we seem to be losing more rights and privileges every year, out here it is still legal to operate a motor vehicle while under the influence of country music.

Bill
--------------------
"Why not the best?"
Jimmy Carter

RE: Tesla Autopilot, fatal crash into side of truck

Ever hear of the American Auto Duelist Association? Great testing for self driving cars.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (waross)

The maximum braking force is generated just before the tire starts to slide or skid.

This is not actually the case. Maximum force between a tire and paved surface is achieved at a non-zero slip ratio, between 10%-20% slip depending on tire design, surface type, and other variables.

Quote (waross)

Yes, 3 or 4 time as much braking distance with ABS.

3 or 4 TIMES?

As in, if stopping under threshold braking took 50 feet, an ABS stop took 200 feet?

I don't believe you. IF that is the case, your vehicle has something very severely wrong with it.

The point of ABS is NOT to achieve the shortest possible braking distance- the point of ABS is to minimize braking distances for the vast overwhelming majority of drivers, who in an emergency situation will slam the brake pedal down with every muscle fiber they possess.

RE: Tesla Autopilot, fatal crash into side of truck

jgKRI - In certain snow conditions, maximum braking force is generated with the wheels locked. And yes, I can achieve MANY times less braking distance compared to the way the ABS wants to do it. The main purpose of ABS is not to achieve minimum braking, it's to help maintain control of the vehicle while ramming down the brake pedal as hard as you can. The problem is when it decides to achieve this goal by almost not applying the brakes.

RE: Tesla Autopilot, fatal crash into side of truck

I can't swear to the actual % increase in stopping distance, But when I first got an SUV (Yukon) it scared me in the distance it required for a panic stop. Like Waross says, I had to relearn to "let off". while replacing the factory supplied tires helped a lot, it wasn't till I pulled the ABS fuse that I felt I could get a decent emergency stop.

I am now scared when I have a SUV full of kids tailgating as I KNOW it can not stop if needed to.

RE: Tesla Autopilot, fatal crash into side of truck

Happens on dry surfaces, too. It feels like those old GM cars where the harder you pushed the brakes, the more they faded and the less they worked. You learned to ride them just at the fade point. ABS is the same!?

RE: Tesla Autopilot, fatal crash into side of truck

Well it is a fairly easy test to set up, so we can quit the hand waving, and I'd agree that some ABS systems are worse than others on snow and gravel. But I am a bit surprised to hear that people think they can beat ABS on dry hard surfaces, in normal traffic situations. Equally I doubt the ABS would beat a reasonably skilled driver who was primed for the braking event and not fatigued. I doubt it was banned from Formula 1 because it made the cars uncompetitive.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Tesla Autopilot, fatal crash into side of truck

If WAROSS was referring to a gravel road or an icy road and his ABS/no ABS braking test (he didn't say) then I would have to agree. I've tried this myself and especially on ice but also on gravel the ABS causes longer braking distance. Quite a disturbing amount at times.

As Waross says, ease off the brake just below ABS threshold to get a good stop, if the road surface is anything but bare and dry pavement. It takes some practice on various roads and conditions before you get a feel for it for any given car or truck. I tend to do this "practice" when nobody else is in the car with me. However, I have made it a point to test the ABS on pavement during test-drives, with the dealership salesman beside me. I enjoy scaring those guys.

STF

RE: Tesla Autopilot, fatal crash into side of truck

And Uber is planning on using self-driving cars for taxis. It will have a driver, but will they really be paying attention?

RE: Tesla Autopilot, fatal crash into side of truck

Which "driver"?

Or do you mean both of them?


Norm

RE: Tesla Autopilot, fatal crash into side of truck

coloeng, I'd say not, see: GregLocock (Automotive), 3 Jul 16 09:26 & 12 Jul 16 18:45 for example.

Few things are more disengaging than riding heard on automated systems, i.e. watch, but don't touch.

Regards,

Mike

The problem with sloppy work is that the supply FAR EXCEEDS the demand

RE: Tesla Autopilot, fatal crash into side of truck

IRStuff- that is true, but another video is out there somewhere, showing a minor collision with object on the left as well.

RE: Tesla Autopilot, fatal crash into side of truck

(OP)
BBC News report on Tesla and its supplier MobilEye squabbling

BBC Newshour Extra 'Driving into the future' (as opposed to the side of a truck). 50m audio podcast, downloadable MP3.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (Reuters)

Tesla removed a Chinese term for "self-driving" from its China website after a driver in Beijing who crashed in
"autopilot" mode complained that the car maker overplayed the function's capability and misled buyers.

Keith Cress
kcress - http://www.flaminsystems.com

RE: Tesla Autopilot, fatal crash into side of truck

Nah, I think I read about that back when it happened (side rails to keep cars from going under trucks, that is). Must be a slow news day.

Let's propose instead, that all autos should have big posts sticking up on the corner like a four-poster bed, that'd do the same thing, right? or is it only a good idea if somebody else is paying for it?

RE: Tesla Autopilot, fatal crash into side of truck

It's interesting that American truckers have invested some money in under-trailer aero skirts in order to save a few bucks on fuel.
Certainly a reasonable amount of structure behind the skirts wouldn't add a lot to the cost of the skirts.
... BUT, whether the structure was sturdy enough to bounce the car off with little damage to the truck, or even if the structure were designed to dissipate substantial energy, say by decelerating the car in the width of the trailer, that crash would not have been survivable either way, witness the damage from the tree.

Mike Halloran
Pembroke Pines, FL, USA

RE: Tesla Autopilot, fatal crash into side of truck

One point is that skirts may have made to trailer more visible so that the automated system could have avoided the crash.
apart from the color of the sky, I understood that there was a type of proximity sensor which did not scan high enough above the ground to see the trailer. It would have seen/detected a skirt extending close to the ground.

Bill
--------------------
"Why not the best?"
Jimmy Carter

RE: Tesla Autopilot, fatal crash into side of truck

That's my point. The sensor may have ignored higher items assuming that they were signboards. The sensor should have reacted to a skirt that extended close to the ground. Either a light skirt for aerodynamic advantage or a substantial skirt for protection.

Bill
--------------------
"Why not the best?"
Jimmy Carter

RE: Tesla Autopilot, fatal crash into side of truck

Sideskirts or the lack thereof are not the issue. The "autopilot" system was being used in manner for which it was not designed (which was keeping the car between the lines on a limited access highway and not rear-ending the car in front of it). It was not equipped to recognize and manage traffic controls and intersections. Otherwise it would have seen the tractor/rig the second before it saw the gap under the billboard it thought it could drive under and said, "Hey I'm doing 70 and there is crossing traffic immediately in front of me, should I be worried about this?" IRstuff points that out in the second post of this thread.

Another novel idea, rather than lowering everything that the car might run into so it can see it, would be to raise the sensor's height to say, maybe the level at which things won't decapitate you if you try to drive under them. But that's the trouble with relying on a camera and some proximity sensors to paint a picture of a dynamic environment precisely enough that you can navigate a car at speed through it. The LIDAR approach is much more robust (and expensive, and complicated).

Any car that I am ceding authority for decision making to had better be able to tell a semi-truck from a billboard from a Sasquatch without any special modifications to those things.

RE: Tesla Autopilot, fatal crash into side of truck

If you look at the number of cars with burned out lights, and extend that to self driving cars, where the owner would not have replaced a defective camera, then I would expect this might happen more often.

On the other hand, if self driving cars either won't self drive, or just will not move for a defective camera, then a one off event is nothing but a computer glitch.

Just reboot the car and replace the passenger.

The debate on if you trust a self driving car over a human driven car, is the same question if you trust the driver of your car pool.

A bigger debate is who gets the ticket for careless driving?

RE: Tesla Autopilot, fatal crash into side of truck

JohnRBaker,

Why is the government responsible for this? How about the lobbyists? What is to stop a trucking firm from installing this on their own, rules or now rules?

Would a skirt have prevented this fatality? Did the car even brake?

--
JHG

RE: Tesla Autopilot, fatal crash into side of truck

Cost will stop a trucking firm from doing this on their own. Margins are slim in that business and if there is no economic payback to do something, and no rule dictating that "thou shalt", it doesn't get done because then one would be at a cost disadvantage to one's competitors who aren't doing it, either.

I have my doubts that such a structure would actually stand up to being hit by a car at 70 mph. I think they're mostly meant to reduce the chance of pedestrians and bicyclists being run over in city traffic.

Rear under-ride protection on box trailers needs work, too. Better designs are available, but they're not mandated, so they don't get used.

RE: Tesla Autopilot, fatal crash into side of truck

drawoh, I never said anything of the sort. I only posted the item because it provided an additional point of view and commentary.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

Just going by the Yahoo column, no surprises that he was speeding and not minding the guidance of the car. He had demonstrated as much on his Youtube videos.

This does bother me:

Quote:

The NTSB report disclosed that the Tesla Model S uses a proprietary system to record a vehicle's speed and other data, which authorities cannot access with the commercial tools used to access information from event data recorders in most other cars. For that reason, the NTSB said it "had to rely on Tesla to provide the data in engineering units using proprietary manufacturer software."

Not possible to collect vehicle data on a Tesla without going to the factory? Do these things not have a CANBUS port? Can the NTSB validate the data that Tesla provided?

STF

RE: Tesla Autopilot, fatal crash into side of truck

That's not a problem unique to Tesla. Part of the Toyota throttle cable farrago was that only Toyota could access the detailed data, from memory. The USA could usefully introduce legislation making the content of vehicle black boxes accessible to others.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

RE: Tesla Autopilot, fatal crash into side of truck

That's unlikely to happen in the current Congress and administration; the companies would merely cry, "Burdensome regulation!" and the pols would scurry away.

Besides, accessibility is only part of it. There would need to be a general protocol, which would require a standard. There would need to be some assurance that the hardware could survive crashes that it currently would not, etc. The end result would be an orange box ala airplane recorders, which is not cheap.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

RE: Tesla Autopilot, fatal crash into side of truck

And 'black box' data from cars can and has been used in court. I know this first-hand as several years ago a coworker of mine was found guilty, based at least in part on the contents of his SUV's 'black box', after his involvement in an accident where three members of a family were killed. He was driving under the influence and the data showed that he never touched his brakes. He's still severing a prison sentence for murder.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

The data exists, yes, but, as was mentioned, it's in proprietary formats. The hardware might survive most crashes, but the percentage is probably lower than what something like an NTSB would need or want.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

RE: Tesla Autopilot, fatal crash into side of truck

Here's the 500 pages of info - it's not a single report, it's the entire docket (like the entire collection of documents produced):

HWY16FH018 Docket

Also, I'm not sure if anybody has ever listened to this, but the podcast 99% Invisible, which is about design and architecture, did a 2-part podcast on "the automation paradox" in which automation actually makes things more unsafe in an emergency because it hands over control to an un-attentive human, whose skills have atrophied, at the worst possible time. The first part was about the Air France 447 crash, the second is about self-driving cars.

I think it's totally relevant here. The link is here: Johnnycab (Automation Paradox, Pt. 2. In this case, Tesla calls their system "Autopilot". Telling the driver "oh, you have to keep your hands on the wheel" when every incentive is to just treat it as a fully autonomous car, may not hold a lot of water. I wonder what the liability courts will say.

RE: Tesla Autopilot, fatal crash into side of truck

It wasn't my fault, sir, please don't deactivate me. I told him not to go, but he's faulty, malfunctioning. Kept babbling on about his mission. - C-3PO concerning R2-D2

Guess you'd have to sue for deactivation.

Richard Feynman's Problem Solving Algorithm
1. Write down the problem.
2. Think very hard.
3. Write down the answer.

RE: Tesla Autopilot, fatal crash into side of truck

Quote (olynyk)

... in which automation actually makes things more unsafe in an emergency...

This is not a new issue.

As soon as it became technically possible to automate a vehicle with computers, engineers HAD to figure out what level of automation was appropriate.

Reprinted from "Digital Apollo" by David Mindell https://books.google.ca/books?id=gXYItzQARVoC&...



While it was easy to identify the extremes (as in the cartoon above), the best middle-ground was hotly debated. It was a very contentious issue, with astronauts pushing hard to have maximum control over the spacecraft, program managers pushing just as hard to ensure 100% mission success (by eliminating human error) - all of it offset by insightful engineers and specialists who figured out just how much command and control fidelity a trained human can handle in a computerized system before becoming overloaded.

NB. This is probably why Niel Armstrong was chosen to be the first man to walk on the moon. His mastery over computer control systems was unmatched by most other astronauts and it saved his life over and over through X-15 flying, NASA training, and Gemini. Everyone in the NASA chain of command knew that no computer could kill him. He made sure to prove it one last time, 25 seconds before touchdown!

STF

RE: Tesla Autopilot, fatal crash into side of truck

The recent collision of a US naval vessel and a rather gigantic cargo ship is an excellent example of such failures. One would have thought that all those radars and bodies in the bridge would have noticed that they were going to collide. I can't even being to picture why the US vessel was even that close to the other ship: http://abcnews.go.com/US/navy-destroyers-deadly-co...

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

RE: Tesla Autopilot, fatal crash into side of truck

Yes, and the original Mercury capsule didn't even have a window until the astronauts demanded one be added.

John R. Baker, P.E. (ret)
EX-Product 'Evangelist'
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without

RE: Tesla Autopilot, fatal crash into side of truck

Designed by the original nerds for their pet chimps.
No real pilot will get into anything without a window.

Richard Feynman's Problem Solving Algorithm
1. Write down the problem.
2. Think very hard.
3. Write down the answer.

RE: Tesla Autopilot, fatal crash into side of truck

SparWeb - I took Mindell's course on the history of Apollo in 2010 ("Digital Apollo" was one of the required texts for the course.) Easily the greatest, most interesting, most memorable course I've ever taken. He had 12 students have dinner with Commander David Scott! He's a great teacher and writer.

RE: Tesla Autopilot, fatal crash into side of truck

Olynk,
Where's the smiley icon for "brimming over with jealousy?" wink

Now that you've got me googling his name again, I see he has published a new book. Thanks!

STF

Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

Red Flag Submitted

Thank you for helping keep Eng-Tips Forums free from inappropriate posts.
The Eng-Tips staff will check this out and take appropriate action.

Reply To This Thread

Posting in the Eng-Tips forums is a member-only feature.

Click Here to join Eng-Tips and talk with other members! Already a Member? Login



News


Close Box

Join Eng-Tips® Today!

Join your peers on the Internet's largest technical engineering professional community.
It's easy to join and it's free.

Here's Why Members Love Eng-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close