×
INTELLIGENT WORK FORUMS
FOR ENGINEERING PROFESSIONALS

Are you an
Engineering professional?
Join Eng-Tips Forums!
• Talk With Other Members
• Be Notified Of Responses
• Keyword Search
Favorite Forums
• Automated Signatures
• Best Of All, It's Free!

*Eng-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

#### Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

# Tesla Autopilot2

## Tesla Autopilot

(OP)
I think it's worth breaking out Tesla from Uber, both the hardware and software are different.

So, the second crash of a Tesla into a stationary firetruck looks like more than a coincidence. Even if Autopilot wasn't engaged (the police say it was) Automateic Emergency Braking should have stopped this.

From tesla's website

Standard Safety Features
These active safety technologies, including collision avoidance and automatic emergency braking, have begun rolling out through over-the-air updates

Automatic Emergency Braking

Designed to detect objects that the car may impact and applies the brakes accordingly

Side Collision Warning

Warns the driver of potential collisions with obstacles alongside the car

Front Collision Warning

Helps warn of impending collisions with slower moving or stationary cars

So questions that need to be asked, are which of these were fitted to the crash vehicle? AEB is widely available on other cars, but according to Tesla forums it is possible that it was remotely disabled. According to one user you can set AEB to warn only. That is bizarre choice of UI design.

Anyway, I think so far there have been three AP collisions with large objects on the road in front of the car in good viewing conditions, the idiot with the truck crossing the road, and two fire trucks.

Cheers

Greg Locock

New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

### RE: Tesla Autopilot

Of the systems I've seen, there are significant limitations to the ability to auto-braking based on approach speed. I think the one from Volvo is delta-V of 30mph max; I know I saw a demo for it and they had really big targets and rather low approach speeds.

"In tests in which the vehicle traveled at under 30 miles per hour, systems designed to prevent crashes successfully avoided collisions in 60% of the time. Meanwhile, automatic emergency brakes designed to only reduce crash severity were able to completely avoid crashes in only 33% of test scenarios, according to AAA." http://fortune.com/2016/08/24/aaa-automatic-brakin...

I see a lot of articles based on the same AAA test in 2016.

It seems like the tech is perfectly fine to prevent fender-benders in stop and go traffic and as the delta-V departs from that it becomes increasingly unreliable across the board.

### RE: Tesla Autopilot

What that means is that the average crash severity will increase. Not because things are getting worse, but simply because the bottom end of the curve gets cut off. That can be seen, correctly as far as I can tell, as an overall improvement or it could be seen as a degradation because per centage wise the severe crashes increase. One perception is damning while the other is encouraging.

### RE: Tesla Autopilot

(OP)
Reading the Tesla forums is interesting. If AP is activated then AEB should never be necessary, AP is supposed to offer a kindler gentler more predictable driving pattern. AEB is designed to avoid false positives, and basically if it detects an unavoidable collision jams on the brakes until the speed has dropped by 25mph, or the steering wheel is touched (that's a failure mode I'd investigate) or the accelerator is touched, or the target disappears. AEB does have a speed limitation, which varies, but it seems like at least some versions work below 85 mph.

Cheers

Greg Locock

New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

### RE: Tesla Autopilot

#### Quote (davidbeach)

One perception is damning while the other is encouraging.
The cost savings to insurance companies to write off less fender-bender damage still makes it worth it? My insurance company will pass the savings on to me, right? :/

STF

### RE: Tesla Autopilot

3DDave: "It seems like the tech is perfectly fine to prevent fender-benders in stop and go traffic and as the delta-V departs from that it becomes increasingly unreliable across the board."

davidbeach: "What that means is that the average crash severity will increase. Not because things are getting worse, but simply because the bottom end of the curve gets cut off."

If drivers become accustomed to, or expect, their car to brake for them, severe crashes could become more common, as well. These systems need to function more than a third or even two-thirds of the time, since people WILL rely on them, whether they should or not.

Edit: "...function more than a third..." not "...function a third..." Sorry for any confusion.

### RE: Tesla Autopilot

It's unclear whether that will actually happen; I would have thought the same about ABS, but people don't seem to be getting into those kinds of crashes. I think that we're close to the limit of what our normal "chicken" limits are.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

For that matter, will AVs be aware of wet, icy, oil spill or other low friction conditions and reduce speed, respectively increase following distance accordingly, as an attentive and prudent human driver would?
Or are they going to operate according to nominal mu conditions at all times?
If they are going to dynamically adjust the assumed mu, I'd be interested in the algorithms for that.
As a human driver, I have many data I monitor to adjust my assumed mu downward from the nominal...: ambient temperature, appearance of the road, weather history of the preceding several hours, presence of vehicles in the ditch on the side of the road or otherwise appearing to have lost control due to insufficient traction; not to mention experience on certain patches of road, e.g. near intersections, where mu is noticeably below par.

"Schiefgehen wird, was schiefgehen kann" - das Murphygesetz

### RE: Tesla Autopilot

There was a good television documentary program about highway safety and its history. Although cars are far safer than years ago, fatality rates have stopped going down. One expert speculated that the reason was that roads have been straightened and drivers feel so safe in their cars that they are driving faster. The result is that accidents have become more severe with more deaths. He said that the greatest true safety measure would be to put a 12" spike in the steering wheel pointed at the driver's chest. Drivers would drive a lot more safely.

### RE: Tesla Autopilot

I'm relatively certain that texting are other "distracted driving" situations including "distracted pedestrians" are the chief cause of increased deaths. The 12" spike would do nothing, they pretty much build cars that way for decades. I really don't think people get into cars and and think, this car is so safe I can drive like an idiot.

----------------------------------------

The Help for this program was created in Windows Help format, which depends on a feature that isn't included in this version of Windows.

### RE: Tesla Autopilot

I had thought I posted this graphic recently, but couldn't find it. It shows that the fatality rate per capita is still on a downward trend, and what's really flattened out is the deaths per Vehicle Mile Traveled (VMT), whose increase is somewhat matching the population rate. Therefore, what's flattening the death/VMT is the increased exposure to driving, i.e., we're driving more and farther, which increases the probability of death per year.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

On a related note to this discussion, I have a car which has a limited emergency braking system (speeds <19 MPH). Above that speed, it will warn me of a collision. While driving the other day, and following at a safe distance, the car in front of me smacked a pothole which created a spray of water and aggregate. Curiously, the radar system interpreted this as an object and warned me of a collision with it. Now this tech in my car was in the early days for Mazda (2014). And they have since expanded its capabilities as they have worked some of these issues out. But this "blip" I experienced really served to illustrate the challenges that exist in getting to Level 2 and higher automation.

### RE: Tesla Autopilot

irstuff,

If the graph had continued on a steady path from 1985 to 1995 the US would start seeing resurrections of the dead about the year 2005. Fortunately this will be avoided for the time being.

### RE: Tesla Autopilot

(OP)
152 ft is poor for a medium size car, and the difference to Tesla's claim of 133 is probably statistically significant (repeatability is about 3 ft from memory). CU say "The Tesla Model 3’s 152 feet is 21 feet longer than the class average of 131 feet for luxury compact sedans". There are plenty of ways of gaming the braking test, and CU's test procedure is not ideal. A much more repeatable test is the AMS one where the braking starts at 66 mph and the distance from 60 to rest (roughly speaking) is the quoted figure. CU's procedure relies on the force buildup.

There's also the old favorite of how grippy your track is (we measure ours).

Cheers

Greg Locock

New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

### RE: Tesla Autopilot

Those Teslas weigh at least 1000 pounds more than a ICE vehicle with the same seating capacity. All those batteries. There's only so much brake disk you can squeeze into a 21" wheel. Does that explain the braking distance?

IRStuff: What is the source of that graph? I'd like to see the graph of "Fatalities per billion VMT" plotted on a logarithmic scale. I don't think that kind of number can actually trend to zero. By definition I think it can only trend to smaller and smaller fractions.

STF

### RE: Tesla Autopilot

(OP)
SparWeb - that may be a little of it, but dry braking is primarily a function of tires, usually. There are effects on subsequent stops, where the brake rotor size is important, but from 60 mph you are well within the thermal limits of the brakes.

CR has a huge effect on new car purchases, at least for ICE cars. I imagine Musk will be blustering away about this on Twitter for a few days, and then they'll work with CR to find out what is going on. I saw some nasty creep from Tesla claiming it didn't matter because it could be solved by a software upgrade. Um, no, almost certainly not.

Cheers

Greg Locock

New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

### RE: Tesla Autopilot

"Regenerative braking (using the electric motors as generators) should help (a bit) with braking distances."

That might actually be what's screwing it up. Longer stopping distances are not the result of inadequate braking power. They can all apply enough force to lock up the brakes. It's either a delay in fully applying the brakes (after the pedal is pushed) or the brakes are being released more than they should be during braking.

### RE: Tesla Autopilot

#### Quote (Spartan5)

On a related note to this discussion, I have a car which has a limited emergency braking system (speeds <19 MPH). Above that speed, it will warn me of a collision. While driving the other day, and following at a safe distance, the car in front of me smacked a pothole which created a spray of water and aggregate. Curiously, the radar system interpreted this as an object and warned me of a collision with it. ...

I have been trying to understand the underlying intelligence for this stuff. If the robot detects an object on the road ahead of it, it must take evasive action, in most cases, stopping. Any other action requires the robot to be extremely reliable at identifying safely hittable objects. Just how good is your radar's resolution of objects?

--
JHG

### RE: Tesla Autopilot

There's a risk of over-thinking this. Perhaps these incidents are the outcome of simple and rudimentary design or programming errors. Mistakes of a nature that we could not imaging being involved with ourselves. Occam's razor.

### RE: Tesla Autopilot

(OP)
That's right. You can't do a 0.7g stop in the middle of traffic just because a plastic bag gets blown across the road (unless everyone is in AVs and they react the same). So the software has to take the world map that has been integrated from the inputs generated by the sensors, and decide which objects are worth ignoring.

Cheers

Greg Locock

New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

### RE: Tesla Autopilot

#### Quote (VE1BLL)

There's a risk of over-thinking this. Perhaps these incidents are the outcome of simple and rudimentary design or programming errors. Mistakes of a nature that we could not imaging being involved with ourselves. Occam's razor.

I think it is being overthought, in the other direction. That isn't to say that there aren't programming errors- the software is written by humans, and there are certainly mistakes.

But...

Current technology being what it is, designing a solution which is capable of everything everyone in this thread wants it to be capable of AND is inexpensive enough for implementation into a mass-production vehicle is not possible.

The primary restrictions are $/Ghz of processing power, and$/pixel of sensor fidelity. Processing is the bottleneck. Programmers have to make compromises and use simpler routines than they would like to, because the processing isn't fast enough to process everything fast enough.

### RE: Tesla Autopilot

(OP)
In that case the technology should not be used on public roads without adequate safety procedures. Any physical test i do either has to be a standard test or I have to do an FMEA on it and get all sorts of people to sign off on it. I don't see why AV operators should not have to do the same.

Cheers

Greg Locock

New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

### RE: Tesla Autopilot

Greg, "I don't see why AV operators should not have to do the same"

because AV is hip and cool?

### RE: Tesla Autopilot

GregLocock, are you suggesting that the "operator" (read "passenger") in an AV will have to be ready to assume control of the vehicle at any time? If so, then what's the point of the automation?

### RE: Tesla Autopilot

HotRod10:

I think it is important that we be clear to differentiate and be mindful of the different levels of autonomy while we continue this discussion.
https://www.caranddriver.com/features/path-to-auto...

#### Quote:

Because no two automated-driving technologies are exactly alike, SAE International’s standard J3016 defines six levels of automation for automakers, suppliers, and policymakers to use to classify a system’s sophistication. The pivotal change occurs between Levels 2 and 3, when responsibility for monitoring the driving environment shifts from the driver to the system.

Level 0 _ No Automation
System capability: None. • Driver involvement: The human at the wheel steers, brakes, accelerates, and negotiates traffic. • Examples: A 1967 Porsche 911, a 2018 Kia Rio.

Level 1 _ Driver Assistance
System capability: Under certain conditions, the car controls either the steering or the vehicle speed, but not both simultaneously. • Driver involvement: The driver performs all other aspects of driving and has full responsibility for monitoring the road and taking over if the assistance system fails to act appropriately. • Example: Adaptive cruise control.

Level 2 _ Partial Automation
System capability: The car can steer, accelerate, and brake in certain circumstances. • Driver involvement: Tactical maneuvers such as responding to traffic signals or changing lanes largely fall to the driver, as does scanning for hazards. The driver may have to keep a hand on the wheel as a proxy for paying attention. • Examples: Audi Traffic Jam Assist, Cadillac Super Cruise, Mercedes-Benz Driver Assistance Systems, Tesla Autopilot, Volvo Pilot Assist.

Level 3 _ Conditional Automation
System capability: In the right conditions, the car can manage most aspects of driving, including monitoring the environment. The system prompts the driver to intervene when it encounters a scenario it can’t navigate. • Driver involvement: The driver must be available to take over at any time. • Example: Audi Traffic Jam Pilot.

Level 4 _ High Automation
System capability: The car can operate without human input or oversight but only under select conditions defined by factors such as road type or geographic area. • Driver involvement: In a shared car restricted to a defined area, there may not be any. But in a privately owned Level 4 car, the driver might manage all driving duties on surface streets then become a passenger as the car enters a highway. • Example: Google’s now-defunct Firefly pod-car prototype, which had neither pedals nor a steering wheel and was restricted to a top speed of 25 mph.

Level 5 _ Full Automation
System capability: The driverless car can operate on any road and in any conditions a human driver could negotiate. • Driver involvement: Entering a destination. • Example: None yet, but Waymo—formerly Google’s driverless-car project—is now using a fleet of 600 Chrysler Pacifica hybrids to develop its Level 5 tech for production.

### RE: Tesla Autopilot

"Hotrod-no I didn't say that or think that or mean to imply that."

Ok. Sorry, I apparently misunderstood you. What then were you advocating that "AV operators should...have to do..."?

"L2 and L3 are recipes a disaster in my opinion, as people will treat them as L4s."

I agree completely.

### RE: Tesla Autopilot

"Any physical test i do either has to be a standard test or I have to do an FMEA on it and get all sorts of people to sign off on it. I don't see why AV operators should not have to do the same."

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

#### Quote (Level 2)

"The driver may have to keep a hand on the wheel as a proxy for paying attention."

Hmmm...

The wording as written (and perhaps even as intended) raises all sorts of questions and concerns.

### RE: Tesla Autopilot

Greg, I agree with you about L2 and L3. Hell, even L4 is questionable. Roads change all the time so how can anyone 100% ensure the road is still meeting the required conditions for autonomous operation?

But then I also agree with you in that if the technology is not yet capable then they shouldn't be operating on public roads.

### RE: Tesla Autopilot

The main intermediate step is gaze detection, a helpful system for even Level 0 operation that alerts the inattentive driver to a lack of attention. This would not be as easily defeated via water-bottle wedged into the steering wheel. It could easily be wired to the four-way flashers to alert other drivers to an out-of-control car.

### RE: Tesla Autopilot

"Hotrod-no I didn't say that or think that or mean to imply that. L2 and L3 are recipes for disaster in my opinion, as people will treat them as L4s. "

This already happens now- reference the guy in the UK busted for moving to the passenger side of his Tesla mid-drive.

I think the SAE standard for AV 'levels' is reasonable- but I don't think the general public has any real understanding of what those levels mean, or the safety tradeoffs they represent.

For the general public it comes down to 'do I still have to drive or does the car do everything'.

Tesla did themselves no favor in this regard by using the name 'Autopilot' for their system.

### RE: Tesla Autopilot

jgKRI - By any other name driver's would abuse it.

### RE: Tesla Autopilot

I agree- but it still wasn't a great choice in my opinion, because of the capability it implies.

### RE: Tesla Autopilot

#### Quote (jgKRI)

I think the SAE standard for AV 'levels' is reasonable- but I don't think the general public has any real understanding of what those levels mean, or the safety tradeoffs they represent.

If they're meant for a consumer product but the consumers don't understand the levels or what they mean, then they're rather useless.

### RE: Tesla Autopilot

"then they're rather useless"

The consumer does not need to know what they mean, they only need to clearly know what their purchase can do, or not do.

For example, "Autopilot" to a civil pilot mean something that maintains course and speed; no pilot would think that it should automatically and reliably dodge buildings and other airplanes. The general public, on the other hand, may, and does, see something different. Clearly, Tesla didn't fully consider the implications of people ignoring the insufficiencies of Autopilot, in addition to not having a sufficient robust system to start with. With Musk's disdain for lidar, it almost guarantees further failures and collisions.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

Gaze detection: good idea, but does it go far enough? How about brain wave detection, to see if the driver is going to sleep? Or better, if his/her intelligence is below a standard level deemed necessary to drive (of course, this would prevent the majority of drivers in my area from even starting the car).

### RE: Tesla Autopilot

"Gaze detection: good idea, but does it go far enough? How about brain wave detection, to see if the driver is going to sleep?"

Most systems that can do gaze detection can do blink detection as well; blink detection, particularly the rapidity of eyelid open/close, can reliably detect onset of sleepiness, and even micro-sleeps.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

Spartan5,

I have responded to this previously. The SAE automation levels are useful for discussions about engineering and control. For the purposes of a consumer/driver, this is a binary state -- automated, and not automated. Words like "Automatic" and "Robot" should not be anywhere on the controls. Any word stronger than "Assist" should be banned from the controls of anything other than a full robot.

--
JHG

### RE: Tesla Autopilot

#### Quote (IRstuff)

The consumer does not need to know what they mean, they only need to clearly know what their purchase can do, or not do.

I don't get your point. Knowing what their purchase can do is similar to knowing what those levels mean and where their purchase falls into that list. So, are you saying they still need to know the same thing as those levels, but worded differently????

### RE: Tesla Autopilot

#### Quote (LionelHutz)

Knowing what their purchase can do is similar to knowing what those levels mean and where their purchase falls into that list. So, are you saying they still need to know the same thing as those levels, but worded differently????

Go to the middle of nowhere in Idaho and ask a random person at a grocery store what Tesla Autopilot can do, and you might get an answer that approximates the truth.

Ask this same person to explain the difference between level 3 and level 4 operation according to SAE J3016. You'll get a blank stare.

### RE: Tesla Autopilot

Perhaps one or more of those exquisitely-defined Levels are simply very bad ideas.

Perhaps in some future edition of the SAE standard J3016, they'll be forced to add notes such as:
"(This level has been proven dangerous and is not recommended.)", or
"(This level has been banned in all major jurisdictions.)"

Imaginary dialog:
"This seems like a very dangerous design concept."
"No, it's in accordance with SAE J3016 Level 3."
"Well I suppose that if it's been precisely defined, then it must be okay."

### RE: Tesla Autopilot

I'll try again. AFTER you re-write the "geek speak" level description into consumer language in an attempt to tell the consumer what their product will do, certain levels are still very bad ideas because the resulting product will not be properly understood or used by the consumer. It doesn't matter very much what the "geek" thinks the level should mean when providing the consumer with a product that is a result of that level is a very bad idea.

### RE: Tesla Autopilot

That presumes that the "bad" idea is simply tossed out there without safeguards. Automation, in general, isn't a bad idea; the implementations seen so far are simply not adequately safeguarded, mainly because the manufacturers haven't bothered to provide robust safeguards. We can look back at the development of the car itself for some lessons. We get into a car, push a button or turn a key, and the car starts, but the original cars weren't that straightforward; cranks and chokes and priming, etc., are all the things that fell by the wayside over the course of the maturation of cars. We're even eliminating "turn-key" as a benchmark of automation by eliminating the traditional key.

We've got industrial machines that can easily kill their operators, but under most circumstances, the high-school educated operator is able to safely operate the machines.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

The safety claims by Musk are hypocritical. He starts by stating the limitations that the users must accept, and concludes with spurious statistics.

"Schiefgehen wird, was schiefgehen kann" - das Murphygesetz

### RE: Tesla Autopilot

One point that Tesla makes is that it was strictly intended to be used on highways.

But, duh, there are lots of databases that tell you that sort of thing, and the Autopilot ought to not allow itself to be turned on when it's not anyway close to a highway.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

#### Quote (drawoh)

Spartan5,

I have responded to this previously. The SAE automation levels are useful for discussions about engineering and control.
That's why I posted them. I had thought that that is what this was.

### RE: Tesla Autopilot

Spartan5,

Indeed it is, however, this is a confusing thing to communicate outside the engineering world. The differences between the levels are minor, and we do not seem to be worrying about them here. No one has discussed say Level 3 versus Level 4, for example. There is a lot of confusion in the outside world about how responsible drivers are when they use automated vehicles. The human interface is a critical part of most technologies.

When technology creates a hazard, there cannot be confusion over who is responsible for mitigating it. Either the driver is 100% responsible, or the manufacturer of the car is 100% responsible.

--
JHG

### RE: Tesla Autopilot

I have another thought here. Perhaps there is a maximum level of automation that can be permitted when the driver is responsible for safety. The driver must be kept alert.

--
JHG

### RE: Tesla Autopilot

Machine vision sensors work pretty well to detect drowsy and missing drivers. If needed, they could even detect heartbeat to eliminate fake heads.

I suspect, though, if autonomous cars become the majority, there will be movement to eliminate human driving altogether, except on special roads.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

IRstuff,

You can ban the human drivers from the road, but what about bicycles, pedestrians, pets, loose shopping carts and Bambi and Bullwinkle? If you build an access limited track occupied entirely by robots, you have all sorts of opportunity to make the robots do cool stuff. If the robots are out in the real world where we are, they have to cope with unpredictable elements.

--
JHG

### RE: Tesla Autopilot

drawoh,
The levels are important for just the reasons you mention. There is extensive need for regulation tied to each of those levels. From functional requirements, to how they are marketed even. There is not good reason that these lower level systems can't utilize geofences and other active driver monitoring systems to minimize their potential for abuse or misuse.

It seems a lot of people are talking past each other in some instances in these threads because their not differentiating between the levels.

I mean, it shouldn't even be newsworthy that Tesla's are plowing into things because they are just a piddly Level 2 system and they're doing exactly what they are going to do when the driver isn't 100% paying attention. Tesla markets it like Level 4 though, and has extremely limited safeguards in place to keep it from being used that way.

### RE: Tesla Autopilot

Spartan5,

Let's look carefully at those levels.

Level 5: The automated car is in control, responding only to destination instruction from the passenger. If the car causes an accident, the manufacturer is legally and morally responsible. No problems.

Level 4: The car is highly automated. As the driver/passenger, I get in and indicate my destination, and I do something other than drive. I watch a movie. I read a book or magazine. I make out with my passenger. I do work on my laptop. I watch out the side window, perhaps taking pictures of interesting buildings as they go by. How rapidly can I transition to control of the vehicle in an emergency? An alert teenager waiting for an instruction to brake needs three quarters of a second. How much time do I need to go "What the f&&k?" "Holy f&&k!" and hit the brakes or change lanes? If I am not focused on driving, I don't want to be responsible. Okay, you the manufacturer are responsible for any accidents, including the ones that occur when I take control of the vehicle. I need to impress my female passenger somehow! Do you really want to be responsible?

Level 3: Conditional Automation. "The driver must be available to take over at any time." To me, that means the driver is gripping the steering wheel and watching out the front window. The best way to keep focused on driving is to steer the car. Perhaps the robot can help parallel park or back in. The instrument panel can ring buzzers and remind the driver that they are responsible if the car hits something. I think this is where we are having accidents.

Level 2: Partial automation. Same as level 3. I have no problems with the car watching out for and signalling hazards.

Level 1: No automation. The driver is responsible.

For Levels 1 to 3, the driver is responsible for safety, and must do nothing other than drive the car. There is minimal opportunity to take advantage of automation. For level 5, the manufacturer of the car is responsible for safety. Level 4 is not a functional concept.

--
JHG

### RE: Tesla Autopilot

I think Tesla's system is Level 2 (and seemingly a poor implementation, at that), but it presents itself to the end user (who knows no better) as Level 3, and it's marketed as having the hardware for Level 4 or 5.

### RE: Tesla Autopilot

#### Quote (DrawOh)

Level 4 is not a functional concept.

On the contrary- level 4 is, in my opinion, what most manufacturers are likely to ever obtain, and may be the limit of what is actually possible in the physical universe which we currently occupy.

Level 4 is level 5 with restrictions- for example, a level 4 vehicle operates as level 5 when it is sunny and beautiful out, but cannot operate at level 5 in a blizzard.

Think about the simply massive range of conditions under which it is possible for a human being to successfully navigate a vehicle; operation under that COMPLETE set of conditions, with ZERO exceptions for weather (no matter how severe), obstacles, inconsistent surface conditions (think off-road applications) etc etc. That is not just a tall order- it's massive. Gigantic. It is not only impossible right now, regardless of cost, it might never be possible at all.

Level 5, truly, is not just a car with no steering wheel- it's a car with no steering wheel that still gets you there in a winter storm in Buffalo. That's a long way off and maybe never attainable.

### RE: Tesla Autopilot

I agree with you, but that opinion of ours busts a whole lot of bubbles in tech and political circles.

Who here believes that production Tesla cars actually have "hardware for self-driving capability"? I think it will be found that they don't have sufficient sensors, and they don't have sufficient computing power, and it will never handle exceptions well enough. The list of such "exceptions" is long ... in fact, it is indeterminate - and that's only the things that we know we don't know.

### RE: Tesla Autopilot

(OP)
Level 4: The car is highly automated. As the driver/passenger, I get in and indicate my destination, and I do something other than drive. I watch a movie. I read a book or magazine. I make out with my passenger. I do work on my laptop. I watch out the side window, perhaps taking pictures of interesting buildings as they go by. How rapidly can I transition to control of the vehicle in an emergency? An alert teenager waiting for an instruction to brake needs three quarters of a second. How much time do I need to go "What the f&&k?" "Holy f&&k!" and hit the brakes or change lanes? If I am not focused on driving, I don't want to be responsible. Okay, you the manufacturer are responsible for any accidents, including the ones that occur when I take control of the vehicle. I need to impress my female passenger somehow! Do you really want to be responsible?

You have misunderstood L4. You are describing L3.

Cheers

Greg Locock

New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

### RE: Tesla Autopilot

Not according to SAE. Nothing on the streets comes even close to this. btw, this version is free to download, after creating an account

#### Quote (J3016_201609 Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles)

NOTE 1: The user does not need to supervise a level 4 ADS feature or be receptive to a request to intervene while the ADS is engaged. A level 4 ADS is capable of automatically performing DDT fallback, as well as achieving a minimal risk condition if a user does not resume performance of the DDT. This automated DDT fallback and minimal risk condition achievement capability is the primary difference between level 4 and level 3 ADS features. This means that the user of an engaged level 4 ADS feature is a passenger who need not respond to requests to intervene or to DDT performance-relevant system failures.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

BrianPeterson gets it!

### RE: Tesla Autopilot

1GregLocock,

The situation is that you build a car. I get in it and head for a crowd of school children. Someone is responsible for ensuring no one gets run over. At L5, there is no question that you are responsible. I have no access to controls. At L3, I am responsible, and my full attention must be on my driving.

If you are responsible for an L4 car, why have controls? A good design approach would be for the steering wheel, accelerator and brakes to be connected to the computer, not the car. This gives the robot the option of ignoring driver inputs. If the car is L5, it would be a good idea to have a plug-in control pack, allowing an authorized human to drive the vehicle if necessary.

I am sitting in your L4 car and I say to my passenger "Hold my beer and watch this." In court afterwards, my lawyer points out that all the evidence against me is from logs generated by software that you wrote. You are not a disinterested party.

If you are building an L4 car, what is your attitude towards the passenger/driver? Somewhere in these levels, in L4 in my opinion, responsibility is transferred from the driver to the manufacturer. This is very much more important than fine technical details.

--
JHG

### RE: Tesla Autopilot

The field of AI has had repeated 'Great Disappointments' over the decades. This is likely to be another example.

Famously, "A.I. is hard." - where "hard" is 'Comp Sci-speak' for exceedingly difficult.

They'll eventually figure out that, "A.I. outdoors is even harder."

### RE: Tesla Autopilot

"If you are responsible for an L4 car, why have controls"

Per SAE's definition, L4 is not 100% capable, AND, at all levels, including L5, the driver still has the right to drive the vehicle themselves:

EXAMPLE 1: The person seated in the driver’s seat of a vehicle equipped with a level 4 ADS feature designed to automate high-speed vehicle operation on controlled-access freeways is a passenger while this level 4 feature is engaged. This same person, however, is a driver before engaging this level 4 ADS feature and again after disengaging the feature in order to exit the controlled access freeway.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

(OP)
From the definition "Driver involvement: In a shared car restricted to a defined area, there may not be any."

Outside of that area, yes, it will be the driver's job to avoid emergencies. Inside that area, it is not. Major OEMs have already said that for L4 when under the control of the robot that they will have responsibility for the outcomes. That's why you won't see L3s from major OEMs.

Cheers

Greg Locock

New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

### RE: Tesla Autopilot

betrueblood
Gaze Detection...I like that. How does it work (do you need a clean windshield, etc.)?

Something much easier: Marijuana smoke detection. How about that one? Or, periodic breath-a-lizer?

### RE: Tesla Autopilot

"Driver involvement: In a shared car restricted to a defined area, there may not be any."

This isn't saying the driver can't be involved inside the defined area.

### RE: Tesla Autopilot

"[Walter Huang's] Tesla with Autopilot engaged accelerated toward a barrier in the final seconds before a deadly crash, an official report into the crash has revealed." The Register

### RE: Tesla Autopilot

Deeper look into the actual NTSB report reveals another point that Tesla will be sure to harp upon, the driver set the cruise speed to 75 mph, which is why Autopilot accelerated into the barrier when it perceived the slower moving car that was in front it was no longer there, because that car correctly followed the lane.

So far, there have been lots of Tesla accidents that are simply explainable by a functional mode that was completely oversold, misunderstood by its owners, and completely inappropriate as any serious means of aided driving.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

https://electrek.co/2018/06/10/tesla-version-9-sof...

Maybe this will make it all better.

I think not. The term "self driving features" bugs me. What's the list of "features" that a good human driver has, which permits them to drive (mostly) without crashing? If software is able to implement that list of "features", does that make them a good driver? What happens if one "feature" is missing or isn't fully developed - what are the repercussions?

Is this going to continue to encourage drivers to switch their brains off?

### RE: Tesla Autopilot

Seems to me to be extremely premature to call anything Tesla sells as have "self-driving" features. What Tesla has demonstrated to date is barely Level 2, but marketed as Level 3 or higher.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

#### Quote:

Deeper look into the actual NTSB report reveals another point that Tesla will be sure to harp upon, the driver set the cruise speed to 75 mph, which is why Autopilot accelerated into the barrier when it perceived the slower moving car that was in front it was no longer there, because that car correctly followed the lane.

So far, there have been lots of Tesla accidents that are simply explainable by a functional mode that was completely oversold, misunderstood by its owners, and completely inappropriate as any serious means of aided driving.

Obviously, the car wasn't correctly following the lane. It was following the wrong side of a line.

As for your other point. Funny how you crapped on for pointing out that the consumers don't understand the levels OR the consumer language converting the levels into a description of what was sold to them, then you basically bring up my same point again.

### RE: Tesla Autopilot

"Is this going to continue to encourage drivers to switch their brains off?"

Of course it will. In particular, at the point where the car is doing the steering, the human behind the wheel becomes a passenger and cannot be expected to to be alert and aware, so as to be ready to take over driving with split-second notice.

### RE: Tesla Autopilot

Tesla does not sell their car based on a technical SAE "level;" they sell an "Autopilot." My previous comments refer to the SAE definitions as not being useful for normal consumers to understand. The consumers obvious understand "Autopilot" to be something that allows them to watch videos or nap during driving, which it isn't, again, confirming that anything between Level 1 and Level 4 cannot be marketed to an audience that can't understand the technical differences, at least, not without a significant increase in education. Levels 2 and 3 are basically TLDR for the consumer and they hear only what they want to hear.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! https://www.youtube.com/watch?v=BKorP55Aqvg
FAQ731-376: Eng-Tips.com Forum Policies forum1529: Translation Assistance for Engineers Entire Forum list http://www.eng-tips.com/forumlist.cfm

### RE: Tesla Autopilot

So you do agree with my previous point then?

### RE: Tesla Autopilot

can someone give a review on the eng-tips smart car white paper advertisement that just hit my inbox.
looks interesting.

### RE: Tesla Autopilot

(OP)
Review: blah blah keyword blah blah meme blah blah tenuous link blah big ad.

I don't know enough about PLM to decide whether it is an important part of AV deployment. This paper didn't convince me that I was missing out on anything. I should add that only I do tiny amounts of work for the Driver Assistance Technology people, most of my interest comes about from trajectory estimation, world building, and computer vision. They are very interesting to me but not even slightly part of my work. I do write software, but it never gets near a production vehicle, so I have no idea how the production code process works with safety critical items. I have worked on code similar to that which would end up in production vehicles, but it wasn't safety related, and it was a long time ago in dog, or internet, years.

Cheers

Greg Locock

New here? Try reading these, they might help FAQ731-376: Eng-Tips.com Forum Policies http://eng-tips.com/market.cfm?

#### Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

#### Red Flag Submitted

Thank you for helping keep Eng-Tips Forums free from inappropriate posts.
The Eng-Tips staff will check this out and take appropriate action.

#### Resources

White Paper - Considerations for choosing a 3D printing technology
The adoption of 3D printing into major companiesâ€™ product development life cycles is a testament to the technologyâ€™s incredible benefits to consumers, designers, engineers and manufacturers. While traditional production methods have limitations in manufacturability, 3D printing provides unparalleled design freedom due to the additive method of building parts layer by layer. Download Now

Close Box

# Join Eng-Tips® Today!

Join your peers on the Internet's largest technical engineering professional community.
It's easy to join and it's free.

Here's Why Members Love Eng-Tips Forums:

• Talk To Other Members
• Notification Of Responses To Questions
• Favorite Forums One Click Access
• Keyword Search Of All Posts, And More...

Register now while it's still free!