Here’s a driving situation that I’m guessing most of us have all experienced at one time or another. You are driving along on a highway or freeway, moving at a relatively fast clip (say 60 miles per hour, the prevailing speed and as matched with other nearby cars), surrounded by a mild amount of traffic, but nothing so onerous as to bog down the overall flow of vehicles.
A car to your left unexpectedly decides to dart into your lane, cutting down on the distance you had from the car directly ahead of you. It’s one of those situations wherein the driver seems to want to swiftly go across multiple lanes of traffic, perhaps belatedly realizing that there is an off-ramp coming soon that they want to reach, and they had not been astute enough to gradually make their way over to the rightmost lane. The driver then makes another lane change, doing so into the lane to your right, now clearing the path ahead of you. Keep in mind that all of this is happening in a matter of a handful of seconds, everyone moving at 60+ mph during the course of this series of eye-blinkingly brief events.
Upon the driver having shifted into the lane to your right, you can now see more readily what’s ahead of you in your particular lane. Here’s the rub. Turns out that there is a car in your lane, up ahead of you, which has either come to a crawl or might even be entirely halted, possibly stalled on the highway or freeway. Because the other car had somewhat momentarily blocked your view, you had not been able to see that this stalled car was a menace-in-waiting to you and your car.
You don’t know whether the driver that had cut in front of you might have seen the stalled car and decided to make a quick escape, or whether they were merely completing their effort to get over into the rightmost lane for purposes of exiting at the next off-ramp. Either way, that driver has left you now holding the bag (a dangerous one, for sure!).
Essentially, you’ve been handed a hot potato. As recap, you were zooming along in your lane, and there is a car sitting in your lane, motionless, waiting to get smashed into by you, which you had not detected until the last moment, partially due to an interloper.
This is the moment that many drivers hope will never arise, yet it likely happens to many drivers, possibly with some substantial frequency, particularly if you are a daily driver that puts ample mileage on your car while commuting to work (I encounter these kinds of situations about once or so every two weeks, during my hour and a half long commute each day, each way, for work).
Typically, you maneuver out of the situation, often barely, by the skin of your teeth, though with about 6,300,000 car accidents occurring annually in the United States, some proportion of car crashes are undoubtedly due to this kind of inadvertent setup.
In a reported recent incident, a Tesla on Autopilot (according to the driver) rammed into a stalled vehicle on a highway, doing so while the Tesla was moving along at a speed of around 60 mph, and the crash occurred in a manner akin to what I’ve just described as a driving scenario.
According to the driver of the Tesla, another car cut in front of him, staying there fleetingly, then moved rapidly over to the next lane, and within moments it became apparent that a car was stalled up ahead, and he and his Tesla were going to ram right into it, full force. He and his Tesla did so, and luckily he lived to tell the tale.
It might be instructive to consider how this kind of a crash occurred and what it portends for Tesla drivers using Autopilot, along with ramifications for autonomous self-driving driverless cars in general.
Diagnosing What Happens In Stalled Car Crashes
Let’s take out our Sherlock Holmes magnifying glass and try to ferret out salient characteristics of these kinds of automobile-based death-defying (though sometimes death resulting) incidents.
As I earlier suggested, sometimes you can maneuver out of the situation, while other times there is not any viable recourse and you get pinned into ramming into the stalled car.
Consider these two key elements:
• The specific context at the moment. The context of the specific driving predicament is a big factor in what will transpire since it determines what options might be viable and which ones are not.
• Driver mindset and actions. The thinking processes and actions of the driver are another crucial consideration for how the circumstance will play out.
If you try to hit your brakes, the question arises as to whether you can come to a stop in time, though even if you cannot come to a halt soon enough to prevent ramming of the stalled car, at least if you can ratchet down speed off your car you are going to reduce the likely amount of danger and resultant damage that can occur when you rear-end the other vehicle.
Beyond dealing with the speed of your car, you might perhaps swerve into another lane, either to your left or to your right, allowing you to either avoid entirely the stalled car or maybe only sideswiping it, rather than plowing into it head-on.
Of course, the swerving action might be blocked by other cars that are to your left or right. Or, you might be able to do the swerve, yet other cars in the left or right lanes will then be disrupted by your movement into their lanes, possibly getting them directly involved in the pending crash. This often results in a domino-like cascade of cars hitting each other, doing so to avoid the sudden swerve that you made.
From the driver’s perspective, it is important to consider how much time did they have to take a potential avoidance kind of action and were they cognitively attune to have been able to use that time as best possible.
In other words, a human driver can be caught off-guard, and even if there was sufficient time to do something, the person might either become mentally confounded or be shocked into a state of being frozen, not sure of what to do, and potentially wasting those precious few seconds when an action might have made a significant difference to the outcome.
Now, let’s add into this scenario the use of automation, focusing on Advanced Driver-Assistance Systems (ADAS), and see how that changes the picture.
Semi-Autonomous Cars And The Co-Shared Driving Task
When reviewing an incident involving ADAS, and especially for those cars that are considered semi-autonomous, meaning they are not yet at a true autonomous level, not being at Level 5, and in the case of the Tesla Autopilot being at a considered Level 2, you need to imagine that there are essentially two drivers of the car, the human driver and the semi-autonomous automation.
What did your co-shared semi-autonomous driving “partner” do?
In theory, the Tesla Autopilot should not have been susceptible to a mind-freeze that a human being might have and would have unemotionally and computationally calmly calculated the matter. This would involve detecting the object ahead, and interpreting that the object was not moving, and ascertaining that the Tesla was moving toward the stalled object and would intersect (badly) with it, and then try to figure out what action to take.
According to the reported incident, the Autopilot did not engage the brakes. If so, we should be asking, why not? As a minimum, at least the Automatic Emergency Braking (AEB) should presumably have engaged.
Also, apparently the Autopilot did not try to swerve the car and avoid or reduce the head-on impact, which once again we are left to ask why it did not do so? Was it because it failed to consider utilizing any quick-maneuver options? As a side note, the Tesla manual warns that the Autopilot might not well handle these kinds of situations, though I have more to say about that aspect in a moment herein.
This incident further raises the question as to whether or not Tesla ought to be using LIDAR, a mash-up of light and radar that is a sensory device used by nearly all other autonomous car makers. Would a LIDAR device have potentially aided in detecting the stalled car? It is possible that a LIDAR unit, especially if positioned on the top of the vehicle, would have had an added chance of detecting the upcoming calamity, providing what I refer to as an essential omnipresence capability for the automation attempting to aid in driving the car.
On another notable facet of the incident, the human driver says that there was insufficient time for him to react.
Let’s make clear that this does not give the Autopilot a freebie in terms of suggesting that it too did not have time to react. The human driver might genuinely believe that he had insufficient time (he might be right, he might be mistaken), but the automation, working at the speed of onboard computers, could potentially have had time to do something. Plus, we’re removing the human mental coagulation time out of the equation when considering what the Autopilot automation system might have had time to do.
There is also a chance that the human driver might have assumed that the Autopilot was going to aid in the driving, doing so at this key or decisive moment, and thus the human driver might have instinctively delayed their own actions, spurred consciously or subconsciously under the assumption that their co-shared automation-based “driver” would come to their rescue.
Tesla typically points out in these kinds of incidents that it is the human driver that is responsible for the car, no matter what the Autopilot automation does or doesn’t do.
That’s a seemingly easy means to swipe away any possible limitations of the automation.
Plus, for human drivers, it is difficult to continually keep a mindset that you are presumably the captain of the ship, retaining ultimate responsibility, which is somewhat mentally undermined when you know that you have your second-in-command running things for you, the Autopilot, and then all of sudden, bam, turns out that you were supposed to be the one handling the controls (a Catch-22, as it were).
Furthermore, this kind of setup of Human-Machine Interaction (HMI) belies the aspect that the human driver might believe that the automation is going to do something, in spite of the human driver being informed possibly long-ago or as found in some owner’s manual that they cannot rely upon the automation. Human nature cannot be so readily overturned by merely telling someone to ignore their instincts or overcome what might have become an ingrained habit.
I have repeatedly forewarned that as we encounter the emergence of Level 2 with ADAS and Level 3 semi-autonomous cars coming into the marketplace, there will be a lot more of these kinds of incidents involving a co-shared human-machine driving effort that inevitably falters or fails to take what might have been suitable action to avoid or reduce a car crash.
Regrettably, get ready for more of this and brace yourself accordingly.