Tuesday, April 6, 2021

Elon Musk: Threat or Menace?

Although both Tesla and SpaceX are major engineering achievements, Elon Musk seems completely unable to understand the concept of externalities, unaccounted-for costs that society bears as a result of these achievements.

First, in Tesla: carbon offsetting, but in reverse, Jaime Powell reacted to Tesla taking $1.6B in carbon offsets which provided the only profit Tesla ever made and putting them into Bitcoin:
Looked at differently, a single Bitcoin purchase at a price of ~$50,000 has a carbon footprint of 270 tons, the equivalent of 60 ICE cars.

Tesla’s average selling price in the fourth quarter of 2020? $49,333.

We’re not sure about you, but FT Alphaville is struggling to square the circle of “buy a Tesla with a bitcoin and create the carbon output of 60 internal combustion engine cars” with its legendary environmental ambitions.

Unless, of course, that was never the point in the first place.
Below the fold, more externalities Musk is ignoring.

Second, there is Musk's obsession with establishing a colony on Mars. Even assuming SpaceX can stop their Starship second stage exploding on landing, and do the same with the much bigger first stage, the Mars colony scheme would have massive environmental impacts. Musk envisages a huge fleet of Starships ferrying people and supplies to Mars for between 40 and 100 years. The climate effects of dumping this much rocket exhaust into the upper atmosphere over such a long period would be significant. The idea that a world suffering the catastrophic effects of climate change could sustain such an expensive program over many decades simply for the benfit of a miniscule fraction of the population is laughable.

These externalities are in the future. But there are a more immediate set of externalities.

Back in 2017 I expressed my skepticism about "Level 5" self-driving cars in Techno-hype part 1, stressing that the problem was that to get to Level 5, or as Musk calls it "Full Self-Driving", you need to pass through the levels where the software has to hand-off to the human. And the closer you get to Level 5, the harder this problem becomes:
Suppose, for the sake of argument, that self-driving cars three times as good as Waymo's are in wide use by normal people. A normal person would encounter a hand-off once in 15,000 miles of driving, or less than once a year. Driving would be something they'd be asked to do maybe 50 times in their life.

Even if, when the hand-off happened, the human was not "climbing into the back seat, climbing out of an open car window, and even smooching" and had full "situational awareness", they would be faced with a situation too complex for the car's software. How likely is it that they would have the skills needed to cope, when the last time they did any driving was over a year ago, and on average they've only driven 25 times in their life? Current testing of self-driving cars hands-off to drivers with more than a decade of driving experience, well over 100,000 miles of it. It bears no relationship to the hand-off problem with a mass deployment of self-driving technology.
Mack Hogan's Tesla's "Full Self Driving" Beta Is Just Laughably Bad and Potentially Dangerous starts:
A beta version of Tesla's "Full Self Driving" Autopilot update has begun rolling out to certain users. And man, if you thought "Full Self Driving" was even close to a reality, this video of the system in action will certainly relieve you of that notion. It is perhaps the best comprehensive video at illustrating just how morally dubious, technologically limited, and potentially dangerous Autopilot's "Full Self Driving" beta program is.
Hogan sums up the lesson of the video:
Tesla's software clearly does a decent job of identifying cars, stop signs, pedestrians, bikes, traffic lights, and other basic obstacles. Yet to think this constitutes anything close to "full self-driving" is ludicrous. There's nothing wrong with having limited capabilities, but Tesla stands alone in its inability to acknowledge its own shortcomings.
Hogan goes on to point out the externalities:
When technology is immature, the natural reaction is to continue working on it until it's ironed out. Tesla has opted against that strategy here, instead choosing to sell software it knows is incomplete, charging a substantial premium, and hoping that those who buy it have the nuanced, advanced understanding of its limitations—and the ability and responsibility to jump in and save it when it inevitably gets baffled. In short, every Tesla owner who purchases "Full Self-Driving" is serving as an unpaid safety supervisor, conducting research on Tesla's behalf. Perhaps more damning, the company takes no responsibility for its actions and leaves it up to driver discretion to decide when and where to test it out.

That leads to videos like this, where early adopters carry out uncontrolled tests on city streets, with pedestrians, cyclists, and other drivers unaware that they're part of the experiment. If even one of those Tesla drivers slips up, the consequences can be deadly.
Of course, the drivers are only human so they do slip up:
the Tesla arrives at an intersection where it has a stop sign and cross traffic doesn't. It proceeds with two cars incoming, the first car narrowly passing the car's front bumper and the trailing car braking to avoid T-boning the Model 3. It is absolutely unbelievable and indefensible that the driver, who is supposed to be monitoring the car to ensure safe operation, did not intervene there.
An example of the kinds of problems that can be caused by autonomous vehicles behaving in ways that humans don't expect is reported by Timothy B. Lee in Fender bender in Arizona illustrates Waymo’s commercialization challenge:
A white Waymo minivan was traveling westbound in the middle of three westbound lanes on Chandler Boulevard, in autonomous mode, when it unexpectedly braked for no reason. A Waymo backup driver behind the wheel at the time told Chandler police that "all of a sudden the vehicle began to stop and gave a code to the effect of 'stop recommended' and came to a sudden stop without warning."

A red Chevrolet Silverado pickup behind the vehicle swerved to the right but clipped its back panel, causing minor damage. Nobody was hurt.
The Tesla in the video made a similar unexpected stop. Lee stresses that, unlike Tesla's, Waymo's responsible test program has resulted in a generally safe product, but not one that is safe enough:
Waymo has racked up more than 20 million testing miles in Arizona, California, and other states. This is far more than any human being will drive in a lifetime. Waymo's vehicles have been involved in a relatively small number of crashes. These crashes have been overwhelmingly minor with no fatalities and few if any serious injuries. Waymo says that a large majority of those crashes have been the fault of the other driver. So it's very possible that Waymo's self-driving software is significantly safer than a human driver.
...
The more serious problem for Waymo is that the company can't be sure that the idiosyncrasies of its self-driving software won't contribute to a more serious crash in the future. Human drivers cause a fatality about once every 100 million miles of driving—far more miles than Waymo has tested so far. If Waymo scaled up rapidly, it would be taking a risk that an unnoticed flaw in Waymo's programming could lead to someone getting killed.
I'm a pedestrian, cyclist and driver in an area infested with Teslas owned, but potentially not actually being driven, by fanatical early adopters and members of the cult of Musk. I'm personally at risk from these people believing that what they paid good money for was "Full Self Driving". When SpaceX tests Starship at their Boca Chica site they take precautions, including road closures, to ensure innocent bystanders aren't at risk from the rain of debris when things go wrong. Tesla, not so much.

Of course, Tesla doesn't tell the regulators that what the cult members paid for was "Full Self Driving"; that might cause legal problems. As Timothy B. Lee reports, Tesla: “Full self-driving beta” isn’t designed for full self-driving:
"Despite the "full self-driving" name, Tesla admitted it doesn't consider the current beta software suitable for fully driverless operation. The company said it wouldn't start testing "true autonomous features" until some unspecified point in the future.
...
Tesla added that "we do not expect significant enhancements" that would "shift the responsibility for the entire dynamic driving task to the system." The system "will continue to be an SAE Level 2, advanced driver-assistance feature."

SAE level 2 is industry jargon for a driver-assistance systems that perform functions like lane-keeping and adaptive cruise control. By definition, level 2 systems require continual human oversight. Fully driverless systems—like the taxi service Waymo is operating in the Phoenix area—are considered level 4 systems."
There is an urgent need for regulators to step up and stop this dangerous madness:
  • The NHTSA should force Tesla to disable "Full Self Driving" in all its vehicles until the technology has passed an approved test program
  • Any vehicles taking part in such a test program on public roads should be clearly distinguishable from Teslas being driven by actual humans, for example with orange flashing lights. Self-driving test vehicles from less irresponsible companies such as Waymo are distinguishable in this way, Teslas in which some cult member has turned on "Full Self Driving Beta" are not.
  • The FTC should force Tesla to refund, with interest, every dollar paid by their customers under the false pretense that they were paying for "Full Self Driving".

5 comments:

David. said...

Aaron Gordon's This Is the Most Embarrassing News Clip in American Transportation History is a brutal takedown of yet another of Elon Musk's fantasies:

"Last night, Shepard Smith ran a segment on his CNBC show revealing Elon Musk's Boring Campany's new Las Vegas car tunnel, which was paid for by $50 million in taxpayer dollars. It is one of the most bizarre and embarrassing television segments in American transportation history, a perfect cap for one of the most bizarre and embarrassing transportation projects in American history."

David. said...

Eric Berger's A new documentary highlights the visionary behind space settlement reviews The High Frontier: The Untold Story of Gerard K. O'Neill:

"O'Neill popularized the idea of not just settling space, but of doing so in free space rather than on the surface of other planets or moons. His ideas spread through the space-enthusiast community at a time when NASA was about to debut its space shuttle, which first flew in 1981. NASA had sold the vehicle as offering frequent, low-cost access to space. It was the kind of transportation system that allowed visionaries like O'Neill to think about what humans could do in space if getting there were cheaper.

The concept of "O'Neill cylinders" began with a question he posed to his physics classes at Princeton: "Is a planetary surface the right place for an expanding industrial civilization?" As it turned out, following their analysis, the answer was no. Eventually, O'Neill and his students came to the idea of free-floating, rotating, cylindrical space colonies that could have access to ample solar energy."

However attractive the concept is in the far future, I need to point out that pursuing it before the climate crisis has been satisfactorily resolved will make the lives of the vast majority of humanity worse for the benefit of a tiny minority.

David. said...

‘No one was driving the car’: 2 men dead after fiery Tesla crash in Spring, officials say :

"Harris County Precinct 4 Constable Mark Herman told KPRC 2 that the investigation showed “no one was driving” the fully-electric 2019 Tesla when the accident happened. There was a person in the passenger seat of the front of the car and in the rear passenger seat of the car."

David. said...

Timothy B. Lee's Consumer Reports shows Tesla Autopilot works with no one in the driver’s seat reports:

"Tesla defenders also insisted that Autopilot couldn't have been active because the technology doesn't operate unless someone is in the driver's seat. Consumer Reports decided to test this latter claim by seeing if it could get Autopilot to activate without anyone in the driver's seat.

It turned out not to be very difficult.

Sitting in the driver's seat, Consumer Reports' Jake Fisher enabled Autopilot and then used the speed dial on the steering wheel to bring the car to a stop. He then placed a weighted chain on the steering wheel (to simulate pressure from a driver's hands) and hopped into the passenger seat. From there, he could reach over and increase the speed using the speed dial.

Autopilot won't function unless the driver's seatbelt is buckled, but it was also easy to defeat this check by threading the seatbelt behind the driver.
...
the investigation makes clear that activating Autopilot without being in the driver's seat requires deliberately disabling safety measures. Fisher had to buckle the seatbelt behind himself, put a weight on the steering wheel, and crawl over to the passenger seat without opening any doors. Anybody who does that knows exactly what they're doing. Tesla fans argue that people who deliberately bypass safety measures like this have only themselves to blame if it leads to a deadly crash."

Well, yes, but Musk's BS has been convincing them to try stunts like this for years. He has to be held responsible, and he has to disable "Full Self Driving" before some innocent bystanders get killed.

David. said...

This Automotive News editorial is right but misses the bigger picture:

"Tesla's years of misleading consumers about its vehicles' "full self-driving" capabilities — or lack thereof — claimed two more lives this month.
...
When critics say the term "autopilot" gives the impression that the car can drive without oversight, Tesla likes to argue that that's based on an erroneous understanding of airplanes' systems. But the company exploits consumers' overconfidence in that label with the way the feature is sold and promoted without correction among Tesla's fanatical online community. Those practices encourage misunderstanding and misuse.

In public, Musk says the company is very close to full SAE Level 5 automated driving. In conversations with regulators, the company admits that Autopilot and Full Self-Driving are Level 2 driver-assist suites, not unlike those sold by many other automakers.

This nation does not have a good track record of holding manufacturers accountable when their products are misused by the public, which is what happened in this case."

It isn't just the Darwin Award winners at risk, it is innocent bystanders at risk.