How Data Fails Humans, or the results of a Tesla road test

image

I see a lot of experts and massive businesses investing and committing to the concept of “big data”. It’s spoken about in grand terms. We talk about it like the solution to all of our problems is simply hiding in plain view, waiting to be recorded and processed to give us “the answer”. For the most part I agree that if we were to gather enough data and have the tools to analysis it all, we could actually have a near-perfect view of an experience. But I also fear that we’re putting too much trust in data. Those recordings of speed, location, time of engagement, rate of response, etc. are confined and adding more data points won’t always solve the big issue. 

The problem I foresee is two-fold. First, data isn’t perfect. Anyone who’s ever gathered data on a subject knows that it isn’t always easy to get a clear view of what’s happening based on the data you receive. You have to analyze the data to determine the story.

This leads to the second issue: analysis. Humans are impulsive, emotional creatures. This means we have moments where our interpretation of anything is based on our own, personal, biases. There’s no way around it aside from involving many, disconnected people and having them interpret data independently from each other. The beauty of being human is that we fail at things.

Nothing could be more clearly supporting of this argument than the recent, very public, tussle between Elon Musk @elonmusk and New York Times writer John Broder @nytimeswheels. Mr. Broder recently took the Tesla Model S for a test drive in New York. In his test drive Broder ran out of charge and was forced to be towed to a charging station on his second day of driving. The overview of the issue is that cold weather (it was the coldest day of the year) reduced the charge in the car as it sat idle overnight. 

Elon Musk, founder of Tesla Motors, responded to the article with his own analysis of Mr. Broder’s drive. To do this he reviewed all the data the car gathered during the drive. Speed, charge, estimated remaining mileage, even the temperature of the heating system. What’s interesting here is that the data seems to show that Mr. Broder acted intentionally and directly to cause the car to stall. Mr. Musk claims that Mr. Broder:

… the display said “0 miles remaining.” Instead of plugging in the car, he drove in circles for over half a mile in a tiny, 100-space parking lot. When the Model S valiantly refused to die, he eventually plugged it in.

Mr. Musk is using the speed and gear data to “prove” this point. But, in fact, this is a prime example of how data can tell several different stories, leaving our biases to fill in the blanks.

I have to say here that I’m a huge fan of the Tesla S, I haven’t wanted a car so badly in a long time. 

Back to the story. Musk is looking at a set of data that tells him that the car has no charge and is continuing to operate. His interpretation of this is to bestow heroism on the car. Mr. Broder responds to this with a completely plausible and normal human experience… he can’t find the charging station because it’s night time.

While Mr. Musk has accused me of doing this to drain the battery, I was in fact driving around the Milford service plaza on Interstate 95, in the dark, trying to find the unlighted and poorly marked Tesla Supercharger. 

I harp on this particular piece of data because, for me, it’s the one that’s been interpreted the most by both Musk and Broder. Musk paints the image of the Trickster Broder hunched over the steering wheel and manically laughing as he lurches the car forward and backward. All the while Musk’s poor Tesla is struggling to keep a charge under persistent abuse as though the car is Dustin Hoffman in The Running Man and Broder the evil Doctor/torturer yelling “Is it safe?”.

While Broder paints the image of a stressed, sad, probably cold, man simply trying desperately to find a station while the damned machine beeps warnings of failing fuel.

In the end there’s no way that Musk can be correct because he wasn’t there. He also does not address the loss of charge overnight. Yet there’s little chance that Broder was the innocent victim of technology. He has posted concerns about electric vehicles in the past. This is where data gets us into so many messes. 

If we claim that data is salvation to all our problems then Tesla should simply start gathering more data. What time of day is it? What’s the temperature? Can we have a function that allows the customer to send us GPS tracking? Could we set up a camera inside and outside of the car to record everything that’s happening then process it to determine driver stress and environmental conditions? Yes. They can do all of that. But this is data tracking a human experience that’s then interpreted by humans. It will always be fallible because that’s what we are. 

Data is meant to support skill, knowledge and experience not replace them.

In a follow up article written by Margaret Sullivan she contributes a good, moderated, perspective. Musk’s review isn’t wholly accurate because he wasn’t in the car at the time and Broder seems to have missed a few key adjustments that would have saved the car from stalling.

She includes a comment from a Tesla driver calling out the settings that Broder failed to set:

But, if he had taken time to read the owner’s manual beforehand (which, at 30-or-so well-written pages, would have taken an hour), he would have known about:

• “The ‘Max Range’ setting, which would have charged the battery beyond the ‘standard’ range and given him 20-30 miles more range;

• “The ‘Range Mode’ setting, which would have conserved battery during the drive;

• “The section entitled ‘Driving Tips for Maximum Range’;

• “And, the concept of plugging the vehicle in (especially during his overnight stop): ‘Tesla strongly recommends leaving Model S plugged in when not in use.’ and ‘The most important way to preserve the Battery is to LEAVE YOUR MODEL S PLUGGED IN when you’re not using it.’

What can we learn from this?

We can learn that the recording of data can be skewed not only by the human experience but by the complexity and “newness” of the experience. I’m certain that, this being his first time in a Tesla S, Mr. Broder was unaware of the total affect of these settings. I’d also be willing to give him the benefit of the doubt about reading the manual as I wouldn’t have done it either.

It’s likely that his experience with a new vehicle, with new options and settings and under frustrating conditions compounded the poor decisions and caused the car to stall. If you’re a data scientist and your reading this (how did you find this post?) then you should start thinking about how to record human data and not just the data from the vehicle.

It’s entirely possible that when using a new piece of technology for the first time, the user will make mistakes. If you’re talking about a car the potential impact is much higher than when you’re talking about an app. Data, as captured from a device and analyzed by a person can’t completely record all of the events happening with the user. 

Because of this we should not fault users for their poor decisions or lack of reading the “manual”. Mr. Broder was presented with a “car” he treated as such. A lot of money has been put into making the Tesla S look and feel like a car. But it’s not the same vehicle. Cars feel different and the interface is different as well (most cars don’t have a 17” iPad in the dash.)

The data recording of the “electric vehicle” represents an entirely different experience than Mr. Broder had even though he was there.

In the end, Musk has lost some credibility for his very human analysis of non-human data. While Mr. Broder loses credibility for simply being human and not recording his experience better.

It’s time we start accepting human error into data sets and it’s time we start treating customers as what they really are, people.