On metrics and honesty

You gotta be honest about your metrics

Was bored at work yesterday before a full staff meeting and hopped onto #TChat for the first time ever, which is basically a Twitter chat about people, talent strategy, people analytics, HR metrics, and all that. I’m interested in that stuff, which makes me a weird, fringe-y member of the overall population. Hey — it’s all about chasing down your own personal dream, no? So I’m on TChat, throwing grenades like this puppy:

… and as a result, this whole area is something I’ve been thinking about for much of the last 24 hours … all the way up to therapy this morning.

Let me back up one step. I write about this kind of stuff all the time — basically like, everyone’s chasing the idea of Big Data and analytics, but they’re chasing it wrong. They forget that it means we have to teach it better, we have to hire people with new skill sets, we have to understand the difference between “synthesis” and “analysis,” and we have to get executives to a place where data is accepted and it’s not just about their “gut feel.”  That’s a lot of steps. Work doesn’t always flow in such a narrative fashion — most people are all about those daily deliverables, baby!

So here’s what I’m talking about in therapy, OK … (well, part of it) … you ever sit in business meetings and someone is totally afraid to admit that an idea or initiative failed — no one likes to discuss failure at work — so instead they couch it, then roll out another metric that seems impressive? Here’s an example:

Person: Well, Q3 was a bit below expectations, but … (pause) we did register 4.5 million media impressions off our mention in a Fortune article!

That sentence above means absolutely nothing. It basically means “We didn’t do the main thing we were supposed to, but here’s a thing that sounds good that we did do!” Here’s the way you paint by numbers around it:

  • Fail
  • Replace failure with metric of your choosing

And now we come to a fork in the road, OK?

The only way — like, the only way — that metrics and analytics and Data can ever work for a company is if there’s some honesty around them. This goes back to the “gut feel” idea above. Everyone has gut feelings and beliefs. “No one visits that page on our website!” or “That store’s sales are the best!”

So what if you believe that, right? And then … you start getting a little more in bed with data and analytics, and what you believed is entirely wrong? What happens then?

Are you honest about that, and consider it a first step on a potential new course?

Or do you assume the data is wrong, the data “isn’t clean,” or something else entirely and stick to the core beliefs you had the first time around?

If you want to “compete on analytics,” you know the first thing you need to do?

  • Scrub and standardize your data

You know the second thing you need to do?

  • Scrub and standardize the biases of your people

Without that 1-2 punch, it will never work. It will devolve into “Well, the data’s wrong!” or “The course is this, not what the info says!”

People think this kind of stuff is all about IT and databases. Yep. That’s part of it. It’s much more about people and beliefs and the ability to change perceptions and be honest with yourself about how things are going. Until we embrace that part, it’s never going to be a huge thing for any company.

Ted Bauer


  1. So true. Testing our assumptions is the only way to drive innovation. If you won’t allow the data to help you decide on a new course, then you are destined to repeated failure. Live, learn, repeat.

    • It’s far easier to silo data than it is to integrate it, which is why a lot of organizations start out on that course and end up integrating later. The problem with that is one of vision over the data by the people involved…data from one area of the company often doesn’t jive with data from another part.

      The solution of course is to ask and understand why, but often folks simply choose to disagree rather than dive in together to resolve perception problems…this is where understanding is invaluable to the process. We are still a long way from the computers coming close to telling you what the data means, and so it’s left to people to interpret and manage it collaboratively.

  2. True, but hard to do. There’s no way, in the real world, to conduct controlled experiments with the kind of big data your talking about. Basically you need to have some kind of framework to look at the data through, be it a mathematical model or a logic one, and base your testing on that model’s predictive power. The problem will be gut feel though. People are inherently irrational, no matter how much data and actual trial runs and tests you give them showing they are wrong, they will most often fail to admit it or change their view point. In fact, they often dig in deeper.

    Big data will, in my opinion, be simply used to cherry pick from for people in charge to either justify their views or refute those of others. Until some few firms start using it successfully and following the data as opposed to their biases and gut feel, all it will do is facilitate making the same bad decisions faster and more confidently.

Comments are closed.