Download Guides
Corporate Training & Development

Measuring the Effectiveness of Data Analytics

POSTED 11/04/2015
By Susan Hall

Many organizations are struggling to harness all the data they collect. But figuring out how to measure the effectiveness of the data wranglers those companies are hiring presents a greater challenge.

“It’s tough in any knowledge-worker role,” says Greta Roberts, CEO and co-founder of Talent Analytics, a company applies analytics to predicting pre-hire employee performance.

If you have a sales rep role or a truck driver, you can say performance is based on how many pieces you process or how many calls you take, how many sales you make, but in a knowledge-worker role, particularly with data science, what do you measure? There is no quantitative measure, which is the ironic thing. It’s the most quantitative role possible.”

Developing KPIs

At the same time, whether tracking employee or departmental performance, it’s vital to be able to tie performance to business goals. That can mean asking the right questions that can lead to insight that can drive business results — or just taking some anomaly and running with it.

It means developing performance measures unique to your own organizational goals. In health care, it might be improving 30-day readmission rates or getting more patients with high blood pressure to their goals.

Different industries use different approaches to measuring data. The International Association of Crime Analysts suggests one metric might be requested vs. self-initiated activity.

Consider quality over quantity when deciding what data to collect and measure—just because it can be counted, doesn’t mean it should be.

In setting targets for analytics staff, you should be measuring long-term improvements in statistics over time, according to Guy Smith, marketing director for Fauna & Flora International, a conservation charity.

Understanding which key performance indicators (KPIs) are important to your organization takes research and experience and that means experimentation, he says.

Two I’s and two E’s

Cardinal Path, which helps companies make the most of their digital assets,  has been building up its analytics staff with in house training. Those hires get some formal training on analytics tools, but also work side-by-side with more experienced data pros.

Co-founder and senior partner Alex Langshur considers those effects a success when the new hires are able to work more independently on projects, contribute rather than just follow, and in some cases, lead rather than just take direction.

In evaluating its data pros, the company has specific performance categories and indicators. The broad categories, he explains, include two I’s and two E’s:

  • Innovation: Do they show innovative approaches to tough challenges? are they looking to innovate the work they do? Are they seeking new opportunities based on what they’ve learned?
  • Initiative: Do they have a bias for action? Do they “lean forward” and engage? Do they put their hand up to offer/ask for help?
  • Execution: Can they do the work? Do they have the ability and skills to act in a sustained fashion?
  • Excellence: Do they try to execute to the best of their abilities? Are they committed to delivering excellence?

On a more specific level Cardinal Path looks at:

Technical skill(s): Have they been keeping current? Are they using best practices and remaining at the forefront of developments in the field? Are they seeking to close their knowledge gaps? Are they mentoring and sharing their skills with others?

Customer satisfaction: Are clients happy with their performance? Look at the technical quality of the solution, responsiveness to requests, degree of engagement and support, and do they have good client skills?

Alignment to corporate purpose: It’s looking for adherence to company values and goals.

Communications: With the client, their colleagues and superiors.

Other factors: Are they comfortable with ambiguity? Do they show leadership potential? Are they seeking a career track or do they want to be individual contributors?

“In a competitive and extremely fast-moving market, service companies are distinguished by the ability of their team to deliver with excellence, often in very challenging environments — high pressure, quick turnarounds, lots of moving parts, fuzzy mandates… So we’ve tried to look at broad metrics that address both the “hard” and “soft” skills, and give equal value to both ,” he says.

What to Measure?

Roberts targeted the heart of performance metrics in research she did with the International Institute for Analytics on the characteristics of data professionals. Respondents were asked to rate themselves in areas such as:

  • How often they delivered their projects on time
  • The percentage of time their analytics models were used
  • How they think their customers would rate them in terms of satisfaction, communication etc

Our hypothesis was that everyone would rate themselves highly – and they did – making the data useless,” she said.

But you can still measure things, she says.

“There really isn’t just one data science role or one marketing role. It encompasses everything – there’s data-preparation people and programming people. If they’re the data prep people, how long did it take them?

“The other thing: Are their projects exploratory and research-focused or do they need to have a commercial result?”

She suggests possible metrics such as:

  • How much money the project saved the business
  • How much money the project made the business
  • How well they stuck to delivery timelines
  • How well they learn on their own
  • How well they help a customer form a problem vs. just solving the problem given

“It really depends on the role and the project,” she says. “You don’t want to put too much focus on ‘You’ve got to deliver every time on time’ because part of being a data scientist is finding something weird and going back and exploring it.
“But people hire data scientists to use the data to deliver results for the business, so it’s a bit of a quandary,” she says. “You don’t want so much of an open field without any measures, that it’s a lot of fun and it’s interesting, but it doesn’t deliver results.

Jagged Edge Media JAcom Consultants