To be “enterprising” is to be eager to undertake or to attempt. To show initiative and be resourceful. These are leadership traits, so to be enterprising is to lead. ‘Analytics’ is how we use data to inform decision making, in the context of achieving business objectives. These are management practices, so analytics is about management.
“Enterprising analytics” is about being creative, resourceful and adventurous with decision-making to achieve business objectives. It is about the set of leadership and management practices that need to be in place for an organization to make the most of its analytics investment.
Throwing facts together
There are no facts about the future. There are only facts about the past. The only thing we know about the future is conjecture. Conjecture means “an opinion or conclusion formed on the basis of incomplete information.” The Latin root of conjecture meant to “throw facts together.” When the word entered English it meant “interpret signs and omens”, which isn’t how most business leaders want to make strategy.
But, unfortunately, it’s how most of us do our strategy work in general, let alone our data strategy. We throw facts together and interpret omens in elaborate rituals (i.e. business meetings). Business leaders are often sold on the merits of predictive analytics as mitigation for this problem. But predictive analytics can simply mean moving from reading your daily stars to calculating your natal birth chart. Applying more rigor to a flawed technique doesn’t improve the technique.
Much of what you’ll end up with is an illusion of control: the belief that events are causally related when they are not. Anthropologists have long studied our potential to indulge in the illusion of control (i.e. superstitions) when confronted with times of increasing uncertainties. Being sold the panacea of prediction is a serious risk for the modern business executive, especially if it’s seasoned with magical phrases such as ‘AI’ or ‘big data’ i. Getting too involved in prediction is something best avoided by the unwary, unskilled or overconfident.
The triumph of hope over evidence
If you work in data strategy and you’re using deterministic methods, a large proportion of your work will likely consist of facts thrown together. Or the statistics pejorative, ‘some data we found lying in the street’. If you’re not using probabilities to model critical elements, then your data strategy has a decreasing likelihood of being effective beyond a couple of financial quarters. Instead of a strategy that can be realized, you’re likely to have an elegant story that fits your preconceptions of what the future should look like.
In other words, many of our strategies will be based on wishful thinking. As Howard Wainer wrote in Truth or Truthiness, we are all prone to “the most natural of human tendencies, the triumph of hope over evidence.” This tendency isn’t new, as Shakespeare reminded us in King Henry IV, Part Two: “Thy wish was father, Harry, to that thought.” What we believe is strongly influenced by what we hope for.
Daniel Kahneman and Amos Tversky have helped build an entire cognitive framework around our preference for coherence and avoidance of information that challenges our beliefs. Add in group dynamics like social proof and groupthink, and we quickly get to a point where our strategies become a set of political slogans. Something everyone can agree with quickly finds the lowest common denominator. And once there, your data analytics strategy is dead.
Learn to think probabilistically
So, if you want your data strategy to hold up, you’ll need to start learning how to think probabilistically. A good place to start is by reflecting on the Flaw of Averages: a plan based on average inputs is wrong on average. In other words, if you’re building a forecast using five inputs, and you use averages for each of those five inputs, your aggregate forecast will be wrong on average. Read Sam Savage and the The Flaw of Averages for more on this.
The good news is that you’re already thinking probabilistically. Any time any expert makes any pronouncement on an uncertain variable, they are making a probabilistic estimation. Remember, there are no facts about the future. The bad news is that, unless you’ve had your knowledge ‘calibrated’, you’re unlikely to be accurate with your estimates. Particularly if you’re working in an industry going through substantial change, or you’re looking further than a couple of quarters. Read Keith Stanovich’s Decision Making and Rationality in the Modern World if you’re interested in learning more about why knowledge calibration is important.
The importance of modelling the future with probabilities has increased significantly as organizations start crafting their data governance strategies. Every business is now a data business and data is a unique organizational asset. We have duties of care and responsibility for ensuring the data asset has sufficient governance. As we’ve seen with high profile stumbles like Equifax and with the impending arrival of GDPR, there are real costs associated with not understanding data.
Find better ideas to think with
Because leaders need to invest in analytics and artificial intelligence capability, their organizations need to start adopting the techniques necessary to manage complex adaptive systems. Or, put another way, if you plan on hiring data scientists or investing in self-service analytics, you’re going to use probabilistic techniques sooner rather than later. So you may as well start now.
Which means learning to differentiate between two basic ideas of what probability is. Most of us will be stuck in the first idea of probability. Getting to the second idea of probability isn’t easy. As E. F. Schumacher wrote in A Guide for the Perplexed, “… the human mind, generally speaking, does not just think: it thinks with ideas, most of which it simply adopts and takes over from its surrounding society.”
So to get ourselves thinking probabilistically for our modern age, we need to think with better ideas about what probability is. If you did statistics in your first year of university, you would have done some papers based on classical probability. This is where we know the parameters of a system and can calculate the probabilities for the events the system will generate. Classical probability is still useful in the right context.
The problem is that context is slowly contracting. As the World Economic Forum observed in its 2016 report The Future of Jobs, nearly half of the knowledge acquired during the first year of a four-year technical degree is already obsolete by graduation day. The half-life of expertise is steadily decreasing. Learning how to learn is becoming a critical executive skill and learning to think probabilistically will become table stakes.
It’s subjectivist probability that you need to put in your annual professional development plan. Subjective probabilities aren’t obtained from an experiment or a replicable event. They express an opinion or degree of belief about how likely a particular event is to occur. We deal in subjective probabilities all the time: the next time you’re in a meeting count the number of times people say ‘probably’ or ‘likely to’. Sometimes the word ‘should’ also disguises a subjective probability. One useful outcome of having your knowledge calibrated is you become alert to any language that claims future knowledge.
Most of us are used to building “if/then” statements. But few of us are used to framing questions in the form of “given this…what is the likelihood of observing that?” But if you want to make the most of your data strategy, progress your career on the data side of business or get a good return on your analytics investment, you’re going to need to make that shift. And sooner rather than later.
This article is published as part of the IDG Contributor Network. Want to Join?