Romantic and Rational Approaches to Artificial Intelligence
A gap already exists between companies’ ability to collect data and managers’ skills at putting it to use. Will AI increase the divide?
Topics
Artificial Intelligence and Business Strategy
In collaboration with
BCGThe use of artificial intelligence in the criminal justice system offers a stark example of the contrast between knowing how to produce results and knowing how to consume them intelligently. Systems recommend bail and sentencing but offer little transparency about the basis for the recommendation, leaving the humans who digest the recommendations potentially under informed.
What if we knew so little about the production processes of the food we eat? We know more about what we put into our mouths than what we put into our minds.
Are Organizations Biting Off More Analytics Than They Can Chew?
In 2015, we observed a growing gap between the production and consumption abilities of analytics in organizations. The article “Minding the Analytics Gap” describes how organizations struggle to consume the analytics results they produce. If that wasn’t bad enough, not only did we observe a gap, but it was a gap that grew, not shrank, as organizations got better at analytics.
Yes, organizations were rapidly improving their ability to produce analytical results. They were gathering more and more data. They were building digital infrastructures to process these vast quantities of data. They were developing (or acquiring) the talent required to develop complex models of market behavior. When these pieces all came together, organizations could create sophisticated analytical results.
Unfortunately, managers and executives in those organizations often did not have the expertise to consume the analytics results that the organization was able to produce. Just having the analytics results available wasn’t enough. The organizational ability to develop business insight and strategy based on those analytical results was more limited.
The difficulty lies in the individual rates of improvement in production abilities and consumption abilities. As organizations matured analytically, they were able to improve their analytics production capabilities more quickly than they were able to improve their consumption abilities. As a result, maturing organizations found that, despite the fact that their consumption abilities were improving, they were able to consume less and less of what they produced. The analytics gap gets worse as organizations improve — the opposite of what leaders would hope and expect.
And yet this may have just been the tip of the iceberg. When it comes to artificial intelligence in business, the divergence and resulting gap between production and consumption of data analytics may be an even bigger concern.
Artificial Intelligence Widens the Analytics Gap
Artificial intelligence in business builds off of an analytics foundation. (Stay tuned — we’ve got much more coming about that in our forthcoming report on artificial intelligence and business strategy this fall.) But as a result, organizations will similarly experience a growing gap between artificial intelligence production and artificial intelligence consumption. What’s worse, the rate at which the artificial intelligence production-consumption divide grows stands to be greater than what we’ve observed with standard data analytics. Everything hinges on the relative rates of change for the sophistication of AI data production vs. AI data consumption.
Get Updates on Leading With AI and Data
Get monthly insights on how artificial intelligence impacts your organization and what it means for your company and customers.
Please enter a valid email address
Thank you for signing up
AI production sophistication seems poised to grow rapidly. AI is building quickly on what organizations have learned from analytics sophistication. As new techniques are developed, tools seem to incorporate them quickly — the scarce resource for most AI is data, not algorithms. Algorithms, by definition, are software; they are easily and perfectly copied. At the extreme, complex AI algorithms can be incorporated into AI production processes perhaps without data scientists understanding their details — they just use the library or tool. The result is rapid increase in the sophistication of AI in an organization.
Conversely, managers and executives may find that their understanding of the AI output improves slowly. As complex as analytical models can be, managers and executives likely have at least some basic statistics background to build from — so they have a starting point. But with artificial intelligence models, managers probably have less background. Machine learning is rarely part of a business curriculum core.
Not to mention that many of the algorithms themselves are “black boxes,” particularly when offered by vendors that want to protect the investments in their development. Deep learning neural networks can be trained with organization data to yield high predictive accuracy — but unlike many analytical models with coefficients on observable input measures, AI approaches typically contain a large number of weightings on nodes in hidden layers — not exactly the sort of description that will make AI models accessible for easy consumption.
As a result, the divergence between the production and consumption of artificial intelligence in organizations may increase even more quickly than it has for analytics. Managers then may find that their organizations’ AI models work, yet not understand why.
Peeking Inside the Black Box
The 1974 novel Zen and the Art of Motorcycle Maintenance by Robert M. Pirsig is relevant today because it contrasts romantic and rational relationships with technology. Along the narrator’s motorcycle road trip, maintenance of the motorcycle was inevitable. Treating the machine as a black box — romantically, in other words — led to frustration, breakdowns, and unhealthy reliance on others. Inauthenticity stems from a lack of knowledge. But a rational approach, one that puts in effort to understand the machine, led to independence, stability, and even pleasure in working with the technology.
Without understanding how AI works, we lose the ability to think critically about where the results are strong and where they are weak. We lose the ability to understand how changes outside the scope of the model will adversely affect the model. We lose the ability to know where the AI will fail before it fails. We lose the ability to repair it ourselves when it does inevitably fail.
Stopping gains in artificial intelligence isn’t the right approach, even if it were possible. Instead, managers need to work to close the gap by learning more about AI, by opening the black box, by learning enough to be better managers in a future that relies on AI. Success depends on rational problem-solving approaches to AI, not romantic reliance.
Comments (2)
Nik Zafri Abdul Majid
Peter Evans