Are New Advances in AI Worth the Hype?
Executives can be forgiven their skepticism when they consider the current state of AI — but there are good reasons to take this technological opportunity seriously.
Topics
Artificial Intelligence and Business Strategy
In collaboration with
BCGAlmost daily, we’re hit with another breathless news report of the (potential) glories of artificial intelligence in business. Rather than excitement, the fervor can instead kindle a Scrooge-like attitude, tempting executives to grumble “bah humbug” and move on to the next news item.
Why exactly is the “bah humbug” temptation so strong? Perhaps because…
- News reports naturally gravitate toward sensational examples of AI. We collectively seem to like it when science fiction becomes science fact. But it may not be clear how humanoids and self-driving cars are actually relevant to most businesses.
- Reports tilt toward stories of extreme cases of success. Those managers who have found some aspects of AI that are relevant to their business may be frustrated with the differences between their experiences and the (purported) experiences of others. They may feel that AI is immature and the equivalent of a young child, far from ready for the workplace.
As a result, managers may perceive AI as yet another in a long list of touted technologies that are more fervor than substance. Certainly, the information technology sector is far from immune to getting intoxicated with promising “new” technologies. Still under the influence of the intensity from prior technological shifts (digitization, analytics, big data, e-commerce, etc.), managers may struggle to determine what exactly is new about AI that may be relevant now. After all, AI has been around for decades and is not, actually, new.
Why the attention to AI now? Is there anything new in AI worthy of the hype? Is this vintage of AI just “old wine in new bottles”?
When the web began to garner interest, it was hard to argue that distributed computing was new. We started with centralized processing with mainframes and terminals rendering output and collecting input. Yes, the web promised prettier output than green characters on cathode ray screens, but did the newfangled web differ from prior distributed computing other than cosmetically? In retrospect, it seems silly to ask; it would be hard to argue that the internet didn’t fundamentally change business. Something was new about the web, even if it didn’t look different at first.
More recently, analytics has also seen its fair share of hype. However, statistical analysis, optimization, regression, machine learning, etc., all existed long before attention coalesced around the term “analytics.