Are New Advances in AI Worth the Hype?artelligenceforum
Executives can be forgiven their scepticism when they consider the current state of AI — but there are good reasons to take this technological opportunity seriously.
Almost daily, we’re hit with another breathless news report of the (potential) glories of artificial intelligence in business. Rather than excitement, the fervor can instead kindle a Scrooge-like attitude, tempting executives to grumble “bah humbug” and move on to the next news item.
Why exactly is the “bah humbug” temptation so strong? Perhaps because…
- News reports naturally gravitate toward sensational examples of AI. We collectively seem to like it when science fiction becomes science fact. But it may not be clear how humanoids and self-driving cars are actually relevant to most businesses.
- Reports tilt toward stories of extreme cases of success. Those managers who have found some aspects of AI that are relevant to their business may be frustrated with the differences between their experiences and the (purported) experiences of others. They may feel that AI is immature and the equivalent of a young child, far from ready for the workplace.
As a result, managers may perceive AI as yet another in a long list of touted technologies that are more fervor than substance. Certainly, the information technology sector is far from immune to getting intoxicated with promising “new” technologies. Still under the influence of the intensity from prior technological shifts (digitization, analytics, big data, e-commerce, etc.), managers may struggle to determine what exactly is new about AI that may be relevant now. After all, AI has been around for decades and is not, actually, new.
Why the attention to AI now? Is there anything new in AI worthy of the hype? Is this vintage of AI just “old wine in new bottles”?
When the web began to garner interest, it was hard to argue that distributed computing was new. We started with centralized processing with mainframes and terminals rendering output and collecting input. Yes, the web promised prettier output than green characters on cathode ray screens, but did the newfangled web differ from prior distributed computing other than cosmetically? In retrospect, it seems silly to ask; it would be hard to argue that the internet didn’t fundamentally change business. Something was new about the web, even if it didn’t look different at first.
More recently, analytics has also seen its fair share of hype. However, statistical analysis, optimization, regression, machine learning, etc., all existed long before attention coalesced around the term “analytics.” Airlines in particular have long used data for revenue management. Yet something was also new about the potential for analytics, starting about a decade ago, that is now affecting businesses everywhere.
Underappreciating the differences between the old periods and new in each of these examples would have been a mistake. Managers who had unfavorable responses to either of these are probably no longer managers. What is different about AI now?
Unlike in earlier incarnations, we now have access to the processing power these AI developments require. What could once be done in theory can now be done in practice. Furthermore, the required processing power is affordable to most organizations. The leap from fervor to value to the business can happen — with investment, experimentation, and tolerance for failure.
Read the entire article on MIT Sloan Management Review
SOURCE: MIT Sloan Management Review