IT leaders need to understand some hard truths of Artificial Intelligence tools in order to shape AI strategy. Consider these key questions to discuss with your developers
CIOs face painful new round of shiny object syndrome
[Editor's note: Seth Earley, CEO of Earley Information Science, will speak on the "Putting AI to Work" panel at the upcoming MIT Sloan CIO Symposium on May 24 in Cambridge, MA.]
CIOs and technology leaders have it tough these days. Change is happening at a faster rate; agile, born-digital competitors are threatening to upend longstanding business models; and digital capabilities increasingly depend less on the tools themselves and more on rethinking the business process and customer value proposition. When business leaders come to IT with an ask, the things they want are not always aligned with what they really need.
Shiny new technologies such as cognitive computing, AI, bots, the Internet of Things (IoT), personalization, and machine learning have their intrinsic conceptual appeal, but novelty should not be confused with business value. Nevertheless, management may read an article about technologies that are not fully baked or that are not cost effective at the present level of industry maturity, and think “we need this.” Or they hear a vendor pitch that, though not actually an outright lie, promotes “aspirational functionality” that is not realistic at present (OK, they are lying).
Perhaps that is a little unfair. Many new products are coming into a marketplace that is continually evolving. Some initiatives that are theoretically possible may not yet be practical. Much of the cognitive computing and virtual assistant technology functionality is still far from being cost effective or scalable. But organizations do need to get outside their comfort zone and try some of these new things – sometimes knowing they are not yet baked. The challenge is in sorting out what is ready and practical from what is in the experimental, science project stage and knowing what to do with each one.
What's real and what's noise?
I attended a presentation for an organization that was investing an enormous amount of money in a phased approach to personalization. A slide in the strategy deck for the leadership team said:
- Phase one – limited personalization implemented
- Phase two – personalization with demographic and interest-specific content
- Phase three – dynamic machine learning based personalization
- Phase four – predictive personalization enabled by machine learning
- Phase five – AI-driven personalization with advanced prescriptive algorithms
First of all, these terms meant very little in terms of practical approaches. They were jargon-filled and buzzword-compliant. Leadership always has to ask the question “so what?” What does that mean? What is “limited personalization?” What will it look like? What will it mean for the user? For the departments that have to support the capability?
After the vendor describes functionality in tangible, specific terms with actual examples, the next question is “How is this next phase different?” What demographics are going to be meaningful? How will we message these different targets with different information? What interests are useful for personalization?
If those answers make sense, terrific. If they don’t, it’s time to pause and look for what’s real and what’s noise. In the talk that I attended, the team was unable to describe what the rest of the phases meant. What is the difference between dynamic machine learning and predictive machine learning? Is the distinction meaningful? Are the differences academic and technical or are there practical implications?
Does the emperor have clothes?
In this situation, when the design team began to develop content architecture to support personalization even at a basic level, the company trying to develop personalization had no idea how its various audiences differed from each other or what content to use to personalize the message. The message was really the same across their different audiences.
The company lacked the underlying infrastructure or strategy to develop personalized content, let alone build “advanced predictive, IoT-enabled smart machine learning dynamic AI-enabled prescriptive…” and so on. It was nonsense. But no one asked the hard questions and no one would say that the emperor (technology vendor) had no clothes.
Focus on the user
At the end of the day, work should come down to use cases and user scenarios – the day in the life of the user. What are the things that they need to do to solve their problem or achieve their goal? It is easy to lose sight of the day-to-day tasks and activities of the people whom our technology serves.
I once worked with a large Medicare administrative contractor – one of the insurance processing organizations that handles Medicare claims. Entire departments of people were churning out hundreds of documents about claims regulations and processes each week. When we asked these people who they were creating the content for and what the purpose was, no one knew. Further investigation revealed that much of the content was never even read or downloaded by anyone!
Meeting true business needs
Business process is not fun or sexy – and often taken for granted. Questioning and examining business process does not get people excited. If the process is broken, the objective is to “just fix it” – don’t tell me it's broken or how broken it is.
Also, digging deeper into underlying assumptions or core business processes means potential disruption and the risk of impacting short-term profits. The concept of a digital transformation has lost its meaning and become a catch-all for just about any technology initiative.
True business value comes from questioning the obvious and looking for the real business need, instead of becoming fascinated by the shiny new object that is making headlines at the moment. CIOs have an important role to play in shaping these questions.
In my follow-on article, I will address the factors that enable (and hinder) digital agility, another pain point for CIOs.