Kubernetes helps orchestrate and automate tasks associated with containers - an essential need as you scale. Here's a deep dive for IT leaders on what Kubernetes can do, key terms, best practices, trends for 2020, and more.
How to explain AI in plain English
What is artificial intelligence? How does AI work? What are the enterprise use cases? Here’s how to discuss the key issues in plain terms
Cognitive scientist and Dartmouth professor John McCarthy coined the term artificial intelligence (AI) in 1955 when he began his exploration of whether machines could learn and develop formal reasoning like humans. More than 60 years later, AI is the hottest tech topic of the day, from the boardroom to the breakroom.
The vast majority of technology executives (91 percent) and 84 percent of the general public believe that AI constitutes the next technology revolution, according to Edelman’s 2019 Artificial Intelligence (AI) Survey. PwC has predicted that AI could contribute $15.7 trillion to the global economy by 2030. Of course, it’s already having an impact all around us. AI powers voice-based devices, filters our email, and guides our search results. One in five respondents said their organizations plan to implement AI enterprise-wide this year, according to the PwC 2019 AI Predictions Survey.
While AI has fueled more than a little bit of hype, enterprises are indeed trying to get down to business - before rivals do. As noted in our Harvard Business Review Analytic Services report, An Executive’s Guide to Real-World AI, “Hype in tech is nothing new. What’s different this time is the degree to which reasonable and knowledgeable people believe that there is, indeed, a real urgency to get going with AI now.”
Dan Vesset, group vice president, analytics and information management, at market research firm IDC, warns, “If you’re not starting to invest, there’s the real possibility of being left behind forever.”
That’s because companies need foundational pieces in place to be successful with AI, as the report notes. These include talent, which is in short supply; having the right data infrastructure as well as sufficient quantity and quality of data to train your models; deciding how AI will be governed; and managing change in the organization.
[ Check out our quick-scan primer on 10 key artificial intelligence terms for IT and business leaders: Cheat sheet: AI glossary. ]
However, just because everyone is talking about AI does not mean they understand it. As Red Hat technologist Gordon Haff recently wrote, AI projects in the past have often tried to boil the ocean - to solve multi-faceted problems like self-driving cars. Today’s enterprise AI projects are more practical, often focusing on customer experience pain points. “What’s happening with AI today is exciting in part because it involves practical solutions that address complexity, the need to handle more and more data, and demanding customers,” Haff noted. “McKinsey says, 75 percent of online customers expect help now.”
As demand for AI-enabled initiatives continues to increase, CIOs and IT leaders will need to clarify for non-technical constituents what AI is – and what it isn’t.
[ Read also: How to explain machine learning in plain English. ]
Here are a few basic AI definitions that may prove useful when discussing the topic:
“AI is the use of intricate logic or advanced analytical methods to perform simple tasks at greater scale in ways that mean we can do more at large scale with the workers we have, allowing them to focus on what humans are best at, like handling complex exceptions or demonstrating sympathy.” –Whit Andrews, vice president and distinguished analyst with Gartner Research
“Put simply, AI is a technology solution that helps the enterprise get more done with fewer resources by automating mundane, data-heavy tasks. More specifically, AI is a computerized simulation of human intelligence that can be programmed to make decisions, carry out specific tasks, and learn from the results.” –Zachary Jarvinen, head of technology strategy, AI and analytics at OpenText
“AI is a mathematical and algorithmic model that allows computers to learn to do tasks without being explicitly programmed to do those tasks.” –Timothy Havens, the William and Gloria Jackson Associate Professor of Computer Systems in the College of Computing at Michigan Technological University and director of the Institute of Computing and Cybersystems
Bonus: The bicycle analogy
For those who prefer analogies, Havens likens the way AI works to learning to ride a bike: “You don’t tell a child to move their left foot in a circle on the left pedal in the forward direction while moving your right foot in a circle… You give them a push and tell them to keep the bike upright and pointed forward: the overall objective. They fall a few times, honing their skills each time they fail,” Havens says. “That’s AI in a nutshell.”
[ Is RPA a kind of AI? See our related post: Robotic Process Automation (RPA) vs. AI, explained. ]
AI 101: How AI works
So how does it work? “AI matches the data about circumstances in a business or technical process to its outcomes so that developers can make desirable outcomes more common using the people you have,” says Gartner’s Andrews.
The AI on your phone, in your car, or on the Internet ingests those massive volumes of data and produces an implicit model that best optimizes some predetermined objective or reward function, Havens says.
“Usually that objective is represented by some calculated measure of how often the AI is correct in its outputs," Havens says. "AI learns very much the way humans do – except AI can ride, fall off the bike, a readjust millions and millions of times a second.”
Typically, AI analyzes historical data to make predictions about the future. Let’s take a real estate app as an example: “By using algorithms, the computer can analyze massive amounts of data on past home sales, school districts, transportation and traffic patterns, and more to make an estimate on a homes’ value, cost per square foot, and more,” Jarvinen says. “As the application collects more data from users and public information on sales, it learns, and the predictions become more accurate.”
AI vs. jobs
For some time, there has been fear and uncertainty about the impact of AI on human employment. Will these machines rise up and take all of our jobs?
Unlikely. For one thing, AI isn’t that advanced yet. “The robots aren’t coming to get us. And to go beyond that, AI is not even intelligent in the way humans are yet,” says Havens. “AI can precisely do calculations and perform inferences using billions of data points, but AI systems currently are not generalized intelligence.”
In fact, 40 percent of respondents from Global 2000 organizations say that they are adding more jobs as a result of AI adoption, according to a 2019 Dun & Bradstreet report, while only 8 percent are eliminating jobs as a result of implementing AI capabilities. Around one-third (34 percent) reported that there was no change in HR needs as a result of AI.
In any case, say AI experts, outright automation is the wrong focus for applied AI in the enterprise. AI adds the most value as an augmentation of human decision-making and interactions, complementing human skills for exponentially better outcomes. “The greatest misconception by far is that AI is a replacement of the workforce when it is, in fact, aimed at enhancing the workforce through augmentation,” says Jarvinen. “AI is about helping employees deal with mundane tasks (such as data entry or alert validation) so they can focus on the things that matter (such a speaking with customers or dealing with security threats).”
Gartner predicts that 80 percent of project management tasks that would typically be handled by a person today will be eliminated by AI by 2030, spanning traditional project management functions such as data collection, tracking, and reporting. Within IT and other business functions, such as finance, salaries for AI roles continue to rise.
What about enterprise use cases for AI? Let’s explore: