Artificial intelligence (AI): 5 trends, hype-tested

Are you exploring how to best implement artificial intelligence in your business? Consider these trends, which are key to a future of practical AI business applications
259 readers like this.

If you are considering using artificial intelligence (AI) to mature your foundational IT and data capabilities, how do you separate hype from reality?

Whether you are exploring the promises of AI for your business or still wondering when you will see truly transformative results, here are five industry trends that will help realize AI’s untapped potential. Let's break them down:

1. Black box vs. explainable AI

For most of us, deep learning systems are essentially incomprehensible. Using millions of data points as input and the correlating data as output, their internal logic can generally not be interpreted using plain language.

[ Get our quick-scan primer on 10 key artificial intelligence terms for IT and business leaders: Cheat sheet: AI glossary. ]

However, if automated systems are to assist in making critical decisions such as which operations and processes to use and we cannot understand how these decisions are made, how can we identify and address errors? This lack of common sense (a term first defined in the context of AI by John McCarthy in the 50s), has limited the application of AI in the real world to date. We need a clearer, less complicated AI system that better relates to the world and to people.

We need a clearer, less complicated AI system that better relates to the world and to people.

2. Machine learning vs. machine teaching

According to McKinsey Global Institute, by 2030, work hours spent on physical and manual skills and basic cognitive skills are expected to decrease by 14 percent and 15 percent, respectively. We will instead spend more time using higher cognitive skills, like answering "why" and deciding what to do.

This new way of working will lead to a demand for tools to support it. PARC scientist Mark Stefik‘s research on mechagogy (machine teaching) describes a future in which people and machines learn from each other’s strengths. In the future, we can imagine AI systems as an essential part of the workplace – our “thinking” partners.

3. von Neumann computing vs. neuromorphic computing

One of the key disruptions in IT during the next decade will be the transition from traditional von Neumann computing architectures to neuromorphic computing. As Moore’s law slows and we encounter the von Neumann bottleneck, what can we learn from the most efficient computer to date — the brain?

Biological brains have memory and compute in the same circuits, whereas traditional von Neumann digital computers separate memory from compute. Biological brains are highly parallelized, whereas digital computers perform computations in a serial fashion. Biological brains are dense and require only a minuscule fraction of the energy used by a digital computer. These bottlenecks are the primary reason that modern digital computers struggle to process huge AI programs.

4. Digital vs. quantum computers

Size limitations prevent conventional digital computers from meeting the demands of AI computing. Quantum computers use qubits and parallelism to handle much larger amounts of data and to look at all solutions simultaneously. Incumbents like IBM and Google AI Quantum and startups like Bleximo are working to combine general-purpose processors and NISQ application-specific quantum coprocessors called quantum accelerators to build systems for specific business and engineering domains. Early potential industry applications include chemistry (for materials), pharmaceuticals (for drug design), and finance (for optimization).

[ Read also: Quantum computing: 4 things CIOs should know. ]

5. Electronic vs. brain-machine interface devices

Current AI applications primarily run on electronics, but we’ll eventually see a more intimate integration of electronic and biological systems.

Current AI applications primarily run on electronics, but we’ll eventually see a more intimate integration of electronic and biological systems. For example, Neuralink, one of Elon Musk’s latest ventures, announced plans to start clinical trials of its implantable brain-machine interface (BMI) devices with humans by the end of 2020. With the integration of AI applications and our biological systems, the boundary between humans and machines has begun to merge. Scientists are also combining BMIs and AI to control external devices using brain signals and to recreate aspects of cortical function with AI systems.

Most scientists and technologists agree that we have only scratched the surface of AI’s potential. Increasingly, CIOs and organizations need to keep track of the latest developments of this transformative technology.

[ How can automation free up more staff time for innovation? Get the free eBook: Managing IT with Automation. ] 

Kate Yuan is a startup consultant with a focus on go-to-market strategies and enterprise sales. She has worked with startups in four continents and 30+ countries as an investor, advisor, and operator. Most recently, she was the Operating Partner at Hemi Ventures, an early-stage fund investing in mobility, biotech and enterprise AI sectors.