Artificial Intelligence: 3 ways the pandemic accelerated its adoption

Artificial intelligence (AI) applications are becoming ubiquitous in organizations across industries. Here are three key pandemic-related trends that hastened its adoption
36 readers like this.

The need for organizations to quickly create new business models and marketing channels has accelerated AI adoption throughout the past couple of years. This is especially true in healthcare, where data analytics accelerated the development of COVID-19 vaccines. In consumer-packaged goods, Harvard Business Review reported that Frito-Lay created an e-commerce platform,, in just 30 days.

The pandemic also accelerated AI adoption in education, as schools were forced to enable online learning overnight. And wherever possible, the world shifted to “touchless” transactions, completely transforming the banking industry.

Three technology developments during the pandemic accelerated AI adoption:

  • Continued inexpensive computing power and storage
  • New data architectures
  • Availability of new data sources

[ Also read Artificial Intelligence: How to stay competitive. ]

Pros and cons of AI developments

Let’s look at the pros and cons of these developments for IT leaders.

1. Continued inexpensive computing power

Even 60 years after Moore’s Law, computing power is increasing, with more powerful machines and more processing power through new chips from companies like NVidia. AI Impacts reports that “computing power available per dollar has probably increased by a factor of ten roughly every four years over the last quarter of a century (measured in FLOPS or MIPS).” However, the rate has been slower over the past 6-8 years.

Pros: More for less

Inexpensive computing gives IT leaders more choices, enabling them to do more with less.

Cons: Too many choices can lead to wasted time and money

Consider big data. With inexpensive computing, IT pros want to wield its power. There is a desire to start ingesting and analyzing all available data, leading to better insights, analysis, and decision-making.

But if you are not careful, you could end up with massive computing power and not enough real-life business applications.

As networking, storage, and computing costs drop, the human inclination is to use them more. But they don’t necessarily deliver business value to everything.

2. New data architectures

Before the pandemic, the terms “data warehouses” and “data lakes” were standard – and they remain so today. But new data architectures like “data fabric” and “data mesh” were almost non-existent. Data fabric enables AI adoption because it enables enterprises to use data to maximize their value chain by automating data discovery, governance, and consumption. Organizations can provide the right data at the right time, regardless of where it resides.

Pros: IT leaders will have the opportunity to rethink data models and data governance

It provides a chance to buck the trend toward centralized data repositories or data lakes. This might mean more edge computing and data available where it is most relevant. These advancements result in appropriate data being automatically available for decisioning – critical to AI operability.

Cons: Not understanding the business need

IT leaders need to understand the business and AI aspects of new data architectures. If they don’t know what each part of the business needs – including the kind of data and where and how it will be used – they may not create the correct type of data architecture and data consumption for proper support. IT’s understanding of the business needs, and the business models that go with that data architecture, will be essential.

3. New data sources

Statista research underscores the growth of data: The total amount of data created, captured, copied, and consumed globally was 64.2 zettabytes in 2020 and is projected to reach more than 180 zettabytes in 2025. Statista research from May 2022 reports, “The growth was higher than previously expected, caused by the increased demand due to the COVID-19 pandemic.” Big data sources include media, cloud, IoT, the web, and databases.

Pros: Data is powerful

Every decision and transaction can be traced back to a data source. If IT leaders can use AIOps/MLOps to zero in on data sources for analysis and decision-making, they are empowered. Proper data can deliver instant business analysis and provide deep insights for predictive analysis.

Cons: How do you know what data to use?

Besieged by data – from IoT, edge computing, formatted and unformatted, intelligent and unintelligible – IT leaders are dealing with the 80/20 rule: What are the 20 percent credible data sources that deliver 80 percent of the business value? How do you use AI/ML ops to determine the credible data sources, and what data source should be used for analysis and decision-making? Every organization needs to find answers to these questions.

Core AI technology is evolving all by itself

AI is becoming ubiquitous, powered by new algorithms and increasingly plentiful and inexpensive computing power. AI technology has been on an evolutionary road for more than 70 years. The pandemic did not accelerate the development of AI; it accelerated its adoption.

Harnessing AI is the challenge ahead.

[ Want best practices for AI workloads? Get the eBook: Top considerations for building a production-ready AI/ML environment. ]

Arun Ramchandran
Arun ‘Rak’ Ramchandran is the Global Head of the Hi-Tech, Platforms & Professional Services (HTPS) Vertical Business Unit, and is also the Global Market Leader for the Digital Core Transformation (DCT) Service Line at Hexaware.