The death of planning: 4 factors fueling IT decision making now

At a time when technologies and market conditions can change on a dime, it doesn’t make sense for companies to craft five-year strategic plans. Here’s what they should do instead
280 readers like this.
COBOL application modernization

Until around 1900, human knowledge doubled once every century. By the end of World War II, this rate had accelerated to once every 25 years. Today, according to some estimates, human knowledge doubles roughly every year. And, with the advent of Internet of Things technologies, it has been projected that our knowledge will soon be doubling twice every day. 

This is more than just a fun statistic. The exponential growth rate of information has a real impact on business and IT planning. Too many companies are still devoting time and resources to strategic plans that look out five or ten years into the future – even though we know that the business and technology landscape will be completely different by that time. 

The acronym VUCA (which stands for volatility, uncertainty, complexity, and ambiguity) is a specter that has long haunted leaders at startups. These leaders have always needed to be able to quickly pivot as they gather new information about how the marketplace is responding to their products and business models. But increasingly, the need for an “expect-the-unexpected” mindset applies equally – or perhaps even more so – to established enterprises. 

[ Is 2020 the year of the pivot? Read: Now or never: Why CIOs must future-proof IT workforce strategy ]

Large, long standing companies typically have significant investments in legacy IT, revenue streams centered on products that are nearing obsolescence, and leaders with deep experience in a context that may not be applicable to the current business world. As a result, it can be extremely difficult for these organizations to be nimble. Even as they plan five or ten years into the future, they’re often being passed and left behind by their younger, more adaptable competitors. 

Here are four factors that IT and business leaders should carefully consider as they plan for the death of planning:

1. Product development methods

Much has been made in recent years about the importance of moving from a “waterfall” approach to product development to an “agile” approach. In theory, agile projects are more iterative, with development teams able to quickly respond to changing needs. But in practice, we’ve noticed that this is only partly true. Waterfall presupposes that stakeholders know both the problem and how they’re going to solve it; agile typically presupposes only that stakeholders know the problem. 

But the truth is, sometimes we don’t know the solution or the problem. Organizations need a product development method that will keep up with an ever-changing world. In our opinion, the best fit is a “lean” approach, which emphasizes processes that continually increase value to customers. 

2. IT leadership 

CIOs need to have a deep understanding of technology. Now, that may sound obvious, but it’s really not. No company would employ a chief financial officer who didn’t understand money or a chief marketing officer who was unfamiliar with the latest thinking in advertising and media. And yet, in many organizations, CIOs are more departmental managers than they are tech leaders. This stems from the old view of IT departments as cost centers that “keep the lights on” for the rest of the business. 

In today’s business climate, if your CIO isn’t a tech wizard who helps lead strategic decision making for the entire company, then you have a big problem.  

[ Read also: CIO role 2020: Everything you need to know about today’s Chief Information Officers ]

3. Infrastructure 

In the past, the lifespan of IT infrastructure (typically around five years) aligned nicely with strategic planning cycles. But today, it’s extremely difficult even for experienced IT leaders to predict with any sort of accuracy what sorts of data center hardware and employee endpoints they’ll need a half-decade from now. 

Organizations should move away from a mindset of “buying infrastructure” toward one of “building capabilities.”

Organizations should move away from a mindset of “buying infrastructure” toward one of “building capabilities.” This means adopting flexible solutions (and financing approaches) that will let organizations tackle their current challenges, rather than fighting today’s war with yesterday’s armor. Cloud resources and as-a-service deployment models are important for enabling this sort of flexibility. 

4. Project scope

At most large organizations, product development spans across a number of different groups, each working on its own individual project that contributes to the eventual product. This approach may have made sense when business leaders could predict what the marketplace would want from their company five years in the future. But in 2020, this highly-distributed product development process makes it much more difficult for companies to change course and can potentially inhibit innovation. 

We suggest that businesses move away from a “project” mindset, and instead empower small teams to work on their own products. These teams will still have their share of misses; they may, in fact, have more misses than hits. But when they fail, they’ll fail small and fast. 

By emphasizing near-term problem solving instead of long-term planning, organizations can adjust to meet the demands of the market – and of the moment. 

[ Culture change is the hardest part of digital transformation. Get the digital transformation eBook: Teaching an elephant to dance. ]

Dr. Abel Sanchez holds a Ph.D. from the Massachusetts Institute of Technology (MIT). His areas of expertise include the Internet of Things (IOT), radio-frequency identification (RFID), simulation, engineering complex software systems, and cyber-physical security. He teaches graduate courses in Information engineering, cybersecurity, and software architecture.
John R. Williams is a Professor of Information Engineering and Civil and Environmental Engineering at MIT. Professor Williams holds a BA in physics from Oxford University, an M.Sc. in physics from UCLA, and a Ph.D. from Swansea University. His area of specialty is large scale computer analysis applied to both physical systems and to information.