3 points where Linux could have played out differently

3 points where Linux could have played out differently

Let’s examine three key points in the historical timeline of Linux, what might have been, and what we can learn from each

up
15 readers like this
linux questions

Given different events or individual actions, could Linux and open source more broadly have failed to become the engine for collaboration and innovation that it is today?

Perhaps you believe that great economic and technological forces make it difficult for individuals or chance events to radically alter how events play out. The integrated circuit, Moore’s Law, the internet, the sharing of software especially in academic settings, and other broad trends trump any single action in this view. Even if an open source operating system called “Linux” did not exist today, something much like it would.

[ Read also: 5 open source projects that make Kubernetes even better, by Gordon Haff. ]

Linux has its heroes

Alternatively, you have the idea of a hero driving events. “The history of the world is but the biography of great men,” as the Scottish philosopher and essayist Thomas Carlyle put it during a series of lectures on heroism in 1840.

Both perspectives are useful. The overall tides of history are powerful but individuals and events can exert significant change over at least the short- to mid-term.

The overall tides of history are powerful but individuals and events can exert significant change over at least the short- to mid-term.

Consider how specific individuals, companies, and technologies could have perturbed the development of open source at several points along its timeline. And how today’s software landscape might plausibly look different as a result. (The emphasis here is on Linux and open source infrastructure although we could also single out other events that relate to programming languages, the desktop, and more.)

1. Was Unix inevitable?

Without reprising the whole history of Unix, suffice it to say that history is both complicated and very intertwined with that of Linux and open source more broadly. A world where something like Unix never came out of a corporate lab (as Unix came out of AT&T Bell Labs) or somewhere in academia would have been, at minimum, a much different environment for early free software and open source.

There’s an accidental aspect to Unix specifically. After Bell Labs withdrew from the Multics project with MIT and General Electric in 1969, two Bell researchers, Ken Thompson and Dennis Ritchie, decided to apply some of the lessons they’d learned to an operating system for smaller computers. Management wasn’t interested; Multics had soured them on operating system research. Thompson, Ritchie, and, later, others persevered but it certainly doesn’t look like it was a sure thing.

"It was great minds, looking at this all over the world, people jumping on it."

However,  Red Hat distinguished engineer William Henry argues that conditions were such that a collaboratively-built operating system designed for interconnected networks, multiple users, and smaller computers was bound to arise one way or another: “We understood there’s a network coming, a universal one, something that’s big, where we’re going to be sharing with people that we don’t always collaborate with every day. And that was happening in the educational sort of community. So I think that something was inevitable because there was so much collaboration… It was great minds, looking at this all over the world, people jumping on it. The new network was there, new security was going to be there as an issue once you bring that network into it.”

2. Was Linux inevitable?

MINIX, a version of Unix written by Andrew Tanenbaum for educational purposes, served as the inspiration for Linus Torvalds to write Linux starting in the early 1990's .

Given Unix, offshoots were bound to happen – especially given the unique legal circumstances surrounding AT&T as a regulated utility, which prevented it from pursuing software as a business throughout the 1970s. BSD Unix came out of the University of California at Berkeley. MINIX, a version of Unix written by Andrew Tanenbaum for educational purposes, would itself serve as the inspiration for Linus Torvalds to write Linux starting in the early 1990's while he was at the University of Helsinki.

No one argues that it was in any way pre-ordained that a Finnish university student would write a Unix derivative and, furthermore, build a sufficient community around it to give it momentum. However, the argument goes that absent Linux, either some other university student would be inspired by MINIX or something else, or one of the existing alternatives, like the aforementioned BSD, would have slipped into the role.

Bryan Cantrill, longtime engineer at Sun Microsystems and now CTO of Oxide Computer Company, makes the case for BSD in the absence of Linux: “If it’s not Linux, it would have been one of the BSD variants which would have been the de facto Unix on x86. It is the rise of the Internet and it is the rise of SMP to a lesser degree, and then the rise of commodity microprocessors as highest performing microprocessors [where] Linux grabbed a ride, drafted on those economic megatrends but did not really contribute to them.”

The complication is that BSD hadn’t succeeded. It’s a matter of conjecture whether it would have eventually overcome what Brian Proffitt of the Red Hat Open Source Program Office calls “a lack of developer interest and passion,” combined with multiple variants that further splintered developer and user mindshare. It was likely also held back by lingering concerns caused by a lawsuit filed by AT&T over BSD.

In any case, it’s hard to rule out some sort of freely licensed operating system branching off from the Unix tree. But the complicated history of Unix makes it hard to state with any confidence how it would have played out exactly or certainly when.

[ Read also: 5 open source tools IT leaders should know about now and Robotic Process Automation (RPA): 6 open source tools. ]

3. Was Free Software inevitable?

In the 1980's, the computer industry was commercializing in ways that were increasingly eroding some of the sharing ethos that had pervaded the field since the beginning in many places. One of those places was the MIT AI Lab where the formation of two commercial Lisp companies, Symbolics and Lisp Machines Inc., had ended up as a messy and acrimonious process that led to much reduced open collaboration and widespread departures from the Lab.

Richard Stallman was part of the lab community and his experiences with the effects of proprietary code in this conflict led him to decide to develop the GNU project, a free operating system. Although the operating system kernel has never been completed, components like the compiler and utilities would be instrumental in Linus Torvald’s ability to assemble a complete operating system around the kernel that he wrote. But equally important were the principles – and perhaps the license – that Stallman wrapped around the code. They established a focus on user freedoms and rules for making source code available whenever software is distributed.

Several permissive licenses – which generally let you redistribute modified software without making the new source code available – were already putting in an appearance at about the same time.

But, given that several permissive licenses – which generally let you redistribute modified software without making the new source code available – were already putting in an appearance at about the same time, how much did Stallman, the GNU Public License, and the Free Software Foundation that Stallman created ultimately matter?

Richard Fontana, a lawyer at Red Hat who specializes in open source legal issues, argues that absent this free software movement: “There would have been perhaps some sort of political void or cultural void that would have been filled by something else, or perhaps we would have seen a more vibrant community developed in the 1980's and 1990's around software licenses that prohibited commercial use… Because there was a kind of anti-commercial sentiment that Stallman was responding to, and the early enthusiasts around GPL licensed software were responding to it despite the fact that Stallman himself made clear that free software was compatible with commercialization. So that kind of positive view of commercialization of free software maybe was not inevitable or would have been limited to a permissive license context.”

Luis Villa, an open source lawyer and co-founder of Tidelift, adds that, more broadly, a lot of the historical focus on licenses also stems from the early days of free software. “For years, we said the licenses were the only acceptable way to legislate behavior. We said that we didn’t like codes of conduct. We said we didn’t like kicking people out of our communities. And so we left ourselves with only licensing as the tool. And I think to some extent, that’s an artifact of Richard [Stallman] and the FSF.”

What have we learned?

Beyond romping through history, asking these types of questions can help to provide insights we can apply more broadly.

An important one: Timing is important. Projects and technologies happen against a backdrop that is changing with time both in terms of technology, competition, and more macro trends. As Mike Bursell, Red Hat Chief Security Architect puts it: “I think there were some lucky breaks. And maybe [open source] could have happened earlier in some ways. But I don’t think the timing is inevitable. I would like to think that human nature is such that we would have got there in the end. But that’s not the same as saying we’d have it now.”

Many aspects of collaboration and sharing pre-date what would become free software or open source. But the rise of open source as a development approach for enterprise software is relatively new. And, as open source makes an ever larger impact on the software world as a whole, it’s worth remembering that some aspects of collaboration can be fragile – especially as new software delivery models like public clouds upend business models and complicate vendor relationships. As Rob Hirschfeld, CEO of RackN puts it: “Tragedy of the Commons is a very real thing. There’s sharing and collaborating around a common set of shared value components. That’s a very hard thing to maintain, especially with loose governance and loose rules, which is sort of inherent in open source. The idea that we’re gonna have multiple people profiting from a shared code base is very, very hard to sustain in a real way.”

Finally, many aspects of the environment in which these what-ifs played out existed in the form it did because of large incumbent vendors. A number of them had the opportunity to parlay some of the inflection points central to open source’s rise to considerable advantage. Yet they didn’t, for reasons of organizational inertia and existing business models. Had one of those vendors played a different hand and been less hamstrung by their culture, open source could have played out differently.

This article is based on a four-part miniseries that was part of the Innovate @Open podcast: Listen to or read the entire production. The podcast brings in many more guests and considers a number of other scenarios including commercialization, Microsoft’s role, and broader legal questions.

[ What do IT leaders say about how they're using open source? Download the Red Hat 2020 State of Enterprise Open Source Report .] 

No comments yet, Add yours below

Comment Now

Gordon Haff is Technology Evangelist at Red Hat where he works on product strategy, writes about trends and technologies, and is a frequent speaker at customer and industry events on topics including DevOps, IoT, cloud computing, containers, and next-generation application architectures.

7 New CIO Rules of Road

CIOs: We welcome you to join the conversation

Related Topics

Submitted By Kevin Casey
July 09, 2020

Just because you automate a process doesn’t mean you’ve secured it. If you're considering RPA, make sure you understand the security implications

Submitted By Stephanie Overby
July 08, 2020

Need to get up to speed on edge computing – or educate others in your organization? Check out these key concepts and questions, explained in plain terms, by edge experts.

Submitted By Ginny Hamilton
July 08, 2020

The IT organizations most likely to succeed in the next normal will be the ones that spend time future-proofing their workforce strategies

x

Email Capture

Keep up with the latest thoughts, strategies, and insights from CIOs & IT leaders.