Information technology is no more

(Image source: datwyler.com) 

As many have during the summer, I have taken a break and did some reading. To my amazement, I have seen the term of “IT” or information technology in much of the literature that I read over the summer. And I started to wonder if we’re doing ourselves any favors by still insisting on using this term. I tried to answer that question for myself, and have come to the following realization: 

The term information technology has its origin in a world where computers were largely absent – it was first mentioned in a 1958 Harvard Business Review article. The article singled out three distinct elements: “technique of processing large amounts of information rapidly”, “statistical and mathematical methods to decision-making problems” and finally, “the simulation of higher-order thinking through computer programs”. The article further explained that information technology would have its greatest impact on middle- and top management. To be very clear, the article was not wrong and in fact given that the outlook was meant for the 1980s – it was in fact spot on. 
But it was also during the 1980s and early 90s that we saw a change in the computing paradigm. We moved away from centralized mainframe computers, that were in fact only accessible by a few, to personal computers soon to be found on everyone’s desk. In the early 90s, the introduction of corporate computer networks gave rise to corporate email and business information systems were finally accessible to almost everyone with a personal computer inside the company. 
This was also the time, when many larger companies started to build their own business information systems and mission critical software applications. Others were adapting existing commercial software where it was available. 

But then something changed in the early 2000s, coinciding with the advent of the Internet, for the first time ever, over 50% of US households had a personal computer at home. This started the trend of consumerization of IT – or rather, if I had my way, the consumerization of computing technology, that allowed the use of personal devices for work. This trend was further accelerated with the adoption of laptops and finally, with the introduction of smart phones and tablets. This computing evolution has been foreshadowed as “ubiquitous computing”, a term which was coined by Mark Weiser from Xerox PARC in 1988. An important aspect of this new form of computing is that computing is accessible to anyone, regardless of skills or know-how. Mark explained it this way: “we are trying to conceive a new way of thinking about computers in the world, one that takes into account the natural human environment and allows the computers themselves to vanish into the background”. Smartphones have indeed fulfilled many of these promises. Long gone are the days when one had to learn cryptic commands for a command line interface. With smartphones and tablets, the need to understand the underlying technology is gone – thus, computers have “vanished into the background”.

The next step in this evolution is “contextual awareness” or “context-aware computing” as defined in the paper by Bill Schilit, Norman Adams, Roy Want in 1994. It stipulates that computers or software applications can “sense” the context in which the human is interacting and adapt or reconfigure accordingly. We have started to see several applications that are trying to “sense” the right context and provide the user with the relevant information; for example, by knowing the location or time, your device can alert you of changes in your local environment without you even asking for it. Smartphones can also detect when, for example, you’re driving or when your shopping (based on changes in geo-location); Soon our phone will suggest the proper credit card to use when in a particular store. This provides another entry into a “relatively” new computing paradigm, “AI” or artificial intelligence. The story of AI is one of many failures throughout history with little success until now. It dates to the 1940s with the famed computer scientist Alan Turing describing the requirements that would need to be met for an artificial intelligence to be indistinguishable from a human being. This would be later known as the “Turing test”. AI has been used in business since the early 90s but only recently has found a much broader footing because of increases in capacity, bandwidth, and computing power. In fact, today, several companies have started to create chips specifically for AI applications. 

To try and summarize all these changes, one could formulate the following: The shifting computing paradigm that we have lived through over the last 50 years must imply something else: today’s computing applications that we use for business activities, no longer (and have not for a long time) fit the definition of “IT” or “information technology” – computing has become much more and much more relevant to every aspect of our lives.

In fact, I would go so far and say that the term “IT” is now a limiting factor to businesses because technology is boxed-in (literally) and only “brought into” the discussion whenever the topic of computing arises. Thinking in terms of an “IT” organization limits the companies understanding of computing and it muddles the waters when to determine accountability. For example, to what extend does the product manager define the technology used if we refer to the product manager as “the business” but in fact the product is technology. Or who owns the topic of “API” (application programming interface) that is used like a product but is so technical that you need someone with the technology understanding. Is it a technology product management function or is it a business product management function? If companies distinguish between technology and business, there can never be clarity on the ownership and the greater purpose. But to drill even further on this point, nothing today happens in our lives or in commercial activities without computing. Everything we do, and I really mean everything, requires some form of computing. This, in fact, has been the largest paradigm shift over the last 50 years. 

And in many industries, what we used to call IT has become the business. For example, in financial services where most everything is now completely done via technology. This is a trend that we’re also seeing in other industries, for example, a colleague of mine argued that in current car manufacturing, technology or “IT” is only playing a part of the value chain. But I would argue that with Tesla and other car startups, the computing part is now significantly larger than at any other car manufacturer. Tesla is creating their own AI chips and writing their own software for the car, they are creating and designing their own supercomputer so they can run simulations – computing is intrinsically linked to their business. It’s often said Tesla cars are computers on wheels. Even car companies can no longer afford for IT to be IT because computing is becoming more and more central to their business and their cars. 

Companies, especially in industries being disrupted, would do well to understand this because it is foundational for topics such as digital transformation or agility and it may very well decide if the company survives or stagnates and dies.

Popular posts from this blog

Will Libra change the world?

Apples Vision Pro Headset strategy is all about its Arm-chips.

Will smart phone cameras be better than digital mirrorless cameras?