The concept of information technology began when the Egyptians and Sumerians created the world’s earliest forms of writing more than 5,000 years ago. Since that time, things have obviously changed quite a bit, including how information is collected, stored, and disseminated. Modern inventions and discoveries – such as electricity, semi-conductors, and computers – simplified the process and led to the emergence of information technology, or IT, as we know it today. And in the last two decades even more advances have happened that are quickly changing the face of IT and how it relates to consumers.
In the future, companies that understand how IT has become “consumerized,” and how to effectively use that trend to their advantage, will be positioned for success in this new paradigm.
Before examining how IT has become consumerized, it is important to define some of the key terms, beginning with “information technology.”
Defining information technology
The term isn’t very old, as it was first coined by Harold J. Leavitt and Thomas L. Whilser in a 1958 edition of Harvard Law Review to describe the nascent Computer Age and how chips and circuits would eventually replace paper files and notebooks. The definition of IT that Leavitt and Whilser created, and which still holds for the most part, consisted of three categories that included the following: techniques for processing information, the application of statistical methods for decision making, and the simulation of higher-order thinking through computer programs. This definition of IT covers much but it never considered how consumers, apart from companies or governments, would use IT.
As IT became more ubiquitous in modern society, average consumers eventually became the drivers of technology innovations, which led to the creation of the term “consumerization of information.” The consumerization of information can be defined as the reorientation of IT to focus on the end user (the average consumer), in contrast to organization orientated IT. Because this new approach to IT is more “bottom up” than “top down” it’s also been described as the “democratization of IT,” but it remains to be seen how far or where this trend will go.
An examination of the history of IT, as well as its trend toward consumerization, can reveal some of the implications companies and consumers can expect in the coming years. Companies and consumers can both expect to save money on IT costs in the coming years as well as better protection of data. Perhaps most important for the average consumer, enhanced control over personal data has increased as IT become more democratized.
A long way from hieroglyphs and cuneiform
When the Egyptians and Sumerians invented the hieroglyphic and cuneiform scripts respectively around 3100 BC, the information technology was officially born. The storage of information during much of the ancient world involved papyrus or clay tablets as a medium, with clumsy counting tables used for math. Eventually, the invention of the abacus helped make computing quicker and more accurate, and the invention of paper made storage of information more practical and efficient, but even by the early 1900s AD the IT world was still a long way from what it is today.
IT as we know it today, and the consumerization of it that would follow, is truly a product of the Computer Age. Notable landmarks in the IT revolution include the introduction of the first hard disk in 1956, database management systems in the 1960s, and the world wide web and email in the 1990s. As those technologies became common, so too did the term “IT,” which was increasingly available to a larger segment of the population.
The consumerization of IT
There’s no doubt that IT and its related Information and Communication Technologies (ICT) play a large role in the modern global economy and will continue to do so in the coming years. Global IT spending is estimated to be more than $5.3 trillion this year, with the United States accounting for nearly 35% of the total. The numbers demonstrate that IT is big business, but the best positioned businesses realize that since the early 2000s IT has become a consumer driven niche.
As technology started becoming more affordable and accessible in the early 2000s, the drivers of IT trends reversed from companies and governments to individuals and employees. Workers began introducing technologies and devices into their work places that they were familiar with as consumers, thereby reversing the trend where new technologies were introduced at the enterprise level and then slowly trickling down to consumers. There are many examples of this trend, but perhaps the best one is the iPhone.
Mass consumer adoption of the iPhone in the late 2000s and early 2010s led to corporate policies that allowed the use of personal devices to access business networks. Other, similar examples include the earlier adoption by IT departments of consumer IT services such as email and more recent developments such as the adoption of social media for business. The common denominator in all of these examples is the driving force – the consumer.
The biggest driving force behind the consumerization of IT has by far been employees, who have acted as informed consumers. These employees do research on new technologies before making purchases, and once they’ve mastered the technology they introduce it into their workplaces.
Other factors that have driven the consumerization of IT include the move to remote work, the prevalence of social media, and bring your own device (BYOD) policies by many companies. The BYOD trend is also a result of the consumerization of IT, because as companies see that their employees are at the forefront of tech innovations they realize that BYOD is a cost efficient way to take advantage of the trend.
Experts agree that the consumerization of IT will continue to play a bigger role in how work is done, bringing with it many benefits for companies, employees, and consumers. The trend toward remote work, which increased during the COVID-19 pandemic, appears to be a permanent part of the business landscape that will continue to influence the consumerization of IT. But it’s important to realize that as consumerization of IT increases in the U.S. and other industrialized countries, it’s also taking place in developing countries. Latin America and less developed parts of Asia are expected to see major pushes in the consumerization of IT in the coming years as tech becomes more affordable and accessible for millions of people in those regions.
As the consumerization of IT expands throughout the world, companies, employees, and consumers can expect to reap many benefits from the trend. The biggest benefit to companies will be the savings they get from having to do less training and providing fewer materials for their employees. Many employees use their own devices in this paradigm, and even those who use company devices will be familiar with the technology and will therefore require less training.
The continuing consumerization of IT will also bring a plethora of benefits for employees and consumers, which as you’ve already seen are often synonymous. Employees will continue to enjoy better working conditions, as they will use devices and technology that are familiar to them, making their jobs easier. Likewise, consumers can expect to see lower prices but better technology that is easier to use and more accessible. And as IT firms work to improve cybersecurity in this new de-centralized paradigm of work, consumers can expect will to that their everyday devices and software will become more secure from hacks and security breaches.
Although it’s often overlooked, the consumerization of IT has been one of the major tech trends of the last twenty years. What was once a top down industry, IT is now in the hands of the consumer, who can expect a multitude of benefits as the consumerization of IT continues to grow globally. Companies that understand this trend, particularly how it relates to their employees and consumers, will also benefit and likely take the lead among the tech firms of the future.