When discussing Human-Computer Interaction and User Experience, we generally focus on the Human/User side. This was not the case since the early days of computing. It was only in recent decades that technologists started focusing on the user. This is because the first users were the same scientists/engineers who built the computer they were using so the interface was something that they could understand.
On the other hand, one must not forget that humans interact with computers depending on the shape the same computers take. In this blog post, I will be exploring how computers evolved over the past 7 decades and how the users’ importance was promoted accordingly. This should help us understand the origins of key technology that we use today and help us to plan our future technology from a solid perspective.
“Those who do not remember the past are condemned to repeat it.” George Santayana
In our book, The New Digital Natives, we explained how paradigm shifts help us understand how this new generation of users interacts with devices. This is then helpful when planning and predicting the next step while understanding the latest trends. Based on our research, I’m highlighting the main paradigm shifts practitioners need on their day-to-day design decisions.
1) Going Electrical, aka 1940s
Before Turing’s electrical computer, ‘Colossus‘, the world had not yet tasted the power of an electrical computer. Till those days during WW2, a computer was either a mechanical device or a human worker who worked repetitive calculations. Turing’s electrical computer was crucial for the allied forces to win the second world war. The events that led to the development of this machine were recently interpreted in the movie ‘The Imitation Game‘.
This machine was built to decrypt messages transmitted by the Nazi military. That was its purpose and design and that was the ‘only’ task that this machine could do. I am stating this for us to think about the relationship between the size of the machine and its capability.
When we look at this machine, we should keep in mind one fundamental UX perspective: Who were the users of this machine? Naturally, not the general public but the people who also built the machines.
2) Computers and Businesses, aka 1950s
After the war, computers caught the imagination of many. In my opinion, the most defining opinion piece of the time came from Vannevar Bush in his article “As We May Think” published on The Atlantic edition of July 1945. In this article, Bush gave outstanding insights on how information and computers go hand in hand. He also predicted the information age and how databases would reshape the world.
I strongly urge you to read this 71 year old article today, in the age of Big Data and Data Science and ask yourself when thought revolution really took place.
The paradigm shift at this stage was: “How can this device that helped winning a war help my business conquer my market?“. Basically, no significant changes were made to the architecture at this point and the concept of Batch Processing was ushered.
3) Wider Usage of Computers, aka 1960s
The cold war was brewing and people were starting to get used to the idea that computers are there to stay. In the previous decade, businesses such as banks made use of batch processing to improve their operations.
During this decade, there was a push towards exposing the general public to use of computers. The most evident move was the launch of the first Automated Telling Machine (ATM).
This decade saw also the invention of the first Mouse that made the earlier idea of a pointing device more accessible and easy to use.
Many argue that the most important development of this decade was the Advanced Research Projects Agency Network (ARPANET). This was a packet-switching system based on the TCP/IP protocols that laid the foundation for what we today refer to as the Internet. The motivation of this network was the ability of having an interstate network in the USA to distribute data and make it easier to retrieve in the eventuality of a nuclear attack.
From a user’s perspective, there was nonetheless another paradigm shift that shocked many. This was the idea of Time Sharing or in other words the ability of using the same machine while others are still using it. This started to phase out batch processing from many tasks. Batch processing required distinct skills such as the planning and scheduling of tasks. The importance of these skills was diminished with the introduction of time sharing capabilities and did in fact trouble many that were trained in the 50s about the skills of those being trained in the new decade. The arguments are very similar to what we today hear about young people and mobile devices.
4) Enabling -much- Smaller Computers, aka 1970s
The relative widespread use of computers ignited investment in technology that yielded smaller and cheaper machines. The first commercial microprocessor was released by Intel in 1971. This company has been pioneering development in the field ever since.
Another groundbreaking event of this decade was the realisation of the first handheld mobile device in 1973 by Martin Cooper. This major episode could never happen without the contributions and development of other engineers in previous decades. A comprehensive history of mobile devices can be found in this Wikipedia article.
All this supported the earlier observation of Intel’s co-founder, Gordon Moore, that the number of transistors in a dense integrated circuit doubles approximately every two years. This is adequately known as Moore’s Law.
This decade provided the building blocks that ushered in the creation of the popular computers in the years to come.
5) Availability of Personal Computers, aka 1980s
The concept of a general purpose personal computer started with Olivetti’s Programma101 machine in the 1960s. Its dependency on programming kept it on relevant in the scientific community. Other efforts such as Xerox Star Computer started shaping the user friendly computer we know today.
Within the context of computers being used in different commercial applications and engineering developments making smaller computers a reality, the next step was always becoming more apparent.
Technology entrepreneurs such as Bill Gates and Steve Jobs clutched this emerging opportunity and started building their systems and machines that were specifically designed for less technical users. On separate paths and through different sorts of deals they built their companies, Microsoft and Apple respectively, that would be shaping the subsequent decades.
6) Web connecting us all, aka 1990s
Following the development of Internet related protocols (TCP/IP) in the late 1980s, Berner-Lee’s vision of a ‘web of information’ became possible. While working at CERN in August 1991, he posted instructions of how to make use of these first web pages that allowed for primitive navigation.
The computing landscape of the early 1990s was mainly dominated by Microsoft’s Windows Operating System version 3.x, followed by the significant advancement released through Windows 95. The vision of have a wide distribution of computers on personal and commercial level was being achieved.
Meanwhile, Netscape took advantage of the nascent WWW and release the first browser, Netscape Navigator, in 1994. Netscape experienced an astonishing growth in its early days and motivated Microsoft to start off their Internet Explorer initiative.
This all resulted in the astonishing Browser Wars that characterised the decade. This war, and this reference is no exaggeration, had its casualties. These were namely Netscape themselves but also Microsoft and Bill Gates himself. The court case dragged till the end of the decade and many argue that it set Microsoft out of pace in the technological race. I really recommend you watching the exciting documentary The True Story of the Internet : Browser Wars. The talented journalist John Heilemann walks us through the story by interviewing the main protagonists of this episode.
Meanwhile, while the tech giants of the decade were battling out on the large arena, the WWW provided vast opportunities for new emerging companies who exploited online applications. These are namely Google, Yahoo, Amazon and more. These companies then came along to dominate the forthcoming decades.
7) Mobile Information, aka 2000s
Mobile devices became more affordable and available during the early days of this decade. as the graph below demonstrates, this decade witnessed an exponential increase in mobile telephone subscriptions. Phones during these days were only capable of enabling telephone calls and sending/receiving SMS text messages. By this time, people were used to the notion of sending electronic messages from computers in the form of e-mails. SMS text messages were different and only allowed for 160 characters but had an outstanding feature…mobility. I personally recall groups of people enjoying themselves texting each other and chatting with others while at different locations. This was the early kindling of mobile information.
Smartphones in the popular form of PDAs were available in this decade but they practically needed a stylus to enable the user to carry out significant tasks. On the other hand, Apple’s first iPhone, released in 2007, was practically the first mobile device that provided a better interface that allowed for multi-touch interaction without the need of a stylus. Google’s Android followed and together, these new operating systems, brought to light a new era of mobile applications development that sealed the end of this decade with more available and usable mobile devices that linked Internet connectivity and mobile information.
Meanwhile, throughout the decade, Internet connectivity became even more available with dramatic improvement to quality in terms of speed and connectivity. The New Digital Natives were born in this decade and for them, everything sealing off this decade was/is set as the baseline or de-facto standard for what is expected out of technology…mobile and instant information.
8) We want contextual Information and we want it Now, aka 2010s
This decade kicked off with the birth of a new generation of devices…tablet computers. Apple, once again, managed to outrun its competitors with the release of the first iPad model. This was followed by other similar devices that started making this newly-shaped technology more available. Tablets were found to be convenient for education or work-related applications such as reading and reviewing of documents.
During the early days of this decade we also started witnessing the rise and popularity of Cloud Computing. Few years later, we’re taking this for granted but widespread cloud storage only gained pace few years ago. This brought to light a new generation of interaction with devices ranging from desktop computers to mobile.
Cloud Computing was pioneered by Amazon through the AWS platform and was immediately followed by Google Cloud and Microsoft’s Azure platforms. By 2015, these were widely available and being used by many industries in different applications ranging from data processing to image processing.
One of the most revolutionary factors of this decade is undoubtedly The Internet of Things. IoT will also probably shape the coming decades with byproducts such as The Physical Web and other contextual information application.
This is probably the only entry whose conclusion is actually an introduction to other posts. Just like anything else, computing is a product of an evolutionary process. Next time, before thinking that we’re really using high tech devices, remember that we’re still using a derivative of the type-writer, a pointing device that materialised in the 1960s and all packaged in cases that were designed in the 1970s and 1980s.
Above all, these decades have shown us how they are all interconnected and how one leads to the next. My next posts will cover different a variety of topics that can guide you towards contributing to a new paradigm shift.