Rise Of The Machines: A Brief History Of Computers

Information technology has touched almost all parts of life in the 21st century. We use technology to read the news, play music, share photos, get weather updates, book travel, and more. This chapter explores the rapid growth of information technology.

Learning Objectives

You should be able to:

  • Broadly explain major information technology advances
  • Discuss how information technology changes influence life today, including:
    • Mainframes
    • Computer networks
    • Personal computers
    • Smartphones
    • Internet of Things (IoT)
    • Web 1.0
    • Web 2.0
    • Generative Artificial Intelligence

In the Beginning: 1800s

People have been interested in teaching machines to compute for centuries. In the early 1800s, Charles Babbage was fascinated by thinking machines and is credited with creating one of the first mechanical computers. Ada Lovelace became acquainted with Babbage and she had the foresight to see that computers could be used for much more than simple arithmetic. She developed sophisticated algorithms that could be carried out on these mechanical computers. Today, Ada Lovelace is credited as the first computer programmer.

Ada Lovelace, public domain image from wikipedia.org/wiki/Ada_Lovelace

Over the next 200 years, thousands of technological improvements made our modern computing infrastructure possible.

First Modern Computers: 1930-1950s

The first modern computers occupied entire floors or buildings. They were expensive and difficult to use. Advances were made in operating systems, programming languages, and interfaces so that these machines could be made easier to work with.

  • 1933: Alan Turing proposed a "universal computing machine" upon which all modern computers are based. Advances in hardware and software over the following decades made computers dramatically more powerful and easy to use. Alan Turing was one of the key scientists who helped crack the Nazi Enigma encryption device (as shown in the movie The Imitation Game, which you should definitely watch).
  • 1939: The first basic calculator was released. It did not do a whole lot, but it helped out the arithmetically challenged. Below is an example of an early mechanical calculator.
    Mechanical Calculator
  • 1947: The computer keyboard was created. Prior to keyboards, computers had to be configured by connecting cables or by uploading punchcards. Below is a sample punchcard used to program computers.
    Punchcard
  • 1950s: Digital computers and central mainframes were deployed. Prior to digital computers, analog computers were common. But, analog computers were difficult to use and error-prone. Mainframe computers centralize computing on a few large, powerful devices. Client computers connect to the central mainframe.
    Mainframe

Start of Interconnected Computing: 1960s

At this point in history, computer hardware was still large and expensive. Governments and companies invested heavily in research despite not knowing exactly where the inventions in hardware and communication technologies would go.

  • 1960s: ARPANET (precursor to the internet) was developed. Initially, only universities and governmental agencies were connected. The lessons learned in ARPANET led to the internet.
  • 1963: The computer mouse was invented. The mouse is a device we take for granted today, but it was a major milestone in computer usability.
    Mouse
  • 1965: Fiber optic data transmission was first performed. Fiber optic cable would eventually be deployed around the globe. Fiber optic cable is the backbone of the internet. When new networking cable is laid going long distances, it is virtually always fiber optic cable.
    Fiber

The Personal Computer Revolution: 1970s-1980s

Before the 1970s, few individuals owned computers. But manufacturers made hardware smaller, faster, and cheaper. By the early 1980s, a personal computer was in reach for much of the developed world.

  • 1976: Apple released the Apple II--a personal computer with a graphic user interface that made computers easier for novices to adopt.
    Apple II
    Credit: Ruslan Gilmanshin - stock.adobe.com
  • 1982: Rear Admiral Grace Hopper gave a presentation titled, "Future Possibilities: Data, Hardware, Software, and People." You can watch part 1 and part 2 online. She was a pioneer in digital computing. Her perspectives on the history of computing are fascinating, and her predictions of the future of computing proved correct.
  • 1985: Microsoft released Windows 1.0. This was Microsoft's first foray into the graphical operating system segment. Try an emulated version of Windows 1.01 using this link
  • 1989: The World Wide Web was invented. Use this link to examine the first website. People began using web browsers to access the World Wide Web. At the beginning of the internet, there were many popular protocols that let people interact. But, the World Wide Web caught on so much that "the web" and "the internet" almost became synonymous. Later, this version of the web would be named Web 1.0. Contributing content to Web 1.0 required significant technical skills. Web 1.0 was largely read-only. Experts created the pages, and customers viewed the content.

The Internet Revoluation: 1990s-200s

In this period, the World Wide Web (often used as a synonym for the internet) was invested and gained wide adoption. Broadband internet rolled out to many parts of the world. The dot-com bubble burst in 2000, but adoption of the internet and online services continued unabated.

  • 1990: Microsoft released Windows 3.0--its easiest-to-use operating system to date. It achieved widespread adoption in home and business markets. Try an emulated version of Windows 3.1 online using this link.
  • 1991: Linux Torvalds starts the Linux project. Billions of Linux systems have been deployed since its initial release.
  • 1995: Microsoft released Windows 95. It was so popular that people lined up around the block to be among the first to buy it. The hype for Windows 95 was so intense that some people bought it without realizing that they needed a computer to run it. Try Windows 95 here.
    Windows 95
    Credit: pbombaert - stock.adobe.com
  • 1997: The first version of Wi-Fi was released People began to see the value of being able to connect to a network without dragging around a network cable. Initial Wi-Fi speeds were slow from our current perspective, but since 1997, Wi-Fi speeds have increased to match wired connections in many instances.
    WiFi
  • 2000: LG creates the first internet-connected refrigerator. Connecting different types of devices to the internet is known as the Internet of Things. The number of "smart" devices would increase drastically after the year 2000.
  • 2004: Facebook launches. Facebook and other Web 2.0 companies made it easier for people to add content instead of just viewing content. Web 2.0 is sometimes referred to as the "read/write" web.
    Facebook
    Credit: eskay lim - stock.adobe.com
  • 2005: YouTube launches its video-sharing service. Video content could be uploaded and shared with the world. Google acquired YouTube in 2006.
  • 2006: Amazon launches its Amazon Web Services (AWS) cloud computing service. Google, IBM, Microsoft, Oracle, and other large players would compete with their own cloud computing platforms. With cloud computing, organizations rent computing capacity from service providers rather than buying and hosting expensive equipment in-house.
    Cloud Computing
  • 2007: Apple releases the original iPhone. Smartphones were not so smart before the iPhone. The iPhone form factor has been copied endlessly since its release.
    iPhone
    Credit: chathuporn - stock.adobe.com
  • 2009: Bitcoin launched, thereby creating the first entirely digital currency.
    Bitcoin

Ubiquitous Technology: 2010s-2020s

By this point in history, computing touches nearly every aspect of our lives. It became nearly impossible to shop, bank, travel, or learn without information technology.

  • 2013: The website jimmarquardson.com was registered. Literally dozens of people across the world would be impacted, some for the better.
  • 2015: The Ethereum network went live. The Ethereum network demonstrated how blockchain technology and smart contracts could be used not just for cryptocurrency, but to establish a digital trail of ownership.
    Smart Contracts
  • 2016: Oculus Rift virtual reality headset launches. The dream of virtual environments (the metaverse?) and augmented reality felt closer than ever.
    Oculus Rift
  • 2019: Google announced that it achieved quantum computing supremacy. I.e., for certain problems, quantum computers could solve those problems faster than traditional computers. Quantum computers solve certain problems very well, but they will never replace traditional computers. Quantum computing may become an important complement to traditional computers. Below is a quantum computer showcased by IBM as the Consumer Electronics Showcase (CES) in 2020.
    IBM Quantum
    Credit: AA+W - stock.adobe.com
  • 2019: 5G cellular networks began rolling out, promising higher bandwidth for applications like video streaming. Data caps still suck.
    5G
  • 2019-2020: The COVID-19 pandemic forced many organizations to move operations online. The work-from-home revolution was kicked into high gear. Educators made more content available through online services.
    Work from Home
  • 2021: The first Bitcoin exchange-traded fund (ETF) was approved by the United States Security and Exchange Commission (SEC). While many regulators lambasted Bitcoin, the approval of the Bitcoin ETF signaled increasing acceptance of digital currencies by the financial sector.
  • 2022: ChatGPT 1.0 brought generative artificial intelligence into mainstream use.
    Chat AI
  • 2023-2024: The artificial intelligence race continues. AI dominates tech news. In annual reports, companies forecast increasing use of AI to generate revenue and cut costs.
    AI

What's next?

Nobody knows for sure what is next. Clearly, artificial intelligence will continue to improve. Researchers continue the quest for artificial general intelligence which could be the largest disruption in human history. Blockchain technology could become more integrated with things in the physical world. Autonomous driving could achieve widespread adoption. It is easy to see what exists now and predict that we'll have better versions of those same things in the near future, but true disruption is hard to predict.

Exercise

List all of the things you have done today. For each event, describe any information technology and how you used it. Examples could include listening to the radio, checking email, or texting somebody--all of which use information technology in some way.

Reflection

  • Has improved information technology come with any negatives? If so, what?
  • What will be the major information technologies that will shape our society in the next 50 years?

Key Terms

  • Mainframes: Powerful, large-scale computers used primarily by large organizations for critical applications, bulk data processing, and large-scale transaction processing.
  • Computer networks: A collection of interconnected devices that communicate with each other to share resources and information, such as the internet, local area networks (LANs), and wide area networks (WANs).
  • Personal computers: General-purpose computers designed for individual use, typically consisting of a desktop or laptop, used for tasks such as word processing, internet browsing, and gaming.
  • Smartphones: Mobile devices that combine cellular communication capabilities with advanced computing features, including internet access, touchscreens, and a wide range of applications.
  • Internet of Things (IoT): A network of physical objects embedded with sensors, software, and other technologies to connect and exchange data with other devices and systems over the internet.
  • Web 1.0: The first generation of the World Wide Web, characterized by static web pages and limited user interaction, primarily focused on information dissemination.
  • Web 2.0: The second generation of the World Wide Web, emphasizing user-generated content, usability, and interoperability, leading to the rise of social media, blogs, and collaborative platforms.
  • Cryptocurrency: A digital or virtual currency that uses cryptography for security. It operates independently of a central authority or government and is typically based on blockchain technology, which ensures transparency and immutability of transactions. Examples include Bitcoin and Ethereum.
  • Generative Artificial Intelligence: A subset of AI that focuses on creating new content, such as text, images, or music, using machine learning models. Examples include GPT-3 for text generation and GANs (Generative Adversarial Networks) for image creation.