Information Technology: A Brief History
These days, it’s hard to imagine life without computers. Access to the World Wide Web has revolutionized our lives, influencing the way we communicate, bank, shop and work. Do you ever wonder how we got here?
The computer wasn’t initially created for the communication and shopping purposes of today. Instead, it was designed to solve a serious number-crunching dilemma. By 1880, a growing population forced the U.S. government to gather and process Census results in a faster way. This necessity gave rise to punch-card based computers so large that they took up entire rooms.
Fast forward to 2018. We carry thousands of times more computing power on our smartphones than what was available in those early models. The following timeline details how computers evolved from their simple beginnings to the powerful machines of today. Read on as the experts at FX Technology look back on milestones in the evolution of IT.
Computer Evolution Timeline
1822: Charles Babbage, a mathematician, established a calculating machine that would be able to add up tables of numbers. Although it was funded by the English government, the project failed. While this first mechanical machine doesn’t resemble what we’d consider a computer today, it was a major advancement at the time.
1911: The Computing-Tabulating-Recording Company, the forerunner to IBM, was founded. As a result of a thriving U.S. economy, a new enthusiasm for tracking information unfolded.
Several small companies were merged together, including timeclock businesses and computing scale manufacturers. In 1924, the company changed its name to International Business Machines.
1936: A computer scientist named Alan Turing presented the idea of a universal computing machine, which was later called the Turing machine. The central concept of the modern computer was based on his ideas.
1940: Harvard University created the first large-scale digital computer called Mark 1. It was programmed using punch cards.
1941: This year marked the first time a computer could store information on its main memory, thanks to John Vincent Atanasoff. After inventing the first electronic digital computer in the 1930s, Atanasoff and his grad student, Clifford Berry, designed a computer that could solve 29 equations simultaneously.
1945: John Von Neumann published the first discussion of computer architecture and stored programming, called “First Draft of a Report on the EDVAC.”
1971: 1971 was a very important year in the world of technology. Steve Jobs and Steve Wozniak got together and decided to start what would become Apple computers. Additionally, computer engineer Ray Tomlinson was writing the software that would become email. There was also Sharp Corporation, which invented the pocket calculator. And IBM engineers invented the floppy disk that allowed data to be shared among computers.
1973: Just two years later, the internet was born from a memo written by Xerox researcher Bob Metcalfe. It connected multiple computers and other hardware.
1975: A technology timeline is not complete without the mention of Bill Gates. In 1975, he contacted Micro Instrumentation and Telemetry Systems (MITS) about their Altair 8800 computer. He said he had created a system called BASIC for it. BASIC became very popular and thus Microsoft was born.
1976: Steve Wozniak and Steve Jobs launch the Apple 1, among the world’s earliest and most successful personal computers (PC) for use at home.
1981: Apple’s success prompted IBM to join the computer race. The company introduced its first personal computer, code-named “Acorn.” Operating on Microsoft’s MS-DOS operating system, the Acorn had an Intel chip, two floppy disk drives, and an optional color monitor.
1990: Tim Berners-Lee published “Information Management: A Proposal” at the European research organization CERN. It is in this document that the global hypertext system is introduced. HyperText Markup Language (HTML) is the fundamental structure of the modern World Wide Web as we know it today.
1993: The World Wide Web was made available free of charge. The public could now log on!
1999: The term Wi-Fi became part of the computing language as users begin to connect to the internet wirelessly, opening many possibilities for the future.
2001: Apple released the Mac OS X operating system, which introduced a more reliable and user-friendly platform. Multiple applications could efficiently be run at the same time. Meanwhile, Microsoft rolled out Windows XP. The 2000s also saw the birth of smart handheld devices — later called smartphones — that allowed everyone to have access to the internet at their fingertips. This, once again, changed the trajectory of information technology and computer development.
2007: Apple launched the first iPhone.
In order to stay atop the wave of technological advances as the future of the IT industry, it helps to understand its past. This brief history was brought to you by the highly-trained professionals at FX Technology.
Contact FX Technology
We can help you improve your business by updating your technology and finding useful IT solutions. Whether you are interested in IT management, VoIP services, or backup services and cloud storage, our team is here for you. To receive IT audit or a consultation, contact us online or give us a call at 417-895-9223!