▼ Menu

The Evolution of Computer Hardware and Software

In order for a computer to function, it requires some kind of computer program or software. The computer hardware provides the capability to perform computing-related tasks. The software is the set of instructions that humans create to tell the computer what to do. Modern computers and software are two different concepts, but that wasn't always as apparent as it is now. The first mechanical computers, starting with Charles Babbage's Difference Engine in 1822, were entirely hardware-based and were programmed with switches and wires. Until the late 1940s, hardware and software were essentially the same things, and to reprogram a computer meant to rewire it, either partially or entirely. Computers themselves were not always ubiquitous like they are now. Originally, they were very difficult to use, very large, and too expensive for anyone except governments and universities to own.

Alan Turing developed the first theory for computer software in an essay he wrote in 1935, titled 'Computable Numbers With an Application to the Entscheidungsproblem'. The first software algorithm, however, was written for Babbage's Analytical Engine in 1842 by the Countess of Lovelace, Augusta Ada King-Noel, also known as Ada Lovelace. The goal of her algorithm, which existed as a set of notes, was to direct the Analytical Engine to calculate Bernoulli numbers. Lovelace also speculated that the Analytical Engine could perform tasks beyond the intent of its design. She stated in her notes that it could carry out more general computing functions such as composing music.

In 1945, John von Neumann described a digital computer design using an electronic processor with an arithmetic logic component, processor and instruction registers, data storage memory, a program counter, external data storage, and a means of receiving input and outputting data. This concept became the blueprint for modern programmable computers. The first computer capable of running software, or computer programs, was the Z3, invented in 1941 by German civil engineer Konrad Zuse. The Colossus, first designed in 1943, was another programmable computer, which the Allies used to decrypt German military codes. The predecessor to the modern computer, however, was the Manchester Baby, which fulfilled the standards of the von Neumann architecture. It stored software code in memory and was a general-purpose computer. The Manchester Baby, in turn, inspired the Ferranti Mark 1, which was the first general-purpose computer available to the commercial world. Future computers would continue to follow the von Neumann architecture, enabling the growth of the software industry as a distinct entity from the hardware industry.

Up until the 1950s, vacuum tube-based computers dominated until they were replaced by fully transistor-based computers, the first of which was the Harwell CADET, first released in 1955. Transistor technology, in turn, evolved into integrated circuit technology and microprocessors, which shrank computer sizes and, as of the 1960s, increased their computing power by orders of magnitude. Microcomputers came into existence in the 1970s, and this ushered in the age of the personal computer, which brought computers into homes. The first generation of personal computers started with the Apple II, Commodore PET, and TRS-80, which were followed much later by IBM's PC desktop system. Intel entered the consumer market with IBM-compatible processors, giving rise to IBM PC 'clones'. These clones typically came with a modular design that enabled consumers to add and change components within the system. This upgradeable trait enabled IBM-compatible PCs to rapidly achieve prominence in the home and business computer market. At the same time, other desktop systems and workstations arose, such as computers made by DEC, Sun, and SGI, mainly for use by businesses. Mobile computing took off in the 1980s with the Osbourne 1 portable computer, the predecessor to the modern laptop, followed by the IBM Thinkpad tablet in 1992. Personal data assistants (PDAs) and smartphones flourished in the early 2000s, making it possible for consumers to take computers wherever they went.

All computer software depends on and is created with some form of programming language. Early programming languages were machine-dependent. These machine languages took significant levels of expertise and training and made software development highly expensive. New languages developed that were human-readable and easier to manage, such as FORTRAN in 1957 and COBOL in 1959. These languages required a compiler to translate them into machine-readable code and were designed to develop software for the military, scientific community, and businesses. When personal computers entered the market, simpler, interpreted languages such as BASIC made software coding a task that even home computer users could master. Interpreted languages did not need a compiler to run and were very easy to debug and quick to run. Another important language was LOGO, developed in 1967 to help children get involved in programming. One of the most prevalent languages today is C or C++, from which most modern software is made.

Early forms of software were bundled with the computers that they were written for, meaning that in order to get the software the customer wanted, they also had to buy the hardware with it. The Supreme Court decision in the Digidyne v. Data General case in 1985 effectively put an end to this, while the free software movement arose in the same decade, led by pioneers such as Richard Stallman, founder of the GNU Project. This resulted in an open-source subculture where programmers freely shared the source code for their software, ensuring its propagation and use across a broad number of different emerging and established computer systems. At the same time, commercial software flourished in the form of operating systems, business and scientific software, and games. Some of the most notable software that came from the 1980s and 1990s includes the free and open-source Linux operating system and the commercial Microsoft Windows family of operating systems. Web browser software emerged in the 1990s, bringing the Internet to the masses, and video games had a renaissance on home computers. In the 2000s, operating systems such as Apple iOS and Google Android flourished for mobile devices, and programs known as apps became commonplace.

Computers and software are still evolving. Scientists are currently working on quantum computers that use quantum bits, or qubits, instead of binary digits. These computers may be capable of running millions of times faster than computers of today. Software design for this emerging class of computer is still in its infancy, and it relies on principles of quantum physics that are difficult to explain to the masses. Companies like IBM and Microsoft, however, are already working to make quantum computer software development possible for mainstream users.

History and Evolution of Computers and Software

IT for Kids and Students

Further Reading

    
Extras ▼
Use browser back button to return.