history of computers, while relatively brief in terms of years spanned, is incredibly complex and eventful. The technological advances have come at a blinding pace, from the original mainframes that weighed tons (literally!) to the tiny notebooks that weigh less than a gallon of milk today, just a half-century later. Five main generations have delineated the advancement of computers: the mainframe, minicomputers, personal computers, supercomputers, and finally the notebook computer.
The first of these categories is the mainframe, which originated in the 1950s. The UNIVAC, the first mainframe that was reproduced, ran off of vacuum tubes, cost over a million dollars, and was quite physically imposing. During these years mainframes filled entire rooms and were only within the price range of huge corporations or the government. In those days, the only computers were mainframes -- there was no other category of computers to compare. These central processing systems performed mainly huge data processing tasks, like tabulating census numbers or massive financial transaction monitoring. As smaller computers were introduced in the form of the "minicomputer," the functions of mainframes became more and more specialized; since mainframes offered the capability to run ("host") many different operations at the same time, they became the de facto server for operations that required simultaneous functions.
However, with the advent of minicomputers (which weren't so small themselves), the demand for mainframes dropped significantly. Most minicomputers were able to perform the same type of hosting operations for which mainframes were used at a lower cost and at similar or equal speeds. This drop in demand was so sharp, in fact, that most companies in the mainframe business were forced out, leaving IBM as the giant in the industry. IBM still controls over 90% of the mainframe business today, for many reasons due to this 1970s decline in demand and the failure of other companies to maintain their mainframe businesses.
IBM retained their mainframe platforms, and while mini (and later personal) computers became easier to utilize as servers, IBM developed mainframes into incredibly reliable processing systems. Currently, mainframes are used for very high-volume data processing tasks that cannot ordinarily be disrupted for things like routine maintenance or system outages, like online reservation systems that need to be accessible 24 hours a day as well as huge storage tasks related to their processing duties. The reliability of mainframes is now their most valuable asset, with maintenance and repairs able to be performed without any service outages by routing operations to another system. Mainframes have evolved from the only type of computer in existence to a specialized, efficient niche of the market.
When mainframes were introduced, they were simply known as "computers." There was no need to differentiate between different types of computers, since there was only one type of computer to identify. With the advent of a second type of processing system, the term "mainframe" came into use, while the new, smaller processing units were termed as "minicomputers." The first minicomputer debuted in 1964, and the category became a middle ground of processors between the powerful mainframes and the quickly-proliferating personal computers, which were more suited to one user.
Minicomputers could run multi-user operations that did not require all of the computing power of a mainframe. They were more cost-efficient for a smaller scale of businesses wanting to incorporate computing technology into their operations. Minicomputers are not often used today; they have been rendered nearly obsolete by more and more powerful microcomputers that can perform all of their server functions at a fraction of the cost and space. The increased effectivity of mainframes has also contributed to the decline of minicomputers; more affordable and efficient mainframes also contributed to this reduction in demand for minicomputers. Networking among microcomputer, or personal computer, systems also reduced the demand for the type of mid-size server capabilities that minicomputers provided.
The generation of computers that helped render the minicomputer almost obsolete is the personal, or microcomputer. This is the unit that almost every college student possesses, that is at almost every desk in an office, and that accounts for much of the computing boom among individual users during the 1990s. Microcomputers were introduced to the mass market in 1966 and the TV Typewriter, introduced in 1973, allowed users to display alphanumeric values on their television screen and store small amounts of data on cassette tapes. Microcomputers were not simply'scaled-down" versions of mainframes, in the manner of minicomputers, but instead were aimed at individual users and small businesses.
The new format of computing held advantages for these smaller consumers, as opposed to being aimed at large-scale users like the government or huge corporations. While they did not perform the same tasks as mainframes, they were not ever intended to do so. Instead, microcomputers were aimed at bringing basic computing applications like information storage, graphics, and mathematical functions to the individual user. In catering to this market, microcomputers were consciously not effective for the functions performed by mainframes and minicomputers like large-scale processing, simultaneously performing several operations, or storing huge amounts of data. Instead, they offered to perform the tasks needed by individuals in a more affordable, smaller package.
While microprocessors filled the growing demand by individual users, the large-scale computing device was also developing. Large-scale, incredibly advanced systems evolved to fill demands for faster processing, more storage, and more complex functions by large groups and government; these processors are known today as supercomputers and they are constantly changing. Supercomputers, by definition, are the world leader in processing speed at the time of their invention.
Supercomputers serve as high-capacity calculation tools; they are used to track constantly-changing phenomena like weather patterns, genetic research, and other complex and variable information. They are utilized by the military in testing nuclear weapons and wargames, and by airplane manufacturers to judge wind's effects on their planes, among many other complex uses. They are not generally applicable to the individual user or even a small business; they are cutting-edge technology and their price, maintenance, and technical support reflect that. They tend to require large amounts of cooling, something that even an average-sized corporation needing large amounts of processing would not have the technical or physical capacity to handle. An example of a specialized supercomputer in recent years is the chess expert Deep Blue, manufactured expressly to play chess flawlessly. Of course, other specialized supercomputers serve more viable functions than entertainment.
Supercomputers are the absolute fastest, highest performing machines that technology has to offer. However, they are expensive and difficult to maintain, and cannot be used for general purposes like home computing. Even to manage a supercomputer for the purposes of a mainframe is, in most cases, technological overkill. Supercomputers occupy a highly specialized niche in the computing market; they serve an important and advanced purpose, but are not of immediate use to the vast majority of people.
The personal computing market has continued to evolve in the form of notebook computers. Commonly known as laptops, these computers are reduced in size from the desktop models discussed above but not reduced in computing power. They are marketed to individual users for a variety of purposes -- some are…