Retrieved from https://studentshare.org/information-technology/1447740-from-the-history-of-pc-monitor
https://studentshare.org/information-technology/1447740-from-the-history-of-pc-monitor.
PC Monitor Monitors also referred to as output devices have come a long way since the early 1970s when blinking green monitors existed in text-based computer systems. Computer monitors are often packaged separately than other computer components. The monitors offer instantaneous feedback by displaying the contents of the text or graphic images as one works or plays. Typically, desktop monitors make use of a cathode ray tube (CRT). Laptops, on the other hand, come with the output device firmly affixed to the rest of the unit.
Laptops primarily use gas plasma or any similar projection technology, liquid crystal display (LCD) and light emitting diode (LED). Because of the slimmer design and lesser energy consumption, LCD monitors are rapidly replacing the venerable CRT (Allan 124). This paper will examine the history of monitors, explaining how video cards have altered the needs for monitors. Advances in monitor technology have facilitated the introduction and adoption of superior monitors. IBM made some of the most significant modifications.
In the 1970s, monitors consisted of text-reading technologies that were essential for deciphering contents of the text-based computers. In the course, of a decade, IBM made substantial changes to monitors and in 1981; the company introduced the first color graphics adapter (CGA). This monitor had the capacity to display four colors and a maximum resolution of 320 pixels that entailed 120 pixels on the horizontal display and 200 pixels on the vertical display. In 1984, IBM made another change and launched the enhanced graphics adapter (EGA) display, which allowed for a maximum of 16 colors and had an enhanced resolution of 640 x 350 pixels.
This means that the new monitor had better display and appearance, which allowed for easier reading of texts. Later on in 1987, IBM launched the video graphics array (VGA) display model. This display system is still used in monitors today. IBM later introduced the extended graphics array (XGA) monitor (Allan 175). This advanced model offered true colors (16.8 million colors) in 800 x 600 pixel resolution and an additional 1,024 x 768 pixel resolution for 65,536 different colors. Currently, most monitors use the ultra extended graphics array (UXGA) standard, which is capable of supporting 16.
8 million colors and high resolutions of 1,600 x 1,200 pixels according to the video memory contained in the graphics card within the computer. Prior to the inception of DOS, the main operating system for all 8 bit computers was the control program for microprocessors (CP/M). It originally utilized different memory-mapped video display devices, as well as discrete keyboards, plugged in the machines. This technology is akin to that of the current video display cards that presented much later. The most common microprocessor was VDM-1.
Manufacturers realized that this market was untapped and started to market the mini-style and mainframe terminals to the CP/M community. The sales were astounding and CP/M computers fully incorporated terminals. Apple II computers, as well as the early game machines, (like those manufactured by Nintendo and Atari) plugged to a monitor rather than the terminal. In the Apple II, the keyboard was manufactured as a portion of the system. Therefore, the only thing missing after plugging in the computer was a monitor.
Unlike terminals, these monitors’ appearance was similar to that of television sets, but without the tuner. In other instances, the monitors were, in fact, television sets. For instance, early computers such as Commodores Vic 20, 64 and 128 could be transformed into a TV by using a special RF adapter, which hooked to the TV antenna. Later, IBM introduced the PC-DOS computers that coined the terms “3 piecers” and “4 Piecers” as the computers had three distinct components, i.e. the monitors, a CPU box and the keyboard.
Ironically, the monitors in these computers plugged directly to the computer. This is quite similar to the earliest form of personal computers as they used a display device connection. The new monitors made use of video cards, which were typically IBM color graphics cards 9 (CGA), IBM monochrome (MDA), or Hercules, which was the initial third party ad on video cards (Ernest 189). When PCs presented, the purpose of video cards was only to display images or texts on the screen. The quantity of memory on these video cards was quite minimal and was thus not needed to a great degree.
However, today’s video cards do more than merely display an image; they also assist the processor in processing graphics. This means that the video cards help to accelerate the process of image displaying on screen. This was especially essential after the introduction of the current 3D gaming devices. The speed and momentum needed to process the images on the monitor was simply too much for a CPU to manage so the games took relatively long periods at extremely slow speeds. The video card has, therefore, been built with its own instruction logic that added things such as textures, fog effects, lighting effects and bump mapping to the monitors to give a more detailed picture of superior quality.
In addition, the superior speeds of video cards have also significantly reduced the problem of frame rate dropping (Kent 413). Works Cited Allan, R. A. A History of the Personal Computer: The People and the Technology. Michigan: Allan Publishing, 2001. Print. Ernest, D. Measurement Systems. New York: McGraw Hill Professional, 2003. Print. Kent, S. L. The Ultimate History of Video Games. New York: Three Rivers Press, 2001. Print.
Read More