A video card, (also referred to as a graphics accelerator card, display adapter, graphics card, and numerous other terms), is an item of personal computer hardware whose function is to generate and output images to a display.
The term is usually used to refer to a separate, dedicated expansion card that is plugged into a slot on the computer’s motherboard, as opposed to a graphics controller integrated into the motherboard chipset.
Some video cards offer added functionalities, such as video capture, TV tuner adapter, MPEG-2 and MPEG-4 decoding or even FireWire, mouse, light pen or joystick connectors.
Video cards are not used exclusively in Intel-based PCs; they have been used in devices such as Commodore Amiga (connected by the slots Zorro II and Zorro III), Apple II, Apple Macintosh, Atari Mega ST/TT (attached to the MegaBus or VME interface), Spectravideo SVI-328, MSX and, obviously, in video game consoles.
Graphics processing unit (GPU)
A GPU is a dedicated graphics microprocessor. Its aim is to lighten the working charge of the CPU, because of that, it is optimized for floating point computing, which is very common in 3D functions. The majority of the information provided in a video card specification is referred to the GPU attributes, which shows its importance among the components of the video card. The main attributes of the GPU are the core clock rate, which in 2006 oscillated between 250 MHz and 650 MHz, and the number of pipelines (vertex and fragment shaders), whose aim is to translate a 3D image formed by vertexes and lines into a 2D image formed by pixels.
The video BIOS or firmware chip is a chip that contains the basic program that governs the video card’s operations and provides the instructions that allow the computer and software to interface with the card. It contains information on the memory timing, operating speeds and voltages of the processor and ram and other information. It is possible to re-flash a BIOS (enable factory-locked settings for higher performance) although this is typically only done by video card overclockers, and has the potential to irreversibly damage the card.
Random Access Memory Digital-to-Analog Converter. RAMDAC takes responsibility for turning the digital signals produced by the computer processor into an analogical signal which can be understood by the computer display. Depending on the number of bits used and the RAMDAC data transfer rate, the converter will be able to support different computer display refresh rates. With CRT displays, it is best to work over 75 Hz and never under 60 Hz, in order to minimise flicker. (With LCD displays, flicker is not a problem.) Due to the growing popularity of digital computer displays and the migration of some of its functions to the motherboard, the RAMDAC is slowly disappearing.
Figure 8. SVGA, S-Video and DVI outputs
The most common connection systems between the video card and the computer display are:
- SVGA: Analogical standard from the late 1980s, it was designed for CRT displays. Some problems of this standard are electrical noise, image distortion and sampling error evaluating pixels.
- DVI: Designed for digital displays as LCD displays and video projectors. It avoids image distortion and electrical noise, corresponding each pixel from the computer to a display pixel, using its native resolution.
- S-Video: Included to allow the connection with DVD players, video recorders and video game consoles.
Other connection systems are:
- Composite video: Analogic system, with very low resolution. It uses RCA connector.
- Component video: It has three cables, each with RCA connector (YCbCr); it is used in projectors.
- HDMI: digital technology released in 2003, whose goal is to replace all the others.