Video cards are used to regulate the data displayed on computer monitors. They can also display information on liquid crystal displays (LCDs), high definition television (HDTV), and projectors. Video cards are differentiated by how they use a combination of display standards to render information, including display resolution, measured in pixels; color depth, measured in bits; and refresh rate, measured in hertz (Hz).
Video cards are typically integrated into a computer’s motherboard. Older model computers may accept a graphics card or video card as an expansion card that slides into a specific port on the computer. Video cards come with their own CPUs, which are designed to handle the graphical calculations necessary to run video, simulations, or animations.
Types of Video Cards
There are many different types of video cards. A graphics accelerator board is configured to render complex two-dimensional (2D) or three-dimensional (3D) graphics. A graphics controller typically features additional SDRAM memory and support for high-resolution display. Three dimensional graphics accelerator boards or 3D video cards are frequently used in computer-aided design applications used in architecture, engineering, and medical imaging.
Video cards for more powerful applications such as games, video editing, and animation differ in how they deliver information. Rendering cards used for 3D multimedia or video are usually optimized for precision while gaming cards are usually optimized for performance. High-end graphics cards and controllers are often referred to as graphics processing units (GPUs) or visual processing units (VPUs). Some specialty multi-display graphics cards are built to render information on more than one monitor. These video cards are useful in military, medical, or scientific imaging applications.