DVI and VGA video connections can be found on most TVs and computer monitors these days. The main difference between the two is the method of data transmission--DVI uses digital carriage, VGA analogue conversion. DVI-D stands for "Digital Visual Interface-Digital," a redundancy meant to illustrate compatibility with another digital format, HDMI.
Other People Are Reading
The older VGA connector appears more as a D square with 15 pins, while DVI uses a flatter rectangular shape and has 24 pins. Also, DVI incorporates a small area to the left of the main connector that provides analogue compatibility.
VGA is a totally analogue signal, meaning video made up of one signal with modulation. DVI emerged as one of the main digital video carriers in the computer industry back in the 1990s. By digital, we mean multiple signals carried concurrently. This is the main difference between the two, affecting their offered features.
DVI cables can technically carry higher resolutions than VGA, going up to 1920 by 1080 on a single link, or twice that on a double. VGA has a nominal limitation of 640 by 480, although this indicates a misnomer. Current VGA cables are actually XVGA and can reach 1600 by 1200 and more.
Most people can't tell the difference between the two standards when viewing video on like resolutions. However, DVI does support uncompressed video while VGA requires conversion along the way. Also, VGA supports refresh rates of up to 75Hz, while DVI can only go to 60Hz.
In the age of CRT monitors using tube technology, DVI held very little advantage over VGA. With LCD screens this has changed, as they benefit from DVI's fixed 60Hz refresh rate and can display the added detail gained by eliminating compression.
Price also presents a big difference between the two, as DVI cables still cost twice their VGA counterparts. Since the quality gap, while measurable, isn't night and day for most people, you may want to consider VGA as a budget solution.