DVI vs VGA

DVI - information goes directly from your video card to your monitor. The color of each pixel on your monitor is calculated by your video card and then sent as digital information to your monitor so that no conversion is necessary. An LCD monitor simply reads this information and displays it directly

VGA - Information is converted from digital to [red,green,blue] format. Some accuracy and time is lost in this converstion. How much is lost depends on the monitor's conversion hardware.

Image Quality:

On a CRT monitor, there is no real image quality difference between DVI and VGA. This is because a CRT is natively based on the [red,green,blue] format for displaying each pixel.

On an LCD, you will notice a difference between the 2 formats if you look hard enough. Different LCDs will handle the conversion differently. You may start to see dithering, banding, "dancing pixels" and blander/incorrect colors when using vga on an LCD. The larger the LCD/resolution the more you will notice these differences.

DVI also has a faster data transfer rate, which means that the higher the resolution, the worse the input lag will be if you use VGA. This is very important if you play fast(twitch) shooter games.

Finally, VGA only contains the color information for your monitor's image. DVI includes more than that. That's why when you connect using DVI, you don't have to adjust your monitor's image position, phase, and clock corrections to sync. It contains exactly how/what your video card wants to display.

If you hook up your LCD with VGA, you will notice that several monitor adjustments become available were they were not under DVI. That is because DVI carries all the information your monitor needs to configure itself where as VGA does not.

and the list goes on..

Comments

Popular posts from this blog

Impossible