Wednesday 12 October 2011

DVI & VGA - What's the Difference?

DVI & VGA - What's the Difference?

Hdmi

You've probably heard about these two types of connections before - and the number of times that they have been used on televisions and home entertainment systems. But what differences, are present between DVI and VGA cables which make up for the noticeable differences in video quality?

VGA, or otherwise known as Video Graphics Array, has been around since the late 1990s when it was first developed to connect monitors to Macs. The newer DVI, or Digital Video Interface, is both newer and more commonly used with intermediate displays and a number of higher quality digital equipment.

The differences between the two are not properly understood. The VGA display is based on analogue signals, and the data that is transferred through the cable after being converted from a graphics card and the converted into a digital format at the other end which is understood by the display. The drawback of this conversion process is not only the inability to monitor the exact monitor's elements, or in other words, individual pixels, and thus may lose some quality in the process depending on screen size, resolution, and synchronisation properties of the signal with the displays physical properties.

DVI provides a much faster and easier solution to this process, making it much more reliable as a means of transferring data for displays. Although there are several types of DVI which are available, they are designed to be compatible with each other as so to avoid setting different standards for different displays.

DVI-D is basically the base digital format. It transfers the signal from a DVI graphics card and then converts it into an analogue signal at the receiving end. Although this process still implements an analogue converting process, the end result display is still much higher than VGA due to the advancement in technology since the initial development of VGA.

outfitter tents

No comments:

Post a Comment