Internet
Fact-checked

At EasyTechJunkie, we're committed to delivering accurate, trustworthy information. Our expert-authored content is rigorously fact-checked and sourced from credible authorities. Discover how we uphold the highest standards in providing you with reliable knowledge.

Learn more...

What is a DVI?

R. Kayne
R. Kayne

Digital Visual Interface or Digital Video Interface, better known as DVI, is a video interface standard. It was established by the Digital Display Working Group (DDWG) to maximize resolution of digital flat panel displays. Visual Graphics Array (VGA), the standard previous to DVI, is an analog technology that was designed for cathode ray tube (CRT) monitors.

Why the switch to DVI? Graphics cards generate digital signals that then pass through a Random Access Memory Digital to Analog Converter (RAMDAC), where they are converted to analog signals to accommodate CRT monitors. The conversion from digital to analog "slurs" the signal in minute ways that result in a picture that isn't as sharp as it should be or would be if purely digital. This explains why, as analog CRT monitors became larger and resolution greater, text was not as sharp as it should have been and very small fonts became blurry. A natural progression was to make an industry switch to digital displays with digital interfaces.

The conversion of signals from digital to analog by graphics cards reduces picture quality.
The conversion of signals from digital to analog by graphics cards reduces picture quality.

DVI eliminates the unneeded analog interpreter from translating digital signals between digital components. Digital flat panel displays use a method whereby each pixel is mapped to a numerical value that establishes its level of brightness each time the frame is painted, which occurs many times per second. Inherently this is more exacting than CRT technology. When the analog conversion is eliminated, the result is optimal. If the digit "1" is relayed, the receiver will get "1" and nothing else. If sent via analog, the "1" might look more like .0952 or 1.002.

Intel, IBM, NEC, Compaq, Fujitsu, Hewlett Packard and Silicon Image formed the DDWG consortium that developed DVI. The market transitioned from the old VGA standard to the DVI standard with many flat panel displays and graphics cards featuring both VGA and DVI interfaces to accommodate intermixed components. There are several types of DVI cable formats for varying hardware standards. DVI has since been superseded by two newer digital standards: Unified Display Interface (UDI), and DisplayPort.

Discussion Comments

wander

When you go and buy a DVI cord make sure you find a good salesperson to talk with. You need to figure out how long your cord should be for your use, and how high a quality cable you should get.

These DVI cords come in a staggering number of options and you can get overwhelmed without some help. Just be aware that these cords are not all created equal.

Also, if you are having some trouble with your DVI cord once you get it connected, take it back to the store right away and explain your problem. Sometimes they produce corrupted images, and you may need a replacement.

animegal

If you are a computer gamer getting your graphics to look amazing is probably a top priority for you. I have found that switching to a DVI LCD monitor really improves the game visuals and prevents ghosting and graphic lags. These monitors are a bit more expensive than the VGA monitors, but it is only about a $50-$100 price jump and you will notice a big difference.

If you are serious about keeping things looking good across the board, you should also make sure all of your connecting wires and video cards are DVI. Consistency will make for the best picture.

Post your comments
Login:
Forgot password?
Register:
    • The conversion of signals from digital to analog by graphics cards reduces picture quality.
      By: nikkytok
      The conversion of signals from digital to analog by graphics cards reduces picture quality.