Home / Miscellaneous / Types of DVI connectors and their specifications. HDMI, DVI, VGA, DisplayPort - All about connection interfaces What does a dvi connector look like

Types of DVI connectors and their specifications. HDMI, DVI, VGA, DisplayPort - All about connection interfaces What does a dvi connector look like

Hi all. Get from me a new portion of the information you are interested in;).

From this article you will learn what a dvi connector is, types and features. You will also learn to distinguish given interface from others. This will help you in replacing cables in case they fail, and you will also understand what equipment you can connect to each other.

Getting to know the interface

First, let's figure out what DVI is. The abbreviation hides the phrase “Digital Visual Interface”, which means “digital video interface” in translation. Have you guessed the purpose of its use? It sends the digital recording to video equipment. It is used to connect mainly plasma and LCD TVs.

Technical features

  • The data format used in this interface is based on another one - PanelLink, which assumes a sequential transfer of information.
  • High-speed TMDS technology is used: three channels that process video streams at speeds up to 3.4 Gbps per channel.
  • The maximum length of the cable is not set, since it is determined by the arrays of information being sent. For example, a 10.5 m wire is capable of converting an image into 1920 × 1200 dots, and 15 m - 1280 × 1024 dots.

  • There are two types of cable:

- Single link (single mode) involves 4 twisted pairs: 3 of them transmit RGB signals (green, red, blue) and the 4th for the synchronization signal. Wires process 24 bits per pixel. Thus, the maximum resolution is 1920×1200 (60 Hz) or 1920×1080 (75 Hz).

- In Dual (double), the parameters increased by 2 times. Therefore, through it you can watch videos at 2560 × 1600 and 2048 × 1536 pixels.

History of appearance

Connector released in 1999 by Digital Display Working Group. Prior to this, only the VGA interface was used, suggesting 18-bit color and analog information conversion. With the increase in the diagonals of digital displays and the requirements for picture quality, naturally, VGA has become small. So the world got DVI, holding the brand to this day.

DVI vs VGA Differences

What is the difference with VGA?

DVI has 17-29 pins while its predecessor had 15.

VGA converts the signal 2 times, and DVI - 1. How is it? The image is sent to your computer by the video card, which itself is a digital device. Since the outdated interface is analog, it first converts the signal to the same type that it understands for itself, and then outputs a digit. As you understand, in the case of DVI, this is not necessary.

  • Due to lack of conversion new interface gives a better picture, but on a small monitor you are unlikely to see the difference.
  • DVI involves automatic image correction with the ability to change only the brightness and saturation for viewing convenience, while VGA has to be fully configured.
  • The quality of data transfer through an outdated interface may be degraded due to external interference, which cannot be said about the new connector.

You may have heard about another, newer, digital interface - because now it is used, perhaps, more often than DVI. So that you do not confuse them with each other, we will analyze the main differences:

  • External version

DVI only carries video, while HDMI carries 8-channel audio in addition.

  • The first can work with both analog and digital signals, and the second - exclusively with digital.
  • A modern interface is equipped with a built-in Ethernet channel with a speed of 100Mbps, and DVI does not imply such a bonus.

There is also a difference in image quality.

DVI can display a maximum image only in Full HD (1920 × 1080), while HDMI can already 10K (10240 × 4320).

Types of DVI

You already know how not to confuse this interface with others. Now let's look at how its varieties differ from each other:

  • DVI-I. An additional letter means "integrated" (in our language - "united"). This type of connector assumes analog and digital channels (Single Link version) that function independently. Which one should work at one time or another depends on the connected equipment. Dual Link mode provides 2 digital and 1 analog channel.
  • DVI-D. The last letter hides the word “digital”, which in Russian means “digital”. That is, in this type of interface, there is no analog channel.

This type of connector is also available in two versions.

- Single Link has only one digital channel, which limits the resolution to 1920x1200 at 60Hz. It is also impossible to connect an analog monitor through it and implement nVidia technology 3D vision.

- Dual Link assumes 2 digital channels, which increases the capacity to 2560×1600 at 60Hz. This interface allows you to watch 3D on a monitor.

  • DVI-A. The additional letter carries the term "analog". Have you guessed what it means without translation? That's right, this is an analog interface, only in the form of DVI.

That's all.

Check out my blog more often for more useful information.

Hello dear readers! Today I would like to talk about ways to connect a monitor to a video card - about video card connectors. Modern video cards have not one, but several ports for connection at once, so that it is possible to connect more than one monitor at the same time. Among these ports are both obsolete and now rarely used, and modern ones.

The abbreviation VGA stands for video graphics array (an array of pixels) or video graphics adapter (video adapter). Appeared back in 1987, 15-pin and, as a rule, blue, is designed to output a strictly analog signal, the quality of which, as you know, can be affected by many different factors (wire length, for example), including on the video card itself, therefore, the quality of the picture through this port on different video cards may vary slightly.

Before the ubiquity of LCD monitors, this connector was almost the only possible way to connect a monitor to a computer. It is still used today, but only in budget models monitors with low resolution, as well as in projectors and some game consoles, such as the latest generation of xbox consoles from Microsoft. It is not recommended to connect a Full HD monitor through it, as the picture will be blurry and fuzzy. The maximum length of a VGA cable at a resolution of 1600 x 1200 is 5 meters.

DVI (variations: DVI-I, DVI-A and DVI-D)

Used to transmit a digital signal, replaced VGA. It is used to connect high-resolution monitors, TVs, as well as modern digital projectors and plasma panels. The maximum cable length is 10 meters.

The higher the image resolution, the shorter distance it can be transmitted without loss of quality (without the use of special equipment).

There are three types of DVI ports: DVI-D (digital), DVI-A (analog) and DVI-I (combo):

To transfer digital data, either the Single-Link or Dual-Link format is used. Single-Link DVI uses a single TMDS transmitter, while Dual-Link doubles the bandwidth and allows screen resolutions higher than 1920 x 1200, such as 2560x1600. Therefore, for large monitors with a high resolution, or designed to output a stereo image, you definitely need at least DVI Dual-Link, or HDMI version 1.3 (more on that below).

HDMI

Also a digital output. Its main difference from DVI is that HDMI, in addition to transmitting a video signal, is capable of transmitting a multi-channel digital audio signal. Sound and visual information is transmitted over one cable at the same time. It was originally developed for television and cinema, and later gained wide popularity among PC users. It is backward compatible with DVI through a special adapter. The maximum length of an ordinary HDMI cable is up to 5 meters.

HDMI is another attempt to standardize universal connectivity for digital audio and video applications, so it immediately received strong support from the electronics giants (contributions from companies such as Sony, Hitachi, Panasonic, Toshiba, Thomson, Philips), and as a result, the majority modern devices to output high-resolution images have at least one HDMI output.

Among other things, HDMI, as well as DVI, - allows you to transmit copy-pasted sound and image in digital form over a single cable using HDCP. True, to implement this technology, you will need a video card and a monitor, attention! - supporting this technology oh how. Again, there are currently several HDMI versions, here's a little about them:


display port

Appeared in addition to DVI and HDMI, since Single-Link DVI can transmit a signal with a resolution of up to 1920 × 1080, and Dual-Link up to a maximum of 2560 × 1600, then a resolution of 3840 × 2400 is not available for DVI. The maximum resolution capabilities of DisplayPort are not particularly different from the same HDMI - 3840 x 2160, however, it still has unobvious advantages. One of these is, for example, that companies will not have to pay tax for using DisplayPort in their devices - which, by the way, is mandatory when it comes to HDMI.

In the photo, the red arrows indicate the latches that prevent the connector from accidentally falling out of the connector. In HDMI, even version 2.0, no clamps are provided.

As you already understood, the main competitor of DisplayPort is HDMI. DisplayPort has an alternative technology for protecting transmitted data from theft, only it is called a little differently - DPCP (DisplayPort Content Protection). In DisplayPort, just like HDMI, there is support for 3D images and the transmission of audio content. However, DisplayPort audio transmission is only available one-way. And transmission of Ethernet data over DisplayPort is generally impossible.

In favor of DisplayPort is the fact that it has adapters for all popular outputs, such as: DVI, HDMI, VGA (which is important). For example, with HDMI there is only one adapter - to DVI. That is, having only one DisplayPort connector on the video card, you can connect an old monitor with only one VGA input.

By the way, this is what happens - now more and more video cards are produced without a VGA output at all. The maximum length of a conventional DisplayPort cable can be up to 15 meters. But DisplayPort can transmit its maximum resolution at a distance of no more than 3 meters - often this is enough to connect the monitor and the video card.

S-Video (TV/OUT)

On older video cards, there is sometimes an S-Video connector, or, as it is also called, S-VHS. It is usually used to output an analog signal to outdated TVs, however, in terms of the quality of the transmitted image, it is inferior to the more common VGA. When using a high-quality cable through S-Video, the image is transmitted without interference at a distance of up to 20 meters. Currently extremely rare (on video cards).

We are accustomed to complaining about relatively insignificant differences in chipset performance, motherboards and even processors. In doing so, we lose sight of one of the most important aspects modern computers- video image quality.

Over the past few years, with the spread of 19" and 21" monitors, more and more users began to show dissatisfaction with the image quality generated by the video card. The image is not so clear, it has excessive blurring, it can be impossible to read the text typed in small print. And since all these symptoms manifested themselves when running standard Windows applications, they began to talk about this as poor quality "2D images". We are not without sin either - in the past we have conducted a series of tests where we subjectively assessed the quality of the 2D image of various video cards. However, the term "2D" is misleading, as poor quality is seen in all applications, not just 2D.

In order to understand the reasons for this phenomenon, it is important to understand that the monitor is still connected to the video card via an analog connection. What do we mean when we say "analog"? Although digital circuits are based on a set of analog components, digital system only two discrete values ​​are understood. Digital equipment always works correctly: every time you digitally transmit a one, you get exactly one. Regardless of voltage fluctuations or any interference occurring during transmission. In the analog system, as a result of the transfer of one, you can no longer get one, but 0.935 or 1.062. Therefore, it is not necessary that you see on the screen exactly what the video card generates.

Imagine, for example, an analog connection between a keyboard and a computer. If the analog-to-digital converter of the computer misinterpreted the signal coming from the keyboard, then instead of the letter "a" that you just typed on the keyboard, you could see the letter "b" on the screen. In the same way, the blurring you see at high resolutions is not generated by the graphics chip at all. The data that is displayed on the screen comes from the frame buffer (memory) of the video card in digital form, but before leaving the video card, the signal passes through the RAMDAC. RAMDAC (Random Access Memory Digital to Analog Converter - digital-to-analog converter with RAM) converts digital data into analog signal, and until recently, this was the reason for poor image quality. Currently, the bandwidth of modern RAMDACs is much higher, and the quality is better. Therefore, image quality losses due to RAMDAC are now less common.

After RAMDAC conversion, the analog signal leaves the video card, and through the VGA cable (another source of signal quality loss) enters the monitor. And if you use a digital panel instead of a traditional analog CRT monitor, then the mockery of the signal does not stop - the already poor quality analog signal is converted back to digital here. Agree, this last phase makes very little sense. After all, we just said that the signal comes from the frame buffer in a completely digital form. This is where DVI comes into play.

In this article, we will get acquainted with the digital video interface (DVI) and consider how the problems of signal transmission between a computer and a monitor are solved. In addition, we will talk about the various implementations of DVI in modern video cards, and how to improve the quality of the output analog signal at minimal cost.


What is DVI?

A lot of people think of DVI as "that white connector I never used". But actually DVI is a very important standard. Behind it is a whole group of companies led by the Digital Display Working Group (DDWG), a digital display development group. In addition to it, Intel and Silicon Image play a key role here. Why this happened, we will tell later.

The DDWG came to the same conclusion we stated earlier: there is no point in converting a digital signal to analog in order to convert it back to digital on a monitor. The DVI specification was developed precisely with the expectation that in the future most monitors will become digital. And we rarely use DVI precisely because we still use traditional CRT monitors.

The specification is easy enough to understand. To transfer data over a DVI connection, the TMDS serial encoding protocol developed by Silicon Image is used. And it is not surprising that when it came to TMDS transmitters, integrated circuits from this company were used more often. The DVI specification calls for at least one TMDS "connection", which consists of three data channels (RGB) and one sync channel.

Two TMDS connections - from the DVI 1.0 specification

According to the DVI specification, a TMDS connection can operate at up to 165 MHz. A single 10-bit TMDS connection is capable of transferring data at 1.65 Gbps - more than enough for a 1920x1080 digital panel with a refresh rate of 60Hz. The maximum resolution depends on the bandwidth required to reproduce a given resolution, as well as on the efficiency of the device to which the signal is transmitted. The purpose of our article is somewhat different, but it should still be noted that in digital panels different technologies the maximum allowed resolution is different.

To keep the specification as flexible as possible, a second TMDS connection can be used. It must operate at the same frequency as the first, that is, in order to achieve a throughput of 2 Gb / s, each channel must operate at a frequency of 100 MHz (100 MHz x 2 x 10 bits).

This specification left behind all its competitors precisely because of its high throughput.


DVI-I vs DVI-D

Another advantage of the DVI specification, although unfairly overlooked, is the support for both analog and digital connections on the same interface. Below is just an illustration of the DVI connector.

On the left you see three rows of eight pins. These 24 pins are sufficient for the operation of three data channels and one synchronization channel. The cross-shaped region on the right contains the five pins required for analog video signal transmission.

And here the specification is divided into two parts: the DVI-D connector contains only 24 outputs necessary for digital operation, and DVI-I, in addition to 24 digital outputs, also has five analog outputs (the photograph just shows a photo of the DVI-I connector ). Further, we note that officially the DVI-A connector - a completely analog connector - does not exist. However, similar designations can be found in various literature. Currently, most graphics cards support DVI-I connectors.

Behind the versatility of this connector is the idea of ​​replacing the standard 15-pin VGA connectors we are so used to. It is assumed that such a solution is much better - after all, both analog and digital monitors will be supported.


How about scaling?

The main problem faced when it comes to digital panels (the main application of the DVI specification) is the fixed native resolution. It is at this resolution that the correct image is guaranteed. Since the screen consists of a fixed number of pixels, it is not possible to work at a resolution higher than the native one.

However, it happens much more often when the screen is running at a lower resolution. Take, for example, the Apple 22" Cinema Display. Its native resolution is 1600 x 1024. Playing games at this resolution is pure madness. Not to mention that there are no games that support such a strange resolution. Therefore you will have to play either at 1024 x 768 or 1280 x 1024. The problem now is that the image must be scaled to display correctly on the screen.

For the time being, no one thought about scaling the image. But only until digital panels began to gain popularity. And here the producers had to think about it. The DVI specification implies shifting the work of scaling, filtering and displaying the image in the correct coordinates to the shoulders of monitor manufacturers. Therefore, any monitor that is fully compatible with the DVI specification must be able to scale and filter the image itself. In fact, applying a relatively good scaling algorithm is not that difficult, so don't expect much difference between monitors in this regard (however, we're sure there will be a difference).


DVI support in modern graphics cards

With the introduction of the GeForce2 GTS, NVIDIA is integrating TMDS transmitters into the GPU. In exactly the same way, they are built into the modern line of Titanium cards. The disadvantage of built-in TMDS transmitters is that they operate at too slow a clock frequency to support high resolutions. It appears that the integrated TMDS transmitters have not, and are not, utilizing the full bandwidth of the 165 MHz link. Therefore, the entire implementation of DVI in nVidia cards is relatively useless for high resolution screens.


If your nVidia card has a DVI connector,
then, most likely, on the map you will find something similar

To overcome these shortcomings, nVidia boards began to be equipped with a second, external TMDS transmitter manufactured by Silicon Image. Depending on the design of the board, this transmitter may make a second connection in parallel with the onboard TMDS connection, or it may ignore the onboard TMDS transmitter. It's not clear why the built-in TMDS transmitter doesn't do the job, but if the problem is solved, manufacturers won't have to add an external TMDS transmitter to the graphics card, and there will be some savings. It is thanks to the external TMDS transmitter that it is possible to work through the DVI-I connector in resolutions up to 1920 x 1440.

You may come across nVidia cards with a DVI connector that will not work with a connected DVI monitor. We did an informal test of several DVI cards we had in our lab, and here are the results: all the new Titanium cards worked fine, but the Gainward GeForce3 and nVidia Reference GeForce2 MX didn't. If you have one of the latest Titanium cards - most likely it will work fine for you in almost any high resolution, although the documentation states a maximum of 1280x1024. We tested all of the new DVI Titanium cards on our Apple Cinema Display at 1600x1024.

As for ATI, it's a completely different story. All DVI digital outputs on ATI cards are powered by ATI TMDS built into the GPU. ATI solved the problem in its own way DVI connectors-I. Some of its video cards come with DVI outputs and DVI to VGA adapters. This adapter connects 5 analog DVI-I pins and a VGA connector.


The ATI All-in-Wonder Radeon was the first ATI card to
supplied with a DVI-VGA adapter (shown in the figure)

Matrox appears to be the only PC graphics manufacturer to offer a dual DVI solution on the market. The Matrox G550 comes with a dual DVI cable, however Matrox claims that the monitor's maximum DVI resolution is only 1280x1024. Since we were unable to either confirm or deny these data, we advise those who plan to work in high resolutions to take this choice more carefully.


Conclusion: what to do while there is no DVI, and how to improve the image quality on nVidia cards?

Instead of wishing "how everything will be fine when everyone switches to DVI", let's end the article with a more vital conclusion. Being the best graphics chip manufacturer on earth is not easy. For nVidia, the main problem is the inability to control and track the production of all cards bearing the company's name. By allowing third-party companies (such as ASUS, Chaintech, Gainward, Visiontek, etc.) to create cards based on nVidia chips, the company leaves quality control to the manufacturers themselves. But because the company offers manufacturers a reference design, they rarely run into major problems. However, one of these few problems is the situation with image quality.

To comply with the FCC (Interference Protection) standard, a low-pass filter is installed just before the analog video output of all video cards. It passes signals with a frequency below a certain value and delays all other high-frequency signals that do not affect quality.

Problems with nVidia cards start when third-party low-pass filters, in addition to various unnecessary frequencies, do not pass some important frequencies. It is unlikely that the capacitors and inductors that make up these low-pass filters were deliberately chosen to be of the worst quality. Likewise, it's unlikely the component ratings are out of nVidia's specifications. It is possible that when manufacturers bought components for these filters, some of them differed in quality. Most likely, this explains the sporadic nature of the appearance of problems with the image. Whatever the reason behind all this, you can improve the quality of the image by removing the low-pass filter. Next, we will consider how to do this operation with minimal cost.

Let's make a reservation that after removing the low-pass filter, you lose the warranty on your video card, and we are not responsible for possible malfunctions. The operation itself is extremely simple. On all nVidia graphics cards since GeForce, the low-pass filter can be seen as 3 sets of 3 capacitors connected in parallel with 2 sets of 3 inductors near the VGA connector. Each component of the RGB signal sent to the monitor uses a different set of devices. In addition, most boards have a set of protective diodes, although not always.

On this GeForce2 Pro, three sets of three capacitors are circled in rectangles. They need to be bitten. From left to right in the picture: a column of capacitors, a set of coils, a second set of capacitors, a set of protection diodes, another set of coils, and the last set of capacitors.

On a GeForce3 board with a DVI-I connector, the low pass filter is located next to the DVI-I connector. If the card does not have a DVI-I connector, then the filter components can be found near the VGA output, or where the DVI connector should have been.

On this Visiontek GeForce3 Ti 500, a number of capacitors have already been removed (in the red box). Therefore, it is not surprising that the card provides a high-quality image. The capacitors are next to the DVI connector. After you bite off the capacitors, all that should be left can be seen above in the red box.

The entire operation of biting off 9 capacitors is performed with simple wire cutters. With the right approach, you will not damage the board. In the end, it all depends on how bad the signal from your card was before the operation. As a result of some operations, we did not achieve almost any improvement, and it happened that an already excellent card showed even more excellent results.

To completely get rid of the low-pass filter, you will have to short-circuit the inductors so that they also have no effect. After removing the capacitors, the effect of closing the coils is not so significant. The operation itself is much more difficult.

Again, by removing this filter, there is a possibility of passing high frequencies that can interfere with other devices. But the likelihood of that is extremely small.

Why is such an upgrade not required for ATI or Matrox cards? Until recently, both ATI and Maxtor produced all boards on their own chips, so all control of all components was carried out very carefully. We have yet to see if ATI's third party decision to manufacture boards will affect image quality. Will users face the same problems as nVidia users.

It is obvious that soon, with the development and popularization of the DVI standard, end users will no longer have to bother with questions about why the image quality is so bad, and what is to blame ...

DVI (Digital Visual Interface, digital image interface) - connector developed Digital Display Working Group , as the first digital connector for liquid crystal ( LCD) panels. Since analog D—Sub was intended for CRT monitors, when the signal level changed, the brightness also changed, which for LCD monitor is not recommended. In addition, it has already begun to approach the threshold of bandwidth required for large resolutions. Yes, and an extra signal converter at the monitor input, definitely did not improve the picture. Later, the problem of changing the brightness for D-Sub decided, and the interface is still used in budget monitors in a single instance, or for compatibility in conjunction with digital inputs.

Serial format is used for data transmission PanelLink which uses T range M imimized D differentialS ignaling (signal transmission with minimal signal changes). Supported 3 streams to transfer data at speeds up to before3.96 Gbps.

For achievement top speed transmission, it is necessary that the cable length does not exceed 1.5 meters. With increasing length, signal starts fade out, so when connecting over long distances, you need to use special active amplifiers. Also, the ability to transmit a signal is highly dependent on wire quality, their resistances, etc.

Designations and types of DVI connectors:


  • DVI-D- support for digital transmission only
  • DVI-A - support for analog transmission only
  • DVI-I - support for analog and digital transmission

connector transmits 24 bit color in all resolutions, but when using Dual-link DVI on certain equipment, theoretically, it is possible to achieve 48 bit.

Maximum resolution for single channel mode ( single-link) DVI1920 X 1200 X 60Hz.

For dual channel ( dual-link) mode, the maximum resolution is - 3,840 × 2,400 X 33 Hz or 2,560 × 1,600 at standard 60 Hz.

At any problems in displaying information via DVI, the main reasons may be:

  • · Squeezing, twisting of the cable.
  • · Poor contact or clogging of the contacts of the plug and plug.
  • · Electromagnetic interference from nearby high voltage cables, or the DVI cable is poorly shielded.
  • · The resolution is too high and hence the lack of bandwidth.

Before 2015 year is planned to completely replace DVI new standard -