- Shop Products
- Help & Resources
- Section Map
- Refine Products
I really like this opening quote:
"For a list of all the ways technology has failed to improve the quality of life, please press three." — Alice Kahn
Number three is certainly something we would have been pressing a lot had we tried to incorporate HDMI into our projects from the beginning of HDMI's introduction back in 2000 or 2001. In fact, what prompted me to write this particular paper was hearing people make statements like "We will not use HDMI in our installations or in our designs, we will only use DVI-D." I thought this was kind of unusual because the two really are the same thing. After conducting a little bit of research, I created this overview to give you a better idea of what is going on with digital video.
In this paper, I will explain:
This little quote from Silicon Image really says it:
"DVI is the accepted standard for transferring serially uncompressed data at high speeds between a PC host and digital display such as an LCD monitor. DVI enables a video signal to be transferred from a PC source to a digital display in its native digital form, simplifying the way PCs communicate with displays and improving display image quality." — Digital Visual Interface and TMDS Extensions White Paper by Silicon Image, Oct. 2004
Joseph D. Cornwall, CTS, ISF-C, DSCE, ROI Technology Evangelist at C2G
Joe Cornwall has worked in the commercial AV industry for more than two decades. He's held management and technical positions with Sony, General Instrument and Motorola Broadband Communications Sector. Cornwall was a founding member of C-Big, the C-Band Industry Group and served as a member of the Board of Directors for the Satellite Broadcasting and Communications Association (SBCA) from 1999 through 2003. Today Joe Cornwall holds the position of Business Development Manager for C2G, where he's responsible for promotion, support and growth of a full complement of connectivity solutions, including the award-winning RapidRun® line of products. Cornwall holds CTS, DSCE, ISF-C and ROI industry certifications and is a graduate of Massasoit State College and the University of Cincinnati.
Serially uncompressed data at high speeds — this is the critical concept of this paper. Video is no longer video; video is now high speed digital data and it is nearly indistinguishable from anything that happens between two computer devices. Except in the instance where we are going between computer-generated video content and a fixed pixel display. That was the real key here — the advent of fixed pixel displays such as plasma LCD and DLP projectors, monitors and laptops that are themselves digital devices. We want to eliminate those complex analog-to-digital and digital-to-analog conversions that exist in the digital environment as long as possible, and leverage the native capabilities of that digital environment.
We know that years ago we were capable of supporting up to 1080p in the analog environment, and there are a number of devices that have done that. But the only solution that allows us to go beyond 1920x1080, including D4K (3440x2400) and beyond, is TMDS. So if we are going to start looking at very high resolution images, such as what you would see in your local movie theater, we need to move into this environment.
Another byproduct of moving into TMDS is better color. Unlike NTSC and analog where we had very low color resolution, a lot of people seem to forget that even with component video when we talk about s-video and composite video, color resolution rarely exceeded about 180 lines of resolution of actual color. In fact NTSC was often referred to as "Never The Same Color" because it was so difficult to get accurate color capability. In the TMDS environment, our base color palette is a 16 million true color resolution and can even go beyondthat if the project involves simulators or if there is a need to have a virtual reality. We can now leverage xvYCC, also known as Deep Color, and extend this up to more than a billion colors, which truly does encompass every color the human eye can perceive.
TMDS stands for Transition Minimized Differential Signaling. It sounds like a mouthful when you first hear it. TMDS was developed by Silicon Image Inc., a member of the Digital Display Working Group, as a method for transmitting high speed digital data. It incorporates a very unique and very clever algorithm that reduces electromagnetic interference (EMI) and enables the clock recovery at prodigious distances, up to 100ft at 1920x1200. It also enables high skew tolerance on cables that are really complex, and based on their original design, should not be able to produce video images from one end to the other. It does all of this with a very high level of confidence.
TMDS is a lot like RGBHV, and much like the analog world we live in today, in that ituses four channels: Red, Green, Blue and Clock. So, if someone said to you, "I have four coaxial cables instead of five so since I can't use RGBHV to connect my video source to my projector or my video source to my display, what can I use?" Well, you would probably respond that they could use SRGB, where we composite the horizontal and vertical sync and then multiplex them on a single cable. That is exactly what is happening on TMDS. So now you can begin to see that we are not in a foreign land. This looks very familiar — Red, Green, Blue and Clock — and it is a two stage process. This algorithm converts the input of an 8-bit video word into a 10-bit video word. By doing that, it does something very counterintuitive; it makes the signal smaller.
That is very unusual. Why would it do that? How does it do that? How did these computer guys get so clever? TMDS signaling uses twisted pair, hence the term "differential" in TMDS. Twisted pair means we are going to have common mode noise reduction, or interference rejection. This means we are going to have a higher head room. It also operates at current modal logic, which tells us that we're talking about something that is operating under 5 volts. In fact, the way that HDMI operates between the transmitter and the receiver is to operate at a 5 volt handshake, while 3.3 volts is the current mode logic where the actual data is transferred.
There are three twisted pairs for Red, Green and Blue, plus a fourth twisted pair for sync and, once again, that unique 8-bit to 10-bit conversion capability. So, what does that mean? That is the transition minimized part of Transition Minimized Differential Signaling. Here is what happens. Pretend the image below is the image you see on your computer screen right now. You see a black screen image with white letters on it. We know that each section of that screen is described by a digital word. Black, also known in digital as zero IRE, is described byan 8-bit digital word that is 00000000. As we get to the letter "W" in the word "What" in the headline, it changes to white.
The signal goes to 100 IRE, or full saturation, full white, where all three colors are at maximum output. This is indicated by a digital word of 11111111. Every time you transition from a digital zero to a digital one, or transition from one bit to the next, it is described by an electrical square wave. Square wave, as you may remember from your engineering classes, is a fundamental sine wave plus all of the odd harmonics, all of the very high frequencies. This means that when we have to make eight transitions we have a tremendous amount of high frequency going through this cable. It becomes very difficult for us to be able to encompass all of this bandwidth on any kind of a practical transmission.
So what our computer brethren did was to take a look at this and come up with a better solution. Most of what happens in video happens in shades of gray, but even as we look at the other colors we realize that those colors are described by shades of gray going through a red, green or blue filter, then combining to actually make the color. All of these variations happen in the four middle bits. So the computer guys said, "What if we take the four middle bits and where there are all ones we inverted them and made them all zeroes and then added a one to the very end? That way we eliminate a lot of these transitions, allowing us to carry less high frequency material." They then went one step further and said, "What if the least significant bit and the most significant bit are ones, and how about we invert those and make those zeroes and then add another bit to the end?" By going from eight bits to ten bits, what they have actually done is minimize the transitions from zero to one so that the maximum amount of transitions being applied is five rather than eight. This eliminates a lot of high frequency material that needs to be processed and transmitted. So it is rather counterintuitive to go from eight bits to ten bits, to go from a smaller word to a larger word, and then end up with a smaller amount of information. That is the beauty of data truncation or algorithms. This algorithm uses a special ten bit sequence to minimize the zero to one transitions. Hence, TRANSITION MINIMIZED DIFFERENTIAL (because it's going over a twisted pair) Signaling. So now we understand TMDS and all of the benefits it provides, including providing up to a D4k level of transmission.
Let us now take a look at how this hardware communication is played out. The block diagram you see below could represent, for instance, the output of a Blu-ray DVD player and the input of an LCD panel in your living room. It could also represent the output of a codec, a medical imaging device or other such device with an HDMI output, and the HDMI input on something like a projector or LCD panel in a digital signage installation.
What you see here should make you feel pretty good because it is very familiar. In the middle, you see the following TMDS channels: 0 (Blue), 1(Green) and 2 (Red). This is the same Red, Green and Blue that you have been used to seeing all along. The fourth pair is the TMDS Clock Channel; there is your sync. So you see that we are really not in foreign territory here. These things are very familiar to those of us who have been working with analog audio and video for a while. What you see in that fifth connection, Digital Display Data Channel (DDC), also known as EDID - carries a tremendous amount of information. This is where things start to get really interesting.
You know that if you connect your computer up to a monitor using a VGA cable you have DDC channel information there. That is what tells your computer to switch from 1280x800 to 1024x768 to present on a particular projector. That process is highly automated.
In the digital world it is a little bit more sophisticated. In the digital environment we move from Display Data Channel (DDC), which is a digital communications protocol between a display and a source that allows these devices to understand at what resolution they can operate.
This moves us into something a little more sophisticated called the Extended Display Identification Data, or EDID. EDID, now on version 1.3, is a 256-byte structure that provides a tremendous amount of information such as: Monitor name, identification number, model number, serial number, display size, aspect ratio, etc. etc.
The screenshot you see to the right is really kind of interesting. It might be a little bit small, but this shows you that the DDC channel EDID information is not simply transmitting the resolution the device can operate on. Rather, it is actually transmitting the serial number, the build date, the firmware date, the manufacturing number, the manufacturing identification number, the maximum resolution, the color depth, and a table of all the resolutions that the device can resolve. So there is a lot of information being transmitted there which makes these monitors compatible with a number of digital devices. Here in lies the very first problem that we experienced.
Back when HDMI first was released in the market, we already had DVI. However, what we really had was a failure to properly write EDID information on all of these devices. In the early days manufacturers actually had something called "plug fests". These were events where display manufacturers brought their displays and source manufacturers brought their sources. They would get together and start plugging these devices together and would take notes until they found where the incompatibilities existed. What we discovered was that a lot of the incompatibilities had to do with the EDID information being improperly coded resulting in the devices being unable to talk to one another. I'm very happy to say that most of these issues now reside only in legacy equipment, so the only time you are really going to experience any type of EDID information issues is if you are trying to incorporate devices that are four to six years old into a contemporary installation. Most EDID issues have been resolved and, in fact almost every digital device, whether it's an LCD panel, plasma panel, DLP panel or Blu-ray player, have firmware that constantly updates the EDID information to make sure these devices are compatible with all contemporary technologies.
Now, I would like to point out one other thing about the block diagram on the previous page. Although you will notice there is Red, Green, Blue and Clock as well as DDC data, what we do not see there is a pair for audio. In the digital world audio is embedded into the Red, Green and Blue digital video signal. So, you cannot possibly have a cable that has HDMI on one end and DVI on the other with 3.5mm audio. This is electronically combined and electronically separated at both ends, so it is part of the video signal. Moreover, audio truly is encompassed within the TMDS environment. A lot of people do not realize that even DVI-D is capable of supporting TCM digital audio in this video information. It was just never implemented at the time, and it took HDMI to get us there.
Now that we understand transition minimization, we understand that transition minimized differential signaling is a way of transporting digital data between a source and a display. We also understand that there is a line of information called DDCthat allows us to transmit EDID, or extended definition ID data, so that devices can understand what languages they will speak, and at which resolutions they will work.
In the late 1990s and the early part of the 21st century, digital rights management became a factor in all of this. Hollywood, of course, wanted to make sure that we could not make a copy of Titanic for our personal use. So they came up with the concept of digital rights management. There is a great quote attributed to G. Gordon Liddy that is very appropriate when discussing this concept — "Obviously crime pays or there would be no crime." The fact is that DVD encryption, or HDCP, was broken very soon after it was actually adopted by the Society of Motion Picture & Television Engineers (SMPTE) at their first meeting. In short, digital rights management has been more successful at causing headaches than it has at protecting digital content.
Let's take a look at what they did when they brought digital rights management into this mix. They implemented what is called HDCP, or high bandwidth digital content protection. It is designed to prevent the copying of digital videoand audio content. Since this is computer data, or high speed data, they could not use something like macro vision, which is an analog fluctuation of the signal, they had to come up with something different. They actually looked to something that was created for encryption by the CIA in the latter part of the 20th century. It utilizes what is called Blom's Scheme. Blom's Scheme is a symmetric threshold key exchange protocol used in cryptography to be able to encode all of these messages and here is what it does: Part of the EDID structure, the reason it has totell you the serial numbers and all of this other information, is that every display has an associated 56-bit key code and every source has an associated 56-bit key code. These key codes are built into the devices and are exchanged in order to create an opportunity for the devices to communicate. The DVD player would send the key to the monitor, which would send the key to the DVD player, and the two of them would magically erupt in a third momentary key that would allow those two devices to communicate for that length of time when the two keys had been exchanged. One source is connected to one display and then HDCP authentication, or key exchange and validation, takes place and allows an image to be displayed.
This created a tremendous amount of problems ten years ago. At that time when we really did not understand how the code needed to be written so that this protocol would allow a single source to authenticate more than one display. So being able to take the output of a code, the output of a DVD player, the output of a digital signage device on HDMI and have multiple televisions was problematic. That is the one too many. Another thing we did not have was something called authenticate forever.
A lot of money has been spent not only eliminating problems with authenticate forever, but also on eliminating the problems of latency that are a byproduct of Blom's Scheme. This occurs when you are on video input number one and you change to video input number two, yet it takes a second or two to get an image. This is much slower than in the analog world. We also see this when we change channels on digital televisions. It often times takes what seems to be a very long period of time to lock in on that channel and resolve an image to display. Authentication of thisBlom's Schemeis what is really happening, enabling these devices to talk to one another. In short it is really EDID and Blom's Scheme working together that results in these delays. A lot of money has been spent by companies like Crestron and Extron, making products that will perform key authentication management or EDID emulation that will prevent this constant reauthorization.
It was about eight years ago they created the code calledauthenticate forever. This is similar to the electronic equivalent of having kids in the back seat of your traveling for your family vacation. They ask that constant and repetitive question — "Are we there yet?" In fact, this is what happens with your home audio/video system when your DVD player is constantly asking your display — "Are you there? Can we exchange this key? Can we talk?" This is occurring over and over again. If you go from video two and back again, it refreshes that Blom's Scheme key exchange between the two devices. This is just a small piece of information regarding the latencies and the difficulties associated with it.
Now we understand that we had to create a way to send digital data, and we had to truncate that data by using transition minimization. We also wanted to place it in a different environment where it was going over twisted pair, and we had to provide EDID so these devices could talk to each other. Then we had to add high definition content in order for it to become a commercial reality. That is when an actual technology that would allow us to transmit digital video was created. This is something that only emerged into the market around 1999 and then in 2001 with DVI-D. In fact, as late as 2003 there were only 25 companies that manufactured HDMI enabled devices. Today, it is pervasive. Every device, whether it is commercial or consumer, is HDMI compliant. Because it is HDMI compliant, it is both HDCP compliant and CEC compliant, which we will cover later.
So what was the first product created once TMDS was perfected? Well, that would be DVI-D. DVI-D is a really neat way of connecting video. First, it has a robust locking connector. It looks a lot like VGA, with those torque screws that provide a secure connection that will not come loose. However, that is a bit of ared herring. Cables do not normally fall out all by themselves. We'll discuss this in more detail when we get to HDMI.
There were several different flavors of DVI-D. DVI-D in the digital domain is designed to support WUXGA or up to 1920x1200 at up to a 75Hz refresh rate. We also had something called DVI-D dual link, which over the years has become a less popular option. It is doubtful that we will see much of the dual link technology being used anymore. Dual link was, in essence, two DVI-D cables in the same jacket that allowed us to go up to 2560x1600, or WQXGA level resolutions. This was used five to seven years ago primarily by folks who were involved in creative applications, such as video production and computer graphics productionfor advertising. At the time, they were using relatively large 30"-35" monitors that required this very high level of resolution. Today, we can get resolutions well beyond this with HDMI or DVI-D single link, which is why I believe DVI-D dual think is going to continue to decline in popularity.
When DVI was first introduced, it caused a little bit of confusion. The very first permeation of DVI was DVI-A. Think about this for a minute: DVI-A, digital visual interface analog. Isn't that a contradiction in terms? There was the problem. DVI-A was really intended to be a replacement for VGA. In fact, there had already been attempts to replace VGA using something called DFP, or digital flat panels, which came out in the late 1980s or early 1990s. It never did catch on, so it was on the market for about 37 seconds then dropped off the face of the earth.
The problem with DVI-A was this: You had a device that was DVI-D output, added a display that perhaps had DVI-A input, used a cable to connect it and it worked perfectly. But yet there was no image. Why not? Because the device wanted analog and the source had digital. With no way to convert them, the two were inherently incompatible. Then the industry moved into something called DVI-I, or DVI-Integrated. DVI-I has both digital and analog pins so it can carry either signal. However, it cannot convert digital to analog or analog to digital. Fortunately, DVI-A has, for the most part, disappeared from our industry, and we do not see it anymore. DVI-I exists in very few products. Primarily where we see DVI-I being used today is in some digital signage devices where there is a relatively small footprint and we need to have the ability for both analog and digital connections. We also see it on some codec devices so we can decide whether we want to have an analog or digital connection.
With the introduction of the analog sunset in 2014 and the introduction of software that is down-limiting the amount of resolution we can get from commercial analog sources even today in computers, the days of analog video really are numbered as a universal solution. We will soon be moving towards an all-digital world in the video environment. Since DVI-I seems to be fading out, we will probably only see it used in some of the more limited, special applications where you select analog or digital and then connect appropriately.
DVI-D is digital, and is a video standard. Single link can now support up to 2560x1600 at 75Hz over a single link. We primarily see DVI-D on medical imaging devices and on some computer devices where we need a very high resolution, orhave multiple screen capabilities. By and large we are seeing DVI-D movinginto technology that will eventually take its place. That technology is called HDMI, High Definition Multimedia Interface.
Let us take a look at the evolution of DVI-D to HDMI. With HDMI, we took DVI-D exactly how it was, and we added consumer electronic control, or CEC, and audio capability to it. CEC is a one wire bi-directional serial bus that uses A/V link protocol to perform remote control functions. In other words, I can hit the play button on my DVD player, and it should turn on my display and my other devices while going directly to the correct input. We will discuss this more in depth later,because it's a very interesting feature that you are going to see being leveraged more and more as new products come to market.
In addition to CEC, audio capability was also added. HDMI was designed specifically for the consumer electronics industry. The concept behind HDMI was to eliminate video and audio connections, such as component (Red, Green, Blue), audio (left and right), subwoofer (left and right), rear and center channel - and eight or ten other connections needed to hook up one DVD player, one satellite receiver or one cable box to an audio/video receiver. The challenge was finding a way to combine it all into one plug, and that is exactly what HDMI did.
When HDMI was created, several of the features that were developed were really quite amazing. First and foremost, a compact disc operates on something called the Redbook Standard. We all know that a compact disc operates on 16-bit words sampled at 44,100 samples per second. When creating HDMI, there was a desire to make the sound better than what is on a compact disc. Therefore, one of the first things addressed by the committee tasked with designing these particular capabilities was the decision to move this to a 24-bit word length.
In the digital audio environment the word length, 16-bit or 24-bit, describes amplitude. A compact disc can give an 88db signal to noise ratio. This is very good, but not far beyond what an analog reel-to-reel or a very high quality analog recording was capable of. A 24-bit word length gives up to a 160db signal to noise ratio or dynamic range, which is significantly greater than anything that can be done in the analog world. It basically provides a wider palette to paint those sound pictures on, resulting in much more clarity and a much greater separation between the softest and the loudest sounds. This makes it very exciting when we watch action movies, allowing for us to have a very refined sound.
The other thing that happened with the HDMI standard is that it increased from 44.1kHz, or about 44,000 times per second sampling, to 192,000 times per second sampling. Once again, if we think back to our early engineering classes, we recognize that sampling is described by something called Nyquist's Theorem (or Rate). Nyquist's Theorem says that if we want to have a 20kHz high frequency sound, which is a very high frequency sound that is on the edge of what the human ear is capable of hearing, we need to have a minimum of twice that in sampling or 44.1kHz. When that standard was created for compact discs way back in the early 1980s, it brought with it a host of problems.
The most significant problem was that we had to filter out some very high frequency information, resulting in a much higher sampling rate versus a much higher bandwidth. So needless to say, HDMI does represent absolute state of the art sound quality with up to eight channels of uncompressed audio going over this single connector. That truly is as good as it gets. When we say what we are listening to sounds like the master tape, we are in fact hearing the master sound track. Dolby TrueHD and DTS-HD make this possible. For instance, this is used at Skywalker Ranch, the soundstage where they mixed movies like The Abyss and the Star Wars series. The result is master quality sound, and HDMI is the only place you can get it.
HDMI (High Definition Multimedia Interface) used to be called High Definition Mystical Interface because there was often a mystery as to whether or not it would work. While HDMI has always been fully compatible with DVI-D, in fact, inside an HDMI cable is a fully realized DVI cable, in earlier days, commercial gear did not have HDMI inputs. So, often times it was not HDCP compliant. It did not have the high definition content protection, or the EDID table that was written in the same way as consumer gear, so there was a incompatibility issue. Your DVD player was speaking French, your cable or satellite receiver was speaking Italian, and they were unable to communicate with one another.
What needed to happen was the standardization of those EDID tables and the implementation of HDCP protocols, including one-to-many authorization and authenticate forever in order for HDMI and DVI-D to truly become compatible and, in fact, interchangeable. Fortunately this is where we are today, and this means you will not experience compatibility problems between HDMI and DVI-D on anyof your contemporary gear. If you buy a commercial panel that has an HDMI input, it is compliant with all the specifications for HDMI 1.4 that we will discuss now.
HDMI delivers uncompressed digital video, multi-channel audio and CE control. From a practical perspective we are now working at level 1.3b. When we first started, HDMI 1.0 had significant performance issues. However, it has since undergone many revisions and in January 2010, HDMI version 1.4 was introduced with several new features. For the most part HDMI 1.4 is just like HDMI 1.3 with only one exception: you do not need new cables. However, you do need the right chipsets in order to implement some of these new features. Let's take a look at what happened with HDMI 1.4. But first, let's take a look at CEC and then we'll then we'll take a look at HDMI 1.4.
The idea behind CEC was to provide the ability to press the "play" button on the DVD player which would automatically turn on the receiver as well as the display, then go to the correct input, the correct resolution, and the correct surround sound format and begin playing.
That did not work because the CEC bus, or the CEC language, was not really standardized across all platforms, all industries and all manufacturers, so the problems were similar to those we experienced with the HDCP and EDID information. However, with the advent of HDMI 1.3b and HDMI 1.4, it is imperative that CEC be correctly implemented, and we can see today that it is. This is very important to us at C2G because of our fantastic new product, the new Media Gateway, a device that will be available for shipment starting in September 2011. This solution will really allow you to have both analog and digital inputs to any digital-based system. The TruLink Media Gateway becomes a universal access point for all media.
Every consumer panel that has an HDMI input supports CEC. On a commercial panel you have to get into a sub-menu to turn CEC compliance on or off because you don't want these CEC commands interfering with RS232 or other commands that might be coming from a Crestron, AMX or Control 4 type control system. In the consumer world, CEC is by nature always on. Again, you can access a service menu and turn it off. There are very few differences today between commercialand consumer monitors, with the sole exception being that in some commercial monitors we have power supplies that are designed for a 12-hour or 24-hour heavy duty cycle operation. However, the compliance is the same.
So, here is what happened when we added the complexity of HDMI 1.4. First, and this is the only thing that requires a different cable, we added the HDMI Ethernet Channel. This HDMI 1.4 specification allows HDMI devices to share an Ethernet connection. Remember how I said that for EDID information to really operate we have to constantly update that firmware... and that if you're going to integrate older panels or source devices into your installation, you have to update that information to make sure those EDID tables are up to date so the devices all know how to speak the same language? Well, that would require almost every device to have an Internet connection. When you look at a lot of sophisticated home theaters or corporate board room and presentation systems, you will see things like the audio/video pre-amp processor, whether it is JBL Synthesis, Toshiba or Sony, and all have an Ethernet input. The DVD has an Ethernet input and the codec has an Ethernet input. Even the display panels have an Ethernet input. Part of the reason for this is so that you can leverage things like HULU, YouTube and Netflix. The other part is to make sure the firmware that resides on these sets is always compatible with new devices.
First, HDMI 1.4 solves the issue of getting a network connection to every device when we have an HDMI cable going between it. The specification changes the pin-out of that 19-conductor cable and allows some of those pins to carry up to 100Mbps of Ethernet information between devices. This does require that both the source and the display device have HDMI 1.4 level chipsets and that there is an Ethernet enabled cable between the devices. It truly is a variation on technology that is self-contained.
Second, HDMI added another feature - the audio return channel. This does not require a new cable but does require both the source and reception device tobe HDMI 1.4 chipset -enabled devices. For instance, let's use this example: I have a television in the living room and I'm connected to an off-air antenna. How do I get the sound off of that panel and into my audio/video receiver? Well, in the past I probably would have to go through SPDIF or TOSlink digital audio. In this case, more than likely I would have some sort of lip sync off-set or time delay because of latency. It was sophisticated; it was complicated; and it did not work really well.
What happens with the audio return channel is that a single HDMI cable connection from the audio/video pre-amp processor to the display allows me to carry the audio backwards in a return version, so it is coming from the internal tuner to the output device. Where is this important in a commercial installation? Think once again about a corporate board room presentation system. Maybe the CEO of that company wants to be able to turn on MSNBC, FOX News or CNN to watch news stories that may be related to his company. Do we want this to be a complex set up that involves the need for an external tuner and all these other devices, or would we prefer to keep it simple so that it's easy and intuitive? Perhaps we have a scenario where we are trying to take the output of a local tuner and incorporate it into a codec so that it can run through a digital signage product or reach multiple buildings. That's where we'll experience some real challenges. Some codecs have internal software that prevents echoes. Sometimes those echoes can be a result of latency in electronics. As a result, we may end up eliminating the sound we actually want. Once again, this HDMI bi-directional capability can eliminate some latency and allow a much greater level of interoperability between devices.
Third, HDMI 1.4 delivers 3D. 3D is, of course, very exciting for the digital signage environment. Personally, I think just about everyone in the consumer world and the commercial world has gotten a big yawn out of this technology. I do not expect 3D to necessarily take over corporate board room type projection systems, but I do expect 3D to be very important in the realm of digital signage applications.
Fourth, HDMI 1.4 increases the resolution from 1920x108 at 60mHz, or 1080p, all the way up to 4k resolution, which is 3840x2600. Don't get too excited and run out and buy Blu-ray players that will play that level of resolution. I seriously doubt we will have an abundance of pre-recorded content at that high of a resolution. However, we might see something like that for gaming and simulation systems where we want the highest resolution possible. Also keep in mind that there are limitations to the level of resolution that the human eye can actually see.
Fifth, HDMI 1.4 has expanded support for color spaces that allows us to support up to 12-bit Deep Color, which is well over a billion colors.
Last, but not least, the new HDMI 1.4 standard encompasses something called HDMI micro connector (Type D), which finally has an HDMI locking connector. Unfortunately, it is designed for the automotive industry. It is not something we will see being utilized for indoor rack-mounted gear such as playersand displays, etc.
I have mentioned that one of the downfalls of HDMI is that there is a friction fit connector. While it is true that the connector can be a real challenge if you are trying to connect HDMI into something that is on a moving arm, an articulated arm or a swing-out arm. However, proper strain relief can reduce those problems. Certainly, DVI-D did not present those issues, but HDMI will continue to do so. By and large, we do not see a lot of problems with HDMI connectors in rack-mounted applications.
Local Area Network HDMI 1.4, is an advanced bi-directional communication that allows one device to be connected to the Internet via a LAN port. This allows you to keep all of this firmware up to date on all of your devices, a very nice feature for certain applications.
Much more exciting than that is the digital audio return channel which allows us to deliver audio "upstream" from a TV or display device to an A/V receiver or amplifier. Think about how that might be used, for example, in a small scale digital signage application, where I simply want to have RF. Perhaps I'm coming out of a ZV-type ATSC encryption device using ATSC, QUAM or AVSB to transmit my signal to various displays at different locations and I want to have a pair of speakers. This will allow me to come out to a small amplifier to drive a local source.
Now, while HDMI was progressing, and we saw the industry moving towards much more elegant implementations of digital video, the guys in the computer world were not sitting back ignoring all of this. They were paying some attention. A number of years ago they came up with a solution called DisplayPort. DisplayPort was originally created to connect a computer driver to a computer display device. It was of particular importance inside of a laptop because DisplayPort is asynchronous. In other words, it does not have the clock pair even thoughit has the Red, Green, Blue pair like on HDMI. That clock information is embedded in what would be the Red, Green, Blue. What this means is that it operates similar to USB and can handle multiple signals going to multiple devices all goingover the same cable.
What happened in the laptop is that every laptop computer, up until the advent of DisplayPort, had something called an LVDS driver. The LVDS driver is what drove the internal display in the laptop. The folks building computers realized if they could get rid of the LVDS driver, they could get rid of an entire chipset. This meant that they could design smaller and lighter laptops that were less expensive and that used less battery power because there would be fewer devices to power.
Another advantage of DisplayPort is that it is royalty-free. Every manufacturer that incorporates HDMI into its device must pay a $.04 HDMI royalty onthat HDMI port. The expense can be quite significant when you start thinking in terms of a device that has perhaps eight to ten HDMI inputs, and you are manufacturing a million of them per year. In December of 2009, right before HDMI 1.4 came out, DisplayPort version 1.2 was announced. Since DisplayPort was not really working in the computer world, they decided to make it more like HDMI. It does have a locking connector, the cable is slim and it is shaped very much like HDMI. The conventional wisdom said, "Let's give it the ability to handle bi-directional LAN capability, the ability to handle up-stream audio capability and the opportunity to do some other things."
Just for clarification DisplayPort is not in its very nature compatible with HDMI. It is wholly unique. However, because DisplayPort is malleable andcan do so many things, it can output an HDMI signal so any device, for instance, like the Dell laptop I am using right now to write this article, can have a DisplayPort output. You can then buy an adapter from C2G that will allow you to go from DisplayPort to HDMI. So, if you have an HDMI enabled infrastructure, you are at least to some point DisplayPort enabled.
Here is what DisplayPort did that was really cool. First, it kept the bidirectional capability of HDMI 1.4 to 17.2Mbps so it could distribute commands or update firmware. They had smaller cables based on micropackage protocol not on serial data stream which, meant that this device could operate in a star configuration or in a serial configuration. This is what happened with the specification version 1.2, and it was designed to once again support that chip-to-chip communication. So we will be seeing an advancement of DisplayPort driven by the computer manufacturing industry.
DisplayPort is fully compatible with YPrPb and RGB which means it is fully compatible with things like Blu-ray DVDs. This is why I can go out and get a DisplayPort enabled computer with a Blu-ray DVD built into it that is fully compatible with HDMI and fully compatible with Blu-ray capability. What DisplayPort does as a result of its asynchronous capability, that is so incredibly unique, is multi-streaming.
Imagine coming out of something like a digital signage device or coming outof a computer is a single cable going into a single 50" panel. Then from that panel it goes to a second 50" panel, and from the second to a third, and from third to a fourth. So with nothing more than four pieces of wire, I can power four 50" panels and get four entirely separate 1920x1080 images. This requires a lot of processing power, but just think how much that would simplify something like a small scale digital signage video wall where I'm going to do a 2 x 2 output. All I need is wire to hook that up, and the way those images work together to create a single larger image or fracture to create smaller images is part of the content. This multi-streaming.
I can also do this in a hub configuration where I come out of a computer into a single hub and from that hub into the four panels simultaneously without having to worry about latency. Multi-streaming is a really important part of DisplayPort 1.2. If you are in the digital signage business, you are going to get a lot of requests for this technology from those that are involved primarily in financial services, such as banks and market banking facilities. You are also going to encounter this a lot in the medical industry. DisplayPort seems to be picking up steam when it comes to medical imaging devices. One scenario might be where a laparoscopic camera is being used and the surgeon needs one display that shows the image of the camera and another display that shows the status of the camera; temperature, operating voltages, etc. The benefit of DisplayPort is that it eliminatesthe need for a ton of cables, which would make this device very bulky.
The downside to DisplayPort 1.2 is the length limitations. In fact, the longest length that we have seen DisplayPort work on is about seven meters. At resolutions of 2560x1600, we are limited to something more like six feet. For this reason, there is still a little way togo before this topology works its way into the audio/video world. For those of you who are doing audio/video design and installationin a corporate board room, if you provide a TMDS channel that is supportive of HDMI, or if a client brings in a device that is DisplayPort enabled, you can translate from DisplayPort to HDMI in the same way you can translate from HDMI to DVI-D. You lose some features of the higher level product, but you do get basic compatibility. So, if we have an HDMI enabled installation, by default we have a DVI-D and DisplayPort enabled installation. This is one reason it is so important to understand transition minimized differential signaling and its relationship to HDMI. Maybe by now you are beginning to understand why I mentioned earlier that I was so perplexed when I hear of A/V professionals saying they do not allow HDMI, but rather they prefer DVI-D. Essentially HDMI is DVI-D, and functionally, there is no difference in today's environment between these two technologies.
Now let's talk a little bit about the practical concerns of the TMDS environment. This is the part I refer to as why it didn't work. First of all, if we are installing a monitor and a display and we get that dreaded black screen of death, chances are we are dealing with EDID ROMs that are coded incorrectly. If the system fails to read these devices, it simply will not give you an image or it will give you an image that really does not work very well. Fortunately, most of those problems have been addressed in the past two years.
So, if you are performing an installation where your client has a 60" plasma panel that was purchased five years ago, and it only has fifty hours on it, youshould be sure that you make allowances to update the firmware. Every single device has the ability to have its firmware updated from the manufacturer's website. Simply download it onto a CD ROM and plug a laptop into a test player on the back of that set and enable it to update the firmware that allows it to work. EDID ROM that is coded incorrectly is something we don't see as a practical problem if we are using a contemporary product.
HDCP authentication failure is a little more common. We still have some devices on the market that are not HDCP compliant. The main problem is that we either have an issue with repeater implementation or we have an issue with authenticate forever. In some cases the issue may not be related to either one of these, but instead that there is an unacceptable level of latency. In other words, the delay of switching from one input to another; such as when changing between channels takes too long. Using more sophisticated systems that incorporate key authentication management, such as those created by Crestron and Extron, can eliminate some of that latency and provide flexible switching, of course at a price point that goes along with that level of flexibility. But do not fear, simple TMDS installations with single point to single point and very simple switching devices are also highly dependable in today's environment. HDCP authentication failure in these is just not something we see all that often.
Something that is a real problem in the TMDS environment is voltage sag. Fortunately, this is limited almost entirely to the world of cable boxes. Cable boxes are notoriously inexpensive. They are built with very marginal power supplies which is the most expensive part of any electronic gear.
These operate on CCM, or current mode logic, which is 3.3 volts high and 1 volt low, with a 5 volt bus. In other words the receiver that is in your 50" displayis being powered by the transmitter in your Blu-ray player, not by the power supply in the TV. The power supply in that Blu-ray player must transmit along the HDMI cable and create that hardware handshake.
Here is the problem. On almost every switching device whether you are going through a simple Pioneer audio video receiver, a sophisticated JBL Synthesis Pre-Pro or a very sophisticated DVI Gear 16-port matrix, these devices are completely invisible to the signal. The source talks to the sync or display and everything in the middle is completely invisible for the purposes of HDCP and EDID capability. What happens with a cable box that has a marginal power supply is this: I plug the cable box directly into a monitor and get an image. Then I take that 3ft cable from the cable box, plug it into my A/V switcher or maybe an 8-channel matrix to run 30 feet to a projector, and I don't get an image. Next, I plug in my DVD player, and I get an image on the projector so I know the cabling between the switcher and the projector is working properly. The issue is that the cable box is not only seeing the three feet into the switcher. It is also seeing the additional 30 feet into the display. So it is actually seeing a total of thirty- five or thirty-six feet, sometimes with extremely small wire gauges and a marginal power supply. Low voltage sags below 5 volts and it does not power the receiver, so the system fails. We do have a solution for that — a voltage inserter. It is designed to work insidethe HDMI environment and sells for less than $20. These problems almost always manifest themselves with cable boxes, occasionally with some inexpensivesatellite boxes, but almost never with any other gear. So if you are not dealing with cable boxes or satellite boxes, there is no need to worry.
Finally, in why doesn't it work, one of the biggest problems we see in the HDMI world today with cheap cables is impedance matching and reflections that compromise data detection. Obviously, you can go on Monoprice.com and buy a 50' HDMI cable for $12. Do you really want to do that in the context of your own installation? I will tell you that this is the biggest mistake you can make. Cheap cables are made with cheap spiral braids. The spiral shielding is likea Slinky — if I bend it, it opens and I no longer have a shielded cable. It actually changes the characteristic impedance and the very way this high speed data is being transported. Use high quality cables like those that are provided by C2G, and rest assured that you will not experience any issues.
With regard to practical limitations in the TMDS environment, if you are running under 50' using quality interconnects, you will have no problems achieving 1080p. You do not need any amplification. For 65' to 100' you will need to use an amplifier. In the analog world the amplifier goes at the rack or head end because the signal must be amplified before it is lost. In the digital world, the signal must be lost before it can be reconstructed via equalization and re-clocking. In this case a lot of times the amplifier goes at the monitor and not at the source end. Don't make this mistake. This is a common problem I often see. It is just because the wrong device is in the wrong place. From 65' to 100' you're probably going to have to use an active buffer, but once again specifying a system where I am going to get 1920x1200 at 100' and point to point connectivity is not such a stretch of the imagination. It is actually quite common. If you need to go between 80' and 100', or 150' and beyond, certainly consider using Cat5 or Cat6 transfer products for this. The products recommended for this would be either two Cat5 or two Cat6 cables, one for TMDS the other for DDC. They provide full 1920x1080 up to 200' with no problems.
There is a new solution coming out on the market called HDBaseT. This product allows you to take a TMDS signal over a single Cat6 cable up to 350' at up to 1920x1200 resolutions with audio, CEC and RS232 capabilities. HDBaseT is really going to revolutionize how we approach the TMDS environment. HDBaseT has been introduced in products by Gefen, Extron and several other manufacturers. C2G will introduce its own solution to the market in the coming months.
Finally, the last tip is to not forget the lowest resolution rule. I see this happen time and time again. Someone putting together a sophisticated installation wants to add a preview monitor at the rack so the client can make a better choice about what they are seeing and also have various monitors elsewhere. Because of the nature of TMDS, it uses EDID to establish resolution. It will look at EDID on the primary set and establish an EDID resolution that is the lowest common resolution of all of the sets. This means that if you put a 14" monitor designed for 480p into your rack, understand that you set the resolution for your entire installation to 480p because that is the lowest resolution in the rack. If you want your entire installation to be 1080p, then every single piece of display gear must be 1080p. You can't have one 720p display at the end of the line and assume it will be fine. The lowest resolution rule says that the lowest resolution device sets the resolution for the entire installation.
In summary, HDMI and DVI-D are variations on the exact same technology. There is no functional difference between the two of them, either in resolution, performance or dependability. DisplayPort supports HDMI and DVI-D connectivity, but DisplayPort signals are not inherently compatible with HDMI and DVI-D. If I take DisplayPort and adapt it to HDMI or DVI-D, I will lose the asynchronous ability of DisplayPort to have multiple signals over the same wire. Finally, TMDS signals are meant for short distance, point to point connectivity. If we are looking at running signals several hundred feet or more - we have to look at other ways of doing this.
This white paper is for informational purposes only and is subject to change without notice. C2G makes no guarantees, either expressed or implied, concerning the accuracy, completeness or reliability of the information found in this document.