Broadcasting Live Events in UHD

This article was originally published at  Society of Motion Picture and Television Engineers Newsletter.

Much of the conversation regarding the broadcast industry’s next great leap forward into the world of Ultra High Definition (UHD) has centered around how broadcasters will be building or rebuilding their wider infrastructures on IT-based foundations capable of handling the high-bandwidth data that the UHD broadcast paradigm requires. Less debated are the nuances of the front end of the UHD transition—image capture. This is largely due to the belief that ultra high-resolution cameras have become so common that this shouldn’t be much of an issue. But that notion is simplistic, according to many broadcast professionals, because it is references modern digital cinematography camera systems with high-resolution imaging sensors, none of which are particularly applicable to conversations involving the broadcast of live events, particularly where sports and action are concerned. But figuring out how to shoot and broadcast that kind of content specifically is crucial to broadcasters because it is live content—sporting events, concerts, breaking news, and the like—that modern, IT-based streaming services like Netflix are not addressing. That means such content remains the province of major broadcast entities as the UHD era dawns, and they need to shoot such events so that the images will translate well on UHD televisions configured for watching 4K resolution movies with a variety of other image characteristic improvements—greater dynamic range, higher frame rates, better color, among other things.

Klaus Weber, worldwide product marketing manager for imaging products at Grass Valley, and an active participant in the EBU’s Beyond HD initiative, spoke last October at the SMPTE Technical Conference, and wrote a paper on this subject, which published in the April 2015 online edition of the SMPTE Motion Imaging Journal. He suggests the needs in terms of image capture presented by live UHD broadcasts “present a complete new challenge to the market.” Unlike many technology challenges in various categories, Weber cautions that despite incredible daily innovation inside the labs of virtually every major technology manufacturer in this space, there will be, in his view, no quick or easy or complete camera solution for this challenge in the foreseeable future. Rather, the industry will have to start learning how to juggle the twin arts of compromise and flexibility when it comes to solving the problems posed by major, live events slated for UHD broadcast.

“Remember, we are talking about cameras for live productions, so let’s say the main focus is on cameras that have a cable in-between the camera head and the camera base station,” Weber explains. “We are not talking about camcorders here or digital cinematography cameras used for cinematography applications—only for live environments. And these cameras are to be used for live productions in UHD. That means we are supposed to consider that UHD means—first of all, a higher pixel count. Yet, on the other hand, in the total UHD standard, it is much more than just a higher pixel count. We have other requirements like higher dynamic range, higher frame rate, extended color gamut, possibly higher bit depth, and so on. These things mean we want essentially better pixels. The problem is, the idea of better pixels is completely opposite the idea of more pixels, and right now, it is not possible to combine them [on a digital camera’s imaging sensor]. You can’t [accommodate] both ideas at the same time in the camera—it doesn’t exactly work that way.”

Weber adds that this dichotomy is important given the nature of the kinds of content that broadcasters need to capture and broadcast live, particularly live sports. He elaborates that there are a suite of options and compromises that broadcasters therefore have to evaluate in deciding what camera systems or configurations to pursue for such events if they mean to shoot them for UHD broadcast.

“If you want true, native UHD, then you need four times more pixels than what is available in an HD camera—double the amount horizontal and double the amount vertical,” he explains. “So what are your possibilities? You can keep the size of the pixels as they are in an HD camera, and then your imager will get four times larger. Then you basically come to what they do with digital cinematography cameras with larger imagers. Having three large imagers with a prism beam splitter is not practically manageable, so that means keeping the large pixels on a true 4K imager, and that means a single imager camera. We have had this for quite a while with digital cinematography. The problem is, for many live events, this has been proven to require certain compromises, such as needing to use film lenses or having limited zoom ranges because of the large PL mount lenses, which give you very short depth of field. And that kind of depth of field is not usable in many live events.”

In other words, the first potential compromise involves relying on a true 4K imaging sensor exclusively to give the broadcast good sensitivity and dynamic range, but at the cost of significant optical problems on a UHD telecast. Weber calls this “not a preferred solution for most, if not all, live productions.”

Alternatively, he continues, “you can make your pixels four times smaller, and squeeze four times as many of them onto the same 2/3-in. imager as we currently use in HD. If we do that, though, our pixel performance gets much lower because you need four times more light to generate the same amount of signal charge. In other words, your sensitivity goes down by about two F-stops, and actually, if you look at it in more detail, you will realize that since some parts of the pixels cannot be made smaller, the area available for collecting light will actually be less than one fourth for each pixel. That means the pixel performance will be even lower. There are some cameras out today that are trying this approach, but they are not usable for many kinds of live productions.”

Next, Weber says broadcasters can opt “to make your pixels more simple. Instead of having five transistors per pixels, you can take out two of the transistors in every pixel, and create more space for your photo-diode by removing those two transistors, giving you more simple CMOS pixels.”

Naturally, he continues, there is a cost to this approach, as well. That cost involves the loss of the ability to use a global shutter. “That means you go to a rolling shutter, and up until now, at least, a rolling shutter has never been accepted for high-end applications, because it introduces a lot of artifacts as we have seen from consumer cameras and phone cameras, which are not traditionally acceptable for broadcast.”

Yet another methodology being introduced to the market is the notion of keeping pixel size and imager size the way they are today on HD cameras, but to have three full 2/3-in. HD progressive imagers in an RGB camera. “This approach provides more than an HD image, closer to 3K in the red, green, and blue channels, and then [via software], you do a kind of up-conversion to 4K,” Weber explains. “This does not give you native 4K resolution, but it allows you to have the same sensitivity as a regular HD camera, but with better resolution, much closer to a native 4K imager. In our work [at Grass Valley], we have found this permits dynamic range close to 15 F-stops, which is the level required to perform HDR operations for the UHD standards.”

All of these approaches, by their nature, involve some level of compromise regarding what it means to capture a “UHD” image for a live event. However, doesn’t it seem logical that they are just temporary stops on the way to “true 4K” live broadcast cameras that, given the importance of this type of programming and the nature of the industry’s technological progress in recent years, will eventually rise up and solve these problems once and for all? One might expect so, but don’t count on a solution any time soon, Weber suggests. The nature of the problem to begin with is fundamentally different than with other technologies, and limits the pace at which a solution can likely be invented, he says.

“Maybe we will get that higher sensitivity [in UHD that we have in HD], but it won’t be soon,” he insists. “If you look back the last 20 years or so at how fast the development in sensitivity has been in terms of imaging technology, it took us between 5 to 10 years just to double it. And as I’ve said, native 4K images with four times more pixels need four times the light. So you have to figure it will be a time frame of at least 10 to 20 years until we can figure out how to compensate for that, not one or two years. This is not the same kind of problem as with processing speed, or RAM memories or hard-drive capacities, which can double every 12 to 18 months. That is simply not the case with imaging technology. So, unless someone invents an entirely new kind of imaging technology, which I don’t expect, then it will take us at least five if not 10 to 20 years before we can compensate for this sensitivity problem. And that means we will have to live with these compromises for quite a while.”

Long enough to be well into the UHD era, at least, Weber suggests. But if that is the case, what does he suggest broadcasters do in the meantime in terms of producing live sporting events in UHD?

Read the rest of this article here.

Leave a Reply

Your email address will not be published. Required fields are marked *