Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
IT

HDMI 2.2 Finalized with 96 GB/s Bandwidth, 16K Resolution Support (tomshardware.com) 70

The HDMI Forum has officially finalized HDMI 2.2, doubling bandwidth from 48 GB/s to 96 GB/s compared to the current HDMI 2.1 standard. The specification enables 16K resolution at 60 Hz and 12K at 120 Hz with chroma subsampling, while supporting uncompressed 4K at 240 Hz with 12-bit color depth and uncompressed 8K at 60 Hz.

The new standard requires "Ultra96" certified cables with clear HDMI Forum branding to achieve full bandwidth capabilities. HDMI 2.2's 96 GB/s throughput surpasses DisplayPort 2.1b UHBR20's 80 GB/s maximum. The specification maintains backwards compatibility with existing devices and cables, operating at the lowest common denominator when mixed with older hardware. HDMI 2.2 introduces a Latency Indication Protocol to improve audio-video synchronization in complex home theater setups.

HDMI 2.2 Finalized with 96 GB/s Bandwidth, 16K Resolution Support

Comments Filter:
  • gotta keep selling those "new" cables!

  • Sounds excessive (Score:2, Insightful)

    by gweihir ( 88907 )

    In fact sounds like the standards comittee should have been disbanded about one version ago. This is not going to become useful before major technological changes make it obsolete.

    • Re:Sounds excessive (Score:5, Informative)

      by Guspaz ( 556486 ) on Wednesday June 25, 2025 @12:44PM (#65475362)

      It will become useful yesterday. Monitors that exceed the bandwidth of HDMI 2.1 have been on the market for some time now. They currently either rely on DSC or DisplayPort (or both).

      Current 4K240 monitors require around 129% of the available bandwidth that HDMI currently provides. When operating in 10bpp HDR, they require 161%.

      Considering that lower resolution monitors on the market today go up to 540 Hz, the appetite for increased connection bandwidth is insatiable.

      • by gweihir ( 88907 )

        Indeed! And it may be useful on space-rockets as well! Because most people have these! Seriously, do you even realize how stupid you sound?

        • by Guspaz ( 556486 )

          So your argument is that technological advancement is pointless because only high-end products can benefit today?

        • HDMI has been doing a shit job of kicking the can which is why we have so many sub-versions. They absolutely should make a big jump forwards for a change, because they have been creating way too many versions of HDMI and that is already leading to customer confusion.

          • The problem HDMI has is that:

            1) They have to work with a given number of conductors in the HDMI cable, defined back when 1080p60 was a lot of pixels

            2) They want the cables to have standardised Gb/s speeds

            3) They want the cables to be cheap

            So, instead of defining a "big jump forwards" spec that would require a very expensive cable, they'd rather cover the needs of the now and leave future problems for the future.
            • Well, this is a bigger jump than they have been making, so perhaps it will last longer. Nobody expects it to keep forever, but they've been failing pretty badly. Besides, as long as the spec of cables most people buy remains cheap, and most people will not need these cables for years anyway, there is no reason why the fancy new ones need to also be that cheap.

        • There are $700 gaming monitors with those specs already. There will probably be $300 gaming monitors with those specs in a year or two.

      • by Luckyo ( 1726890 )

        This. Of all the marketing wank about 16k, the actually relevant part is this:

        >supporting uncompressed 4K at 240 Hz with 12-bit color depth

        This is what actually exists, and has been supported by DP since 2019 or so.

        • by Guspaz ( 556486 )

          I'd dispute that. First because the first GPU to support full bandwidth DisplayPort 2.X didn't come out until 2025, and second because DisplayPort 2.X can't support uncompressed 4K 240 Hz with a 12-bit colour depth. It can do it with compression, or it can do 10-bit colour depth, but not uncompressed 12bpc. That's not a big loss, I'd bet you that nobody in practice could tell the difference between 10bpc with temporal (or even spatial) dithering (which GPUs do automatically) and 12bpc. And the consumer HDR

          • by Luckyo ( 1726890 )

            You're comparing apple to a drawing of an apple.

            Spec came out in 2019. Monitors that support the spec came our recently, and so did GPUs. But they're already out both. It takes time for spec to mature into proper product (because engineering isn't instant creation into being).

            HDMI didn't even have a spec to support this until now. And it will take a while before the spec is implemented as well, because engineering hasn't magically become instant creation into being still.

      • Considering that lower resolution monitors on the market today go up to 540 Hz, the appetite for increased connection bandwidth is insatiable.

        540hz probably sounds ridiculous to the average user, and even recently most gamers find it ridiculous (good luck getting 540fps).
        but frame generation in modern GPUs will eventually fill 1000hz (you just need ~120fps to make it feel good), and ~1000hz on a sample-and-hold display should yield comparable motion clarity to a CRT.

        plus TVs keep getting larger; so 8k1000hz @ 12bpp + hdr will be very useful.

        • by Guspaz ( 556486 )

          There are benefits to higher refresh rates beyond just reducing latency or reaction times. It improves motion clarity, that is, it reduces motion blur. There are other ways to do that, like black frame insertion or backlight strobing, but those have severe negative impacts on brightness, and framegen is a way to improve motion clarity without reducing brightness. Framegen shouldn't be about improving low framerates. It should be about taking something like 60 FPS and turning it into 240 FPS for better motio

    • by bn-7bc ( 909819 )
      At this point it might be time to drop hDMI go for 100Gbps ethernet instead, reach is nor going to be an issye since HDMI probably does not reach more than 10-15m anywayand cat6a/vay t cables are allso way cheper
    • Are you confusing this with Displayport or are you completely ignorant? HDMI is bandwidth constrained with current generation monitors, relying on compression to get the data where it needs to go in the high end.

      Maybe let the actual experts do their job rather than calling for them to be disbanded. They clearly know more about this than you do.

  • by groobly ( 6155920 ) on Wednesday June 25, 2025 @12:16PM (#65475272)

    Why do I need 16K resolution? Sitting one foot away from a 180inch screen?

    • Re:why (Score:5, Interesting)

      by karmawarrior ( 311177 ) on Wednesday June 25, 2025 @12:20PM (#65475284) Journal

      I don't know if they've added the relevant features to HDMI but for DisplayPort one of the reasons they upped the speed was so that you could drive multiple monitors from one cable.

      Personally while I like that this is putting pressure on the DP group to not sit on their laurels (it'd be nice to drive 4 4K monitors from one cable...) I don't really care what the HDMI group says. The term is synonymous with royalties and unnecessarily expensive cables. DisplayPort, which is at least free and open, seems to both be the more innovative standard and the free-er one.

    • by Rinnon ( 1474161 )
      For all the same reasons you needed 4K resolution!
    • Re:why (Score:5, Insightful)

      by ArchieBunker ( 132337 ) on Wednesday June 25, 2025 @12:50PM (#65475378)

      Well the slashdot crowd has certainly aged out. At one time people would have been excited about incredible resolutions. Now it's just old people complaining.

      • I would imagine the time between the widespread standardization on 1080p and the widespread availability of cheap 4K monitors was, for most Slashdotters (who are Gen X and older), the time about which our eyes all turned to shit (45-50 years old.)

        So you're probably not wrong. But that said, more monitors, larger monitors, and better colour, are all things we should appreciate even with crappy eyes.

      • by Anonymous Coward

        You enjoy this because you're an electronics geek. I enjoy it too. But in all other respects, your posting shows that you are a cantankerous old coot, so you are in good company with the old people complaining.

      • At one time people would have been excited about incredible resolutions. Now it's just old people complaining.

        My doctor doesn't want me getting too excited anymore. Where's my nurse? It's past time for morning meds, and I need my nap...

      • by MeNeXT ( 200840 )

        or maybe we don't fall for the false hype anymore. I'm writing this on a 55" 8K monitor where the OS fails to scale the menus so you have to squint to be able to read them. If we had proper support for 8K, 16K would be great.

      • Most of the time it's the trade-offs that are the problem.

        When I was a kid, it was fun to upgrade my PC every 6 months whether it needed it or not, as any upgrade gave me at least a healthy 25% improvement in performance with no fuss. Today, if I replace a part, my OS starts screaming at me that there's a problem with my license that that something isn't properly authorized or supported, claiming that it's for my security and they're just trying to keep me safe from the bogeyman. Meanwhile, they keep push

    • I just bought a 5k x 2k monitor oled monitor at 165hz and 10-bit; hdmi 2.1 and dp don't support it without DSC.
    • by ljw1004 ( 764174 )

      Do you watch soccer? 4k resolution means a player's head is about 14 pixels high, not enough to make out much beyond a blob of color; their jersey is 60 pixels high, enough to make out the number but not much more. Doubling the vertical resolution (i.e. going to 8k) would likely be enough to let you make out similar detail to what you'd see in real life. (Frame rate is another issue: HDMI 2.0 allows 4k at 60hz which is too slow when panning in a soccer game; HDMI 2.1 allows 4k at 120hz which is probably eno

      • Do you watch soccer? 4k resolution means a player's head is about 14 pixels high, not enough to make out much beyond a blob of color; their jersey is 60 pixels high, enough to make out the number but not much more. Doubling the vertical resolution (i.e. going to 8k) would likely be enough to let you make out similar detail to what you'd see in real life.
        (Frame rate is another issue: HDMI 2.0 allows 4k at 60hz which is too slow when panning in a soccer game; HDMI 2.1 allows 4k at 120hz which is probably enough). I think that 16k is probably the right bandwidth to get soccer looking good.

        All good in theory, except that you likely need something like a 200" TV so actually tell the difference between 8k and 16k.
        And no, you are not going to sit right in front of the TV. You'll be at least 1 meter away, more likely 2-3.

        Do you watch the gorgeous film classics like Lawrence of Arabia? One of the (many) things that make it look great is that it was shot on 65mm, equivalent to about 12k resolution.

        Let me guess, you are watching these classics at 1080p, or at best 4k. You couldn't care less if it was shot in 12k or 1000k at this point.

        • by ljw1004 ( 764174 )

          All good in theory, except that you likely need something like a 200" TV so actually tell the difference between 8k and 16k.

          Like I said, I figured 8k would be enough resolution for soccer. As for 16k, I imagine that something with bandwidth for 16k would translate that bandwidth into twice the frequency for 8k, which would be ideal for soccer.

          [Lawrence of Arabia] Let me guess, you are watching these classics at 1080p, or at best 4k.

          I watched Lawrence of Arabia on a Cinerama screen. It was breathtaking. I expect that the higher resolutions described here will help more places (like movie theaters) display higher quality prints. I suspect they'll open up new avenues like fake windows or full-wall screens in residences.

          • 4k 60 fps is more than enough for soccer. Most people don't event sit close enough to fully benefit from the 4k resolution anyways

          • by AmiMoJo ( 196126 )

            Keep in mind that they specify 8k resolution at 60Hz. 8k video is 120 Hz native, so that already half the bandwidth of this new HDMI standard used up.

            Dual displays daisy chained, or 240Hz, or some combination... Supporting not just TVs, but computer monitors and VR headsets. Given that it will take a few years for this tech to enter production and consumer devices, it seems like a reasonable projection of where things are headed.

      • Oh please, every single point you make is mute. -soccerplayers won't be recognizable with 16k if you already can't make it out on 4k. You seem to forget that with even a 70" TV the size of the TV itself won't let you make it out on 16k, because the pixels are too small, do in reality you would still see the same gobble what you see between 4k to 6K. -VR, have you actually already seen a 4K/eye headset, with decent lenses? You'll be hard pressed to see the pixels already 6k will already be (beyond) retina r
    • Why ask why you need it?

      Ever see a Jumbotron in 1080p? It's ridiculous.

      I can totally see a wall-sized screen being useful for many businesses. Walk to one area, read what's there, move to another area to read something else. Analysts, factories, hospitals, military, theme parks, etc.

      They already are doing this with walls of a dozen different screens, with that many video cards, cables, power supplies.

      Or complex video splitters, muxers/demuxers, etc.

      When they scale to 24K there will be customers too.

      I'll

    • by Misagon ( 1135 )

      There is no such thing. 8 * 1920 = 15 * 1024.

      Anyway.... screens are getting bigger. A month ago I worked with installing multiple 100" screens in conference rooms, programming display processors to show a mosaic of four video sources on the same screen. When the source signal is the same resolution as the screen's video input, the sources had to be scaled down.
      It might take some time before the equipment exists, but it would still be nice to have the standard ready before then.

      And... 14 years ago, I visited

    • I just built an IMAX screen in my backyard.
    • by Luckyo ( 1726890 )

      You don't. Some massive outdoor displays may need that for making smaller text readable.

      But the thing that is actually relevant for consumers is

      >supporting uncompressed 4K at 240 Hz with 12-bit color depth

      Those monitors already exist, and they use DisplayPort 2.0/2.1 because HDMI doesn't meet their requirements.

    • Why do I need 16K resolution? Sitting one foot away from a 180inch screen?

      You don't. But it's a comparative number people can get behind on their displays all else being equal. But that's not the only thing related to bandwidth. Not all else is equal. You got a 4K 60Hz 8bpp display? Great. I have a 120Hz HDR display which at 4K hits the upper limit of HDMI. But my display is far from the best.

      Around $350 buys you a ROG Strix XG32UQ: A display which says in the manual quite clearly that its max refresh rate is unsupported over HDMI 2.1 and that you need DP 2.0 to actually use the

    • by AmiMoJo ( 196126 )

      High refresh rates, multiple monitors.

      Tests have shown that high refresh rates do offer some small advantages in games. 240Hz is now pretty common for gaming monitors, but often comes with a downgrade in resolution due to HDMI bandwidth limitations.

    • by AmiMoJo ( 196126 )

      Should have clarified about multiple monitors - daisy chaining. One cable for video, or even better one cable for video and USB and power. Some monitors have USB C with Power Delivery now, so one cable to your laptop charges it, gives you a USB hub on the monitor, and carries video. While it's not an HDMI cable, it can carry the HDMI protocol, so there needs to be support from both.

    • Why do I need 16K resolution? Sitting one foot away from a 180inch screen?

      Clearly, you have no use for this... so I definitely think you should advocate for stopping this madness since your standards are the world's standards. 1024x768@60hz (faster than the eye can see) is enough for everyone.

      You know what? Since you are unable to find a use for this, and your standards are a proxy for the world's standards, I will join you in trying to keep this evil technology off the market.

      STOP DOING STUPID STUFF WITH RESOLUTIONS! WE HAVE ENOUGH ALREADY! MORE RESOLUTION IS A COMPLETE WASTE! C

    • The major use I can see is daisy chaining multiple displays. Four 4K displays being fed by one cable may be nice for signage. I can only imagine the lag for the 4th monitor in the chain makes it unsuitable for watching Fast and Furious 27.
  • I imagine the comments will be full of people complaining that 16K has few practical uses and I don't disagree. But being able to drive multiple 4K monitors at high refresh rates over a single cable is very appealing to me. And the fact that this setup would be very future-proof is a huge bonus.
    • by Guspaz ( 556486 )

      Multi-monitor is not a feature that HDMI has ever offered. That's always been a DisplayPort thing, and I don't see anything in this article about adding multi-monitor support. However, we do need more bandwidth for higher refresh rates. Many monitors on the market today exceed the 48 Gbps that HDMI 2.1 provides, and fall back on DSC or DisplayPort to do it.

      • Multi-monitor is not a feature that HDMI has ever offered. That's always been a DisplayPort thing

        Fair enough. In my case, I'm waiting for a 57-inch dual-4K widescreen with QD-OLED and if it supports HDMI 2.2 I certainly won't complain.

    • No. The comments are full of people who don't understand there's more to bandwidth requirements than number of pixels. HDMI 2.1 cannot run run of the mill gaming monitors at their native refresh rates. 4K max supported refresh is 144Hz without compression. $300 gets you 160Hz 4k gaming monitors. $550 gets you either higher refresh monitors, or monitors supporting HDR10 at 144Hz which is also beyond the capabilities of HDMI 2.1

      Forget multiple 4K monitors, HDMI is limiting single 4K monitors.

  • Too bad the best content was filmed in 1080p 24fps (Panavision HD-900F, etc) or below. Or on film which has comparable resolution to 4K and below, although film scanning is typically done at 8K so that it can be edited more easily. (disclaimer: I'm not actually a big cinema tech guy, just looked up a few things. I'm layperson and a consumer of films)

    I'd rather use that level of bandwidth with DisplayPort for Multi Stream Transport. Instead of one 16K@60 or 12K@120 I'd rather have a few 4K displays attached

    • > film which has comparable resolution to 4K and below

      "It's complicated".

      Many of the masterpiece films were filmed on 70mm which is about 4x the size of 35mm, plus better emulsion with a tighter grain.

      So if we take your 4K number for a normal film and 4x it and double that for scanning we're waiting for 32K to master it digitally.

      We're going to need faster storage!

      • Even for standard 35mm, I have seen all kinds of numbers being thrown around regarding its equivalent resolution in pixels, from 2048x1080 to 4096x2160.
      • by jaa101 ( 627731 )

        Many of the masterpiece films were filmed on 70mm which is about 4x the size of 35mm

        Typical 65mm and 70mm frames are 53mm wide whereas 35mm frames are 22mm wide. That's only 2.4x wider (though the factor for area is greater, depending on which exact formats you're comparing). So 4K scales up to 10K and 16K is enough oversampling for a mastering scan. Remember that 4K refers to the number of pixels across the screen, not to a total pixel count.

        The list of 70 mm films [wikipedia.org] isn't that long.

        IMAX is bigger again, though mostly in frame height. Its popularity has taken off as technology makes it

  • Won't fill the void left by the Ultra64.
  • Can't believe nobody complained about the mixup between GB/s and Gbps (gigabit/s). It's 96 gigabit/s, not 96gigabytes/s.

  • It's 96 Gbps (Gb/s), not 96 GB/s. 96 gigabits per second is 12 GB/s.

    If it were up to me, I would start by introducing a simple lossless or near-lossless compression scheme inspired by QOI [qoiformat.org] before doubling the data rate yet again. (It would require variable framerates, but there's nothing technically difficult about that, I think? The only thing holding back VFR in the past was NVIDIA making this incredibly basic feature proprietary in order to charge licensing fees, but eventually they did relent.)

    I'm

    • If it were up to me, I would start by introducing a simple lossless or near-lossless compression scheme inspired by QOI [qoiformat.org] before doubling the data rate yet again.

      Lossless is useless for the case of HDMI because it doesn't guarantee any compression for a given frame or even for a given second of video. In plain English, if you feed a lossless algorithm white noise, there will be no compression and as a result you will exceed the cable's data rate. Also, throwing away frames is generally frown

  • is it cheaper than 100G Ethernet ? If so, can I run IP over it ?

"Only the hypocrite is really rotten to the core." -- Hannah Arendt.

Working...