HDMI 2.2 Specs With Increased Bandwidth To Be Announced at CES 2025 (videocardz.com) 41
HDMI Forum will announce new specifications with increased bandwidth capabilities at CES 2025, ahead of anticipated graphics card launches from AMD and NVIDIA. The announcement, scheduled for January 6, is expected to introduce HDMI 2.2 standard alongside a new cable supporting higher resolutions and refresh rates.
Current HDMI 2.1 specification maxes out at 48 Gbps bandwidth, allowing 10K resolution at 120 Hz with compression. The upgrade aims to compete with DisplayPort 2.1, which offers up to 80 Gbps bandwidth and is already supported by recent AMD and Intel GPUs.
Current HDMI 2.1 specification maxes out at 48 Gbps bandwidth, allowing 10K resolution at 120 Hz with compression. The upgrade aims to compete with DisplayPort 2.1, which offers up to 80 Gbps bandwidth and is already supported by recent AMD and Intel GPUs.
Not clear to me where this is useful (Score:3)
Re: (Score:2)
Re: (Score:2)
Movie theatre digital projectors are still mostly 2K and almost all the rest are 4K. The frame rate is 24fps. Even first-generation HDMI is good enough for both of those. Yes, the screens are big compared to home theatre, but the audience sits proportionately farther back. Even for people with better-than-average vision, like 20/15, you have to be uncomfortably close to the screen to see 4K pixels, i.e., to see any benefit in higher resolution.
Beyond standard 4K, it's better to go to HDR and higher fram
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Imagine South Park at 20K and 240Hz.
I think this is above resolution that 20/20 vision can see on a display. So I might need to get bionic eyes to see that.
Re: (Score:2)
This only makes sense if you have a 200-Inch display or bigger. I mean.. 4K is already perfect/more than you need for 85-Inch screens, and 10K contemplates multiplying the number of pixels by a factor of 5, so you need to increase the screen size by at least double for 20K to be a reasonable resolution to spend money on.
Re: (Score:2)
Sure, there will be some use from those who want to pay silly money for bragging rights on a physical 8K screens they're unlikely to be able to get the full viewing benefit of due to optimal viewing distance requirements (e.g. Sharp's 120" 8k monster), plus some niche advertising or hero displays in
Re:Not clear to me where this is useful (Score:4, Informative)
Considering the article mentions upcoming graphic card launches from AMD and Nvidia, gaming would be best bet. You need a video card capable of processing all this data so it would make sense the two are linked.
Re: (Score:3, Interesting)
At which time it's in any case time for a new generation of Display Port.
All the HDMI group is trying to do is to try to stay relevant so they can collect their license fees.
Re: (Score:2, Informative)
Re: (Score:3)
Once you fall out of the Apple walled garden, everything has HDMI - every TV for one thing. HDMI is still relevant regardless.
1) You do know Apple products have HDMI ports, right? Of all the Apple products that has a video out, only the MacBook Air does not have HDMI. 2) Many Apple devices have Thunderbolt with encompasses HDMI and USB and Ethernet and power.
Re: (Score:2)
Once you fall out of the Apple walled garden, everything has HDMI - every TV for one thing. HDMI is still relevant regardless.
1) You do know Apple products have HDMI ports, right?
Only because people complained a *lot* about how many f**king USB-C to HDMI adapters they had to buy for all the CrapBook "Pro" models built from 2016 through 2020. There was a four year period in which owning a Mac and trying to use it in any hotel anywhere in the world really, really sucked.
I ordered my M1 almost as soon as the 15" version hit the market, precisely because I was so fed up with having to carry around so much external crap to connect to HDMI and SD cards. And even though I don't really ne
Re: (Score:2)
Aside from 3D, which is DOA, where would this be used?
Holodecks.
Unironically.
Re: (Score:2)
Re: (Score:2)
I've been watching computer graphics since the Atari 2600.
Every single jump, I remember thinking 'holy crap, this is amazing, it can't get better than this!' Always to be proven wrong.
Re: (Score:2)
Me too but there are decreasing returns past 4K at 120 FPS aside from niche applications.
The bigger crime is the shitty 24 FPS source material STILL being used for movies/TV (or less for animation [youtube.com]) instead of 60 FPS (or 120 FPS). Sadly most people don't care aside from a few game developers / gamers / artists until you show them [youtube.com] the difference, especially long pans where you have judder / jank like crazy.
Re: (Score:2)
I'll agree that perhaps we've hit the point where it's time to stop worrying about increasing resolution or frame rate, and now move on to more accurate physics and poly density.
24 fps film is still used for the same reason films shot on digital still tend to use the color gradiants of analog film; that's what we expect film to look like.
I remember watching Avatar 2, and thinking that the 60 fps scenes looked more unnatural and video-gamey than the 24 fps scenes.
Re: (Score:3)
Virtual Reality. VR needs to be at around 10K per eye (ie, two 10K displays) to have readable fine print in it (note, the Apple Vision Pro only has 4K displays which makes it less suitable for virtual desktop in spite of what Apple claims), and, VR displays must roll at 120 fps to prevent nausea in certain action gaming. Of course in the case of VR the dual 10K images will be highly compressible since there's very little information delta between the two images.
Re: (Score:2)
Hopefully 8k at 60 without DSC and 8k at 120hz or higher in some fashion. The monitors may not exist now or might be hard to find, but that is likely a chicken and egg problem with sources for such content and interconnects between the two.
Re: (Score:2)
And 4k at 240 or higher without DSC.
Re: (Score:1)
It is basically tech for people with an excess of money and a lack of sense. About 4k is the maximum that makes a difference and 120Hz is only for those that do not understand screen tech anyways.
I am sure this will find people wasting money on it though.
Re: (Score:3)
About 4k is the maximum that makes a difference and 120Hz is only for those that do not understand screen tech anyways.
Congrats. You just made a spec worse than my existing work station where I am running 3x 4K 120Hz screens over a single Thunderbolt cable.
As for your comment of 120Hz being for those who don't understand, I think it's more like 120Hz is for those people with functioning eyes. If you can't tell the difference between a 60Hz and a 120Hz screen side by side then I suggest you get your brain checked. Yes brain, not eyes. Your eyes are fine, the inability to sense smooth motion points to a neurological issue.
Re: (Score:3)
And a thoughtful technical rebuttal from you, as always.
For the record we typically take your non-answers as conceding the point, since that's how arguments work. If you have a point, try a bit harder to make it.
Re: (Score:2)
Gaming monitors. 240hz at 4k.
Re: (Score:2)
I'm guessing the extra bandwidth could be used for Dolby Atmos, which is something like 50-channel surround. Not sure how many bps Atmos typically uses, but it's got to be a lot. And they'll offer it in 32-bit depth with a 192 kHz sampling rate, for the guys who like to stare at the spec sheet instead of the movie.
That might be the next frontier for device manufacturers to put larger figures on their spec sheet. A 32/192 DAC for audio is way cheaper to include than a 10k panel. I don't think those even exis
how much sound bandwidth is in it? (Score:3)
how much sound bandwidth is in it?
Re: (Score:3)
How big is your display anyway (Score:4, Insightful)
and once the tariffs kick in you won't be able to afford new hardware.
Re: (Score:2)
so it will be another hdmi spec (Score:2)
Re: (Score:3)
The cables were never the problem, that was always a non-issue: cables were rated for speeds, not features, and you could simply buy a cable that was certified for "ultra high speed". There's a certification program where if you buy certified cables, you can use a smartphone app to validate the hologram.
The problem is in the devices themselves. They redefined *everything* to be HDMI 2.1, and made all the features optional, so it was impossible to know what features or bandwidth or resolutions an HDMI 2.1 de
Re: so it will be another hdmi spec (Score:2)
Same as with previous versions of HDMI. Just because the spec allows DSD audio or 3D video doesn't mean every device has to support them. You always get the lowest common denominator between devices. Finding out what that is before purchase that always required reading spec sheets.
Meh (Score:2)
10k120 sounds great. (Score:2)
Connectors and cables (Score:1)
Optics? (Score:2)
Higher bandwidth requires thicker wires with more insulation, which leads to heavier and stiffer cables, which causes more strain on the connectors.