

China Launches GPMI, a Powerful Alternative To HDMI and DisplayPort (tomshardware.com) 136
AmiMoJo writes: The Shenzhen 8K UHD Video Industry Cooperation Alliance, a group made up of more than 50 Chinese companies, just released a new wired media communication standard called the General Purpose Media Interface or GPMI. This standard was developed to support 8K and reduce the number of cables required to stream data and power from one device to another. According to HKEPC, the GPMI cable comes in two flavors -- a Type-B that seems to have a proprietary connector and a Type-C that is compatible with the USB-C standard.
Because 8K has four times the number of pixels of 4K and 16 times more pixels than 1080p resolution, it means that GPMI is built to carry a lot more data than other current standards. There are other variables that can impact required bandwidth, of course, such as color depth and refresh rate. The GPMI Type-C connector is set to have a maximum bandwidth of 96 Gbps and deliver 240 watts of power. This is more than double the 40 Gbps data limit of USB4 and Thunderbolt 4, allowing you to transmit more data on the cable. However, it has the same power limit as that of the latest USB Type-C connector using the Extended Power Range (EPR) standard. GPMI Type-B beats all other cables, though, with its maximum bandwidth of 192 Gbps and power delivery of up to 480 watts.
Because 8K has four times the number of pixels of 4K and 16 times more pixels than 1080p resolution, it means that GPMI is built to carry a lot more data than other current standards. There are other variables that can impact required bandwidth, of course, such as color depth and refresh rate. The GPMI Type-C connector is set to have a maximum bandwidth of 96 Gbps and deliver 240 watts of power. This is more than double the 40 Gbps data limit of USB4 and Thunderbolt 4, allowing you to transmit more data on the cable. However, it has the same power limit as that of the latest USB Type-C connector using the Extended Power Range (EPR) standard. GPMI Type-B beats all other cables, though, with its maximum bandwidth of 192 Gbps and power delivery of up to 480 watts.
Obligatory XKCD (Score:5, Insightful)
Standards [xkcd.com].
Re: (Score:2)
Usually this cartoon is overused, but in this case I agree, there are too many "standards" of high-def video ports. Barring some necessary feature, use EXISTING SHIT, xi!
Re: (Score:2)
... there are too many "standards" of high-def video ports. Barring some necessary feature, use EXISTING SHIT, xi!
From TFS:
Because 8K has four times the number of pixels of 4K and 16 times more pixels than 1080p resolution,
it means that GPMI is built to carry a lot more data than other current standards.
Re: (Score:2)
Wake me up when you can buy 8K content.
Re:Obligatory XKCD (Score:5, Funny)
Re: (Score:3)
Thunderbolt 5 is already shipping, supports 120Gbps (more than 96), and supports 8k video...
Re: (Score:3)
This isn't an attempt at replacing DisplayPort, it's frustration with the lack of HDMI2.2 to give us a media-focused transport that can accommodate what DisplayPort has been able to do for a while, now.
Re: (Score:2)
The DP signal itself is limited to 80Gbps.
However- it can send 2 SST DP signals- meaning you can get 80Gbps, and 40Gbps to separate displays, but the real intention is to be able to do 80Gbps unidirectional DP over CIO framing, while still maintaining 40Gbps synchronous for other CIO traffic. i.e., you can have 80Gbps DP on a hub, as well as other TB links, rather than an 80Gbps DP connection rendering a port unusable for TB/USB3+ traffic.
Re: (Score:3)
Only with DisplayPort framing, and DisplayPort doesn't support ARC or CEC, hence GPMI. This isn't an attempt at replacing DisplayPort, it's frustration with the lack of HDMI2.2 to give us a media-focused transport that can accommodate what DisplayPort has been able to do for a while, now.
USB-C is rapidly replacing DisplayPort, so within maybe a decade, that standard will be pretty much defunct, and arguably, it almost is already, given that you can always adapt, and most people don't go around plugging into random monitors all the time.
And although it is taking longer, we're starting to see TVs with USB-C ports now, too. That's what I expect to replace HDMI — not yet another new standard, but something that unifies the TV and computer display markets once and for all, the way HDMI sh
Re:Obligatory XKCD (Score:5, Informative)
USB-C is rapidly replacing DisplayPort, so within maybe a decade, that standard will be pretty much defunct, and arguably, it almost is already, given that you can always adapt, and most people don't go around plugging into random monitors all the time.
USB-C has not, will not, and can not replace DisplayPort.
DisplayPort is far more than a connector. USB-C is only a connector.
USB-C transports DisplayPort in 2 different ways. 1, with the high-bandwidth lines driven directly using DisplayPort signaling protocol (DP-Alt- which is actually how every USB-C monitor you've ever used works), or via CIO- packetized for switching with a hub.
And although it is taking longer, we're starting to see TVs with USB-C ports now, too. That's what I expect to replace HDMI — not yet another new standard, but something that unifies the TV and computer display markets once and for all, the way HDMI should have, but didn't for various reason (inability to get the bandwidth high enough quickly enough, licensing fees, CEC requirements, the computer industry's general reluctance to deal with HDCP, etc.).
HDMI-Alt is a thing, though it's very rare. That's HDMI over USB-C. The TVs you are using are using either HDMI-Alt, or DP-Alt.
USB-C is just a connector.
Based on that, my suspicion is that HDMI 2.2 is likely to be a DOA standard. By the time your average consumer cares about 8K video (if they ever do), most TVs on the market will probably have USB-C ports, and everybody will just use those.
It won't. HDMI 2.1 is woefully deficient. HDMI 2.2 is a much needed breath of fresh air for it to catch up with DisplayPort.
Remember, USB-C is just a port. It transmits either HDMI-Alt, DP-Alt, or CIO-encapsulated-DP.
So in much the same way, I fully expect HDMI to slowly fade away. It will just take a lot longer — probably more like twenty years before every TV in every hotel room has a USB-C port, and that's really the point at which people stop caring. :-)
The port may (FSM willing) but the signaling protocol will not, at least until DP subsumes the HDMI-specific functionality that's mainstream in home entertainment systems.
Re: (Score:3)
USB-C is rapidly replacing DisplayPort, so within maybe a decade, that standard will be pretty much defunct, and arguably, it almost is already, given that you can always adapt, and most people don't go around plugging into random monitors all the time.
USB-C has not, will not, and can not replace DisplayPort.
DisplayPort is far more than a connector. USB-C is only a connector.
USB-C transports DisplayPort in 2 different ways. 1, with the high-bandwidth lines driven directly using DisplayPort signaling protocol (DP-Alt- which is actually how every USB-C monitor you've ever used works), or via CIO- packetized for switching with a hub.
What I'm talking about are the cables and connectors, not the signaling protocols. Pedantically, yes, DisplayPort will likely live on as a wire protocol, because there's really no point in using anything else. But when you're using DP signaling over USB-C, you're bounded by the speed of USB-C's hardware. You aren't using dedicated hardware designed by the DisplayPort standards committee. From DP 2.1 onwards, they're using the USB-C PHY hardware when passing over a USB-C connection. So in the short to m
Re:Obligatory XKCD (Score:4, Informative)
What I'm talking about are the cables and connectors, not the signaling protocols. Pedantically, yes, DisplayPort will likely live on as a wire protocol, because there's really no point in using anything else. But when you're using DP signaling over USB-C, you're bounded by the speed of USB-C's hardware. You aren't using dedicated hardware designed by the DisplayPort standards committee. From DP 2.1 onwards, they're using the USB-C PHY hardware when passing over a USB-C connection. So in the short to medium term, DP as a wire standard is likely dead. Long live DP, the protocol-only standard.
USB-C is just a port.
Whatever Alt-mode protocol is running over that port is just that- that Alt mode's protocol.
Of course the Alt mode uses the USB-C phy to drive the lines- but the wire protocol is still that of DisplayPort, HDMI, USB4, or ThunderBolt, and they all have their own speed limitations. The PHY is merely a multiplexer/redriver.
Some educational material for you. [ti.com]
Same comment as above.
Just as wrong as above.
No, USB-C is a connector plus hardware for sending bits down the wire and retiming the bits on the other end. It's way more than a connector. The connector is the easy part.
No.
There is no such thing as a USB-C retimer. USB-C does not have a signaling protocol.
There is a USB3.2 USB-C Retimer. There is a ThunderBolt USB-C Retimer. There is a DisplayPort USB-C Retimer. There are even combo units. There is no USB-C retimer. It's a nonsensical statement.
In what universe is HDMI 2.1 already woefully deficient when it comes to realistic home video use? For computers, DisplayPort is already the de facto standard. For home video, nobody is going to replace their 4K set with an 8K set. Most people can't even see the difference between 4K and 1080p.
In any universe where you want to watch 8K HDR content.
It's not woefully deficient for you, and for me- for sure. I'm flatly in the "Is 8K even really a thing?" camp. But the fact is, the industry says it is.
Whether people are buying it or not, they're making 8K hardware, and HDMI 2.1 isn't up to the task.
Your 640K is enough for anyone argument falls pretty flat.
Don't get me wrong, 8K video is great for acquisition, because you can crop in post and still get 4K output, but you really don't need faster HDMI for that. But as a consumer standard, there's really no point. For this reason, 8K content is practically nonexistent. So this is a standard whose only real purpose is to provide capabilities that almost nobody cares about, and/or to provide a more expensive, license-encumbered tech to compete with DP on the computer side, which *exactly* nobody cares about other than the HDMI standards folks.
8K TVs are, however, a thing, and content does exist. 4K content was limited in the beginning too, as was 1080P. That's how industry works.
At this point, we would all be better off if the HDMI folks would spend their time developing a standard for CEC over USB-C that can work in concert with DP-Alt mode on USB-C instead of trying to make the HDMI protocol support faster data rates for no obvious benefit.
CEC works fine over HDMI-Alt.
CEC over USB-C with DP-Alt mode is a non-sensical statement.
The HDMI folks don't get to decide how the DP guys twiddle their high-speed lines.
That's fair. But the point is that the signaling is a big part of the reason HDMI 2.2 took so long. Having three different standards bodies each building their own transceiver hardware is, IMO, a waste of time and energy that could be better spent doing any number of other things.
The signaling is why it took so long. That's not wrong. But it's not like everyone else didn't have that signaling a long time ago.
Thunderbolt and US
Re: Obligatory XKCD (Score:2)
"USB-C is only a connector."
Type C is a connector. USB is a protocol. USB-C is the use of USB with the Type C connector.
Re: (Score:2)
Re: (Score:2)
Type C is as non-existent as USB-C.
It's a USB Type C connector (I can link you to the spec, if you like)
And there is no specifically for a USB "C" protocol.
Re: (Score:3)
You did read the title, right? The one that said "China launches HDMI and DisplayPort alternative — GPMI boasts up to 192 Gbps bandwidth, 480W power delivery"? That's a tad more than 120Gbps.
Re: (Score:2)
You did read the title, right? The one that said "China launches HDMI and DisplayPort alternative — GPMI boasts up to 192 Gbps bandwidth, 480W power delivery"? That's a tad more than 120Gbps.
The Slashdot story only says 240W and 96Gbps, and the title says nothing. I didn’t bother reading TFA because honestly I could care less about 8k.
Re: (Score:2)
The Chinese manufacturers presumably don't want to pay Intel whatever they charge for their technology. They want something that is unencumbered by foreign patents, especially given the current direction that the US is heading in.
I expect it will also be considerably cheaper.
Re: (Score:3)
From TFS:
Because 8K has four times the number of pixels of 4K and 16 times more pixels than 1080p resolution, it means that GPMI is built to carry a lot more data than other current standards.
Don't care. The whole point of standards is compatibility. We already have two standards for no obvious reason. DisplayPort should never have been made. It has no real reason for existing other than perhaps HDMI being too slow to adapt to the needs of the computer market. And arguably, USB-C is a third standard.
A fourth standard just means even more devices with different connectors and extra cables and dongles to convert between incompatible standards. No. F**k that s**t.
You want to make a new stand
Re:Obligatory XKCD (Score:5, Insightful)
DisplayPort should never have been made.
I would counter just the opposite, HDMI, an industry standard with fees should not exist when an open VESA standard existed and has consistently been ahead in terms of bandwidth versus HDMI (and physically is a better connector as well). Like DP2.1 beats HDMI2.1 on just about every front.
The only reason HDMI exists as I can remember is the industry wanted DRM built into the spec via HDCP since they were in their full anti-piracy mode and were freaked about people "stealing content" via cable capture. Fat lot of good that did them as nobody pirates content with HDMI capture and HDCP was summarily cracked.
Re:Obligatory XKCD (Score:4, Informative)
No, HDCP works on many interfaces - DVI, HDMI, DisplayPort, MiraCast, etc all support HDCP.
The reason we have HDMI is because HDMI pre-dates DisplayPort - it was based on DVI. But DVI didn't support everything consumer electronics companies wanted - like repeater support allowing for video processors and such. HDMI 1.0 was released around 2002 or so. In those days, there was considerable talk about prohibiting HDMI ports on PCs and their associated peripherals. It took around HDMI 1.2 or 1.3 when this rule was relaxed and PCs and monitors could have HDMI ports.
But at the time, DVI was the display interface.
DisplayPort came much later to provide a higher bandwidth interface.
HDMI however still allows for longer cable lengths - DisplayPort at the higher speeds is limited to around 8 feet or so, while HDMI can go much further. This is because consumer electronics devices might be far apart from the next device in line - e.g., long runs from a rack to the projector. This was also the problem with DVI - HDMI could go the distance while DVI could not (despite HDMI being based on DVI).
For DisplayPort, the PC and monitor are typically well within 8 feet of each other.
Re: (Score:2)
As someone in the ProAV field for 20 years HDMI over 15ft is spotty without active cables and 15ft with DP can work fine.
DP today imo is superior in almost every way except for availability of gear. In my entire career HDMI has been anywhere from great to a nightmare, especially in the early days when HDCP was new and fucking with everything so maybe I am jaded.
Re: (Score:2)
You lack things that people expect like ARC and CEC.
I think in a perfect world, DisplayPort would make it their focus to replace HDMI since they're already the technically superior transport, but they are not doing so, and so DP is not at feature-parity with HDMI for the concerned applications.
Re: (Score:2)
True, it would be easy for DP to integrate that stuff but I suppose they have accepted it's too late to break into the non-PC based side of things. Especially in my line of work there is a lot more HDMI gear for baluns and matrices than DP.
Re: (Score:2)
Having a separate standard just for our TVs is stupid, and frankly, HDMI moves way too slow.
It's taken 8 years to give us what DP gave us 6 years ago in terms of bandwidth, and the next DP standard (UHBR20x6?) is undoubtedly going to give us 120Gbps signaling whenever it comes out, to match what TB ports can do via CIO now.
Re: (Score:3)
there are apps/content that will ONLY play if you are using a HDMI cable. DisplayPort will fail... worst example is some laptopst that can't display a movie/TV in build in screen but works if you connect a HDMI external monitor (but only on that monitor, if you drag the content back to the internal display, it will stop working!)
yes, stupid idea, but some DRM makers did build that and some stupid clients enable it
Is it less expensive? (Score:3)
Information poor article, anyone surprised?
What are the length restrictions for the advertised speed/power specifications?
Are their licensing requirements?
What ever.
Re: (Score:2)
Before or after the first or fifteenth round of retaliatory tariffs?
More info required (Score:2)
Is it adequate to the task? Is it license-free?
If it's free, it could be a successor to DisplayPort.
Instant failure (Score:4, Interesting)
The biggest immediate problem they will face is the cables because of the USB-C connector that isn't USB-C. The second issue is going to be the video encoding because no chips support it. I can understand being against HDMI because it's proprietary but snubbing display port is just stupid. Hell, if they had just taken display port and then added dedicated pins for power, it still would have been better than this. The only way this will get any traction is if China mandates it.
Re: (Score:2)
WTF do you need power on the same cable as the data?
Whats wrong with a separate power cord for a monitor?
Re: Instant failure (Score:5, Insightful)
You could as easily ask, why do you need a separate cable for power if you can support power and data with a single cable?
Re: (Score:2)
You could as easily ask, why do you need a separate cable for power if you can support power and data with a single cable?
Safer, more resilient, more efficient.
Re: (Score:3)
WTF do you need power on the same cable as the data?
Whats wrong with a separate power cord for a monitor?
WTF? Less cabling is ALWAYS preferred.
Re: (Score:2)
Re: (Score:2)
Yes, we call those people "stupid" in the engineering field given their devices already contain power conditioning systems. Though we colloquially call them "power supplies".
Also nothing is stopping you from conditioning the power, and to date precisely zero standards of shared power / data have resulted in precluding the splitting up of them. This will be essential because even if this connector does become a reality there are virtually zero devices out there these days that only provide one connection sta
Re: (Score:2)
Yes, we call those people "stupid" in the engineering field given their devices already contain power conditioning systems. Though we colloquially call them "power supplies".
Your car also has wheels but they are not all the same either. There is a huge audible difference between good power supplies and shitty ones, it is a very important component. Feeding them power that is consistent and in spec is also useful, no matter how much you may not believe.
Re: (Score:2)
The people who create the standards for these interfaces would probably be (rightfully) insulted if you said they weren't writing a spec against which devices can be "consistent and in spec". The whole point of standards bodies like USB-IF is to make sure different companies agree on what "consistent and in spec" should mean, particularly in terms of balancing the interests of different inplementers.
Re: (Score:2)
The people who create the standards for these interfaces would probably be (rightfully) insulted if you said they weren't writing a spec against which devices can be "consistent and in spec". The whole point of standards bodies like USB-IF is to make sure different companies agree on what "consistent and in spec" should mean, particularly in terms of balancing the interests of different inplementers.
This is probably true, but it is also true that USB is the least desirable transport option for digital audio with some rare exceptions, and especially so if you are also powering anything with it at the same time, so yeah. Noted and filed.
Re: Instant failure (Score:3)
Re: (Score:2)
"Yes, we call those people "stupid" in the engineering field given their devices already contain power conditioning systems. Though we colloquially call them "power supplies"."
And those of us with real electrical experience will tell you shit incoming power will kill those power conditioning systems because they aren't made for conditioning. They're nasty noisy fuckers from the get-go.
Oh, and then you have power fucking with data integrity.
In my profession we call people like you RF-ignorant.
Re: (Score:2)
Re: (Score:2)
No it's not, and supporting power on a data cable doesn't preclude a separate power cable anyway.
Doing your best SuperKendall impression again.
Re: (Score:2)
supporting power on a data cable doesn't preclude a separate power cable anyway.
This is true. I suspect that is what people will continue to do like with HDMI or DisplayPort.
Re: (Score:2)
Then those "certain people" can plug the optional power cable in and use that, while the rest of us can use the single cable for both power and data.
Re: (Score:2)
Not Always (Score:2)
WTF? Less cabling is ALWAYS preferred.
Not if you work for a cable manufacturer, especially one trying to sell over-priced gold-tipped cables to people who don't understand how digital signal transmission works.
Re: (Score:2)
Re: (Score:2)
Gold is a good conductor even on cheap cables, and cables absolutely make a difference to analog audio.
Gold is a mediocre conductor compared with copper or silver, providing half again more resistance per foot. It is almost as bad as aluminum, which they have pretty much stopped using for wiring in houses because of elevated fire risk (though to be fair, part of that risk comes from aluminum's higher rate of thermal expansion; it's not just the higher resistance).
The higher resistance is largely moot when you're talking about a few microns of thickness, however. The purpose of the gold coating is to ensure
Re: (Score:2)
Re: (Score:2)
which they have pretty much stopped using for wiring in houses because of elevated fire risk.
Only for 12AWG and 10AWG (15A and 25A) Aluminum. The fire risk is from poor terminations (screw terminals).
8AWG (40A) and up is still used on most homes, because the terminations are better made (lug terminals).
Aluminum is also cheaper--which is very important in residential.
Re: Not Always (Score:2)
" It is almost as bad as aluminum, which they have pretty much stopped using for wiring in houses because of elevated fire risk (though to be fair, part of that risk comes from aluminum's higher rate of thermal expansion; it's not just the higher resistance)."
These days we have switches and outlets compatible with aluminum wire, so you no longer have to do goofy stuff like specially terminating it with copper ends. But you still have to use a torque driver at the panel to make sure you don't connect it inco
Re: (Score:2)
It has a hilarious word-salad description (emphasis mine):
"The Valhalla 2 Reference Speaker Cable consists of twenty-eight conductors divided into four groups of seven. Each conductor is made from solid core 99.999999% oxygen free copper and plated with 85 microns of silver. A Dual Mono-Filament wrap is applied and helically wound around each conductor before a precision FEP jacket is extruded over the outside. The transmission speed of the cable is extremely fast, at 97% the speed of light. V2 exemplifies
Re: (Score:2)
It has a hilarious word-salad description (emphasis mine):
It's not word salad. I understand every bit of it. It is, however, lying, unless they've found a way to violate the laws of physics.
Each conductor is made from solid core 99.999999% oxygen free copper and plated with 85 microns of silver.
Because copper oxidation is so much more fun when you're coating it with silver. Gold, sure, but silver? Is that a joke? ROFL.
The transmission speed of the cable is extremely fast, at 97% the speed of light.
Maybe they are dyslexic. 67% is entirely plausible in copper. :-)
Re: (Score:2)
Plenty of things. From the convenience of it, to the management of cables, to in some cases the looks of it, the entire industry is moving to single cable powered solutions. Also why do you presume it's to power the monitor?
I have a single cable display + power solution here, and the monitor is *providing* the power and already has a separate power cord.
Re: (Score:2)
laptop users. One cable from the display to the laptop, with the display powering the laptop. probably also for most desktops, removing the need for a dedicated power supply in the desktop except for gaming PCs with dedicated GPUs
Apple's new super mega ultra computer only uses 200w, it will run off most high end monitors that support 240w out. Needing a dedicated power cable for your PC is going to look really silly in a couple of years as power demands continue to shrink. I can't imagine USB-C will
Re: (Score:2)
You could power devices such as media hubs (ie, Fire Stick), cable boxes, game consoles, etc...
Depending on which direction(s) the power goes, you could also power monitors. One less cable. (eyes the trio of cables running into my monitor)
Re: (Score:2)
WTF do you need power on the same cable as the data?
I routinely plug my laptop into my monitor with a single USB-C cable. That carries video to the monitor, power to the laptop, and also connects the laptop to a USB-C microphone and SSD backup drive that are plugged into other ports on the monitor. One cable is better than two (or four).
Re: (Score:3)
The only way this will get any traction is if China mandates it.
It will get traction of the industry body is large and influential enough. Consumers don't decide this shit, manufacturers do. The question is if they can convince the Samsungs and LGs of this world to adopt it. As it stands HDMI is deficient in this area, and can't provide uncompressed 8K at 60Hz and you just need to look to consumer TVs in general to see how Displayport simply isn't a thing outside of computer monitors.
Re: (Score:2)
It's an alt mode for USB C. The cables are real USB cables.
It will gain traction because the ports will be compatible with DP over USB and standard USB Power Delivery, but offer more features with compatible hardware.
Consumers are already used to that, where official/certified cables and devices offer better performance. Apple started doing it years ago with chargers.
Re: (Score:2)
I see this stupid shit pasted all over this article. Have none of you wondered why your TV doesn't have a DisplayPort?
It is completely true that DP is flatly better for the sole purpose of moving video frames over a cable, but the fact of the matter is, it's not at feature-parity with HDMI.
Re: (Score:2)
May be too late for the pro-video world (Score:3)
Which is transitioning off baseband to IP Video workflows with either SMPTE-2110, IPMX and SDVoE. Maybe for gaming and general TV use but we are just getting DP2.1 support in the past 24 months finally.
It's still quite cool to push that much bandwidth down a cable but as others have said here it's kind of a dead end unless they submit this to VESA as maybe DP3.0?
Re: (Score:2)
China alone is a huge market. This is the start of their big push for next gen displays and laptops. Instead of waiting for Western standards orgs to catch up and then releasing along with Western manufacturers, they will just move past us. Again.
Re: (Score:2)
I suppose that could be case but even though we all kinda thought HD would be more than anyone needed will we start needing more than 4K in most applications over the next 10 years? 8K displays are still pretty rare are only for media viewing and DP2.1 can push a reasonable amount of data.
And at least in my world on the broadcast and AV side IP video *is* the future, it's just a matter of adoption and product maturity at this point. ST2110 can push a 32k x 32k lossless and then your network switch becomes y
Re: (Score:2)
I have 4k monitors, but they aren't perfect. 8k is definitely an improvement. For 8k content, it depends where you live. Japan has been broadcasting in 8k for years. It's looking like Chinese companies are going to really push 8k displays, with home-grown technology and panels. It's the same strategy that they used for cars, wireless comms, high end audio... Target the next generation and get there ahead of everyone else.
Video over IP is interesting, but how is the latency? For gaming it matters.
Re: (Score:2)
Right now 2110 is sub 1ms latency, maybe not fast enough for gaming but there are more and more hardware solutions coming down the pipeline. It's good enough for virtual and live production at least so we will see if it shakes out.
I know 8K is out there on the pro side, lot of 8K camera and editor monitors but for general use still pretty rare and expensive and for gaming it's well off in the future, most of will still take 1440p with 120hz+ versus UHD at 60Hz unless we can all get some 5090s
Re: (Score:2)
1ms is fine for gaming. Thanks, that's interesting.
Re: May be too late for the pro-video world (Score:2)
There were in fact displays with only SCART. That's why there were so many adapters.
Practicality of 8k for most uses? (Score:3)
Is there a practical home-use for an 8k monitor/TV? I'm all for the more is more philosophy, but I think at home sizes there's a point of diminishing returns that, if we haven't crossed it, we've got to be approaching. I have a 65" 4k TV that I sit only about 8 feet away from, and I honestly don't know how much more crisp that image needs to get. And I'm not sure most homes have room for something bigger. I suppose I *COULD* go up to 75" if I wanted to start soaking up the walkway around the living room, but how far are we gonna push the "more pixels" thing?
Re: Practicality of 8k for most uses? (Score:3)
Me and my family cant tell the difference when actually watching a movie on a 50inch tv between 720, 1090, 4k. I'm not talking comparison where people are not watching the movie but looking for differences.
Cartoon, animated anime stuff I can almost get away with 480 without people noticing.
Re: (Score:2)
I can see a difference when going from 480 to 1080. The difference from 1080 to 4k isn't as stark, but noticeable. I'm just thinking there's a point where the eye won't really be able to pick up the difference that we have to be approaching at this point.
Re: (Score:2)
Re: (Score:2)
8K may be pushing it but objectively resolutions larger than 4K are already common both in single devices, and in situations where you may want to daisy chain devices (I'm pushing 2x 4k displays here over a single USB connector already). If these were gaming monitors it already wouldn't work due to bandwidth limits.
Re: (Score:2)
Is there a practical home-use for an 8k monitor/TV?
I think there is for sports. Watch soccer on a 4k TV. The camera is usually pulled back far enough to see a lot of the field, so each individual player on a 4k screen (3840x2160) is about 150 pixels tall, and the number of their jersey is about 30 pixels tall. That's usually not enough for me to make out what's happening. I can make it out better live in person. An 8k screen I think would be enough to make it out. I'd sit closer to it than your 8' if I wanted to watch. (Likewise, at IMAX I like to sit about
Re: Practicality of 8k for most uses? (Score:2)
I had to move my sitting position from 24ft to 12ft to tell the difference between 1080p and 4K on my 106in home projection screen. I do not expect i could tell the difference between 4K and 8K at that same distance.
When sitting much closer to a computer, I will likely see the difference. Most likely, many applications will have problems with fonts being too small to read.
Unfortunately, scaling issues vary from one app to another, and this is not easily correctable. It is already an issue on my two 32in 4K
Re: (Score:2)
Re: Practicality of 8k for most uses? (Score:2)
3840x2160 is just *barely* higher-density at 28" than your eye can pick out individual pixels from arm's length away, so it's basically "print-density". Achieving comparable pixel density on a 65" TV viewed from ~7 feet away requires ~16k resolution.
6-point text at 1920x1080@27" with subpixel-smoothing is kind of like printed text with 1 diopter of astigmatism, viewed through high-index lenses with some nasty abbe-error on top.
If you *really* want to appreciate the difference, look at lowercase letters with
Re: (Score:2)
I use a couple of 27" 4k monitors, and I'd like to upgrade to 8k. 4k is good, but I can still see aliasing, still see pixels. Text could still be a little sharper, and thus easier to read.
16k HDR? (Score:2)
At what point are we going to need a magnifier to see pixels on our TV? or press our noses up to the screen?
Will we need to use our hands to block blinding brightness preventing our iris from closing enough to see in the shadows of extreme contrast?
Re: (Score:2)
Barney Stinson's 800 inch TV.
"It hurts my eyes."
"Heh, yeah. That doesn't go away."
Telemetry? (Score:2)
Does this new standard include telemetry to report the Chinese government how it is being used and what data is being transmitted? Does it allow for the display to be remotely disabled if disfavorable content is viewed?
Re: (Score:2)
Yes. They use the same technology that allows Chinese solar-powered desk calculators to communicate with the mothership by modulating differential Earth albedo, but they do it with the EM emissions leaking off the cables. It is a nano-chip they just weld into the CE mark on the thing.
Comparison chart needs updating (Score:2)
HDMI 2.2 - 96 Gbps
Thunderbolt 5 - 80 Gbps
USB4 v2 - 80 or 120/40 Gbps
Double the power? It's still only 48V, so that means double the current. I guess that's one reason the Type B connector is twice as big - double the power wires and double the signal.
Re: (Score:2)
Additionally, apparently the 192Gbps mode is single direction, which makes sense for a display.
GPIM has up to 8 24G channels.
USB4 v2 has 4 lanes, which are 40G each. Giving 80/80 symmetric or 120/40 asymmetric bandwidth.
It's not more advanced technology, just more wires in the cables, and in the case of Type-C, it can go 4x24Gbps in a single direction, instead of 3x40+1x40
Power delivery is a physical current capability limitation.
48V is as high as you can go without getting dangerous.
Push the current too hi
Re: (Score:2)
I always thought SERIAL meant something. I get bidirectional serial makes sense... but when you have many channels is it really serial anymore? Just because the physical layer separates into discrete channels it's still serial? but at the data layer it's aggregated parallel so it's still serial?
I just wanted a simple USB 2 without a complex buffered microcontroller and a decent plug like firewire... no, more like a headphone jack that can rotate. It's all anybody needed!
Seriously, if you want more get a di
Re: (Score:2)
TB5/USB4 also aren't really comparisons. They only support those speeds in CIO mode.
DP-Alt is still limited to 80Gbps.
Standards Bodies and China sanctions (Score:2)
Great, Something Else I Don't Need (Score:2)
USB-C cables are flawed (Score:2)
The USB-C connection to my iPhone seems to have grown weaker over the past 2 years, such that when my car hits a bump in the road, I lose my CarPlay. I'm not convinced a standard USB-C connection is as sturdy as an HDMI or DisplayPort connection.
Re: USB-C cables are flawed (Score:2)
Not sure other standard connectors are better in reality.
Re: (Score:2)
Old headphone jacks are better; new ones are amazingly cheaply designed. Plus when you break a solder joint you could fix it yourself without the risk of a heat gun.
What is crazy is USB C having such tiny pins to wear down...and they push way too high a wattage and at a low voltage as well! I'm surprised my cables are not getting hot actually... It's like all that basic physics about cross sectional area and building codes about wire gauges somehow disappeared.
When are they going to run a 1200 watt toaster
Re: USB-C cables are flawed (Score:2)
They are NOT flexing (Score:5, Informative)
I fixed the subject for you. USB4 version 2.0 (yes, I agree it's a dumb numbering scheme) has an 80 Gbps symmetric mode and 120/40 Gbps asymmetric mode. So they've announced a sometimes-proprietary cable that has a standard connector, but the standard connector runs slower than what the rest of the world decided on 2+ years ago. Being late to the party with a slow cable is not a flex.
Re:They are NOT flexing (Score:5, Interesting)
The target here is HDMI. Everyone knows that DP can handle 8K over USB-C just fine.
The problem is that HDMI2.2 still isn't here, and HDMI2.1 is way, way, behind.
DisplayPort does not handle media-center requires like ARC and CEC.
Could they have just pumped HDMI framing over the 120Gbps channels? Yes. They could have. But they wouldn't be able to call it HDMI.
Could they have made this "GPMI" higher bandwidth? Yes, they could have, but there wasn't a need- and lowest bandwidth to get the job is always better, because the higher the bandwidth, the more sensitive the cable is to length.
In short, this is someone getting annoyed waiting for HDMI 2.2, and making their own. Using a USB-C port was just sound reasoning.
Re: (Score:2)
HDMI 2.1 is extremely common now, and supports 8K up to 120 Hz. What exactly do you think they currently need that's beyond that, such that they decided to make a new, incompatible format instead of using HDMI 2.2, which was announced before this? Your "annoyed waiting" hypothesis makes no sense at all.
Re:They are NOT flexing (Score:5, Informative)
HDMI 2.1 is extremely common now, and supports 8K up to 120 Hz.
Only with pretty extreme color space compression.
8K HDR over HDMI2.1 is a joke.
What exactly do you think they currently need that's beyond that, such that they decided to make a new, incompatible format instead of using HDMI 2.2, which was announced before this?
Better than 50Hz SDR uncompressed? Or the ability to do 60Hz 8K at something better than 4:2:2 chroma subsampling?
They obviously started work on this long before HDMI2.2 was announced, as this is a fully-fleshed electrical standard.
HDMI2.1 is hideously outdated.
Your "annoyed waiting" hypothesis makes no sense at all.
It does, you just didn't think it through.
Pretend like HDMI 2.1 was released in 2017.
Pretend like GPMI took more than the 3 months to create between HDMI2.2's spec release and now.
Pretend like DP hasn't been able to do 8K/60Hz/10bpc RGB since 2019.
HDMI2.1 has been stretched to hit a few technical benchmarks, but that taffy is real thin in the middle now.
All of this ignoring the argument of whether or not 8K is even really a thing or not.
Re: (Score:3)
HDMI 2.1's visual quality is rubbish at 8K 120Hz. Even 8K 60Hz there's limitations, quite the opposite of what display companies want when they are pushing hyper fancy displays capable of 16bit Rec2020 colour reproduction.
Need patent free connectors (Score:3)
We really need 100% patent free device interconnect standards. That is 100% patent free and not a patent pool where all the patent owners 'let' the patents be used for free.
Re: (Score:2)