HDMI Forum Rejects Open-Source HDMI 2.1 Driver Support Sought By AMD - Phoronix

submitted by ruffsl edited

www.phoronix.com/news/HDMI-2.1-OSS-Rejected

For three years there has been a bug report around 4K@120Hz being unavailable via HDMI 2.1 on the AMD Linux driver.

The wait continues...

Log in to comment

66 Comments

f00f/eris

This really bothers me. Closed standards locked behind a licensing fee may as well not be standards at all, in my opinion.

TurboWafflz

I don't understand why any hardware uses HDMI anymore anyway, what does it have that displayport doesn't?

Dudewitbow , edited

HDMi foundation is founded by companies who own the home theatre environement (mainly movie conpanies and television) who puts DRM on HDMI to make it harder to illegally copy content like movies, ao they will always want to be anti open source because thats the request of streaming services/movie businesses. Its why for example, mobile devices have widevine levels. those levels basically determine how "unlocked" the device is and services will refuse to offer full functionality to unlocked devices because of it, be it audio or video.

Members of VESA, who control the displaypprt standard are generally computer companies are mostly not in the business of media, so they value specs over drm on changes, which for example a use case is that displayport allows for daisychaining diaplays.

nivenkos

The DRM is so stupid - now in the era of streaming you can get literally anything webripped day1.

DRM is obsolete (and it never really wasn't tbh).

smileyhead

DRM is not to stop pirates, but to show investors and licence holders you are trying to stop pirates.

Dudewitbow

its the attempt that matters more to investors than the pirates. its why a shit ton of games have denuvo, evem if the version of denuvo they utilized is cracked already or not. its not there for the end user, its there for the investors to show they are at least attempting to fight off piracy.

leopold

Denuvo is actually very effective relatively speaking. Several popular games that use it have never been cracked. They haven't made it impossible, just sufficiently difficult and tedious that no one wants to bother.

Auli

Isn’t DRM in games working though. Denuvo only being cracked by one person, to me it sounds like a win for the corporations.

n3m37h

I don't know a single person who has ever used HDMI to steal copyrighted content. Seriously? Who would rip a 2 hr move by watching it vs the 10 min it takes to rip a movie digitally.

Like shit ya got CAM, WebRIP, BRRIP and SCENE. I doubt HDMI was used in any of these scenarios.

Dudewitbow

technically speaking, every gamer who capture cards to bypass when games on PlayStation has an explicit mode that disables built in recording when a cutscene is active is an example.

Flaky

Probably a lot more hardware using HDMI than DisplayPort? Just throwing a guess, tbh.

That being said, I might consider looking towards DisplayPort when I can get a new monitor...

narc0tic_bird

Feature-wise probably next to nothing, and it's usually behind one or two generations in terms of bandwidth. HDMI is often the only port available on TVs though, so GPU makers likely can't afford to just leave it out.

Grass , edited

They should anyway. New tech TV's are all smart these days and the dumb ones are made for two decades ago. At this point we are better off with a PC monitor and separate speakers. Built in speakers are shit seemingly as a requirement. I use a video port switch for extra inputs without needing to use the on screen menus or just running out of built in ports.

Auli

Why not? If you need it get a converter.

virr

CEC (technically I think displayport could support it, but generally isn't implemented) and ethernet up to 100Mbps.

anyhow2503

Almost nothing uses ethernet over HDMI to my knowledge.

Baut auf. she/her

This is the first time I heard of Ethernet over HDMI and I can't tell if you're joking.

catloaf

I think they mean HDMI over Ethernet, which is a real thing, but not something I've ever seen in real life.

MiltownClowns

Decades of being the standard in a/v. That's like asking, why don't we get rid of gas stations and just install electric chargers? Well, everybody's got gas powered cars.

TurboWafflz

AV things sure since they stick around longer, but computers? When was the last time you saw a high end GPU with VGA or DVI? And they already usually have mostly DisplayPort with just one or two HDMI ports

MiltownClowns , edited

Well, I wasn't referring to that ecosystem. That ecosystem is already on display port. The reason HDMI is so prevalent is because it's the standard in audio-visual equipment. Why would I talk about computer equipment when it's not the standard there?

The point still stands. Everybody has equipment that has HDMI, and to phase out that standard in equipment going forward is phasing out equipment people already own.

MonkderZweite , edited

and to phase out that standard in equipment going forward is phasing out equipment people already own.

And where's the problem in that? My parents still use a soon 20 years old plasma tv. But they're getting old too.

krolden

Computers are AV things.

Dog

Today. Every time I go downstairs.

n3m37h

eARC and 12gbp/s more bandwidth (4k@185hz vs 4k@120hz)

Otherwise the same

SuperIce

Your info is outdated. DP 2.0 is 80 Gbps can do 4K@240hz without display stream compression. It can do up to 16K@60hz using DSC.

SchmidtGenetics

Can hook up to TVs…

Skull giver

Licensing is literally the only way the people who make HDMI can make money. They have a monetary incentive to sell as many licenses as possible. That's why they make new versions for minor features, because pasting the sticker with the new number on the box will pay their paychecks.

What I don't get, though, is why the open source approach would be a problem. I don't think the HDMI people have that many business secrets in software form, it's all patents and licensing.

Luckily, there's DisplayPort.

Catsrules

My guess is it has something to do with DRM protection in the HDMI spec. I have no proof but it seems like it is always DRM that screws over open source.

Skull giver

From what I've gathered (and I take it with a grain of salt) the AMD code relies on some proprietary code by the HDMI standards people that are under strict NDA. I don't think the implementation is entirely open source, and I don't think any existing implementation is, either.

In this case, Nvidia's approach of open sourcing the driver but doing most of the work in signed firmware blobs running on the card may be the best solution for now. I don't know how many distros are shipping Nvidia's open source driver, though.

Zucca

Besed on the upvotes, it's not only your opinion. 👍

parens

Alright AMD, just remove HDMI from your graphics cards and be done with it 🤷 . Fuck the HDMI forum.

itsralC

As much as I want them to give HDMI the middle finger I don't think they have enough leverage in the GPU market to pull such a bold move off.

UnpledgedCatnapTipper

They could they just include a DP to HDMI adapter in the box and have no HDMI ports on the GPU maybe?

fuckwit_mcbumcrumble

They’re currently what 15% of the market? Nvidia would happily swoop in and pick up some more market share.

Auli

Ehh Nvidia doesn’t care about the graphics card market anymore. Look at the cash AI is raking in for them.

MigratingtoLemmy

Alright, displayport, here we come

n3m37h

I've been on the DP bandwagon since using my GTX 660Ti

fuckwit_mcbumcrumble

I don’t think I’ve ever used hdmi by choice. It’s always been VGA > DVI > DisplayPort. The only times I use HDMI is consoles or stupid monitors either only 1 DP and a bunch of HDMI.

flashgnash

Why VGA and DVI? I had thought they were limited in their max resolution/refresh rate

fuckwit_mcbumcrumble

DVI and vga were mostly on older devices. HDMI was inferior to dual link dvi until I’d switched to DisplayPort.

Sentau , edited

So I see people on the phoronix forums complaining that this is a bad thing because they have TVs which are HDMI only. From what I read, the HDMI 2.1+ spec is only needed to support extreme cases like 4k@120Hz and above. So my question is how many people are there who have a TV old enough to have no display ports but be of that outrageous specification.

Edit : it seems I am mistaken in thinking that new TVs have display port.

Catsrules

So my question is how many people are there who have a TV old enough to have no display ports but be of that outrageous specification

As far as I know no consumer TV has Display port.

I bought a TV maybe 2-3 years ago that supports 4K@120 and it doesn't have a display port, only HDMI.

ruffsl [OP]

I'm using a recent 42" LG OLED TV as a large affordable PC monitor in order to support 4K@120Hz+HDR@10bit, which is great for gaming or content creation that can appreciate the screen real estate. Anything in the proper PC Monitor market similarly sized or even slightly smaller costs way more per screen area and feature parity.

Unfortunately such TVs rarely include anything other than HDMI for digital video input, regardless of the growing trend connecting gaming PCs in the living room, like with fiber optic HDMI cables. I actually went with a GPU with more than one HDMI output so I could display to both TVs in the house simultaneously.

Also, having an API as well as a remote to control my monitor is kind of nice. Enough folks are using LG TVs as monitors for this midsize range that there even open source projects to entirely mimic conventional display behaviors:

I also kind of like using the TV as simple KVMs with less cables. For example with audio, I can independently control volume and mux output to either speakers or multiple Bluetooth devices from the TV, without having fiddle around with repairing Bluetooth peripherals to each PC or gaming console. That's particularly nice when swapping from playing games on the PC to watching movies on a Chromecast with a friend over two pairs of headphones, while still keeping the house quite for the family. That kind of KVM functionality and connectivity is still kind of a premium feature for modest priced PC monitors. Of course others find their own use cases for hacking the TV remote APIs:

cobra89

TVs don't have DisplayPort. I just bought a new TV, none of the options I looked at had display port.

Dehydrated

These guys can go fuck themselves

nivenkos

This destroys any chance of Valve making an Xbox-competitive home console with SteamOS :(

velox_vulnus

This is a HDMI-only problem. Stick with USB-C or DisplayPort and you'll do just fine.

zelifcam , edited

You’re not going to bring in new users to a platform if they can’t use an HDMI cable that utilizes the full capabilities of their TV. Not everyone has USBC option or especially DP on their already purchased televisions.

Why do people ignore this? This isn’t about me or you understanding and sacrificing and adapting to use Linux desktop. We’re talking about real issues hindering adoption from the common user who doesn’t care their steam device is using Linux.

Auli

You get a cable that is usb-c on one end and HDMI on the other. Not that hard.

SomeBoyo

there are adapters

zelifcam , edited

When I built my all AMD system 1.5 years ago I tried ALL the adapters. Did I get video, sure. But I wasn’t getting all the features of HDMI 2.1. Has that been addressed? Because even looking at reviews today it’s mixed.

brbposting

Could they ship USB-C to HDMI as a workaround?

nivenkos

But I don't think that can manage 4k 60Hz HDR + Dolby Atmos, etc. that modern games consoles have?

zelifcam , edited

My understanding is that on Linux, you will not get VRR on an AMD GFX card using a usb-c adapter. Which is why this update from the article we are all commenting on is so disheartening.

nivenkos

Tell Samsung that for my TV...

smileyhead

DisplayPort is super easy to convert to HDMI and the adapters are cheap. The other way around is not so easy.

SuperIce , edited

Is anyone doing 4K at above 60Hz with the steam deck? I highly doubt it.

nivenkos

This stops them releasing a more powerful home version though. As SteamOS/Linux will not be able to support modern HDMI2.1 features.