No real technical advantage; it’s just owned by the same shitbags that dominate the TV market, so it’s the only way to connect to a lot of consumer living-room displays
Random clips on the web are DRMed these days, like news articles with an embedded video. Many CMSes just DRM all clips. Totally BS but I’ve seen the video frame staying black on a bunch of sites now.
How many embedded DRM-controlled news article videos are you watching on your living room tv though? PC monitors usually have native display port nowadays, no converters or HDMI necessary.
Then I won’t watch those, simple as. Plus, I’ve been running LibreWolf for like a year, whick blocks all DRM by default, and never in my life did I have issues playing a video. Even live videos from TV channels work 50% of the time.
Plex tv is the only one that seemingly requires DRM from when I looked into it. Didn’t decide to use it specifically for that reason.
TBH you should be playing DRM content though smart TV/TV box apps anyway. Desktop Windows playback is more technically limited (for instance, no auto resolution/refresh rate switching) and aside from that you usually get a worse bitrate stream on a stuttery player.
People who connect TVs to the Internet only invite malware. They usually don’t receive big fixes after a few years and tend to spy on all watched content.
Then watch on a plug-in Android TV box. Or take to the high seas.
I’m just saying, if you’re going to stream from an internet service anyway, video/audio on every HTPC streaming app I’ve tried looks bad. Netflix is the best, and it’s still heavily compromised. And (at least on my Sony), the local Android apps tend to have the best system integration for rescaling, HDR, setting the correct refresh rate, per app IQ settings and so on.
But that obviously doesn’t apply if you’re hosting it locally though Kodi, Jellyfin, Plex or whatever.
You should literally never use the apps built in to your TV. Unless you just really like letting the TV manufacturer know exactly what you are watching and when.
On Linux you check the box in Firefox that says Allow DRM Content and then yes, as far as I know, you need to be using laptop or a HDMI display.
Yes HDMI forum are shitbags, but there are definitely technical advantages to HDMI. Just that I can think of, DisplayPort doesn’t have ARC (audio return for sound systems), or CEC (device can turn on TV/display, TV remote can pause movie playing on console, etc) and the max length for a DisplayPort cable is no more than 3 meters before you have to go to expensive active cables. Most of these are easy to work around for most PC setups, but if Valve wants the gabecube to easily fit into living room/TV setups, it does present a challenge.
All of these supposed advantages are solved by USB-C though. Even the length is higher (5m, I believe). I’d be fine if the DisplayPort connector is gone, but the actual standard is just better for most purposes.
I’ve never actually used CEC, but everything I’ve seen says it’s just like a USB HID, correct? According to wikipedia, there already exist USB to CEC adapters.
HDMI has always sucked. I used DVI for the longest time, because HDMI couldn’t push enough pixels to a 1920x1200 display (topped out at 1080p for the longest time). Then jumped straight to display port when I finally got a 4k monitor.
HDMI was always 4-5 years behind other contemporary protocols, and for your trouble, you also got a stack of proprietary bullshit to go with it.
My understanding is it’s not even a licensing issue. The HDMI consortium won’t let you include features from 2.1 and 2.2 in an open source driver. it sounds like Valve would be willing to pay, but they’d have to include a closed source driver for the video card.
That’s still a licensing issue: you’re not allowed to license from the HDMI consortium and then freely sublicense to all your users, which is what open source requires. Hopefully this eventually concludes in the end of relevance for HDMI and we can have a freer, and just better ecosystem in general.
Valve should ship it as displayport internally and bundle a free HDMI adapter that they sell in the store, that way it’s all open source and the HDMI issue is taken care of in the most flippant way possible.
I think thats actually what Intel did on their A series graphics cards. Only had display port out signals but had a display port to HDMI adapter built into the board.
Yes, but that adds more cost. I don’t have any hard data on this, but it feels like their current solution works fine, since anyone using more data than 2160p60, who also won’t accept chroma subsampling, probably is already using DP. Maybe this is a direction to pressure the HDMI forum, since unlike AMD, valve’s drivers are actually open source on the majority of their users’ machines. And if things change in the future, external adapters or proprietary adapters are both solutions.
It’s easy to find a TV with USB-C input, though not universal. That still uses the DP protocol, and cables with different connectors on opposite ends are both cheaper and more common than those with HDMI as a result. Also, this is only even an issue if HDMI 2.0 isn’t fast enough for you, so old devices aren’t a concern.
I’m not sure where I got this idea, but I thought it was because Display Port doesn’t carry audio, and a single-cable solution was more appealing.
But apparently Display Port also supports audio, just none of my devices seem to recognize it…?
Apparently the only advantage of HDMI is ARC (Audio Return Channel), allowing devices to send audio back to the video source, which might be useful in some home theater setups.
Yeah pretty much. Display port is just as good but there aren’t really a lot of TVs on the market with display port because the people who own the HDMI standard are in that industry.
Somebody replied to other comment, but it seems like hdmi allows audio to be sent back, like, if you wanted your screen to send audio to the computer… which would be weird in most PC scenarios, but not so much on TVs.
HDMI requires a license cost, DisplayPort is free.
What advantage does HDMI hold over DisplayPort?
It makes them money.
The HDCP encryption is why it’s used.
No real technical advantage; it’s just owned by the same shitbags that dominate the TV market, so it’s the only way to connect to a lot of consumer living-room displays
This is the problem. I would switch to DP instantly but my TV only has HDMI ports.
There are DisplayPort to HDMI converters available
Pretty sure DRMed content refuses to play on those.
🏴☠️ Well 🏴☠️ I 🏴☠️ don’t 🏴☠️ care 🏴☠️
Random clips on the web are DRMed these days, like news articles with an embedded video. Many CMSes just DRM all clips. Totally BS but I’ve seen the video frame staying black on a bunch of sites now.
How many embedded DRM-controlled news article videos are you watching on your living room tv though? PC monitors usually have native display port nowadays, no converters or HDMI necessary.
Then I won’t watch those, simple as. Plus, I’ve been running LibreWolf for like a year, whick blocks all DRM by default, and never in my life did I have issues playing a video. Even live videos from TV channels work 50% of the time.
Plex tv is the only one that seemingly requires DRM from when I looked into it. Didn’t decide to use it specifically for that reason.
Doesn’t change facts for millions of others.
TBH you should be playing DRM content though smart TV/TV box apps anyway. Desktop Windows playback is more technically limited (for instance, no auto resolution/refresh rate switching) and aside from that you usually get a worse bitrate stream on a stuttery player.
I don’t even know about DRM playback on Linux.
People who connect TVs to the Internet only invite malware. They usually don’t receive big fixes after a few years and tend to spy on all watched content.
Then watch on a plug-in Android TV box. Or take to the high seas.
I’m just saying, if you’re going to stream from an internet service anyway, video/audio on every HTPC streaming app I’ve tried looks bad. Netflix is the best, and it’s still heavily compromised. And (at least on my Sony), the local Android apps tend to have the best system integration for rescaling, HDR, setting the correct refresh rate, per app IQ settings and so on.
But that obviously doesn’t apply if you’re hosting it locally though Kodi, Jellyfin, Plex or whatever.
You should literally never use the apps built in to your TV. Unless you just really like letting the TV manufacturer know exactly what you are watching and when.
On Linux you check the box in Firefox that says Allow DRM Content and then yes, as far as I know, you need to be using laptop or a HDMI display.
Fine. then use a Roku or Apple TV or whatever. He literally included those.
Rokus have the same problem regardless of form factor. But this thread is about people who want to use the Steam Machine for streaming.
Latency, desync, probably can’t do full 4k/120… just because something exists doesn’t mean it’s a viable solution.
deleted by creator
Display port to HDMI cables are pretty good
Active ones aren’t cheap, though.
Yes HDMI forum are shitbags, but there are definitely technical advantages to HDMI. Just that I can think of, DisplayPort doesn’t have ARC (audio return for sound systems), or CEC (device can turn on TV/display, TV remote can pause movie playing on console, etc) and the max length for a DisplayPort cable is no more than 3 meters before you have to go to expensive active cables. Most of these are easy to work around for most PC setups, but if Valve wants the gabecube to easily fit into living room/TV setups, it does present a challenge.
All of these supposed advantages are solved by USB-C though. Even the length is higher (5m, I believe). I’d be fine if the DisplayPort connector is gone, but the actual standard is just better for most purposes.
Cec over USBC?
I’ve never actually used CEC, but everything I’ve seen says it’s just like a USB HID, correct? According to wikipedia, there already exist USB to CEC adapters.
I don’t know what HID is, but CEC lets you control Kodi with the TV remote.
HID means a human interface device, so most commonly a keyboard, but remote controls can and do use the same protocol just fine.
HDMI has always sucked. I used DVI for the longest time, because HDMI couldn’t push enough pixels to a 1920x1200 display (topped out at 1080p for the longest time). Then jumped straight to display port when I finally got a 4k monitor.
HDMI was always 4-5 years behind other contemporary protocols, and for your trouble, you also got a stack of proprietary bullshit to go with it.
My understanding is it’s not even a licensing issue. The HDMI consortium won’t let you include features from 2.1 and 2.2 in an open source driver. it sounds like Valve would be willing to pay, but they’d have to include a closed source driver for the video card.
That’s still a licensing issue: you’re not allowed to license from the HDMI consortium and then freely sublicense to all your users, which is what open source requires. Hopefully this eventually concludes in the end of relevance for HDMI and we can have a freer, and just better ecosystem in general.
Valve should ship it as displayport internally and bundle a free HDMI adapter that they sell in the store, that way it’s all open source and the HDMI issue is taken care of in the most flippant way possible.
I think thats actually what Intel did on their A series graphics cards. Only had display port out signals but had a display port to HDMI adapter built into the board.
Yes, but that adds more cost. I don’t have any hard data on this, but it feels like their current solution works fine, since anyone using more data than 2160p60, who also won’t accept chroma subsampling, probably is already using DP. Maybe this is a direction to pressure the HDMI forum, since unlike AMD, valve’s drivers are actually open source on the majority of their users’ machines. And if things change in the future, external adapters or proprietary adapters are both solutions.
I don’t see “relevance for HDMI” ending anytime soon. Tell me how easy it is to find a TV with DP inputs. Nearly 99% of consumer gear uses HDMI.
It’s easy to find a TV with USB-C input, though not universal. That still uses the DP protocol, and cables with different connectors on opposite ends are both cheaper and more common than those with HDMI as a result. Also, this is only even an issue if HDMI 2.0 isn’t fast enough for you, so old devices aren’t a concern.
My guess is TV compatibility. The steam machine is intended as a living room PC, connected to your TV. Most TVs only have HDMI, no DP.
I’m not sure where I got this idea, but I thought it was because Display Port doesn’t carry audio, and a single-cable solution was more appealing.
But apparently Display Port also supports audio, just none of my devices seem to recognize it…?
Apparently the only advantage of HDMI is ARC (Audio Return Channel), allowing devices to send audio back to the video source, which might be useful in some home theater setups.
And cec. Idk what arc is but I use cec daily.
I’m pretty sure Display Port supports CEC (pin 14)
I ruined my audio ports on my computer and now run my speakers through my monitor using DP, it works great!
Wow I always assumed the same thing lol
Ohh TIL, thanks! I could count the times I needed the TV to send audio back to the home theater, like if I want to watch open channels or something.
I think we could live without it, just plug an audio cable or something, fuck hdmi.
Omg is this seriously it??
Yeah pretty much. Display port is just as good but there aren’t really a lot of TVs on the market with display port because the people who own the HDMI standard are in that industry.
HDCP?
I don’t think you can send audio over DisplayPort
Funny, I have done it for years at home, I guess I am just confused
Somebody replied to other comment, but it seems like hdmi allows audio to be sent back, like, if you wanted your screen to send audio to the computer… which would be weird in most PC scenarios, but not so much on TVs.