If your TV vendor decides to only put 100Mb cards in their TV then unfortunately spikey boy wins and you lose unless you’re willing to downrez your AV catalog.
I have plenty with higher bitrate audio that can hit 80. And with the overhead of the rest of the connections, and possibly just some limits on the chipset for TCP overhead etc, it starts stuttering around that 80mbps limit.
I have a 4k blu-ray remux of Misery that has a 104 Mpbs bitrate. But there are only a couple of movies in my collection that break 100. Most of my remuxes are around 50 to 70.
Anyhoo it’s all moot in terms of network speed since I just use a htpc to play all of them.
Venn diagram of people who understand this specific technicality and people who don’t want to deal with the shitty TV software is almost a circle though.
I’d rather get a Android box at the very least…, or just HTPC.
I’m in that Venn diagram but I’m married with kids and the UX of anything but the TV remote and Plex software is a bit much for me to convince the family to learn. And potentially relearn when I find the next great app like jellyfin 😅
I think there’s another circle with at least significant overlap between those two of family techies who just can’t convince the rest of the family to care.
My wife and kids found Jellyfin easier to use because it more closely resembles Netflix. Your mileage may vary but I get it, and it’s why I even use a media server over just plugging in a laptop with Kodi.
Sometimes the best solution is whatever you can get the users to actually use.
My canon ink tank type printer from mid COVID era is the same, didn’t realise it was only 10/100 on the wired port until I was looking at the switch one day and wondered why I had a yellow light instead of green, was about to run a new network cable until I checked the printer
Is that why my shit keeps buffering any time I try to stream a movie larger than 50-60 GB, despite the fact that I have a gigabit connection and a 2.5Gb router? TIL. BRB, running some speed tests on my TV…
If your TV vendor decides to only put 100Mb cards in their TV then unfortunately spikey boy wins and you lose unless you’re willing to downrez your AV catalog.
What the hell are you watching that has a bitrate of >100Mb? Because unless you have a 16K television I suspect the answer is nothing.
I have plenty with higher bitrate audio that can hit 80. And with the overhead of the rest of the connections, and possibly just some limits on the chipset for TCP overhead etc, it starts stuttering around that 80mbps limit.
I have a 4k blu-ray remux of Misery that has a 104 Mpbs bitrate. But there are only a couple of movies in my collection that break 100. Most of my remuxes are around 50 to 70.
Anyhoo it’s all moot in terms of network speed since I just use a htpc to play all of them.
Venn diagram of people who understand this specific technicality and people who don’t want to deal with the shitty TV software is almost a circle though.
I’d rather get a Android box at the very least…, or just HTPC.
I’m in that Venn diagram but I’m married with kids and the UX of anything but the TV remote and Plex software is a bit much for me to convince the family to learn. And potentially relearn when I find the next great app like jellyfin 😅
I think there’s another circle with at least significant overlap between those two of family techies who just can’t convince the rest of the family to care.
My wife and kids found Jellyfin easier to use because it more closely resembles Netflix. Your mileage may vary but I get it, and it’s why I even use a media server over just plugging in a laptop with Kodi.
Sometimes the best solution is whatever you can get the users to actually use.
I set up an hdmi-Ethernet converter and run Ethernet between my TV and main desktop. It solves problems.
Discovered this on a laptop after running the cable. Wifi was getting 250mbps vs 10/100 speeds
A TV I mean why not, but on a Laptop? Is it from the nineties O_o ?
My canon ink tank type printer from mid COVID era is the same, didn’t realise it was only 10/100 on the wired port until I was looking at the switch one day and wondered why I had a yellow light instead of green, was about to run a new network cable until I checked the printer
I guess you have to have a very particular workload, and printer, to need a gigabit line…
Right?
Casually printing highway ad posters at home, nothing special
Could be something wrong with a cable? A damaged cable can downgrade your connection from gigabit to 100mb
They do that shit on purpose. Use a shield or an htpc. Only input your TV should be getting is HDMI.
Hell no! Only DVI or DisplayPort. No money to patent trolls!
Is that why my shit keeps buffering any time I try to stream a movie larger than 50-60 GB, despite the fact that I have a gigabit connection and a 2.5Gb router? TIL. BRB, running some speed tests on my TV…