It’s plausible but unlikely I think, putting a lot of faith into shitty pinhole cameras to be able to see twenty two 4K pixels one hex value lighter or darker, when most cameras have atrocious definition/sharpness and get blown out by light, blinded by darkness. I dunno, this reminds me of the screaming around Microsoft Kinect in 2013. They had bad and shitty plans for Kinect but, cheap hardware everyone hated Idk.
putting a lot of faith into shitty pinhole cameras to be able to see twenty two 4K pixels one hex value lighter or darker, when most cameras have atrocious definition/sharpness and get blown out by light, blinded by darkness.
I guess if the TV itself was doing the DRM recognition? Idk though, I’ve seen alarmist posting like this before… seems to me evil tech shit usually gets done in more mundane ways?
In every frame, easily identifiable by a shitty pinhole camera though?
I updated my comment with more details
It’s plausible but unlikely I think, putting a lot of faith into shitty pinhole cameras to be able to see twenty two 4K pixels one hex value lighter or darker, when most cameras have atrocious definition/sharpness and get blown out by light, blinded by darkness. I dunno, this reminds me of the screaming around Microsoft Kinect in 2013. They had bad and shitty plans for Kinect but, cheap hardware everyone hated Idk.
I feel like if you just slightly turn up the compression ratio then all that nuance is lost making the watermark nonexistent or unusable
Yes especially since Netflix in particular has atrocious compression.
deleted by creator
Quotin’
I guess if the TV itself was doing the DRM recognition? Idk though, I’ve seen alarmist posting like this before… seems to me evil tech shit usually gets done in more mundane ways?
deleted by creator