- cross-posted to:
- pcgaming@lemmy.world
- cross-posted to:
- pcgaming@lemmy.world
cross-posted from: https://lemmy.world/post/11840660
TAA is a crucial tool for developers - but is the impact to image quality too great?
For good or bad, temporal anti-aliasing - or TAA - has become a defining element of image quality in today’s games, but is it a blessing, a curse, or both? Whichever way you slice it, it’s here to stay, so what is it, why do so many games use it and what’s with all the blur? At one point, TAA did not exist at all, so what methods of anti-aliasing were used and why aren’t they used any more?
Antialiasing is a byproduct of moving away from CRT display technology. The natural image softening in CRT tech is not replicated in LCD and LED displays.
TAA is one of the better options, but at the end of the day it will be difficult to create a true AA solution that doesnt have artifacts, without utilizing supersampling.
We used AA on our CRTs back in the day. Of course we were all running like 1024x768 as the resolution so it was a lot more needed. The higher your resolution the less you need it.
Yes, thats true. AA was helpful at certain resolutions that were what I call “medium resolutions”, the range between 480 and 768 pixels. But CRTs still had a softer image simply as a byproduct of the way the technology worked, and worked better at lower resolutions like 240p (AFAIK, any signal less than 480 vertical pixel resolution was automatically progressive scan). This was abused and exploited by game developers of the time, famously utilizing dithering for transparency effects for platforms that didn’t fully support it such as the SEGA Saturn (it only supported transparent 2D sprites, but not textured polygons like the PSX did). The softer image led to the dithered effects smoothing out, giving the appearance of a bigger available color palette and special effects. Flickering sprites every other field was also a common technique due to CRTs high image persistence. This is why games like Streets of Rage look awful on modern displays, but display correctly on CRTs.
But regardless, AA will probably be phased out eventually, its just a tool to mitigate growing pains of new display technology.
DLAA comes to mind
Interesting take. Do you think that natural image softening would come back in newer technologies?
I’m not that guy, but I don’t think so. The trend will likely be that we get to the point where we render and display in such a high resolution that you can’t even see pixels anymore. We’re getting there already with smaller 4k displays where turning on AA doesn’t have an appreciable difference in 4k native rendering.
I agree with this. Outside of some media that may release with special effects designed to mimic the softer image of a CRT, I think display technology will just progress to the point where nothing will use AA at all because the resolution is just too high to really tell. I mean, its already like that with 4k TVs, you sit far away enough that you usually can’t tell the difference between 4k and 1080p.