cross-posted from: https://lemmy.world/post/11840660

TAA is a crucial tool for developers - but is the impact to image quality too great?

For good or bad, temporal anti-aliasing - or TAA - has become a defining element of image quality in today’s games, but is it a blessing, a curse, or both? Whichever way you slice it, it’s here to stay, so what is it, why do so many games use it and what’s with all the blur? At one point, TAA did not exist at all, so what methods of anti-aliasing were used and why aren’t they used any more?

  • FluffyPotato@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    8
    ·
    10 months ago

    The first things I always turn off are motion blur, anti-aliasing and ray tracing.

    Motion blur just makes it look like you’re drunk, anti-aliasing makes everything look like it’s smeared with vaseline and ray tracing tanks your FPS for not much added quality.

    • Encrypt-Keeper@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      I don’t think I could stomach a game without AA. It’s on par with playing a game with an unstable 30fps frame rate, it’s just nauseating.

    • trailblazer911@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      Try playing Forza without AA. Ray Tracing tanks your performance, but it gives great visual Enhancements, once you experience it, there’s no going back.

      • FluffyPotato@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        5
        ·
        10 months ago

        I don’t really play racing games or Forza so maybe it’s unique to Forza or racing in general but every RPG, action, adventure, strategy, survival, shooter and sim game I have played looks worse with AA and ray tracing is not worth cutting your FPS in half for.

        • AngryMob@lemmy.one
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 months ago

          You must not notice aliasing and shimmering then? Most find it very distracting to see everything flickering and shimmering and stair step with the slightest motion.

          And ray tracing really depends on the game, implementation, and hardware. Ray traced global illumination alone fixes the classic video game look that stems from rasterized lighting errors (light leaking, default ambient light, etc). It is the future for high quality games even not photo-realistic ones. Its expense is offset by both reconstruction and improved hardware. You wont be able to avoid it forever even if you want to.

          • optissima@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            It has gotten much better in the last 7 years. I will say that I usually test 1.5× or 2× my resolution if possible, which can to be less taxing depending on the engine, as I’m always trying to eek out a little extra on my 970.

            • AngryMob@lemmy.one
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              2x on a 970? I struggled with my 970 at 1440p low-medium settings until i got the 3080. Often had to put scaling to 1080p. And that was on “last gen” titles, cant imagine still trying to limp that thing along nowadays, despite as much as i loved it.

              • optissima@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                Depends on the game, but I don’t usually pick up current gen for a bit. Unless you count Switch Emulation?

    • Alawami@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      Motion blur just makes it look like you’re drunk

      Someone hasn’t tried motion blur since 2004 GTA

      • DumbAceDragon@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        I always have film grain enabled. It provides some half decent dithering that helps mask color banding, especially noticeable on my low end monitor.

    • M137@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      10 months ago

      It’s like you’ve used each thing once in some specific game where it was badly implemented and decided that’s how it looks in all games.

      There is no objective “it looks like this”, every game does things slightly or very differently. I’m certain you are unusually blind to detail, have serious vision problems, or you’re just very good at convincing yourself of your own bad ideas.

      • FluffyPotato@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        There are actually a few unreal engine games where you can’t disable AA in the settings and I have tried to play with it on but I just end up disabling it in the ini files anyways because it looks bad. I have not encountered AA that does not make the game look blurry.

        I have never met anyone who doesn’t disable motion blur just outright so didn’t think anyone would ever defend that.

  • RightHandOfIkaros@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    4
    ·
    10 months ago

    Antialiasing is a byproduct of moving away from CRT display technology. The natural image softening in CRT tech is not replicated in LCD and LED displays.

    TAA is one of the better options, but at the end of the day it will be difficult to create a true AA solution that doesnt have artifacts, without utilizing supersampling.

    • falsem@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      10 months ago

      We used AA on our CRTs back in the day. Of course we were all running like 1024x768 as the resolution so it was a lot more needed. The higher your resolution the less you need it.

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Yes, thats true. AA was helpful at certain resolutions that were what I call “medium resolutions”, the range between 480 and 768 pixels. But CRTs still had a softer image simply as a byproduct of the way the technology worked, and worked better at lower resolutions like 240p (AFAIK, any signal less than 480 vertical pixel resolution was automatically progressive scan). This was abused and exploited by game developers of the time, famously utilizing dithering for transparency effects for platforms that didn’t fully support it such as the SEGA Saturn (it only supported transparent 2D sprites, but not textured polygons like the PSX did). The softer image led to the dithered effects smoothing out, giving the appearance of a bigger available color palette and special effects. Flickering sprites every other field was also a common technique due to CRTs high image persistence. This is why games like Streets of Rage look awful on modern displays, but display correctly on CRTs.

        But regardless, AA will probably be phased out eventually, its just a tool to mitigate growing pains of new display technology.

      • Kolgeirr@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        I’m not that guy, but I don’t think so. The trend will likely be that we get to the point where we render and display in such a high resolution that you can’t even see pixels anymore. We’re getting there already with smaller 4k displays where turning on AA doesn’t have an appreciable difference in 4k native rendering.

        • RightHandOfIkaros@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          10 months ago

          I agree with this. Outside of some media that may release with special effects designed to mimic the softer image of a CRT, I think display technology will just progress to the point where nothing will use AA at all because the resolution is just too high to really tell. I mean, its already like that with 4k TVs, you sit far away enough that you usually can’t tell the difference between 4k and 1080p.

  • Evil_Shrubbery@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    10 months ago

    At one point

    I was there, 3000 years ago, when first of the consumer AA almost usable on my Voodoo 1.

  • umbrella@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    10 months ago

    TAA just makes beautiful graphics look crappy and blurry.

    rather not have AA at all instead