• 2 Posts
  • 475 Comments
Joined 1 year ago
cake
Cake day: July 11th, 2023

help-circle







  • Ech@lemm.eetoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    132
    arrow-down
    1
    ·
    13 days ago

    Instagram has it’s fair share of blame for the trend, but I don’t think they were the progenitor, as it were. Snapchat was far more heavy-handed with face altering filters from the get-go, as I remember it. Instagram was mostly just the “old-school” sepia tone, black-and-white type filters for the most part until that picked up.






  • The game is rendered at a lower resolution, this saves a lot of resources.

    Then dedicated AI cores or even special AI scaler chips get used to upscale the image back to the requested resolution.

    I get that much. Or at least, I get that’s the intention.

    This is a fixed cost and can be done with little power since the components are designed to do this task.

    This us the part I struggle to believe/understand. I’m roughly aware of how resource intensive upscaling is on locally hosted models. The necessary tech/resources to do that to 4k+ in real time (120+ fps) seems at least equivalent, if not more expensive, to just rendering it that way in the first place. Are these “scaler chips” really that much more advanced/efficient?

    Further questions aside, I appreciate the explanation. Thanks!