I posted the link to the SteamDeckHQ article that will do a better job at explaining what it is and what it does, but the official release video is here:...
Frame Generation is a tool that can artificially double the framerate of your game by creating fake frames to make it appear smoother.
I posted the link to the SteamDeckHQ article that will do a better job at explaining what it is and what it does, but the official release video is here: https://www.youtube.com/watch?v=dmnEOZg7bKE
You need to purchase the Lossless Scaling tool on Steam, and install Decky Loader* on your Steam Deck. The announcement video explains the process pretty well.
In practice, this is a great way to go from e.g. 45fps to 60, or 60 to 90 if you own an OLED model. And since some games allow you to cap the framerate, it also has the nice side-effects of reducing fan noise and battery use. If you really don't care about input latency (e.g. turn-based games), you can multiply the framerate up to 4x.
I've tested it on these games, from a base 40-45fps:
The Division 2: runs great, input latency can be noticeable when using a sniper. Some visual artifacts on the edge of the screen when running.
Diablo IV (through battle.net): visual artifacts around the character and small UI elements.
Deep Rock Galactic Survivor: allowed me to disable image scaling, no noticeable visual artifacts. Overall excellent.
ARC Raiders: adds a terrible input latency, even the menus are barely usable.
Path of Exile 2 (from capped 30 to 60): works with DirectX12, seems buggy with Vulkan. Long input latency.
It's not a miracle solution, but technically impressive with the right games.
*: Decky Loader is only usable with Steam Deck stable updates, not beta.
Having followed this plugin from the beginning, I can offer a short list of games that to my eye work especially well with it. All of these should be set to run at a locked 30fps, that you do 2x...
Having followed this plugin from the beginning, I can offer a short list of games that to my eye work especially well with it.
All of these should be set to run at a locked 30fps, that you do 2x with the plugin to reach 60. Use the steam menu to disable the frame limit and set your refresh rate to match. If you see a lot of shimmering, try upping the refresh rate to 90. It won't look quite as smooth but still smoother than a locked 30. If you're having trouble reaching 60, try setting a 1600mhz limit on your GPU. Lowering Flow Scale in the plugin can get you a little more performance too.
Slightly shrinking the deadzones on your sticks might help you out as well if you're particularly bothered by latency. It obviously doesn't mean anything directly, but the faster response might help you adjust.
IMO, 3x and 4x look and play like shit with everything, I don't recommend those at all.
Here's what I got:
Armored Core VI
Monster Hunter World
Monster Hunter Rise
Mount & Blade 2: Bannerlord
Diablo 2 Resurrected
Mechwarrior 5: Mercenaries
STALKER Anomaly
Cyberpunk 2077
These games in particular I think look great with the plugin. By targeting a locked 30fps, you can increase graphical fidelity/resolution a bit and come out with a smooth experience that looks nicer. Where possible, enforce the fps limit in the game instead of through the plugin or the steam menu. Helps keep latency manageable. There will always be a degree of shimmering, but with things set up the right way it's so minimal it shouldn't bother you, if you notice it at all. Occasionally you might catch some weirdness when you swing the camera around quickly, but it's so quick you probably won't care. You can use the plugin on pretty much everything, including non steam games and emulators. Emulators can be a bit tricky, might take some extra fiddling.
You can also combine it with Decky Framegen, a plugin that implements Optiscaler in place of DLSS (I think that's how to sum that up). I wouldn't recommend layering frame generation because latency will go wild most of the time, but optiscaler lets you have more options for upscaling with (usually) better results than what's available in the game.
Overall I've been super impressed with how good the plugin is. It depends on the game a bit how it will run and whether it will be OK to look at, but for the most part it legit feels like free performance with no major downside. If you're already accustomed to playing games at 30 and 40 fps, you're probably gonna be a-ok.
I think therin lies the rub. After much trial and error, I have learned that for any latency-sensitive game, the frame generation time becomes significantly worse for me below 45fps. I forget the...
If you're already accustomed to playing games at 30 and 40 fps, you're probably gonna be a-ok.
I think therin lies the rub. After much trial and error, I have learned that for any latency-sensitive game, the frame generation time becomes significantly worse for me below 45fps. I forget the exact switching point, but I think it's when it hits above 22.5ms it starts to be noticably 'mushy'.
Probably because FPS latency is multiplictive in a way input latency is not.
I wish I had your eyes. I have been gaming for decades and 40fps is about the spot where I can't tell if anything is better higher than that. Then again, that's been my standard for years, but...
I wish I had your eyes. I have been gaming for decades and 40fps is about the spot where I can't tell if anything is better higher than that.
Then again, that's been my standard for years, but lately I'm noticing I have other old-people issues... so maybe I could go down to 30fps. :(
It's less so the eyes and moreso the latency. For either, there are definitely diminishing returns. I can definitely see the difference even in desktop usage between 60 and 75 though, especially...
It's less so the eyes and moreso the latency. For either, there are definitely diminishing returns. I can definitely see the difference even in desktop usage between 60 and 75 though, especially the higher the native resolution.
Part of it is that humans don't really see in terms fps. There's some great research on that out there.
But anyhow, my reaction time is typically less than 400ms, and I feel my input latency budget is on the order of 100ms. At 40fps, frame generation consumes about half of that, only leaving 50ms for stuff like wireless controller and display. At 30, there's only 20ms left. I remember hearing at some point that a wireless controller has 35ms latency, not sure how true.
Your reply had me thinking to put a number to it, so I did. 30fps has always been my baseline, coming from a history of portables and crappy hardware. I didn't get into PC gaming in a dedicated...
Your reply had me thinking to put a number to it, so I did.
30fps has always been my baseline, coming from a history of portables and crappy hardware. I didn't get into PC gaming in a dedicated way until about halfway through college, and started on an extraordinarily crappy laptop. I played the first STALKER game on that machine, at like 24 fps with a trackpad, all the way to the end. It would not surprise me if that permanently altered my brain, because it felt like it did at the time. I just had to see what was in that fucking power plant, I can't explain it, so what if the grey matter gets dented.
The plugin impresses me most in Armored Core VI and Monster Hunter World, in those two it feels the most like free performance, and that being a thing at all just dazzles me if I'm honest. The feeling of "I just downloaded more ram and it worked" gets me a bit.
What about power draw? Some people that optionally lock the FPS to 30 or 40(Myself included) typically aim to increase battery life moreso than capping framerate for stability.
What about power draw? Some people that optionally lock the FPS to 30 or 40(Myself included) typically aim to increase battery life moreso than capping framerate for stability.
I didn't notice a particular difference but I also wasn't trying to measure. My goal was to save folks time messing with all the options in the plugin, get straight to seeing it do what it can do...
I didn't notice a particular difference but I also wasn't trying to measure. My goal was to save folks time messing with all the options in the plugin, get straight to seeing it do what it can do so they can quickly evaluate. There's a lot of toggles and options that simply don't matter if it's intolerable in the configuration I wrote out.
That's typically my use case. I would love to have plugged/unplugged power profiles per game. Even better would be to be able to switch game configs themselves when switching (presuming game is...
That's typically my use case. I would love to have plugged/unplugged power profiles per game. Even better would be to be able to switch game configs themselves when switching (presuming game is not running).
That last one would take a serious crowdsourving database though. I had done that for Fortnite, to automatically sub out config files based on which monitor I was connected to. 1080p/60 at megaultra. Ultrawide 1080/75 at ultra, 4k/60 at ultra with a few key settings lowered.
Right now I'm playing V Rising, and my game config is drastically different based on if I'm trying to max pretty or battery life.
Either something improved massively or this must be highly subjective. I tried Lossless Scaling about a year and a half ago on PC with STALKER Anomaly: GAMMA to get more smooth 60+ fps (I normally...
STALKER Anomaly
Either something improved massively or this must be highly subjective. I tried Lossless Scaling about a year and a half ago on PC with STALKER Anomaly: GAMMA to get more smooth 60+ fps (I normally get about 50), and I don't even think I tried framegen at that time, just FSR scaling to keep the latency down. Yet the input lag made it unplayable. I felt like I was playing Skyrim with triple buffering using a controller, not a fast first person shooter with a mouse.
Generally I think the manual recommends using scaling/framegen to get from 60 fps to 120+ on fast displays, because any way to get from 30 to 60 brings too much input lag. Not an issue with slow games, but I don't think it works with first person shooters.
My emphasis is on keeping a stable image, because the plugin doesn't take in the same way to every game. With Elden Ring, for example, the shimmering to me is just intolerable. It feels OK to me...
My emphasis is on keeping a stable image, because the plugin doesn't take in the same way to every game.
With Elden Ring, for example, the shimmering to me is just intolerable. It feels OK to me to play, but is distracting to look at because of its perspective/camera position relative to what's on screen - you see stuff shimmering at the edges pretty much all the time during travel, and the player character will flit out of existence turning the camera side to side. In Anomaly such distortion is much less present, so if you don't have an issue with the input latency that one is more of a success story. Hopefully that makes some sense.
I can confirm Baldur's Gate 3 on a dual GPU LS setting with high settings for 3440x1440 goes from 10 FPS on a 2070 to 45/doubled to 90 FPS with the following settings: scale ~75% on 2070, 2x frame...
I can confirm Baldur's Gate 3 on a dual GPU LS setting with high settings for 3440x1440 goes from 10 FPS on a 2070 to 45/doubled to 90 FPS with the following settings: scale ~75% on 2070, 2x frame generation, preferred GPU/plugged in GPU 1050 Ti, upscaled to native resolution...and it looks good.
Dumb question... is that GPU setting just a config preset in BG3, or are you using an eGPU with your Steam Deck? This setup might be cool for using my deck docked with the TV.
Dumb question... is that GPU setting just a config preset in BG3, or are you using an eGPU with your Steam Deck?
This setup might be cool for using my deck docked with the TV.
Not a dumb question, and neither! I'm running dual GPU with Lossless Scaling on a 2019 Mid-tier pre-built Alienware PC. My motherboard has an extra PCIe 4.0 x16 (physical) x8 (electrical), and I...
Not a dumb question, and neither! I'm running dual GPU with Lossless Scaling on a 2019 Mid-tier pre-built Alienware PC.
My motherboard has an extra PCIe 4.0 x16 (physical) x8 (electrical), and I bought a GTX 1050 Ti to add to the RTX 2070. Lossless Scaling tells BG3 to render frames on the 2070 at ~75% of desired resolution, then pass those frames to the 1050 Ti. This second GPU (only 35% of the Geekbench 4 score of the 2070) then does two things: it adds a frame between frames from the 2070, doubling apparent frame rate. It then upscales the frames to the full desired resolution. Neither of these requires much VRAM nor compute, so even the much older 1050 Ti can accomplish this. I then plug my monitor into the 1050 Ti, set the Lossless Scaling "Preferred GPU" as the same, and it runs beautifully.
It appears that telling the 2070 to run at 75% of my ideal resolution improved the frame rate from 10->45, and then the 1050 Ti upscaled and doubled it to 90 FPS with steady frame times.
I have an eGPU enclosure that I wanted to test this with, but regrettably I don't have any motherboards with Thunderbolt 3 or 4 right now. I thought I did, and was mistaken, as I was hoping to mess around with this. I bet if you had an eGPU that could work with your Steam Deck, you could pass to the eGPU as the "preferred GPU" and plug the TV in. I don't know of any USB-C eGPUs without Thunderbolt, though, and that's pretty important. Give it a shot, what's the harm?
Edit for clarity: You can definitely run LS on a single GPU, but there's a small performance hit to the frame rate. Based on estimates with a different game, I'd expect 10->40 doubled to 80 with just my 2070.
This is fascinating! I didn't know this type of setup was even possible. I haven't heard about dual-GPU setups in any context beyond two of the same model (crossfire, etc.), which seemed to be...
This is fascinating! I didn't know this type of setup was even possible. I haven't heard about dual-GPU setups in any context beyond two of the same model (crossfire, etc.), which seemed to be less common and not very beneficial as years went on.
I'll have to look into this when my current GPU eventually ages enough to struggle with playing new games.
Yep! SLI and CrossFire scale sublinearly because of the overhead coordinating the cards, and have all those asterisks attached. Lossless Scaling is $7 and can even run on an integrated GPU or...
Yep! SLI and CrossFire scale sublinearly because of the overhead coordinating the cards, and have all those asterisks attached. Lossless Scaling is $7 and can even run on an integrated GPU or similar, though only newer ones are recommended. It's honestly pretty miraculous software, and the fact it's only $7 to nearly double frame rate (on a single GPU) or more-than double (with a mixed setup) is pretty bonkers.
So, does this end up looking more or less like what TV interpolation does? If so, I hope this doesn't get baked in as part of the default experience that I'll have to find registry edits in games...
So, does this end up looking more or less like what TV interpolation does? If so, I hope this doesn't get baked in as part of the default experience that I'll have to find registry edits in games (or equivalent) to disable.
I generally agree for shows, but it's a lifesaver with some video games. My family has used interpolation to great effect with Pokémon games due to their poor performance.
I generally agree for shows, but it's a lifesaver with some video games. My family has used interpolation to great effect with Pokémon games due to their poor performance.
I posted the link to the SteamDeckHQ article that will do a better job at explaining what it is and what it does, but the official release video is here: https://www.youtube.com/watch?v=dmnEOZg7bKE
You need to purchase the Lossless Scaling tool on Steam, and install Decky Loader* on your Steam Deck. The announcement video explains the process pretty well.
In practice, this is a great way to go from e.g. 45fps to 60, or 60 to 90 if you own an OLED model. And since some games allow you to cap the framerate, it also has the nice side-effects of reducing fan noise and battery use. If you really don't care about input latency (e.g. turn-based games), you can multiply the framerate up to 4x.
I've tested it on these games, from a base 40-45fps:
It's not a miracle solution, but technically impressive with the right games.
*: Decky Loader is only usable with Steam Deck stable updates, not beta.
Having followed this plugin from the beginning, I can offer a short list of games that to my eye work especially well with it.
All of these should be set to run at a locked 30fps, that you do 2x with the plugin to reach 60. Use the steam menu to disable the frame limit and set your refresh rate to match. If you see a lot of shimmering, try upping the refresh rate to 90. It won't look quite as smooth but still smoother than a locked 30. If you're having trouble reaching 60, try setting a 1600mhz limit on your GPU. Lowering Flow Scale in the plugin can get you a little more performance too.
Slightly shrinking the deadzones on your sticks might help you out as well if you're particularly bothered by latency. It obviously doesn't mean anything directly, but the faster response might help you adjust.
IMO, 3x and 4x look and play like shit with everything, I don't recommend those at all.
Here's what I got:
Armored Core VI
Monster Hunter World
Monster Hunter Rise
Mount & Blade 2: Bannerlord
Diablo 2 Resurrected
Mechwarrior 5: Mercenaries
STALKER Anomaly
Cyberpunk 2077
These games in particular I think look great with the plugin. By targeting a locked 30fps, you can increase graphical fidelity/resolution a bit and come out with a smooth experience that looks nicer. Where possible, enforce the fps limit in the game instead of through the plugin or the steam menu. Helps keep latency manageable. There will always be a degree of shimmering, but with things set up the right way it's so minimal it shouldn't bother you, if you notice it at all. Occasionally you might catch some weirdness when you swing the camera around quickly, but it's so quick you probably won't care. You can use the plugin on pretty much everything, including non steam games and emulators. Emulators can be a bit tricky, might take some extra fiddling.
You can also combine it with Decky Framegen, a plugin that implements Optiscaler in place of DLSS (I think that's how to sum that up). I wouldn't recommend layering frame generation because latency will go wild most of the time, but optiscaler lets you have more options for upscaling with (usually) better results than what's available in the game.
Overall I've been super impressed with how good the plugin is. It depends on the game a bit how it will run and whether it will be OK to look at, but for the most part it legit feels like free performance with no major downside. If you're already accustomed to playing games at 30 and 40 fps, you're probably gonna be a-ok.
I think therin lies the rub. After much trial and error, I have learned that for any latency-sensitive game, the frame generation time becomes significantly worse for me below 45fps. I forget the exact switching point, but I think it's when it hits above 22.5ms it starts to be noticably 'mushy'.
Probably because FPS latency is multiplictive in a way input latency is not.
I wish I had your eyes. I have been gaming for decades and 40fps is about the spot where I can't tell if anything is better higher than that.
Then again, that's been my standard for years, but lately I'm noticing I have other old-people issues... so maybe I could go down to 30fps. :(
It's less so the eyes and moreso the latency. For either, there are definitely diminishing returns. I can definitely see the difference even in desktop usage between 60 and 75 though, especially the higher the native resolution.
Part of it is that humans don't really see in terms fps. There's some great research on that out there.
But anyhow, my reaction time is typically less than 400ms, and I feel my input latency budget is on the order of 100ms. At 40fps, frame generation consumes about half of that, only leaving 50ms for stuff like wireless controller and display. At 30, there's only 20ms left. I remember hearing at some point that a wireless controller has 35ms latency, not sure how true.
Your reply had me thinking to put a number to it, so I did.
30fps has always been my baseline, coming from a history of portables and crappy hardware. I didn't get into PC gaming in a dedicated way until about halfway through college, and started on an extraordinarily crappy laptop. I played the first STALKER game on that machine, at like 24 fps with a trackpad, all the way to the end. It would not surprise me if that permanently altered my brain, because it felt like it did at the time. I just had to see what was in that fucking power plant, I can't explain it, so what if the grey matter gets dented.
The plugin impresses me most in Armored Core VI and Monster Hunter World, in those two it feels the most like free performance, and that being a thing at all just dazzles me if I'm honest. The feeling of "I just downloaded more ram and it worked" gets me a bit.
What about power draw? Some people that optionally lock the FPS to 30 or 40(Myself included) typically aim to increase battery life moreso than capping framerate for stability.
I didn't notice a particular difference but I also wasn't trying to measure. My goal was to save folks time messing with all the options in the plugin, get straight to seeing it do what it can do so they can quickly evaluate. There's a lot of toggles and options that simply don't matter if it's intolerable in the configuration I wrote out.
That's typically my use case. I would love to have plugged/unplugged power profiles per game. Even better would be to be able to switch game configs themselves when switching (presuming game is not running).
That last one would take a serious crowdsourving database though. I had done that for Fortnite, to automatically sub out config files based on which monitor I was connected to. 1080p/60 at megaultra. Ultrawide 1080/75 at ultra, 4k/60 at ultra with a few key settings lowered.
Right now I'm playing V Rising, and my game config is drastically different based on if I'm trying to max pretty or battery life.
Either something improved massively or this must be highly subjective. I tried Lossless Scaling about a year and a half ago on PC with STALKER Anomaly: GAMMA to get more smooth 60+ fps (I normally get about 50), and I don't even think I tried framegen at that time, just FSR scaling to keep the latency down. Yet the input lag made it unplayable. I felt like I was playing Skyrim with triple buffering using a controller, not a fast first person shooter with a mouse.
Generally I think the manual recommends using scaling/framegen to get from 60 fps to 120+ on fast displays, because any way to get from 30 to 60 brings too much input lag. Not an issue with slow games, but I don't think it works with first person shooters.
My emphasis is on keeping a stable image, because the plugin doesn't take in the same way to every game.
With Elden Ring, for example, the shimmering to me is just intolerable. It feels OK to me to play, but is distracting to look at because of its perspective/camera position relative to what's on screen - you see stuff shimmering at the edges pretty much all the time during travel, and the player character will flit out of existence turning the camera side to side. In Anomaly such distortion is much less present, so if you don't have an issue with the input latency that one is more of a success story. Hopefully that makes some sense.
I can confirm Baldur's Gate 3 on a dual GPU LS setting with high settings for 3440x1440 goes from 10 FPS on a 2070 to 45/doubled to 90 FPS with the following settings: scale ~75% on 2070, 2x frame generation, preferred GPU/plugged in GPU 1050 Ti, upscaled to native resolution...and it looks good.
Dumb question... is that GPU setting just a config preset in BG3, or are you using an eGPU with your Steam Deck?
This setup might be cool for using my deck docked with the TV.
Not a dumb question, and neither! I'm running dual GPU with Lossless Scaling on a 2019 Mid-tier pre-built Alienware PC.
My motherboard has an extra PCIe 4.0 x16 (physical) x8 (electrical), and I bought a GTX 1050 Ti to add to the RTX 2070. Lossless Scaling tells BG3 to render frames on the 2070 at ~75% of desired resolution, then pass those frames to the 1050 Ti. This second GPU (only 35% of the Geekbench 4 score of the 2070) then does two things: it adds a frame between frames from the 2070, doubling apparent frame rate. It then upscales the frames to the full desired resolution. Neither of these requires much VRAM nor compute, so even the much older 1050 Ti can accomplish this. I then plug my monitor into the 1050 Ti, set the Lossless Scaling "Preferred GPU" as the same, and it runs beautifully.
It appears that telling the 2070 to run at 75% of my ideal resolution improved the frame rate from 10->45, and then the 1050 Ti upscaled and doubled it to 90 FPS with steady frame times.
I have an eGPU enclosure that I wanted to test this with, but regrettably I don't have any motherboards with Thunderbolt 3 or 4 right now. I thought I did, and was mistaken, as I was hoping to mess around with this. I bet if you had an eGPU that could work with your Steam Deck, you could pass to the eGPU as the "preferred GPU" and plug the TV in. I don't know of any USB-C eGPUs without Thunderbolt, though, and that's pretty important. Give it a shot, what's the harm?
Edit for clarity: You can definitely run LS on a single GPU, but there's a small performance hit to the frame rate. Based on estimates with a different game, I'd expect 10->40 doubled to 80 with just my 2070.
This is fascinating! I didn't know this type of setup was even possible. I haven't heard about dual-GPU setups in any context beyond two of the same model (crossfire, etc.), which seemed to be less common and not very beneficial as years went on.
I'll have to look into this when my current GPU eventually ages enough to struggle with playing new games.
Yep! SLI and CrossFire scale sublinearly because of the overhead coordinating the cards, and have all those asterisks attached. Lossless Scaling is $7 and can even run on an integrated GPU or similar, though only newer ones are recommended. It's honestly pretty miraculous software, and the fact it's only $7 to nearly double frame rate (on a single GPU) or more-than double (with a mixed setup) is pretty bonkers.
A YouTube link I found helpful: https://youtu.be/JyfKYU_mTLA
A community spreadsheet with Lossless Scaling info from various sources and GPUs: https://docs.google.com/spreadsheets/d/17MIWgCOcvIbezflIzTVX0yfMiPA_nQtHroeXB1eXEfI/edit?usp=drivesdk
So, does this end up looking more or less like what TV interpolation does? If so, I hope this doesn't get baked in as part of the default experience that I'll have to find registry edits in games (or equivalent) to disable.
I literally can't watch TVs with it on.
I generally agree for shows, but it's a lifesaver with some video games. My family has used interpolation to great effect with Pokémon games due to their poor performance.