You know, in every showcase of bleeding-edge graphics tech, I think it's really unfortunate that 90% of the games at the vanguards of these rollouts are just uninteresting. I don't really see...
You know, in every showcase of bleeding-edge graphics tech, I think it's really unfortunate that 90% of the games at the vanguards of these rollouts are just uninteresting. I don't really see myself ever playing Forspoken or Immortals of Aveum, and I feel like this happens a lot. Whenever something new like DLSS, RT, or FSR comes out, it's exciting! I get to researching if I should be upgrading my computer parts to stuff that can actually support it. I get kinda caught up in it, before realizing that at best, there are like two games I have any interest in playing that support the tech, and at that point it's back to square one of waiting for the tech to trickle down to just being background performance optimizations years and years down the line. (in short, still holding onto my Vega 64 lol)
That's fair. The unfortunate part of some new technologies is that they aren't always able to be retrofitted to the existing/older generation. Or it's not in the company's benefit to do so. As you...
That's fair. The unfortunate part of some new technologies is that they aren't always able to be retrofitted to the existing/older generation. Or it's not in the company's benefit to do so. As you mentioned adoption trickles down in those instances.
One advantage Nvidia has is the money to drop on studios who sign with them to use their Hot New Tech. Nvidia had been giving grants to studios, worth up to $1 million (or more, by some accounts), in technical expertise and perhaps the hardware necessary to implement the technology. They also have education and research grants as well.
But what Nvidia is really doing is buying adoption and either the loyalty of developers or entrenching them in the use of that technology now that they've learned it. That doesn't help customers much at all and it doesn't speed up adoption from existing customers, only new ones.
It's not a new strategy and I think one of the only ways around it is to make the technology FOSS and also have enough big players who are able to compete that developers aren't constrained by choices in which big player they're supporting. AMD has done some great work in making FreeSync, Vulkan, and ROCm open source, supporting OpenCL and other open source initiatives, but in order for the FidelityFX Super Resolution upscaling tech to succeed it needs to be adopted by studios on their own initiative because I don't think AMD is working with small studios to the same degree. Then those small studios need to be financially successful enough to make more games using that FOSS tech, but perhaps not so big that they sell out to a bigger studio/publisher that works for the other team. It's a big ask and takes a long time to pan out.
AMD initially had this same challenge in getting FreeSync to be adopted as well. Companies were split between throwing their support behind Nvidia or AMD and often not both. FreeSync was a much lower cost implementation, and some of that savings was passed on to the consumer when they purchased a monitor, but AMD wasn't necessarily making any money from that with each sale. I've heard that Nvidia was making something like $100 on each G-Sync monitor. (Nvidia has a thing since 2019 called "G-Sync Compatible" that supports the use of FreeSync, but of course they're only compatible with the 10-series GPUs and later.)
You know, in every showcase of bleeding-edge graphics tech, I think it's really unfortunate that 90% of the games at the vanguards of these rollouts are just uninteresting. I don't really see myself ever playing Forspoken or Immortals of Aveum, and I feel like this happens a lot. Whenever something new like DLSS, RT, or FSR comes out, it's exciting! I get to researching if I should be upgrading my computer parts to stuff that can actually support it. I get kinda caught up in it, before realizing that at best, there are like two games I have any interest in playing that support the tech, and at that point it's back to square one of waiting for the tech to trickle down to just being background performance optimizations years and years down the line. (in short, still holding onto my Vega 64 lol)
That's fair. The unfortunate part of some new technologies is that they aren't always able to be retrofitted to the existing/older generation. Or it's not in the company's benefit to do so. As you mentioned adoption trickles down in those instances.
One advantage Nvidia has is the money to drop on studios who sign with them to use their Hot New Tech. Nvidia had been giving grants to studios, worth up to $1 million (or more, by some accounts), in technical expertise and perhaps the hardware necessary to implement the technology. They also have education and research grants as well.
But what Nvidia is really doing is buying adoption and either the loyalty of developers or entrenching them in the use of that technology now that they've learned it. That doesn't help customers much at all and it doesn't speed up adoption from existing customers, only new ones.
It's not a new strategy and I think one of the only ways around it is to make the technology FOSS and also have enough big players who are able to compete that developers aren't constrained by choices in which big player they're supporting. AMD has done some great work in making FreeSync, Vulkan, and ROCm open source, supporting OpenCL and other open source initiatives, but in order for the FidelityFX Super Resolution upscaling tech to succeed it needs to be adopted by studios on their own initiative because I don't think AMD is working with small studios to the same degree. Then those small studios need to be financially successful enough to make more games using that FOSS tech, but perhaps not so big that they sell out to a bigger studio/publisher that works for the other team. It's a big ask and takes a long time to pan out.
AMD initially had this same challenge in getting FreeSync to be adopted as well. Companies were split between throwing their support behind Nvidia or AMD and often not both. FreeSync was a much lower cost implementation, and some of that savings was passed on to the consumer when they purchased a monitor, but AMD wasn't necessarily making any money from that with each sale. I've heard that Nvidia was making something like $100 on each G-Sync monitor. (Nvidia has a thing since 2019 called "G-Sync Compatible" that supports the use of FreeSync, but of course they're only compatible with the 10-series GPUs and later.)
Isn't that just all tech? It's a slow, expensive crawl until it's everywhere and you don't even notice it.