If you have the capability, I'd appreciate if you could report what the input lag difference is between Native and FSR 2.0. I spotted a 1 frame lag for FSR 2.0 in the youtube video. That may just be an editing error. It would be good to know.
It explicitly requires support from each individual game, as the article says. For cards, since AMD is not relying on ML hardware (which none of their cards support), a good guess would be full compatibility with RDNA 1 & 2, and most likely Vega as well.
Also possible this is RDNA 1,2,3 only since Vega is going to be 5 years old this august. If anyone is still on Vega or Polaris with prices finally starting to drop and RDNA 3 coming out this year will will also force prices down its time to upgrade.
But you are ignoring the fact that Vega was used as the iGPU in all Zen APUs up until Rembrandt/Van Gogh. These are prime candidates for an FSR 2.0 performance uplift, especially in laptops where you cannot simply add a GPU.
It's conceivable they add vega support not because vega is particularly relevant, but because they used vega-based tech in a bunch of still far more recent APU's and this _is_ the kind of tech an APU could benefit from. But who knows; they might not.
They've got RDNA 2 APUs coming up for sale, though. They might decide to go for the upgrade incentive over backwards compatibility.
I hope they don't do things that way, but I can't use RSR on an RDNA notebook dGPU because of the Vega iGPU, so they do seem to be moving in that direction.
Since this is a TAA-esque tech, just like DLSS, the issue may well be software support, rather than hardware. Even DLSS probably could have been ported to other hardware; I suspect NVidia was trying to differentiate their flagship products rather than being intrinsically limited. After all, AI inference algorithms run just find on all kinds of GPUs, not just DLSS supporting ones.
DLSS 1.9 was the 2.0 DLSS running without the tensor cores. It was not nearly as good as the 2.0 that came out soon afterwards. I don't know how much better it could have gotten if NVIDIA had continued to develop it, but the fact that tensor cores are much better for deep learning application is not a marketing gimmick, it's a reality. With the tensor cores the GPU can inference more complicated networks within the allotted amount of time between frames and so get more accurate results and thus higher quality from a lower input resolution, resulting in higher performance, as well.
Tensor cores are also more power-efficient, assuming the complexity of your network is constant. And burning lots of power on DLSS could trigger/worsen clock throttling.
It definitely looks better now. As it should! Still native resolution is a king, but these looks actually very useable just like resent DLSS versions. Do you remember how bad first generation DLSS was... It was really bad. It looks like this is AMD DLSS 2.0 moment, when things starts to click in right places.
You are correct and I already tried to remind the NV fan boys they were comparing DLSS 2.xxx to FSR 1.0 and give AMD time to work on it. Now that time is here.
V1 was/is pretty decent at high resolutions and quality modes, which is useful for some classes of GPU. Sadly it's lacking at 1080p where it's arguably most needed. It does also have the advantage of being able to be applied globally, though, like on Steam Deck or with RSR.
Good thing AMD dropped driver support for Fiji in July and gives us the downgrade of the PCI-e 4x 6500 XT as the obviously superior option to having things like working fan control in Windows 10.
AMD moved the 7 year old Fiji GPUs to the legacy driver stack. That means they will still get updates on an as needed basis. The GCN gen 3 architecture, first released in 2014, that Fiji was based on, has not been sold or particularly relevant in a long time (it might still be in the embedded space, but that is a completely different situation). The last time GCN gen 3 was used in a consumer product was with Bristol Ridge & Stony Ridge APUs in 2016!
The 6500XT is a perfectly fine for modern PCIe 4.0 enabled systems without integrated graphics that need a cheap GPU for basic office work, web browsing, coding, light gaming (think easy to run indie games and older games), etc. While I would like the price to be lower, with the current market that just is not viable.
'The 6500XT is a perfectly fine for modern PCIe 4.0 enabled systems'
'Modern' is the equivalent of 'natural' on food labels.
'That means they will still get updates on an as-needed basis.'
What it actually means is that AMD won't bother to update the drivers when updates are needed. Fan control is already broken in Windows 10 and some game-breaking incompatibilities have been known for quite some time. Windows 11 also hit the market shortly after AMD dropped Fiji support.
Bottom line is that AMD prematurely discontinued driver support for Fiji and none of the excuses wash.
Fiji GPUs were being sold in 2017 and AMD dropped Fiji driver support halfway through 2021.
Pathetically short support period, especially in history's worst GPU shortage + the high cost of Fiji cards + releasing an inferior product in 2022 + dropping driver support shortly before Windows 11.
Just pathetic, including the unwillingness to hold AMD accountable.
I don't know why you insist on comparing Fiji to the 6500 XT. It's a deeply silly comparison.
What is "holding AMD accountable" meant to be? Fuming in a comment section? Come off it. You just thrashed and hissed at someone making the valid point that the 6500 XT is a flawed product that nonetheless has a place in the current absurd GPU market. You're not here to hold anyone accountable, you're just having a rant.
I got a TR Pro workstation with a 'Fiji ' dual-BIOS Sapphire R9 Fury Tri-X OC, purchased new for $250 years ago. I am running "Radeon Pro Software for Enterprise" doing 'content creation' primary in Sony Vegas 18. I replaced the original GPU (Quadro P2200 5GB) as it cannot keep up in general compute with that 'ancient' Pirate Islands Franken-Card.
It says more about you when you always dump on AMD ... just sayin. ...
In spite of its deficiencies it is objectively one of the best value-for-money GPUs out there right now - if you have the right system to put it in. RSR will probably help it out a fair bit, too.
Bit weird to talk of the 6500 XT right after Fiji though. Not exactly comparable classes of GPU...
And not even comparably priced! 6500 XT has been pretty consistently going for ~$250, since launch. The cheapest Fiji (aka Fury) card launched at about double the price.
"There are multiple temporal-style methods that don’t rely on ML"
Not for upscaling as far as I know... for AA, perhaps, but the benefit of that so far seems to be questionable as well.
"So at this point I would be more surprised if AMD didn’t use one."
I have mixed feelings on this as well. Any kind of shader-based neural net, even a simple/primitive one, just seems like it would have a huge performance hit, especially on weaker GPUs that really need it.
I hope I'm wrong though, as a really fast, shader based network that uses multiple frames + motion vectors would be cool.
The good results in that screenshot are not impossible to get without a network, especially if the native output was already blurred by the AA.
From my recollection of the things said to the run-up of the release of FSR (1.0) I think FSR 2.0 is what AMD hoped it could release back then, but they just didn't get it ready in time. Because despite what media sites claimed when the original FSR was released, AMD did indicate that FSR was a DLSS competitor by the things they said in interviews and by the fact that they never knocked back at the rampant media reports treating FSR as a DLSS competitor. Of course their PR people were aware of what the impression of the upcoming release was out in the wild and they allowed that impression to persist even though it was completely inaccurate to FSR (1.0).
Regarding FSR 2.0, it's not going to be as easy to add to games as FSR 1.0. We'll have to see how much support it gets in addition to how good it is, of course).
I'm not so sure. Will they add it just to support 10 series owners who will most likely soon upgrade? After that there will be a certain number of 16 series owners that won't upgrade in the next year. Thirdly you have 400, 500, 5000, and 6000 series owners, but all that adds up to about 6% of the steam survey, less than the 2060 + 2060 SUPER share on steam. (As a comparison, right now 16 series accounts for between 15% and 16% of the survey, and many of those can be expected to upgrade in the next year.)
Also, I'd imagine that the 400 and 500 series cards, which account for over 2/3 (>4%) of that ~6%, are not good candidates for upscaling newer games. These technologies don't upscale to 1080p as well as they upscale to higher resolutions. I don't think 400 or 500 series cards are going to be upscaling new games to 4K. The 6000 series has had extremely small market penetration.
If they add FSR 2.0 it could end up being because it's from AMD and they don't want to give the impression that are leaving AMD owners out in the cold. If they are already adding DLSS it might not be that hard to also add FSR 2.0. But look towards the actual market share they address with FSR 2.0, they don't address a whole lot by adding it on top of DLSS. It's likely they know that in the future AMD will have matrix engines on their GPUs and that AMD will support XeSS on those cards. I don't know what sort of shape XeSS is in or how much the big developers know about its shape, but supporting FSR 2.0 on top of DLSS seems like it addresses a currently small and very-soon-to-be-dwindling market.
‘AMD and they don't want to give the impression that are leaving AMD owners out in the cold.’
AMD had the nerve to dump Fiji owners by the side of the road way back in July. It is safe to say that the current management isn’t very concerned about mindshare in terms of support terms.
Of course, the tech press barely covered it then and has completely buried the story since, despite the glaring connections to the 6500 XT.
How dare an entry level GPU designed for tiny notebooks perform worse than a former heavyweight GPU built back when everything was way cheaper to manufacture, ship and sell!
I don't see it now, but I remember getting the impression XeSS used jitter to get a diversity of sample positions. I wonder if FSR 2.0 is doing anything similar. Over multiple frames, that could provide some sub-pixel precision that one can use with conventional super-resolution techniques, with the basic idea that you can project enough samples into a higher-resolution grid to cancel out much of the aliasing. The motion vectors would have to be very precise, however.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
46 Comments
Back to Article
Makaveli - Thursday, March 17, 2022 - link
installing these drivers now.linuxgeex - Friday, March 18, 2022 - link
If you have the capability, I'd appreciate if you could report what the input lag difference is between Native and FSR 2.0. I spotted a 1 frame lag for FSR 2.0 in the youtube video. That may just be an editing error. It would be good to know.cigar3tte - Thursday, March 17, 2022 - link
What lineup of cards will support FSR 2.0? And does it require support from games as well?NextGen_Gamer - Thursday, March 17, 2022 - link
It explicitly requires support from each individual game, as the article says. For cards, since AMD is not relying on ML hardware (which none of their cards support), a good guess would be full compatibility with RDNA 1 & 2, and most likely Vega as well.Makaveli - Thursday, March 17, 2022 - link
Also possible this is RDNA 1,2,3 only since Vega is going to be 5 years old this august. If anyone is still on Vega or Polaris with prices finally starting to drop and RDNA 3 coming out this year will will also force prices down its time to upgrade.vlad42 - Thursday, March 17, 2022 - link
But you are ignoring the fact that Vega was used as the iGPU in all Zen APUs up until Rembrandt/Van Gogh. These are prime candidates for an FSR 2.0 performance uplift, especially in laptops where you cannot simply add a GPU.Oxford Guy - Thursday, March 17, 2022 - link
OMG. Five whole years? Someone call Methuselah for a reaction vid.mode_13h - Monday, March 28, 2022 - link
5 years *would* be a long time for GPUs, if not for the sorry state of the GPU market for much of that time.emn13 - Thursday, March 17, 2022 - link
It's conceivable they add vega support not because vega is particularly relevant, but because they used vega-based tech in a bunch of still far more recent APU's and this _is_ the kind of tech an APU could benefit from. But who knows; they might not.Spunjji - Monday, March 21, 2022 - link
They've got RDNA 2 APUs coming up for sale, though. They might decide to go for the upgrade incentive over backwards compatibility.I hope they don't do things that way, but I can't use RSR on an RDNA notebook dGPU because of the Vega iGPU, so they do seem to be moving in that direction.
Zoolook - Thursday, April 7, 2022 - link
Nvidia usually pulls shit like that, AMD usually doesn't, but we'll see.emn13 - Thursday, March 17, 2022 - link
Since this is a TAA-esque tech, just like DLSS, the issue may well be software support, rather than hardware. Even DLSS probably could have been ported to other hardware; I suspect NVidia was trying to differentiate their flagship products rather than being intrinsically limited. After all, AI inference algorithms run just find on all kinds of GPUs, not just DLSS supporting ones.Yojimbo - Thursday, March 17, 2022 - link
DLSS 1.9 was the 2.0 DLSS running without the tensor cores. It was not nearly as good as the 2.0 that came out soon afterwards. I don't know how much better it could have gotten if NVIDIA had continued to develop it, but the fact that tensor cores are much better for deep learning application is not a marketing gimmick, it's a reality. With the tensor cores the GPU can inference more complicated networks within the allotted amount of time between frames and so get more accurate results and thus higher quality from a lower input resolution, resulting in higher performance, as well.mode_13h - Monday, March 28, 2022 - link
Tensor cores are also more power-efficient, assuming the complexity of your network is constant. And burning lots of power on DLSS could trigger/worsen clock throttling.rmfx - Thursday, March 17, 2022 - link
The improvement is mindblowing.V1 was a total joke.
V2 (if as good as shown here) is a true DLSS contender that will make me buy AMD.
Makaveli - Thursday, March 17, 2022 - link
Agreed I think RDNA 3 is going to surprise alot of people.haukionkannel - Thursday, March 17, 2022 - link
It definitely looks better now. As it should! Still native resolution is a king, but these looks actually very useable just like resent DLSS versions.Do you remember how bad first generation DLSS was... It was really bad. It looks like this is AMD DLSS 2.0 moment, when things starts to click in right places.
Makaveli - Thursday, March 17, 2022 - link
You are correct and I already tried to remind the NV fan boys they were comparing DLSS 2.xxx to FSR 1.0 and give AMD time to work on it. Now that time is here.Wereweeb - Thursday, March 17, 2022 - link
V1 had no artifacts, since it didn't use a temporal algorithm.Zizy - Thursday, March 17, 2022 - link
It did have shimmering. Avoided ghosting though.Spunjji - Monday, March 21, 2022 - link
V1 was/is pretty decent at high resolutions and quality modes, which is useful for some classes of GPU. Sadly it's lacking at 1080p where it's arguably most needed. It does also have the advantage of being able to be applied globally, though, like on Steam Deck or with RSR.Oxford Guy - Thursday, March 17, 2022 - link
Good thing AMD dropped driver support for Fiji in July and gives us the downgrade of the PCI-e 4x 6500 XT as the obviously superior option to having things like working fan control in Windows 10.vlad42 - Friday, March 18, 2022 - link
AMD moved the 7 year old Fiji GPUs to the legacy driver stack. That means they will still get updates on an as needed basis. The GCN gen 3 architecture, first released in 2014, that Fiji was based on, has not been sold or particularly relevant in a long time (it might still be in the embedded space, but that is a completely different situation). The last time GCN gen 3 was used in a consumer product was with Bristol Ridge & Stony Ridge APUs in 2016!The 6500XT is a perfectly fine for modern PCIe 4.0 enabled systems without integrated graphics that need a cheap GPU for basic office work, web browsing, coding, light gaming (think easy to run indie games and older games), etc. While I would like the price to be lower, with the current market that just is not viable.
Oxford Guy - Monday, March 21, 2022 - link
'The 6500XT is a perfectly fine for modern PCIe 4.0 enabled systems''Modern' is the equivalent of 'natural' on food labels.
'That means they will still get updates on an as-needed basis.'
What it actually means is that AMD won't bother to update the drivers when updates are needed. Fan control is already broken in Windows 10 and some game-breaking incompatibilities have been known for quite some time. Windows 11 also hit the market shortly after AMD dropped Fiji support.
Bottom line is that AMD prematurely discontinued driver support for Fiji and none of the excuses wash.
Oxford Guy - Monday, March 21, 2022 - link
'Modern' PCI-e 4x!'Modern' 4 GB of VRAM!
Many 'modern' excuses to go with this, given the discontinuation of Fiji driver support in the midst of history's worst GPU shortage.
Oxford Guy - Monday, March 21, 2022 - link
Fiji GPUs were being sold in 2017 and AMD dropped Fiji driver support halfway through 2021.Pathetically short support period, especially in history's worst GPU shortage + the high cost of Fiji cards + releasing an inferior product in 2022 + dropping driver support shortly before Windows 11.
Just pathetic, including the unwillingness to hold AMD accountable.
Spunjji - Monday, March 21, 2022 - link
I don't know why you insist on comparing Fiji to the 6500 XT. It's a deeply silly comparison.What is "holding AMD accountable" meant to be? Fuming in a comment section? Come off it. You just thrashed and hissed at someone making the valid point that the 6500 XT is a flawed product that nonetheless has a place in the current absurd GPU market. You're not here to hold anyone accountable, you're just having a rant.
Oxford Guy - Wednesday, March 23, 2022 - link
‘It's a deeply silly comparison.’Stopped reading there. No point in going further when the post is clearly illogical.
mode_13h - Monday, March 28, 2022 - link
Ooh. Certainly seems like *someone* doesn't like being held to account!Abort-Retry-Fail - Saturday, April 23, 2022 - link
Oxie Strikes Again (rolling eyes)
I got a TR Pro workstation with a 'Fiji ' dual-BIOS Sapphire R9 Fury Tri-X OC, purchased new for $250 years ago. I am running "Radeon Pro Software for Enterprise" doing 'content creation' primary in Sony Vegas 18. I replaced the original GPU (Quadro P2200 5GB) as it cannot keep up in general compute with that 'ancient' Pirate Islands Franken-Card.
It says more about you when you always dump on AMD ... just sayin. ...
Spunjji - Monday, March 21, 2022 - link
In spite of its deficiencies it is objectively one of the best value-for-money GPUs out there right now - if you have the right system to put it in. RSR will probably help it out a fair bit, too.Bit weird to talk of the 6500 XT right after Fiji though. Not exactly comparable classes of GPU...
mode_13h - Monday, March 28, 2022 - link
> Not exactly comparable classes of GPU...And not even comparably priced! 6500 XT has been pretty consistently going for ~$250, since launch. The cheapest Fiji (aka Fury) card launched at about double the price.
brucethemoose - Thursday, March 17, 2022 - link
"There are multiple temporal-style methods that don’t rely on ML"Not for upscaling as far as I know... for AA, perhaps, but the benefit of that so far seems to be questionable as well.
"So at this point I would be more surprised if AMD didn’t use one."
I have mixed feelings on this as well. Any kind of shader-based neural net, even a simple/primitive one, just seems like it would have a huge performance hit, especially on weaker GPUs that really need it.
I hope I'm wrong though, as a really fast, shader based network that uses multiple frames + motion vectors would be cool.
The good results in that screenshot are not impossible to get without a network, especially if the native output was already blurred by the AA.
Yojimbo - Thursday, March 17, 2022 - link
From my recollection of the things said to the run-up of the release of FSR (1.0) I think FSR 2.0 is what AMD hoped it could release back then, but they just didn't get it ready in time. Because despite what media sites claimed when the original FSR was released, AMD did indicate that FSR was a DLSS competitor by the things they said in interviews and by the fact that they never knocked back at the rampant media reports treating FSR as a DLSS competitor. Of course their PR people were aware of what the impression of the upcoming release was out in the wild and they allowed that impression to persist even though it was completely inaccurate to FSR (1.0).Regarding FSR 2.0, it's not going to be as easy to add to games as FSR 1.0. We'll have to see how much support it gets in addition to how good it is, of course).
Khanan - Friday, March 18, 2022 - link
It will probably be as hard to add to games as DLSS, but with universal GPU support aside from ancient ones, it should be well adopted.Yojimbo - Friday, March 18, 2022 - link
I'm not so sure. Will they add it just to support 10 series owners who will most likely soon upgrade? After that there will be a certain number of 16 series owners that won't upgrade in the next year. Thirdly you have 400, 500, 5000, and 6000 series owners, but all that adds up to about 6% of the steam survey, less than the 2060 + 2060 SUPER share on steam. (As a comparison, right now 16 series accounts for between 15% and 16% of the survey, and many of those can be expected to upgrade in the next year.)Also, I'd imagine that the 400 and 500 series cards, which account for over 2/3 (>4%) of that ~6%, are not good candidates for upscaling newer games. These technologies don't upscale to 1080p as well as they upscale to higher resolutions. I don't think 400 or 500 series cards are going to be upscaling new games to 4K. The 6000 series has had extremely small market penetration.
If they add FSR 2.0 it could end up being because it's from AMD and they don't want to give the impression that are leaving AMD owners out in the cold. If they are already adding DLSS it might not be that hard to also add FSR 2.0. But look towards the actual market share they address with FSR 2.0, they don't address a whole lot by adding it on top of DLSS. It's likely they know that in the future AMD will have matrix engines on their GPUs and that AMD will support XeSS on those cards. I don't know what sort of shape XeSS is in or how much the big developers know about its shape, but supporting FSR 2.0 on top of DLSS seems like it addresses a currently small and very-soon-to-be-dwindling market.
Oxford Guy - Friday, March 18, 2022 - link
‘AMD and they don't want to give the impression that are leaving AMD owners out in the cold.’AMD had the nerve to dump Fiji owners by the side of the road way back in July. It is safe to say that the current management isn’t very concerned about mindshare in terms of support terms.
Of course, the tech press barely covered it then and has completely buried the story since, despite the glaring connections to the 6500 XT.
vlad42 - Friday, March 18, 2022 - link
Yeah, how dare they move a 7 year old GPU to the legacy drive stack... \sOxford Guy - Monday, March 21, 2022 - link
How dare they release a card with worse performance whilst dropping driver support for a better one in the midst of history's worst GPU shortage.If your car is older than 5 years send it to the junkyard and buy an inferior new one. That's 'modern'.
Oxford Guy - Monday, March 21, 2022 - link
Fiji was being sold in 2017. AMD dropped Fiji driver support in about halfway through 2021.So much for your 7-year period excuse.
Spunjji - Monday, March 21, 2022 - link
Nvidia were still actively selling Fermi when they cancelled support for it.Most people who don't have an axe to (very publicly) grind count from when the architecture is new.
Spunjji - Monday, March 21, 2022 - link
How dare an entry level GPU designed for tiny notebooks perform worse than a former heavyweight GPU built back when everything was way cheaper to manufacture, ship and sell!Oxford Guy - Wednesday, March 23, 2022 - link
Your posts on this subject are illogical and frivolous.Yojimbo - Saturday, March 19, 2022 - link
I was saying that the developers don't want to give the impression that they are leaving AMD owners out in the cold.Spunjji - Monday, March 21, 2022 - link
Good grief, you really did just go ahead and territorially mark the whole comments section with this stuff. Oof."despite the glaring connections to the 6500 XT"
🤡
mode_13h - Monday, March 28, 2022 - link
I don't see it now, but I remember getting the impression XeSS used jitter to get a diversity of sample positions. I wonder if FSR 2.0 is doing anything similar. Over multiple frames, that could provide some sub-pixel precision that one can use with conventional super-resolution techniques, with the basic idea that you can project enough samples into a higher-resolution grid to cancel out much of the aliasing. The motion vectors would have to be very precise, however.