• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Doom Eternal's Ray Tracing Upgrade Analyzed - Best PC Settings + PS5/Xbox Series X Comparisons

Andodalf

Banned
Nvidia were really ahead of their time when they dropped all of this stuff in 2018. Imagine if Sony and Microsoft weren't cheap and partnered with Nvidia instead. The other reconstruction techniques are only considered good when not compared to DLSS. Its like me saying I am good at basketball if you don't include NBA or Collegiate players.
Consoles need to be APU based these days and Nvidia doesn’t make those
 

OverHeat

« generous god »
Man, that PC DLSS secret sauce:

iu


Meanwhile, that PS5 SSD secret sauce:

iu
You win the thread!
 

CrustyBritches

Gold Member
Possible example of a New Nintendo Switch "next-gen" upgrade for Eternal? 720p handheld/1080p docked, DLSS(Performance), DRS(60fps target), Med/High settings mix, Ultra texture filtering. 2.8GB VRAM usage.
Handheld Mode(720p DLSS(Performance)):
eIZyNEF.jpg
kvXUCgo.jpg

Docked Mode(1080p DLSS(Performance)):
inWpSw2.jpg
seo9Qy1.jpg
Doesn't look too bad for what's probably internal res of something like 240-360p on 720p DLSS(Perf), and 360-540p on 1080p DLSS(Perf).
 

skit_data

Member
Possible example of a New Nintendo Switch "next-gen" upgrade for Eternal? 720p handheld/1080p docked, DLSS(Performance), DRS(60fps target), Med/High settings mix, Ultra texture filtering. 2.8GB VRAM usage.

Doesn't look too bad for what's probably internal res of something like 240-360p on 720p DLSS(Perf), and 360-540p on 1080p DLSS(Perf).
I actually think it would be pretty funny if Nintendo came out of left field and just smoked both the PS5 and XSX by offering a solid DLSS implementation on their next console.

It would truly make them look ridiculous. I don’t think that’s gonna happen, but it would be funny.
 
VRS seems to be the easiest of the hardware supported RDNA2 features to use and the one that gives the smallest performance benefits. Once we get SFS, Mesh Shaders and then VRS on what's displayed then we'll really find out how much the hardware support helps.

It's pretty clear to me the biggest beneficiary to Series X will be Sampler Feedback Streaming, because it's quite obvious the Series X GPU is made to operate with it in use. The massive video memory savings will be a big deal, and in turn will also lead to crucial memory bandwidth and SSD bandwidth savings.

If on top of this Microsoft ends up delivering on their claims that Machine Learning accelerated Super Resolution will actually become a thing on Xbox Series X, then who knows just how good it might become.
 

Lysandros

Member
As for the res difference at 120fps mode, I've noticed games generally tend to be more dependant on compute but DOOM Eternal - when rendering at 100+fps - is one of the heaviest hitters of memory bandwidth and scales with BW increase. So the combination of XSX's +25% more BW and VRS being enabled likely shows why there's a 29% res (1584p vs 1800p) gap here in this mode.
Do you really think that XSX's split bandwidth solution offers a clear cut +25% RAM bandwidth advantage over PS5 without any compromise (as if it was a full 16GB/560 GB/s setup)? Additionally, if the resolution difference of the 120 FPS mode happens to be true, i would surprised if PS5 doesn't perform better in it. We had plenty of titles slightly favoring PS5 in 120 FPS modes 'at the same resolution' in the past.
 
Last edited:

Md Ray

Member
Do you really think that XSX's split bandwidth solution offers a clear cut +25% RAM bandwidth advantage over PS5 without any compromise (as if it was a full 16GB/560 GB/s setup)? Additionally, if the resolution difference of the 120 FPS mode happens to be true, i would surprised if PS5 doesn't perform better in it. He had plenty of titles slightly favoring PS5 in 120 FPS modes 'at the same resolution' in the past.
Well, considering how bandwidth-intensive the game is on PC, and how they've chosen 1584p max target res as opposed to 1800p, indicates to me that bandwidth was likely the reason for targeting lower res in 120fps mode. The game really does behave very differently compared to other games like, say, WD:L, AC Valhalla, Death Stranding which lean heavily towards compute and don't really rely much on BW, even if they're rendering in 4K.

Still, I have a feeling the XSX will probably spend most of its time under 1800p to keep the frame rate at 120fps while PS5 will probably stick to its target 1584p more often while also having consistent 120fps. We'll see.
 
Last edited:

Truespeed

Member
Nvidia boned Microsoft and then boned Sony. It’s not about being cheap but more about how Nvidia handled their respective partnerships in the past.

I was curious about this and started looking into the specifics. Apparently, Microsoft tried to claw back $13 Million in payments because it wanted a reduction in chip set pricing. The part I didn't know about was how Microsoft boned Nvidia by rendering a large amount of their inventory obsolete when Microsoft changed their security keys - which were already baked into the chips.

In a regulatory filing on Tuesday, Nvidia said that an interim ruling by a third-party arbitrator required Nvidia to supply Microsoft's "reasonable requirements" for Xbox chipsets while the case is pending. Nvidia and Microsoft entered into a binding arbitration process in April over how many Xbox chips Nvidia is required to produce, and what price Microsoft should pay.

The arrangement to supply graphics hardware for the Xbox, Microsoft's gaming console, has been in place since the spring of 2000 when Microsoft paid Nvidia an advance of $200m (about £128m) against future orders. The deal has not, however, proved lucrative for Nvidia so far. In July, Nvidia warned that it has been forced to write off a large amount of inventory, including Xbox chips that became obsolete when Microsoft changed security codes for the console to thwart hackers.

Microsoft is currently paying Nvidia in advance for chips it orders, but is asking arbitrators to reduce the chips' prices and to award damages. (Nvidia is also asking for price relief and for damages.) As a result, Nvidia has been forced to set aside part of the income from the Xbox chips, amounting to the difference between what Microsoft is paying and what it says it should be paying. This was about $46.2m as of 28 July, according to Nvidia. The arbitration is not expected to conclude until next June.

If it loses the arbitration, Nvidia has warned in regulatory filings that it could be forced to produce Xbox chips at a loss, and to reduce production of other products to fulfil Xbox demand. The company could also be compelled to licence its intellectual property as part of the settlement, Nvidia said.

"Even if the Company does prevail, there can be no assurance that its business will not be materially harmed," Nvidia stated in the filing.

Microsoft has a motive to reduce component prices, since it is estimated to be losing as much as $150 for each Xbox sold. The company recently introduced a new video processor into the console as a way of reducing prices. Flextronics International, which manufactures the Xbox, is in the process of transferring some Xbox production from Hungary to China as a means of further reducing costs.
 
Do you really think that XSX's split bandwidth solution offers a clear cut +25% RAM bandwidth advantage over PS5 without any compromise (as if it was a full 16GB/560 GB/s setup)? Additionally, if the resolution difference of the 120 FPS mode happens to be true, i would surprised if PS5 doesn't perform better in it. We had plenty of titles slightly favoring PS5 in 120 FPS modes 'at the same resolution' in the past.
All modes use DRS, including 120hz mode. The RT mode already looks like it's running at a higher resolution on PS5 even if both target 2160p.
 
Nvidia were really ahead of their time when they dropped all of this stuff in 2018. Imagine if Sony and Microsoft weren't cheap and partnered with Nvidia instead. The other reconstruction techniques are only considered good when not compared to DLSS. Its like me saying I am good at basketball if you don't include NBA or Collegiate players.
That's what I love about Nvidia stuff so much- even though its locked into their hardware, stuff like Gsync, DLSS and being first to push RTX effects makes paying that premium worth it. I like AMD too but it just feels like they're always a day late and dollar short with innovating on graphical features for their GPUs and just playing catch up to Nvidia with some sort of more open solution that's almost always not as good as the Nvidia solution.

The RTX and DLSS additions to Doom Eternal got me back into the game though, for sure. Game looked and ran great on my 2080Ti last year when I played a bunch of it and honestly, the RTX effects aren't super noticable in action, only because you're flying through the game so fast with all of the action.
 

Hoddi

Member
Well, considering how bandwidth-intensive the game is on PC, and how they've chosen 1584p max target res as opposed to 1800p, indicates to me that bandwidth was likely the reason for targeting lower res in 120fps mode. The game really does behave very differently compared to other games like, say, WD:L, AC Valhalla, Death Stranding which lean heavily towards compute and don't really rely much on BW, even if they're rendering in 4K.

Still, I have a feeling the XSX will probably spend most of its time under 1800p to keep the frame rate at 120fps while PS5 will probably stick to its target 1584p more often while also having consistent 120fps. We'll see.

Which GPU did you test it on? I just tried downclocking my own 2080Ti and found very little difference between 13MT/s and 15MT/s at 4k. It also didn't seem to matter if DLSS was enabled or not.
 

MrFunSocks

Banned
I recently joined the "true" PC Master Race - got an Aurora 10 Ryzen edition with the Ryzen 7 5800 and RTX3070. My "gaming" machine was previously my Hades Canyon NUC, which is incredible for the size, but was basically a medium-high settings 1080p gaming machine (which was fine since it was my work machine), which didn't really work anymore now that I'm using a 1440p ultrawide monitor with my PC.

Downloaded Doom Eternal after seeing this video and hot damn, getting essentially locked 100fps (monitor is 100fps VRR/Gsync) with Ray tracing and everything on Ultra at 3440x1400. Next time I play it I'll start putting things up to Nightmare to see how far I can push it before the framerate is affected too much. Might hook it up to my 4K LG OLED and use DLSS to compare it to the Series X version too. The ray tracing is just incredible, being able to see reflections of enemies/projectiles/etc that are behind you is amazing.
 
Last edited:

MrFunSocks

Banned
Soooo for consoles which is the version to get: PS5 or XsX ??
One factor to think about is if you want to have Eternal on the same platform as all future Doom games, might be something you care about. Doom is Xbox exclusive from now on.

We don't have any DF etc comparisons yet, but they'll likely be 99.9% the same graphically and performance wise, just with the Series X having a higher max resolution in 120fps mode.
 
Last edited:

SkylineRKR

Member
Fuck RT. I'm playing 120hz and this is the first game that made me think I can't go back to 60. 60fps is quite some judder if you turn, when coming from 120hz.

I'm playing on S btw so I have no RT but it doesn't matter. Game looks good but plays even better. Didn't want to rebuy the PS version. My 120hz experience has been a mixed back on PS5 so far, perhaps due to the lack of VRR.
 
Last edited:

Inviusx

Member
Fuck RT. I'm playing 120hz and this is the first game that made me think I can't go back to 60. 60fps is quite some judder if you turn, when coming from 120hz.

I'm playing on S btw so I have no RT but it doesn't matter. Game looks good but plays even better. Didn't want to rebuy the PS version. My 120hz experience has been a mixed back on PS5 so far, perhaps due to the lack of VRR.

Agreed, 120hz feels phenomenal in Eternal. I didn't like 120hz in Dirt 5 because the resolution hit was insane but Eternal just looks like the regular game but at an crazy fast frame rate.

I never thought I would turn my nose up at 60fps but here we are.
 

DenchDeckard

Moderated wildly
Well, had the craziest thing with doom eternal on my PC. Never had issues with it, pc plays all other games fine. Did the update on the windows store and tried running it. Doom eternal actually fully crashed my pc and made it restart. I hope my PSU isn’t on the way out. The pc is running all other games fine.
ive deleted doom and I’m trying a reinstall but not sure what’s happening there. Just plays the splash screens then bam, pc restarts :/
 

Md Ray

Member
Which GPU did you test it on? I just tried downclocking my own 2080Ti and found very little difference between 13MT/s and 15MT/s at 4k. It also didn't seem to matter if DLSS was enabled or not.
My assessment is based upon HW Unboxed's 3070 Ti vs 3070 benchmarks. If you look at their specs, the only major difference is in mem bandwidth, compute power difference is like 4% more or less. So any large performance uplift for the 3070 Ti must have to come from that 608 GB/s upgrade.

Out of 12 games tested, only 2 saw the biggest uplift. One was DOOM Eternal with 18% higher avg framerate at 4K (followed by Borderlands 3 which also saw a 18% uplift), most of the games were in the 6-7% range, even less so at 1440p. This indicates the game favors BW more.

Moreover, if you look at 3070 vs 3060 Ti which have identical 448 GB/s bandwidth, but a big gulf between them in computational power - DOOM Eternal at 1440p sees the smallest uplift of just 5% for the 3070, all the other games were generally 15-16% faster. This tells you how little Eternal cares about compute power compared to other games.
 
Last edited:
Do you really think that XSX's split bandwidth solution offers a clear cut +25% RAM bandwidth advantage over PS5 without any compromise (as if it was a full 16GB/560 GB/s setup)? Additionally, if the resolution difference of the 120 FPS mode happens to be true, i would surprised if PS5 doesn't perform better in it. We had plenty of titles slightly favoring PS5 in 120 FPS modes 'at the same resolution' in the past.
Come on man it's a big resolution difference it's not minor, unless ID wasn't getting stable performance on Series X they certainly wouldn't have gone for such a significant resolution difference. The two will be in the same ballpark in terms of performance but with a 29% resolution advantage its expected that PS5 will have a minor advantage.

We know earlier titles were held back by the tools on Series X and probably still are in some cases, expecting PS5 to continue to have an advantage is very optimistic IMO.
 

Shmunter

Member
Come on man it's a big resolution difference it's not minor, unless ID wasn't getting stable performance on Series X they certainly wouldn't have gone for such a significant resolution difference. The two will be in the same ballpark in terms of performance but with a 29% resolution advantage its expected that PS5 will have a minor advantage.

We know earlier titles were held back by the tools on Series X and probably still are in some cases, expecting PS5 to continue to have an advantage is very optimistic IMO.
That’s the max drs threshhold, could be when looking at you foot for all we know. We don’t know which console performs better on avg at 120.

Certainly the 60fps modes have better visuals on PS5 due to the XsX vrs feature.
 
Last edited:
This channel isn't a reliable source.

The uploader themselves had this to say:
"I told you that I am currently not in a position to create content. This video is made by external help following my instructions. I am sorry for any errors that may be in the comparison. I hope I can return the channel soon. A hug!"

So he's blaming the external help instead of himself for those errors. That's just sad.
 

Bogroll

Likes moldy games
That’s the max drs threshhold, could be when looking at you foot for all we know. We don’t know which console performs better on avg at 120.

Certainly the 60fps modes have better visuals on PS5 due to the XsX vrs feature.
Looking at the 120 mode on the banned channel you can see the X is sharper and its not zoomed in that much at all (and not looking at your foot ) But no one will notice in real life but for these threads in side by side its sharper on X and seems to be 120 locked on both.
 

Mr Moose

Member
Looking at the 120 mode on the banned channel you can see the X is sharper and its not zoomed in that much at all (and not looking at your foot ) But no one will notice in real life but for these threads in side by side its sharper on X and seems to be 120 locked on both.
Worse AF in every mode, also.
I'll wait for different users to present their findings though (DF/NX/VG) as this users videos have been very wrong in the past.
 

Hoddi

Member
My assessment is based upon HW Unboxed's 3070 Ti vs 3070 benchmarks. If you look at their specs, the only major difference is in mem bandwidth, compute power difference is like 4% more or less. So any large performance uplift for the 3070 Ti must have to come from that 608 GB/s upgrade.

Out of 12 games tested, only 2 saw the biggest uplift. One was DOOM Eternal with 18% higher avg framerate at 4K (followed by Borderlands 3 which also saw a 18% uplift), most of the games were in the 6-7% range, even less so at 1440p. This indicates the game favors BW more.

Moreover, if you look at 3070 vs 3060 Ti which have identical 448 GB/s bandwidth, but a big gulf between them in computational power - DOOM Eternal at 1440p sees the smallest uplift of just 5% for the 3070, all the other games were generally 15-16% faster. This tells you how little Eternal cares about compute power compared to other games.

I don't know how reliable that is. Doom Eternal ideally needs more than 8GB at 4k so it's possible that there was some memory swapping muddying their results. Other sites also don't show as big differences between the two.

In any case, I don't see much bandwidth scaling on my own system. Performance scales almost linearly with core clock rate, however.
 

Md Ray

Member
I don't know how reliable that is. Doom Eternal ideally needs more than 8GB at 4k so it's possible that there was some memory swapping muddying their results. Other sites also don't show as big differences between the two.

In any case, I don't see much bandwidth scaling on my own system. Performance scales almost linearly with core clock rate, however.
Well, the 3070 Ti also has an 8GB frame buffer so I don't think that's the case. I just booted the game up on my 3070 and the estimated VRAM counter shown at the top doesn't cross 7.5GB when using UN settings, 4K.

At 1440p it consumes even less memory where the 3070 saw just a 5% uplift, so I think it's pretty reliable.
 
Last edited:

Hoddi

Member
Well, the 3070 Ti also has an 8GB frame buffer so I don't think that's the case. I just booted the game up on my 3070 and the estimated VRAM counter shown at the top doesn't cross 7.5GB when using UN settings, 4K.

At 1440p it consumes even less memory where the 3070 saw just a 5% uplift, so I think it's pretty reliable.
Keep in mind that it's only an estimate and VRAM in use fluctuates quite a bit as you play the game. There was a Reddit thread that covered it nicely and is worth reading through.

The fact that the difference is just 5% at 1440p rather suggests that HWU's 4k results are an anomaly caused by lack of VRAM in whatever level they were testing. Adding bandwidth doesn't help in such a scenario while other websites have showed the same 5% diff. at 4k.
 
I noticed in a lot of the shots in the video the XBOX Series X version image objects seemed like they were smaller. I couldn't tell if it was running at a higher resolution or the FOV was different. All platforms look great btw.
 

Md Ray

Member
The fact that the difference is just 5% at 1440p rather suggests that HWU's 4k results are an anomaly caused by lack of VRAM in whatever level they were testing. Adding bandwidth doesn't help in such a scenario while other websites have showed the same 5% diff. at 4k.
That 5% difference is between 3070 and 3060 Ti. On the other hand, the 18% difference in 4K is between 3070 Ti and 3070.
 

Armorian

Banned
That 5% difference is between 3070 and 3060 Ti. On the other hand, the 18% difference in 4K is between 3070 Ti and 3070.

This game has fucked up performance when out of vram and texture pool setting doesn't change texture quality at all, 3070ti with GDDR6 probably swaps memory faster so it has higher performance than 8GB cards with normal G6. And HW benches with all settings to the max no?

I play with everything on ultra nightmare except textures (high), with DSRx4 (5120x2160) and balanced DLSS. Ultra stable 73FPS with vanilla settings and some frops here and there to ~60 with full res reflections.
 
Last edited:
im really a RT proponent, but in this video the reflections look really strange with dooms art style.


maybe it's just alex's super shiny b-roll footage.... everything looks rather "speckig"
 

Hoddi

Member
That 5% difference is between 3070 and 3060 Ti. On the other hand, the 18% difference in 4K is between 3070 Ti and 3070.

I think you misunderstood my comment. Other websites like TechPowerup and Eurogamer report much smaller differences between the 3070 and 3070 Ti in DE and including at 4k. The most likely reason being that they aren't running out of VRAM in their tests like HWU.

Either way, I don't see much bandwidth scaling on my own system. On the contrary, performance scales almost linearly with core clock rate which shouldn't happen if bandwidth were a primary bottleneck.
 

Md Ray

Member
I think you misunderstood my comment. Other websites like TechPowerup and Eurogamer report much smaller differences between the 3070 and 3070 Ti in DE and including at 4k. The most likely reason being that they aren't running out of VRAM in their tests like HWU.

Either way, I don't see much bandwidth scaling on my own system. On the contrary, performance scales almost linearly with core clock rate which shouldn't happen if bandwidth were a primary bottleneck.
Huh, TP and EG's numbers (% increase) line up with one another and makes HWU's results definitely look strange. DOOM Eternal is weird due to its stupidly high VRAM requirement at higher texture settings for no imperceptible difference.
 
For comparison sake
On my 3060ti DLSS OFF RT On everything ultra nightmare getting around 70fps at 3620x2036
not sure why but it put this as my native res
Same settings but 1440P (actual native res) around 140 fps
Textures at ultra
 
Last edited:

CrustyBritches

Gold Member
I played through the game again and my favorite level to play with RT is DOOM Hunter Base, in particular the first major arena. With the shiny floors you can see reflections for enemy projectiles, your own rounds, and even the loot drops are reflected. You can see this same stuff on your weapons, too.




Urdak has some nice reflective surfaces, too. It's cool to see the Slayer in the window reflections and such, but for the most part RT isn't very noticeable in the rest of the game. Still looks great and it runs well enough with DLSS that you can still use it and hit high frames on almost any RTX GPU.
 
Last edited:

JackMcGunns

Member
Maybe it's only noticeable if the developers use it wrong. That's all I can think of.


VRS is not supposed to reconstruct anything like DLSS, so having a 2.0 version might just mean that it's more effective at doing what it does, which is to shade at a variable rate, in other words, save performance by shading only areas where the player will notice.

Taking a screenshot where VRS has been applied and zooming it 400% to compare it to another version that's not using VRS defeats the purpose of what VRS is being used for, it has nothing to do with increasing quality, but saving performance and shading things at a variable rate instead of just shading everything unnecessarily.
 
Last edited:

Armorian

Banned
VRS is not supposed to reconstruct anything like DLSS, so having a 2.0 version might just mean that it's more effective at doing what it does, which is to shade at a variable rate, in other words, save performance by shading only areas where the player will notice.

Taking a screenshot where VRS has been applied and zooming it 400% to compare it to another version that's not using VRS defeats the purpose of what VRS is being used for, it has nothing to do with increasing quality, but saving performance and shading things at a variable rate instead of just shading everything unnecessarily.

VRS so far only degradates IQ while giving so small performance increase in return it's not worth using at all.

What is criminal in 2021 is low/non existent AF on consoles... this shit is almost free since ~2006 and yet devs use it like it's MSAA or something :messenger_grinning_smiling:
 

Bogroll

Likes moldy games
Worse AF in every mode, also.
I'll wait for different users to present their findings though (DF/NX/VG) as this users videos have been very wrong in the past.
Yes of course, I was only using it as a rough guide.

It will be more interesting to me how series S stacks up against the Pro and One X. I know it has a lower resolution but how will performance and other graphical features compare.
 

Md Ray

Member
I played through the game again and my favorite level to play with RT is DOOM Hunter Base, in particular the first major arena. With the shiny floors you can see reflections for enemy projectiles, your own rounds, and even the loot drops are reflected. You can see this same stuff on your weapons, too.


Urdak has some nice reflective surfaces, too. It's cool to see the Slayer in the window reflections and such, but for the most part RT isn't very noticeable in the rest of the game. Still looks great and it runs well enough with DLSS that you can still use it and hit high frames on almost any RTX GPU.

You've got some balls to upload a PC footage of DOOM playing using a controller. :messenger_tongue:
 
Last edited:

FireFly

Member
VRS so far only degradates IQ while giving so small performance increase in return it's not worth using at all.

What is criminal in 2021 is low/non existent AF on consoles... this shit is almost free since ~2006 and yet devs use it like it's MSAA or something :messenger_grinning_smiling:
On Gears 5, VRS saved 5% - 12% of performance according to DF. Anyway, if the boost in resolution the extra performance enables improves image quality over the non-VRS image, it is worth using. For a fair comparison you need to focus on the whole image, since the point of VRS is to focus shading resolution on areas where it will be most noticed.
 

MrFunSocks

Banned
The attacks on VRS are quite funny since all I’ve been reading on here lately is how native resolution doesn’t matter and image reconstruction upscaling to increase frame rate is the future, when image reconstruction techniques also affect the quality of the image like VRS does.

On a TV while playing you will not notice that VRS that this video showed. I know because I played that exact area after watching this video (it’s like 10 mins into the game) and couldn’t even see the detail on the rebar on my computer monitor in 1440p on Ultra settings from a foot away. You literally will not see a difference unless you are taking screen shots and then zooming in 400%.

I really wish they would stop with the 400% zooms because it’s pointless. No one plays games like that. Back when digital foundry made their name you didn’t even need to zoom in to see the difference between 600p and 720p. It was a noticeable difference. Now we’re zooming in 400% on a 200x200 pixel section of a 4K image to try and find differences lol.
 
Last edited:
The attacks on VRS are quite funny since all I’ve been reading on here lately is how native resolution doesn’t matter and image reconstruction upscaling to increase frame rate is the future, when image reconstruction techniques also affect the quality of the image like VRS does.

On a TV while playing you will not notice that VRS that this video showed. I know because I played that exact area after watching this video (it’s like 10 mins into the game) and couldn’t even see the detail on the rebar on my computer monitor in 1440p on Ultra settings from a foot away. You literally will not see a difference unless you are taking screen shots and then zooming in 400%.

I really wish they would stop with the 400% zooms because it’s pointless. No one plays games like that. Back when digital foundry made their name you didn’t even need to zoom in to see the difference between 600p and 720p. It was a noticeable difference. Now we’re zooming in 400% on a 200x200 pixel section of a 4K image to try and find differences lol.
Missed your voice on here Socks. Glad to see you back.

On topic when you have to zoom in 400% to see some slight blur yet you can run the game at 120fps at 1800p that is the whole point of VRS. It will be interesting to see how it looks as more RDNA 2 features are implemented on XSX|S
 
Top Bottom