• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last of Us Part 1 on PC; another casualty added to the list of bad ports?

SlimySnake

Flashless at the Golden Globes
Imagine a world in which you work with iron galaxy on a bad port and it's not them that are the culprit.

Beginning to think ND saw the state of things and brought them in as a patsy last minute.
Yep. They even name dropped them in the latest twitter update. lol

They know exactly what they are doing
 

Elios83

Member
Jim: We have a platform for those who actually want to play the game without issues, please buy that and the game again :messenger_tears_of_joy:

But seriously they should have delayed the PC version until June.
 

Marlenus

Member
The game does not seem as bad as some make out.

The issues are that it is quite CPU heavy because PS5 has dedicated decompression hardware and PCs don't so do that on the CPU cores and it uses a lot of VRAM because the port does not use direct storage.

Bottom line is that 8GB is not enough vram for a solid mid range card anymore. This was obvious once the console specs were known it just so happens that there was a longer than usual cross gen period so the jump has taken longer to arrive which lulled some people into a false sense of security.

Going forward I would say 12GB is entry level, 16GB will be mid range, 24GB high end and 32GB enthusiast tier.

8GB will be relegated to e-sports and ideally sub $200 products.

We had the exact same situation play out when PS4 and Xbox One were launched after their cross gen period and 6GB cards started to struggle and 8GB became the new minimum requirement.
 

Gaiff

SBI’s Resident Gaslighter
The game does not seem as bad as some make out.
Those who actually have it beg to differ. You're spouting so much nonsense with so much confidence.
The issues are that it is quite CPU heavy because PS5 has dedicated decompression hardware and PCs don't so do that on the CPU cores and it uses a lot of VRAM because the port does not use direct storage.
And you know this how? Show us what role the hardware-accelerated decompression plays. Show us raw data about the VRAM usage on PS5. As it stands you're just going "trust me bro."
Bottom line is that 8GB is not enough vram for a solid mid range card anymore. This was obvious once the console specs were known it just so happens that there was a longer than usual cross gen period so the jump has taken longer to arrive which lulled some people into a false sense of security.

Going forward I would say 12GB is entry level, 16GB will be mid range, 24GB high end and 32GB enthusiast tier.

8GB will be relegated to e-sports and ideally sub $200 products.

We had the exact same situation play out when PS4 and Xbox One were launched after their cross gen period and 6GB cards started to struggle and 8GB became the new minimum requirement.
So much wrong with this. The X1/PS4 came out in November 2013. The paltry GTX 670 with its 2GB frame buffer was faster than them and the little 7850 with 3GB from 2012 could run The Witcher 3 at 1080p/high/30fps whereas the consoles struggled with a bunch of medium/low settings and even 1-2 settings at lower than low. Then we had the 970 with 4GB that utterly dunked on the consoles and the 980 Ti that completely trounced them. The first mainstream 8GB card was the GTX 1080 and when the fuck was it ever the minimum lmao? When was a 980 Ti ever not enough for gaming? Then 32GB of VRAM for the future lol?

Console trolls should seriously STFU.
 
Last edited:

Buggy Loop

Member
Those who actually have it beg to differ. You're spouting so much nonsense with so much confidence.

And you know this how? Show us what role the hardware-accelerated decompression plays. Show us raw data about the VRAM usage on PS5. As it stands you're just going "trust me bro."

So much wrong with this. The X1/PS4 came out in November 2013. The paltry GTX 670 with its 2GB frame buffer was faster than them and the little 7850 with 3GB from 2012 could run The Witcher 3 at 1080p/high/30fps whereas the consoles struggled with a bunch of medium/low settings and even 1-2 settings at lower than low. Then we had the 970 with 4GB that utterly dunked on the consoles and the 980 Ti that completely trounced them. The first mainstream 8GB card was the GTX 1080 and when the fuck was it ever the minimum lmao? When was a 980 Ti ever not enough for gaming? Then 32GB of VRAM for the future lol?

Console trolls should seriously STFU.

B b but VRAM! Uh, DirectStorage, special Cerny IO sauce and uh.. please ignore games that already look better than this and runs flawlessly on PC like Plague Tale requiem please.. the devs have nothing on their to-do list, perfect port, they certainly did not acknowledge the fuckup

Ignore also the velocity architecture that made possible Microsoft flight simulator on consoles and many other examples where they run well on PC on ~$800 hardware as documented by Digital Foundry



Also please ignore that most console games pushing graphics are not running 4K, so that 8GB VRAM you selected dumbly for 4K is clearly outdated compared to console power. (Who the fuck recommended 8GB cards for 4K 2 years ago? Nobody)
 

ChiefDada

Gold Member
M Marlenus

The DF Direct dropped for Early Access and they spoke for 40min on TLOU PC performance, and thr overarching topic of memory management and VRAM requirements for the current gen. Richard brought up that the PS5 decompression hardware seems to be a key factor as to why PC CPU demands are relatively high. Of course Alex scoffs and obnoxiously shaking his head as Rich is speaking. To be expected from this clown
 
Last edited:

IFireflyl

Gold Member
People follow popular internet trends without a 2nd thought but I am an idiot with bad tastes for thinking with myself.
I had almost 30 years to work on my “pc taste” whatever it means man. I can’t believe the shit I am reading here sometimes.

At no point did I call you an idiot. Don't put words in my mouth. Everyone has certain preferences. I, and others, have preferences that run counter to yours. My issue with you is that you have this idea that people who disagree with you are objectively wrong, or uneducated, or they are "sheep". The arrogance, condescension, and narcissism emanating from your replies is unreal. Your opinions are not the gold standard for PC gamers. You're a nobody on the internet, just like the rest of us.
 

Gaiff

SBI’s Resident Gaslighter


Hardware unboxed really looks bad in all this, already telling its viewers it’s about VRAM and making it about GPU vendors rather than inherent problems with the port.

Remains to be seen what happens next but we did have the RE4R as well with undeniably high VRAM requirements. Naughty Dog can likely mitigate the issue but not make it completely go away.
 

rofif

Can’t Git Gud
At no point did I call you an idiot. Don't put words in my mouth. Everyone has certain preferences. I, and others, have preferences that run counter to yours. My issue with you is that you have this idea that people who disagree with you are objectively wrong, or uneducated, or they are "sheep". The arrogance, condescension, and narcissism emanating from your replies is unreal. Your opinions are not the gold standard for PC gamers. You're a nobody on the internet, just like the rest of us.
I am not directly saying that you are wrong.
but I am sure that 90% of people who disable motion blur don't know why or what it is. They just do it. wouldn't you agree even a bit ?
 

Marlenus

Member
Those who actually have it beg to differ. You're spouting so much nonsense with so much confidence.

And you know this how? Show us what role the hardware-accelerated decompression plays. Show us raw data about the VRAM usage on PS5. As it stands you're just going "trust me bro."

So much wrong with this. The X1/PS4 came out in November 2013. The paltry GTX 670 with its 2GB frame buffer was faster than them and the little 7850 with 3GB from 2012 could run The Witcher 3 at 1080p/high/30fps whereas the consoles struggled with a bunch of medium/low settings and even 1-2 settings at lower than low. Then we had the 970 with 4GB that utterly dunked on the consoles and the 980 Ti that completely trounced them. The first mainstream 8GB card was the GTX 1080 and when the fuck was it ever the minimum lmao? When was a 980 Ti ever not enough for gaming? Then 32GB of VRAM for the future lol?

Console trolls should seriously STFU.
Streaming assets from the SSD with on the fly decompression is one of the main things Cerny worked on with the console. It allows for higher quality assets and more variety without blowing out the available ram.

There were 8GB variants of the 290X and the 390X (980 competitor) came with 8GB by default. The 1080 was not the 1st mainstream 8GB card at all. Maybe you ought to stop being confidently incorrect and instead try simply being correct.

Do you remember the Fury X 4GB. At launch it was quite a bit faster than the R9 390X. A couple of years later once more games were built exclusively around PS4 and Xbox One the 390X would often be as fast with better minimums.

Also the 390X had longer legs than the 980 even though at launch the performance was about the same. The 970 also dropped off quite hard by mid gen due to only having 3.5GB of ram running at full speed at 0.5GB at reduced speed hence the 970 3.5GB memes.
 

IFireflyl

Gold Member
I am not directly saying that you are wrong.
but I am sure that 90% of people who disable motion blur don't know why or what it is. They just do it. wouldn't you agree even a bit ?

That has nothing to do with anything reply that I have made to you. I replied to your condescending post because you just assume that the person you're replying to (who doesn't want to use motion blur) is a "sheep". You didn't reply to someone who said, "Well my friend told me blah-blah-blah..." Replying to someone like that would have at least made sense. You replied to someone who said they still weren't going to use motion blur. Even though you had no idea WHY they weren't going to use motion blur, you called them a sheep. That's the arrogance and condescension that I'm talking about.
 

Hoddi

Member


Hardware unboxed really looks bad in all this, already telling its viewers it’s about VRAM and making it about GPU vendors rather than inherent problems with the port.

VRAM capacity absolutely seems to be the core issue. The game runs very well on my 2080 Ti at 1440p but it's also clearly bumping into the memory ceiling. Performance then also scales fairly linearly to lower resolutions until my 9900k becomes the bottleneck at 720p.

I even need to disable my secondary monitor to save on VRAM. Keeping it enabled makes the game go over the 11GB budget and start swapping over PCIe.
 

Gaiff

SBI’s Resident Gaslighter
Streaming assets from the SSD with on the fly decompression is one of the main things Cerny worked on with the console. It allows for higher quality assets and more variety without blowing out the available ram.
Cool. Now prove to me that it's the problem here.
There were 8GB variants of the 290X and the 390X (980 competitor) came with 8GB by default. The 1080 was not the 1st mainstream 8GB card at all. Maybe you ought to stop being confidently incorrect and instead try simply being correct.
Hence why I said "mainstream". The 8GB R9 290X was much rarer and the 390X was a rebadged R9 290X with higher clocks and memory. Otherwise, I would have named the Titan X from March 2015.
Do you remember the Fury X 4GB. At launch it was quite a bit faster than the R9 390X. A couple of years later once more games were built exclusively around PS4 and Xbox One the 390X would often be as fast with better minimums.
Again, where is that because that's a bunch of bullshit. You said that 8GB became the "minimum" somewhere in the 8th gen when those consoles have like 5.5GB of available RAM. When was 8GB ever the minimum during the 8th generation?
Also the 390X had longer legs than the 980 even though at launch the performance was about the same. The 970 also dropped off quite hard by mid gen due to only having 3.5GB of ram running at full speed at 0.5GB at reduced speed hence the 970 3.5GB memes.
Means shit and is a strawman. You made the claim that 8GB was the minimum when the 1060 to this day is chugging along with 6GB.

Now, let's see.

Here is the 970 running Dark Souls III at max settings 1080p.



Here it is in Watch_Dogs 2 at 1080p.



Here is the 2GB GTX 670 running Sekiro.



Your claim that 8GB was ever the minimum is completely bogus because even now for 1080p it's plenty. Hell, it's still enough for 1440p the majority of the time and even 4K. Show us those "later" 8th generation games maxing out a 6GB frame buffer.
 

rofif

Can’t Git Gud
That has nothing to do with anything reply that I have made to you. I replied to your condescending post because you just assume that the person you're replying to (who doesn't want to use motion blur) is a "sheep". You didn't reply to someone who said, "Well my friend told me blah-blah-blah..." Replying to someone like that would have at least made sense. You replied to someone who said they still weren't going to use motion blur. Even though you had no idea WHY they weren't going to use motion blur, you called them a sheep. That's the arrogance and condescension that I'm talking about.
Dude I don't even remember who I replied to and what. Probably just a sheep anyway....
lol let it go who gives a shit
 

SHA

Member
That sole reason makes me suggest AMD more these days, my card is for 1440p yet I can play many games at 4K at or above 60 fps easily due to VRAM and infinite cache to compensate the bandwidth, Nvidia is no doubt better at the highest end and RT, but that's it
Yeah yeah, the tier Jensen Hwang claimed you don't need it , It doesn't change the fact 9800 gtx+ or whatever they had is a piece of garbage compares to its counterpart from amd.
 
Last edited:

Marlenus

Member
Cool. Now prove to me that it's the problem here.

Hence why I said "mainstream". The 8GB R9 290X was much rarer and the 390X was a rebadged R9 290X with higher clocks and memory. Otherwise, I would have named the Titan X from March 2015.

Again, where is that because that's a bunch of bullshit. You said that 8GB became the "minimum" somewhere in the 8th gen when those consoles have like 5.5GB of available RAM. When was 8GB ever the minimum during the 8th generation?

Means shit and is a strawman. You made the claim that 8GB was the minimum when the 1060 to this day is chugging along with 6GB.

Now, let's see.

Here is the 970 running Dark Souls III at max settings 1080p.



Here it is in Watch_Dogs 2 at 1080p.



Here is the 2GB GTX 670 running Sekiro.



Your claim that 8GB was ever the minimum is completely bogus because even now for 1080p it's plenty. Hell, it's still enough for 1440p the majority of the time and even 4K. Show us those "later" 8th generation games maxing out a 6GB frame buffer.


Cool so 'mainstream' does not mean widely available and easy to buy but 'whatever I say it does to fit my prior incorrect statement'. Awesome.

As for games RDR2, The Outer World's, Doom Eternal, Borderlands 3 all show the 2060 6GB starting to fall back relative to the performance tier it was at on launch or as resolution increases. Then stuff like Rise of the Tomb Raider required tuning to get running smoothly on the 970.

The other issue here is that it is not just vram on its own that is the problem but the fact that the 3060 12GB in some cases can offer a better gaming experience than the 3060Ti, 3070, 3070Ti and rarely the 3080 10GB. That should never happen. In addition the 10 series was actually a pretty big leap forward because you went from a 6GB 980Ti to a 6GB 1060, 8GB 1070 and 1080 and an 11GB 1080Ti. NV were not stingy on VRAM with the 10 series at all and even the 20 series was pretty decent from a VRAM standpoint.

If Ampere had done like the 10 series then the 3060 would have had 12GB of Vram like it shipped with while the 3070 would have had 16GB, the 3080 would have had 20GB and the 3090 would have had the 24GB it shipped with.

I can see 5000 series offering another large jump in vram amounts and 4060/3070 buyers are going to be in a similar spot to 970 buyers were.

It is also worth mentioning that the 1060 6GB was $300. Given the $800 price of the 4070Ti and the rumoured $600 of the 4070 do you really think the 4060 8GB will be less than $400? I don't and it will get beaten by the 3060 in some games which is just a nonsense.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Cool so 'mainstream' does not mean widely available and easy to buy but 'whatever I say it does to fit my prior incorrect statement'. Awesome.
The Titan Black was the first "widely available" card with an 8GB frame buffer or above. I explained it very clearly why I named the 1080 8GB and frankly, the 8GB of the 390X or 290X are utterly useless.
As for games RDR2, The Outer World's, Doom Eternal, Borderlands 3 all show the 2060 6GB starting to fall back relative to the performance tier it was at on launch or as resolution increases. Then stuff like Rise of the Tomb Raider required tuning to get running smoothly on the 970.
I'm sorry but what? These games are simply more demanding because they're newer. Has nothing to do with 6GB suddenly becoming not enough. RDR2 on consoles has like half of its settings running on low and a bunch of them at lower than low.

This is RDR2 running on a 970 with much higher settings and fps than on consoles.



And lmao, DOOM Eternal? The game that can run on your coffee maker at 100fps?

1080p-3-p.webp


98fps average at 1080p Ultra.

And of course cards will drop in performance as resolution increases, what the heck are you talking about? Resolution applies pressure more pressure to everything, not just the frame buffer.

Rise of the Tomb Raider?

1920_1080.png

The other issue here is that it is not just vram on its own that is the problem but the fact that the 3060 12GB in some cases can offer a better gaming experience than the 3060Ti, 3070, 3070Ti and rarely the 3080 10GB. That should never happen. In addition the 10 series was actually a pretty big leap forward because you went from a 6GB 980Ti to a 6GB 1060, 8GB 1070 and 1080 and an 11GB 1080Ti. NV were not stingy on VRAM with the 10 series at all and even the 20 series was pretty decent from a VRAM standpoint.

If Ampere had done like the 10 series then the 3060 would have had 12GB of Vram like it shipped with while the 3070 would have had 16GB, the 3080 would have had 20GB and the 3090 would have had the 24GB it shipped with.

I can see 5000 series offering another large jump in vram amounts and 4060/3070 buyers are going to be in a similar spot to 970 buyers were.

It is also worth mentioning that the 1060 6GB was $300. Given the $800 price of the 4070Ti and the rumoured $600 of the 4070 do you really think the 4060 8GB will be less than $400? I don't and it will get beaten by the 3060 in some games which is just a nonsense.
Your claim that 8GB was becoming the "minimum" during the 8th gen is completely false. The freakin' 1060 6GB was one of the most popular cards and was never not enough and demolished the consoles except for the X1X by a wide margin in every game. So far, you keep making claims yet haven't provided a iota of proof. The fact that 8GB is just now starting to become questionable (and in like 2-3 games) puts your arguments to rest. 8GB was more than enough the moment it became available and remained so for years. Hell, even 2GB remained viable pretty damn far into the 8th generation so claiming that 8 was ever the least you could get away with is nothing short of a farce.

All the data I've shown also has the GPUs run the games at much higher settings than the consoles that tended to settle for a mix of medium and low with the odd high setting (such as texture quality). They were almost never high across the board, let alone max settings.
 
Last edited:

Calverz

Member
Well it doesn't mean much for those with issues, but my 3090Ti runs this like butter all maxed out @ 4K with DLSS quality. In this game I actually prefer DLSS, it resolves some things better than native. Had no crashes, the only issue was the 18mn long shader cache and the fucking noise mt GPU made.
I found dlss even on quality, was displaying weird bright spots on textures out of focus in cutscenes. Like white dots glimmering in the distance.
 

Thebonehead

Banned
I found dlss even on quality, was displaying weird bright spots on textures out of focus in cutscenes. Like white dots glimmering in the distance.

That happens without DLSS, It doesn't look good at all. Like a lens flare

I haven't found a solution for it - have tried turning off all the settings

See below - Just above the green bin you can see a smudge moving when I pan around - There are actually several all over the screen, but lost in the upload to youtube. Also in the top left it comes across as very blurred ( Due to youtube compression ) - but is actually a series of lens like flares. that are very opaque

Reminds me of the dirty screen effect

 

kittoo

Cretinously credulous
Can anyone answer two questions for me-
  • Why do I not see DLSS option (only FSR)? Running on a 3080
  • How do I turn off the dirt/specks on screen option? These are terrible and distracting.
Thank you.
 

Thebonehead

Banned
Can anyone answer two questions for me-
  • Why do I not see DLSS option (only FSR)? Running on a 3080
  • How do I turn off the dirt/specks on screen option? These are terrible and distracting.
Thank you.

I see the dlss option, although running on a 4090. Must be down to naughty bug striking again.

See my post above. Seems ike this dirty lens effect is an intended 'look'

Like you, I find it distracting and haven't found a way to turn it off.
 

SlimySnake

Flashless at the Golden Globes
Cool. Now prove to me that it's the problem here.

Hence why I said "mainstream". The 8GB R9 290X was much rarer and the 390X was a rebadged R9 290X with higher clocks and memory. Otherwise, I would have named the Titan X from March 2015.

Again, where is that because that's a bunch of bullshit. You said that 8GB became the "minimum" somewhere in the 8th gen when those consoles have like 5.5GB of available RAM. When was 8GB ever the minimum during the 8th generation?

Means shit and is a strawman. You made the claim that 8GB was the minimum when the 1060 to this day is chugging along with 6GB.

Now, let's see.

Here is the 970 running Dark Souls III at max settings 1080p.



Here it is in Watch_Dogs 2 at 1080p.



Here is the 2GB GTX 670 running Sekiro.



Your claim that 8GB was ever the minimum is completely bogus because even now for 1080p it's plenty. Hell, it's still enough for 1440p the majority of the time and even 4K. Show us those "later" 8th generation games maxing out a 6GB frame buffer.

Yeah, my GTX 570 with its 1.2 GB vram simply refused to play CoD Advanced Warfare on anything but the lowest setting. They literally had everything grey'ed out. It was shocking because that card had no problems playing last gen games like Tomb Raider, Far Cry 3 and other games at a locked 1080p 60 fps on high settings.

And yes, the GTX 970 was arguably the most popular card of that generation. And even with its split ram pool, it ran everything from those early PS4 days at 1080p 60 fps on high or better settings than the PS4. That was because despite the 8GB RAM pool of the PS4, only 5 were available to devs for games, and most devs used just 3GB for vram and the rest for cpu and IO.

GG released a breakdown of it at GDC.

wx7XRap.png

KUtFJoh.png


Eh8xAqBXsAE_ToQ.jpg


So if 970 ran these games at double the fps and wasnt bottlenecked by the 5GB PS4, then the 10 GB 3080 shouldnt be bottlenecked by the 12.5/13.5 GB PS5. Not all of the vram is used for the GPU.

Anyway, I played the game for a few hours and got to Bill's town. I stopped fucking around with settings, and while the game still crashes when i quit to main menu, if i keep playing without going into the menus or changing settings on the fly, im not getting any memory leaks or stutters. So i sat down and play through a couple of hours with only slight stutters when going into new areas. 4k dlss quality at 60 fps. only bill's town forest area that Hardware Unboxed tested has brought it down to 55-58 fps. it makes no sense because it is a tiny area that doesnt even look as good as Uncharted 4's forests let alone TLOU2's gorgeous areas that are much larger with far more variety. This game should be running at native 4k 60 fps on the PS5 on the TLOU2 engine.

Id highly recommend people play this without going into menus. Just change the settings in the main menu and stick with it. See if that works.
 

SlimySnake

Flashless at the Golden Globes


Hardware unboxed really looks bad in all this, already telling its viewers it’s about VRAM and making it about GPU vendors rather than inherent problems with the port.

His thesis is well supported by his benchmarks. he never said that vram is the only problem. He pointed out that game is unoptimized. His point is that PC ports that are unoptimized are going to become the norm going forward as we begin to see more and more next gen only games. TLOU is barely even a last gen game but because it released on a single SKU i bet ND said fuck it to optimization even on the PS5 and turned up the settings to 11. This kind of stuff is going to happen more and more and having more vram seems to be the key.

We saw this with RE4 just last week. Literally crashing if you go over VRAM so everyone who bought these nvidia cards because they have better RT performance now have to turn off RT anyway just like AMD GPU owners. So we paid premium for what? DLSS?

And again, this and RE4 are not the only games. I simply could not run Gotham Knights with RT on despite having just 70% GPU utilization at 4k dlss 60 fps and RT on. It would all of a sudden drop to 4-5 fps and i simply couldnt make sense of it. I had no idea it was a VRAM bottleneck because ive been trained to just look at GPU usage. But now it makes perfect sense. I ended up turning off RT and had no issues running the game.

Hogwarts is also a very memory hungry game. i went out and bought 32 GB of RAM and while it helped with regular modes, having 25 GB sitting on RAM in RT mode did little to stop the stutters. Once again, its obvious now that vram, not just system ram was the issue.

Its disheartening to play these games at 45-50% GPU utilization knowing that if i play this at native 4k or with RT on, it will crash or stutter not because i dont have a powerful enough GPU but because it didnt have enough vram.
 

Buggy Loop

Member
Its disheartening to play these games at 45-50% GPU utilization knowing that if i play this at native 4k or with RT on, it will crash or stutter not because i dont have a powerful enough GPU but because it didnt have enough vram.

Without going into just how much bloated these games are using VRAM for not much showing for it, there's also RT memory leaks. On top of devs being stupid enough to crash a game because of VRAM overflow rather than lower performances. I can't recall a period in my past 30 years of PC gaming where this was happening. Their memory management is trash. You spill over VRAM? You transfer to RAM like every goddamn games. Its basic stuffs. It'll slowdown, it will stutter, but it shouldn't crash.

I'm not worried for Last of Us because Sony and Naughty Dog won't let things as is, but Capcom... i doubt we'll see a patch.
 
Last edited:

ChiefDada

Gold Member
Hardware unboxed really looks bad in all this, already telling its viewers it’s about VRAM and making it about GPU vendors rather than inherent problems with the port.

No, they clearly acknowledge there are negative aspects of the port that ND/Sony deserve flack for, such as the bugs. But to deny that VRAM is a significant bottleneck with this game is nothing short of delusional.

 


Hardware unboxed really looks bad in all this, already telling its viewers it’s about VRAM and making it about GPU vendors rather than inherent problems with the port.

So when a game doesn't play well on Nvidia hardware for several reasons (lack of vram, useless RT, optimized for RDNA2) this is his answer: the game is unoptimized? Soon he is going to talk about the Nvidia tools that aren't ready, isn't he?
 

Thebonehead

Banned
His thesis is well supported by his benchmarks. he never said that vram is the only problem. He pointed out that game is unoptimized. His point is that PC ports that are unoptimized are going to become the norm going forward as we begin to see more and more next gen only games. TLOU is barely even a last gen game but because it released on a single SKU i bet ND said fuck it to optimization even on the PS5 and turned up the settings to 11. This kind of stuff is going to happen more and more and having more vram seems to be the key.

We saw this with RE4 just last week. Literally crashing if you go over VRAM so everyone who bought these nvidia cards because they have better RT performance now have to turn off RT anyway just like AMD GPU owners. So we paid premium for what? DLSS?

And again, this and RE4 are not the only games. I simply could not run Gotham Knights with RT on despite having just 70% GPU utilization at 4k dlss 60 fps and RT on. It would all of a sudden drop to 4-5 fps and i simply couldnt make sense of it. I had no idea it was a VRAM bottleneck because ive been trained to just look at GPU usage. But now it makes perfect sense. I ended up turning off RT and had no issues running the game.

Hogwarts is also a very memory hungry game. i went out and bought 32 GB of RAM and while it helped with regular modes, having 25 GB sitting on RAM in RT mode did little to stop the stutters. Once again, its obvious now that vram, not just system ram was the issue.

Its disheartening to play these games at 45-50% GPU utilization knowing that if i play this at native 4k or with RT on, it will crash or stutter not because i dont have a powerful enough GPU but because it didnt have enough vram.
4090,12900k,32gb ram,6600 MBps ssd here.

Gotham knights had the same figure drops for me, and also Hogwarts was an initial stutter fest until one of the recent patches.

No lack of vram for me causing it to stutter, just poor initial optimisation from the developers.

Tlou is unplayable using a mouse as it has horrible camera panning stutter. Use a controller and it's smooth as butter. Has random drops in areas just walking up some stairs n the warehouse where the frame rate will crash and vram utilisation shoots up that you can repeat. Something is way off with the engine.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The game does not seem as bad as some make out.

The issues are that it is quite CPU heavy because PS5 has dedicated decompression hardware and PCs don't so do that on the CPU cores and it uses a lot of VRAM because the port does not use direct storage.

Bottom line is that 8GB is not enough vram for a solid mid range card anymore. This was obvious once the console specs were known it just so happens that there was a longer than usual cross gen period so the jump has taken longer to arrive which lulled some people into a false sense of security.

Going forward I would say 12GB is entry level, 16GB will be mid range, 24GB high end and 32GB enthusiast tier.

8GB will be relegated to e-sports and ideally sub $200 products.

We had the exact same situation play out when PS4 and Xbox One were launched after their cross gen period and 6GB cards started to struggle and 8GB became the new minimum requirement.
If youve got a midrange card.....dont play with Ultra settings.
 

Marlenus

Member
The Titan Black was the first "widely available" card with an 8GB frame buffer or above. I explained it very clearly why I named the 1080 8GB and frankly, the 8GB of the 390X or 290X are utterly useless.

I'm sorry but what? These games are simply more demanding because they're newer. Has nothing to do with 6GB suddenly becoming not enough. RDR2 on consoles has like half of its settings running on low and a bunch of them at lower than low.

This is RDR2 running on a 970 with much higher settings and fps than on consoles.



And lmao, DOOM Eternal? The game that can run on your coffee maker at 100fps?

1080p-3-p.webp


98fps average at 1080p Ultra.

And of course cards will drop in performance as resolution increases, what the heck are you talking about? Resolution applies pressure more pressure to everything, not just the frame buffer.

Rise of the Tomb Raider?

1920_1080.png


Your claim that 8GB was becoming the "minimum" during the 8th gen is completely false. The freakin' 1060 6GB was one of the most popular cards and was never not enough and demolished the consoles except for the X1X by a wide margin in every game. So far, you keep making claims yet haven't provided a iota of proof. The fact that 8GB is just now starting to become questionable (and in like 2-3 games) puts your arguments to rest. 8GB was more than enough the moment it became available and remained so for years. Hell, even 2GB remained viable pretty damn far into the 8th generation so claiming that 8 was ever the least you could get away with is nothing short of a farce.

All the data I've shown also has the GPUs run the games at much higher settings than the consoles that tended to settle for a mix of medium and low with the odd high setting (such as texture quality). They were almost never high across the board, let alone max settings.


1060 6GB with lower minimums than 8GB cards - shame the forum won't let me load the image. TPU have only recently started showing minimum frames and the average does not tell the entire story as you should be fully aware.

RE 2 shows the limits of 4GB pretty well with the 0.1% lows tanking even though 1% and average looks okay. This will cause small stutters. 6GB does hang on in this title though

1080p-p.webp


If you look at when around the time the 580 and 1060 they were pretty neck and neck.

Average-p.webp


Then in the retro in early 2020 we had this and some big wins in more recent at the time titles.

1440p-p.webp


And it even showed up at 1080p.

1080p-p.webp


If youve got a midrange card.....dont play with Ultra settings.

I would not call a 3070 or 3070Ti at MSRP 'mid range' when a 1070 was $380.
 

Lysandros

Member


Hardware unboxed really looks bad in all this, already telling its viewers it’s about VRAM and making it about GPU vendors rather than inherent problems with the port.

Take your usual crude condescendence and stick it to your Crysis shrine Ô all knowing Dick-tator.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I would not call a 3070 or 3070Ti at MSRP 'mid range' when a 1070 was $380.
Then what cards are you talking about?


And per chance do they fall in the mid range of most performance charts?
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
1060 6GB with lower minimums than 8GB cards - shame the forum won't let me load the image. TPU have only recently started showing minimum frames and the average does not tell the entire story as you should be fully aware.
Where? Even the 4GB 970 has higher minimums than the 8GB 390X.

KgKOBBa.png

11muTSN.png


Or are you talking about this where the 6GB 980 Ti still has higher minimums than other 8GB cards (including the 1070)? And if you think the 1060 6GB is suffering because of its VRAM, it's not, otherwise, the 980 Ti would suffer too. The problem is the paltry 192GBit/s bandwidth compared to the 256GB/s of the 580/

gtjjt6i.png


And do keep in mind, Kitguru tested the second set with 2x SSAA which is extremely punishing on the GPU. With just SMAA as shown above, you get way way higher fps.
RE 2 shows the limits of 4GB pretty well with the 0.1% lows tanking even though 1% and average looks okay. This will cause small stutters. 6GB does hang on in this title though
Yeah so your claim that "even 6GB started to struggle" is bogus.
If you look at when around the time the 580 and 1060 they were pretty neck and neck.

Then in the retro in early 2020 we had this and some big wins in more recent at the time titles.

And it even showed up at 1080p.
Had nothing to do with VRAM but AMD getting their shit together with their drivers and their older cards benefitting. Even their 4GB models saw performance improvements compared to NVIDIA. It's cute that you automatically assume that VRAM is the bottleneck, when powerful cards such as the 980 Ti don't buckle like other weaker cards. There are factors other than resolution such as bandwidth that will player a bigger role than VRAM most of the time because performance will suffer regardless of being above the minimum specs.

Last but not least, these benchmarks are all or Very High which is comically higher than what the consoles run this game at.

Shadows: Medium
Dynamic Foliage: Medium
LOD: High
Textures: High
Sunsoft Shadows: Off



A little 2GB GTX 960 is embarrassing the Xbox One.

Your argument doesn't hold up to scrutiny. You claim that because of the consoles, as the generation winded down, the VRAM requirements increased but that's blatantly false. It has nothing to do with the consoles that kept pushing much lower settings than the High presets on PC. As shown above, the 970 with console settings has no problem whatsoever maintain 60fps at 1080p and higher settings than the X1. Then you show Very High benchmarks, grossly misinterpret minimum frame rates as automatically meaning VRAM bottlenecks yet blatantly ignore the fact that the presets used are maxed out. Newsflash: Technology advances. ROTTR looks better than its predecessor, has higher quality assets and is more demanding across the board. The same trend continues with Shadow of the Tomb Raider. Consoles aren't responsible for this. Tech simply moves forward. And no, pushing 2x SSAA to crush the bandwidth of a 1060 doesn't prove that 6GB is being exceeded. The last-gen consoles have 5.5GB of usable VRAM, not 8.
 
Last edited:

Kenpachii

Member
Streaming assets from the SSD with on the fly decompression is one of the main things Cerny worked on with the console. It allows for higher quality assets and more variety without blowing out the available ram.

There were 8GB variants of the 290X and the 390X (980 competitor) came with 8GB by default. The 1080 was not the 1st mainstream 8GB card at all. Maybe you ought to stop being confidently incorrect and instead try simply being correct.

Do you remember the Fury X 4GB. At launch it was quite a bit faster than the R9 390X. A couple of years later once more games were built exclusively around PS4 and Xbox One the 390X would often be as fast with better minimums.

Also the 390X had longer legs than the 980 even though at launch the performance was about the same. The 970 also dropped off quite hard by mid gen due to only having 3.5GB of ram running at full speed at 0.5GB at reduced speed hence the 970 3.5GB memes.

980 competed with the 290 not 390, i had 2x 290's ditched them for a 970 never looked back. 290's where garbage. Hell 980 is basically a 1060 6gb, which was the most sold card and used for years. 980 aged like fine whine.
970 aged like fine whine and memory never was a issue i had one for 7 years.

Fury x was a dog shit card even at launch
 
Last edited:

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
I find it crazy that people in here are defending Sony by saying they are still trying to make PC games, and people should give them a free pass.

If I pay 70 euro for a game - which I dont, because no games is worth it imo - I expect the damn game to work. If you sell a product, then it is expected that the game works.
 

rodrigolfp

Haptic Gamepads 4 Life
I find it crazy that people in here are defending Sony by saying they are still trying to make PC games, and people should give them a free pass.
It's a new platform that they don't also use for game development. Give them time. /s

Also they didn't have games like Planetside 2 and DCUO.
 
Last edited:

Fake

Member
I find it crazy that people in here are defending Sony by saying they are still trying to make PC games, and people should give them a free pass.

If I pay 70 euro for a game - which I dont, because no games is worth it imo - I expect the damn game to work. If you sell a product, then it is expected that the game works.

Fanboys plus ND fanboys. A bad port is a bad port no matter what.
 
Last edited:
Top Bottom