• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last of Us Part 1 on PC; another casualty added to the list of bad ports?

winjer

Gold Member

The-Last-of-Us-Part-I-CPU-benchmarks.png


The game scales with many threads. The sweet spot is 6C/12T. But using 8C16T still brings a good bump in performance.
 

Marlenus

Member
980 competed with the 290 not 390, i had 2x 290's ditched them for a 970 never looked back. 290's where garbage. Hell 980 is basically a 1060 6gb, which was the most sold card and used for years. 980 aged like fine whine.
970 aged like fine whine and memory never was a issue i had one for 7 years.

Fury x was a dog shit card even at launch

290X came out about the same time as the 780Ti and they traded blows. The 970 launched 10 months after the 290 and 9 months before the 390 so bang in the middle.

Still in 2019 Techspot did a revisit and found.

Closing Remarks
There you have it, the GTX 970 went from ~10-15% faster four years ago to a percent faster in 2019 against the Radeon R9 290 based on our 33 game test sample that includes many newer titles.

Hitman 2 and RE2 also show the 4GB cards and below losing hard Vs the 6GB and above cards. Something that would have been more visible if a 390 was included in that revisit but comparing the 1060 3GB Vs 6GB tells the story.
 

M1chl

Currently Gif and Meme Champion
Not all devs can be ID Software :goog_neutral: Still it is surprising, given Naughty Dog is well known for their optimization work on console. Also given what Guerilla was able to do with Decima on PC (+ release few open source repos). Where the Death Stranding still looks incredible, yet it could run on toaster.

The CPU usage especially is kind of a lot for corridor game, but my personal opinion is that they are still using some form of ACE programming (GPU compute doing CPU stuff) and they just didn't want to bother with OpenCL or CUDA and just emulate it on CPU.
 

Buggy Loop

Member
So when a game doesn't play well on Nvidia hardware for several reasons (lack of vram, useless RT, optimized for RDNA2) this is his answer: the game is unoptimized? Soon he is going to talk about the Nvidia tools that aren't ready, isn't he?

So you’re saying the game is optimized?

The Office Smile GIF
 

Marlenus

Member
Then what cards are you talking about?


And per chance do they fall in the mid range of most performance charts?

3070 when not vram limited matches or exceeds the 2080Ti. I would class that and the 3080 as high end and then the 3090 as enthusiast.

3060 is mid range to me in terms of price and performance and with 12GB of ram it will have better legs than competing parts like the 6600XT which you can already see in some games.

3060Ti is upper mid and is starting to bridge the gap between mid and high end.

So with inflation I could see mid range creeping up to around $370 and high end starting at $500.

That would put a 12GB 4070 as a high end part if $600 is true and I just don't think 12GB is enough for a high end part.

In a similar fashion a $400+ 4060 is just too much for 8GB of vram especially when the 3060 will beat it in some titles for that very reason.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
3070 when not vram limited matches or exceeds the 2080Ti. I would class that and the 3080 as high end and then the 3090 as enthusiast.

3060 is mid range to me in terms of price and performance and with 12GB of ram it will have better legs than competing parts like the 6600XT which you can already see in some games.

3060Ti is upper mid and is starting to bridge the gap between mid and high end.

So with inflation I could see mid range creeping up to around $370 and high end starting at $500.

That would put a 12GB 4070 as a high end part if $600 is true and I just don't think 12GB is enough for a high end part.

In a similar fashion a $400+ 4060 is just too much for 8GB of vram especially when the 3060 will beat it in some titles for that very reason.

When the 3070, 3080 and 3090 dropped the 2080Ti became a midrange card.
You cant keep calling a top range card top range generations later.

So today the midrange is still the 3070 cuz in performance charts its in the mid range of said chart.
The 2080Ti as well.
The range topping 980Ti cant be called an enthusiast or high end card anymore when it gets decked by GTX 1070s and low end RTX 2060Ss handily.
 
So you’re saying the game is optimized?

The Office Smile GIF
Barring some first party games on consoles how many PC games (or even console games) are actually optimized sufficiently? That port is no worse than many others multiplats that released in a disastrous state on PC (or consoles) then got patched. That game seems to run OK in some machines having strong CPU and enough vram. And just look at RE4 on PS5, do you think it was optimized enough compared the PC or even Xbox versions?

What they (PCMR) can't stand is that PS5 is here comparatively outperforming their $3000 plastic box and they'll never acknowledge the fact that this console is a much better machine at managing data because it has being designed from the start for that. As they (specifically Dictator) have being constantly downplaying PS5 focus on I/O from the first Cerny talk because the latter understood that the main bottleneck this gen would be I/O and neither CPU or GPU.
 
Last edited:

simpatico

Member
Atomic Heart was made by 6 people in an abandoned warehouse and they managed to make a great looking game that ran great on day 1. I think after it's release, situations like this look even worse. I wonder if there was a small exodus during TLOU2 development who's effects haven't been fully felt yet.
 

M1987

Member
Atomic Heart was made by 6 people in an abandoned warehouse and they managed to make a great looking game that ran great on day 1. I think after it's release, situations like this look even worse. I wonder if there was a small exodus during TLOU2 development who's effects haven't been fully felt yet.
Was it really made by 6 people? no matter how many people it was made by,it looks and runs smooth af on PC,so Naughty Dog have no excuse releasing this shit.Some of the bugs are completely embarrassing
 

Spyxos

Gold Member
Was it really made by 6 people? no matter how many people it was made by,it looks and runs smooth af on PC,so Naughty Dog have no excuse releasing this shit.Some of the bugs are completely embarrassing
6 people sounds very unbelievable. Maybe they started with 6 people and got bigger later.
 

IFireflyl

Gold Member
Dude I don't even remember who I replied to and what. Probably just a sheep anyway....
lol let it go who gives a shit

So your response is just to double-down and continue calling people who disagree with you "sheep". You are the reason PC gamers have a bad reputation. Get over yourself.
 

rofif

Can’t Git Gud
So your response is just to double-down and continue calling people who disagree with you "sheep". You are the reason PC gamers have a bad reputation. Get over yourself.
now thats a new one. someone calling me a pc gamer lol.
in fact, pc gamers turning off motion blur outright after launching the game is a BAD REP
 

Buggy Loop

Member
Barring some first party games on consoles how many PC games (or even console games) are actually optimized sufficiently? That port is no worse than many others multiplats that released in a disastrous state on PC (or consoles) then got patched. That game seems to run OK in some machines having strong CPU and enough vram. And just look at RE4 on PS5, do you think it was optimized enough compared the PC or even Xbox versions?

What they (PCMR) can't stand is that PS5 is here comparatively outperforming their $3000 plastic box and they'll never acknowledge the fact that this console is a much better machine at managing data because it has being designed from the start for that. As they (specifically Dictator) have being constantly downplaying PS5 focus on I/O from the first Cerny talk because the latter understood that the main bottleneck this gen would be I/O and neither CPU or GPU.

This is why we need the 🤡 reaction

You're regurgitating PR. 🤮

There's nothing in Last of Us part 1 that even requires PS5. Its basically based on PS4's Last of Us part II and even that looks better. They didn't need to fundamentally change the engine for streaming, it's really standard fare level design.

Even more cringe than peoples calling themselves PCMR (because that IS cringe), is console fanboys calling any critique against their pony PCMR with 3k$ PCs.




Try ~$800 for console equivalence.

Returnal which was made for the ground up for PS5 and ran at 1080p internally?




2060 Super

Plague Tale requiem which is much prettier?




2070 super before the huge optimization patch on PC.

In fact the IO/Streaming sauce is such bullshit when talking about the Last of Us part 1 that you can literally use the 60 fps patch even installed on an external HDD. It's literally a PS4 game at higher resolution.

The game that really push IO streaming hasn't been released as of yet, maybe the Ratchet and Clank port will, but then again, its all speculation, it still runs fine even on the slowest compatible SSD, switching on the fly environments isn't something new.
 

IFireflyl

Gold Member
now thats a new one. someone calling me a pc gamer lol.
in fact, pc gamers turning off motion blur outright after launching the game is a BAD REP

Also you:

I am into pc gaming since the 90s. Don't school me how to use my pc.

I am a pc gamer myself. I just don't see why limit yourself ... stubborn pc gamer is the only one who looses here

Are you talking to me?
I am a pc gamer for at least 25 years now.
I had first gsync montitors, 240hz, 144hz, everything.
I play 4k 120fps oled on rtx 3080 daily.
I play exclusively 120fps on pc... what is your case again?

The thread is not about me.
It's about how badly devs represent 30fps now. It can be 75ms faster ffs

I see. I also am pc gamer but over the years, i found that I mostly buy and collect steam games rather than play them :p

Exactly this… this elite pc gamer thinking.
you have everything to loose and you are loosing a war with yourself.
extreme prices. More and more stores, only digital.
and don’t get me wrong. I got 3080 pc. I am a pc gamer since 1997… And I am hoping pc will not win.

I am a pc gamer my whole life. I had my first pc in 1997. I still value console. I had 360, ps4 and now ps5. All alongside good pc. Now even 3080.
And yet I am still finding console gaming way more care free. you go and play. Even on good pc there is always tweaking and it costs much more.

Anyway - I play on both. Right now you can get 4k console with no loading times (still not possible on pc) for 400usd. Cannot even get 4k gpu for that on pc.
But like you say - I value single player games the most. I always did. Even in 1997 on pc

And most games are on consoles. It's the other way around with pc not having some games

Pc gaming is a pain in the ass and not worth the hassle over just palying games (that said I am a pc gamer lol). It's not even about the price but just nothing ever working right.
Forza 3,4,5 all suck the same way. The dick sucking cringe festival and random races structure.
Doom Eternal is much worse than 2016

I am not anti pc !!! I love playing at highest spec. I love pc gaming.
Like above, I am just less and less tolerant to its value, stores, bad ports and annoyances.

Why are you people confusing not being blind to flaws, with trolling and hating the platform.

I am maybe jaded as fuck and I really enjoy value and plug and play nature of console gaming… but I am still pc gamer at a core. I want dlss, 4k, 120 fps and gsync.

Stupidity Are You Stupid GIF
 

SlimySnake

Flashless at the Golden Globes
Great video. Shows just how poor the CPU optimization is even in areas where there is nothing going on. VRAM usage on 1080p low is 6 GB. This is a setting where the textures literally dont load. 6GB.

Naughty Gods have been humbled.
 

Buggy Loop

Member
Great video. Shows just how poor the CPU optimization is even in areas where there is nothing going on. VRAM usage on 1080p low is 6 GB. This is a setting where the textures literally dont load. 6GB.

Naughty Gods have been humbled.


This is an aberration to VRAM usage. If the Oodle version they are using is bugged like early report suggests, it’s probably bloated
 

MikeM

Member
This is why we need the 🤡 reaction

You're regurgitating PR. 🤮

There's nothing in Last of Us part 1 that even requires PS5. Its basically based on PS4's Last of Us part II and even that looks better. They didn't need to fundamentally change the engine for streaming, it's really standard fare level design.

Even more cringe than peoples calling themselves PCMR (because that IS cringe), is console fanboys calling any critique against their pony PCMR with 3k$ PCs.




Try ~$800 for console equivalence.

Returnal which was made for the ground up for PS5 and ran at 1080p internally?




2060 Super

Plague Tale requiem which is much prettier?




2070 super before the huge optimization patch on PC.

In fact the IO/Streaming sauce is such bullshit when talking about the Last of Us part 1 that you can literally use the 60 fps patch even installed on an external HDD. It's literally a PS4 game at higher resolution.

The game that really push IO streaming hasn't been released as of yet, maybe the Ratchet and Clank port will, but then again, its all speculation, it still runs fine even on the slowest compatible SSD, switching on the fly environments isn't something new.

PCMR is fucking irritating and I say that as a 7900xt PC owner.

“OMGZ PC RULZ CONSOLE SUKZ”

“I NEED 200FPS OR ELSE EVERYTHING IS TRASH”

Why are console players looked down upon? Video games are video games whether played at 30fps or 200fps. Let people enjoy their games the way they want.

I also find it hypocritical when PCMR hates on current gen consoles but then praises the Steamdeck even though it runs at performance and resolution levels far lower than current gen consoles.

I still play my console heavily even though I have a high end PC. Because games are fucking games. The sooner this insecure PCMR (and really any masterrace stuff) circle jerk dies (probably never, or when they move out of mom’s basement/touch grass), the better.
 

SlimySnake

Flashless at the Golden Globes

DF noticed that the stutters and poor performance is tied to loading behind the scenes. the game is essentially streaming in data for every single environment you enter and if you just wait for a minute, the performance goes up by 25%. lol.

Whats hilarious is that these are still PS3 sized levels using PS4 quality textures. For them to be taking THAT long to stream in data for tiny levels utilizing 100% of the Rzyen 3600 is nuts.

They must have utilized Cernys IO to do a lot of this heavy loading and couldnt figure out an optimal way to do this on PC.
 

Thebonehead

Banned
Barring some first party games on consoles how many PC games (or even console games) are actually optimized sufficiently? That port is no worse than many others multiplats that released in a disastrous state on PC (or consoles) then got patched. That game seems to run OK in some machines having strong CPU and enough vram. And just look at RE4 on PS5, do you think it was optimized enough compared the PC or even Xbox versions?

What they (PCMR) can't stand is that PS5 is here comparatively outperforming their $3000 plastic box and they'll never acknowledge the fact that this console is a much better machine at managing data because it has being designed from the start for that. As they (specifically Dictator) have being constantly downplaying PS5 focus on I/O from the first Cerny talk because the latter understood that the main bottleneck this gen would be I/O and neither CPU or GPU.

PS5 only owners sure have such a weird hate boner for Alex.

tumblr_m7tt5uCBwk1r4gei2o3_400.gifv



tumblr_m7tt5uCBwk1r4gei2o5_400.gif



tumblr_m7tt5uCBwk1r4gei2o6_400.gif
 

Marlenus

Member
When the 3070, 3080 and 3090 dropped the 2080Ti became a midrange card.
You cant keep calling a top range card top range generations later.

So today the midrange is still the 3070 cuz in performance charts its in the mid range of said chart.
The 2080Ti as well.
The range topping 980Ti cant be called an enthusiast or high end card anymore when it gets decked by GTX 1070s and low end RTX 2060Ss handily.

If the 3070 had dropped in price perhaps but at £500 I still consider that the start of the high end segment and should provide a very good 1440p and decent 4K experience. I don't think the performance matches the price tbh.

Also the 3060 is showing that in some titles at certain settings it can offer a better game play experience. That should not happen and it is more than just this one title.

Where? Even the 4GB 970 has higher minimums than the 8GB 390X.

KgKOBBa.png

11muTSN.png


Or are you talking about this where the 6GB 980 Ti still has higher minimums than other 8GB cards (including the 1070)? And if you think the 1060 6GB is suffering because of its VRAM, it's not, otherwise, the 980 Ti would suffer too. The problem is the paltry 192GBit/s bandwidth compared to the 256GB/s of the 580/

gtjjt6i.png


And do keep in mind, Kitguru tested the second set with 2x SSAA which is extremely punishing on the GPU. With just SMAA as shown above, you get way way higher fps.

Yeah so your claim that "even 6GB started to struggle" is bogus.

Had nothing to do with VRAM but AMD getting their shit together with their drivers and their older cards benefitting. Even their 4GB models saw performance improvements compared to NVIDIA. It's cute that you automatically assume that VRAM is the bottleneck, when powerful cards such as the 980 Ti don't buckle like other weaker cards. There are factors other than resolution such as bandwidth that will player a bigger role than VRAM most of the time because performance will suffer regardless of being above the minimum specs.[/QUOTE]

Bus width is linked to VRAM allocation. It is not like we have a 1060 12GB to determine if slow down is due to bus width or due to memory amount. Probably more the former in the majority of cases because the entire 10 series were very well balanced parts with an appropriate amount of VRAM for the amount of performance on offer.

You can start to see the limits of 6GB though here.

WD_1440p.png


Even though the 2060 has more bandwidth than the 1070 and 1080 and it has better colour compression as well the 1% lows are falling quite a long way behind that of the 1080.

borderlands-3-1920-1080.png


Here the 2060 is falling behind the normally slower 8GB 10 series parts

Same thing in Anno.

anno-1800-1920-1080.png


And as we saw with Watch Dogs Legion the averages don't tell the whole story at all so who knows what other games TPU tested where the average looked okay but the 1% lows were worse.

Last but not least, these benchmarks are all or Very High which is comically higher than what the consoles run this game at.

Shadows: Medium
Dynamic Foliage: Medium
LOD: High
Textures: High
Sunsoft Shadows: Off



A little 2GB GTX 960 is embarrassing the Xbox One.

Your argument doesn't hold up to scrutiny. You claim that because of the consoles, as the generation winded down, the VRAM requirements increased but that's blatantly false. It has nothing to do with the consoles that kept pushing much lower settings than the High presets on PC. As shown above, the 970 with console settings has no problem whatsoever maintain 60fps at 1080p and higher settings than the X1. Then you show Very High benchmarks, grossly misinterpret minimum frame rates as automatically meaning VRAM bottlenecks yet blatantly ignore the fact that the presets used are maxed out. Newsflash: Technology advances. ROTTR looks better than its predecessor, has higher quality assets and is more demanding across the board. The same trend continues with Shadow of the Tomb Raider. Consoles aren't responsible for this. Tech simply moves forward. And no, pushing 2x SSAA to crush the bandwidth of a 1060 doesn't prove that 6GB is being exceeded. The last-gen consoles have 5.5GB of usable VRAM, not 8.


I made no comment about console settings. I simply said that 6GB started to struggle and we can see that it clearly did in some titles, either due to lack of bandwidth or due to VRAM constraints which when a chip is designed for a given performance target goes hand in hand most of the time.

Yes a 3070 with 16GB of VRAM would have far more VRAM than it can truely take advantage of but it would not be losing to the 3060 12GB in some games.

Hogsmeade_RT_1080p-color-p.webp


This is something that should not happen. The 3060 is providing a solid 30 fps 1080p ultra + RT gaming experience here. The 3080 and 3070 and getting absolutely crushed. That is a clear example of the 3070 and 3080 being VRAM limited. Even the 12GB+ RDNA2 cards are offering a far better gaming experience here. Anybody who purchased the 3070 over the 6800 or the 3080 over the 6800 XT for the RT capabilities is not going to be happy yet those who went 3060 12GB are absolutely fine if they are happy with 30 FPS.

Or here in RE4 where the 3070 can't even play the game but the 3060 can handle 45+ fps.

min-fps-rt-2560-1440.png


You can try and paint this as TLOU being an outlier but there are more titles where this is becoming an issue with the 3060Ti, 3070, 3070Ti and on occasion 3080 as well.

NV went cheap on VRAM for the 3070 and 3080. I mean having 8GB for the 1070 was great, having 8GB on the 2070 was perfectly fine but in 2020 the 3070 also having 8GB was bound to cause problems long term. For those who upgrade every few years then not a problem but for those who like to keep their card for 5 or so years it is something that will come back to bit you. Also with 4070Ti pricing many people who may have upgraded to a 40 series part have decided to hold on and even the 4070Ti at $800 damn dollars is showing signs of being limited at 4K which for that much money I just don't see as acceptable.
 

Gaiff

SBI’s Resident Gaslighter
I made no comment about console settings. I simply said that 6GB started to struggle and we can see that it clearly did in some titles, either due to lack of bandwidth or due to VRAM constraints which when a chip is designed for a given performance target goes hand in hand most of the time.
Come on man. This is you.

We had the exact same situation play out when PS4 and Xbox One were launched after their cross gen period and 6GB cards started to struggle and 8GB became the new minimum requirement.

That's what you claimed. "After their cross-gen period ended." We stopped seeing cross-gen AAA games as early as 2014 and you have to go all the way up to 2020 and now even post RT benchmarks to prove a point. Hogwarts Legacy with RT, really man? That's where we're at now? 2020 isn't just the cross-gen period ending, it's the new generation starting.

The implication was very clear; you were making a parallel between the situation back then and the current one. That because of the new consoles, the memory requirements dramatically increased once the cross-gen period ended. The cross-gen period of the 8th generation ending had fuck-all to do with 6GB cards running into their limits because we saw the consoles being routinely outperformed by cards with 3.5GB of VRAM. Now the reason that 8GB is starting to show its age is because the consoles which are typically the target have 13.5GB or so of available RAM. Back then, it was only 5.5GB. So no, the situation aren't the exact same or remotely close because at no point during the 8th gen did 6GB become too little because of the consoles.

Do I agree that 8GB on a 3070 was a joke? Sure. Do I think the same on say a 980? Fuck no. 4GB remained enough for like 5 years following the launch of the 980. 8GB was sometimes not enough the moment that the 3070 came out.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If the 3070 had dropped in price perhaps but at £500 I still consider that the start of the high end segment and should provide a very good 1440p and decent 4K experience. I don't think the performance matches the price tbh.
Thats not how that works....thats not how anything works.

The "current" price doesnt dictate the segment a PC component is in especially if a component is EoL.
Its performance does.
The 2080Ti performed like a 3070 when the 3070 dropped.....yet still cost ~1000 dollars.
That meant the 2080Ti had to drop from the enthusiast/high end segment it once ruled to whatever segment it now took up on performance charts.
And today the 2080Ti is going to get walked by the likes of the RTX4060 which will be this generations low tier (Theres a rumored RTX4050 which will be entry level)
 

Marlenus

Member
Thats not how that works....thats not how anything works.

The "current" price doesnt dictate the segment a PC component is in especially if a component is EoL.
Its performance does.
The 2080Ti performed like a 3070 when the 3070 dropped.....yet still cost ~1000 dollars.
That meant the 2080Ti had to drop from the enthusiast/high end segment it once ruled to whatever segment it now took up on performance charts.
And today the 2080Ti is going to get walked by the likes of the RTX4060 which will be this generations low tier (Theres a rumored RTX4050 which will be entry level)

When the 1080Ti released the 1080 dropped In price to reflect it's new lower performance tier. That has not happened with the 3070 or the 3070Ti.

So performance wise, sure mid range, shame the price does not reflect that reality.
 

SlimySnake

Flashless at the Golden Globes
I have this really frustrating bug where the game just wont drop any handgun ammo. All i get is revolver ammo and rifle and shotgun ammo. Whats frustrating is that coming off of RE4, i decided to put everything into upgrading my starting pistol thinking id be using it the most. Just got my first bullets in like 4 hours and they were in a shiv locked room so guaranteed handgun ammo drops.

I wonder if the bug will end up helping me and drop more shotgun and sniper ammo now that the handgun ammo has been removed from the loot pool.

P.S Pittsburgh open areas are brutal on the GPU. massive drops to 40 fps from what has been an otherwise locked 60 fps on 4k dlss on quality.
 

scalman

Member
Never before i saw so ugly game on gtx 1069 6gb.. i mean all new games could still have at least textures high on this vram i mean just look at cb 2077 with frs... I mean what is even this?
Ps4 pro version looked ways better then this ..even when i tried this on high just for look purposes it still looked bad... And fsr 2 doesnt do anything ...it looks worse too
 

SlimySnake

Flashless at the Golden Globes


I did not notice that flickering in my DLSS run. Could just be a problem with that particular fire escape. it looks fine and clears up jaggies in foliage compared to native 4k.
 

J3nga

Member
I spent 10 hours in the game and yet to see a bug or a crash. My only problem are lengthy loading times when you first boot up the game, otherwise my experience has been that this is a definitive edition of TLOU. It runs better than Dead Space remake which is a stutter fest not to mention The Callisto Protocol, not getting any stutters here. I'm running it on 4090 and I do believe people are having issues on a lower-end PC's but also a lot of it is blown out-of-proportion with fake videos and whatnot.
 

NinjaBoiX

Member
I'm more shocked over steams definition of similar games when I check it on the store.

Screenshot-20230328-202003.jpg


I mean. I think I only made it about half way through on ps4, but I wouldn't say there's similarities behind these two games gameplay wise.
Crafting? Looting? Shooting? Post-apocalyptic?

Come on bro, they’re hardly chalk and cheese…
 
Last edited:

hinch7

Member
Playing the game on a 1070 at 1440P with FSR 2.2, set to quality and its wild. Mostly on medium settings, with a mixture of high. The game looks and runs good and smooth and much better than what an old Pascal GPU should be expect to - thank the gods for FSR! No crashes or bugs to note with the few hours I've played with it.

Not sure why or where people are experiencing such issues with this. I have a feeling a lot of people are setting things on max without any thought on lower tier GPU's; with their limited VRAM capactiy and overloading the memory. And those having bad performance are using older generation six core CPU's. As the game is very CPU and memory intensive.
 
Last edited:

MMaRsu

Banned
If the 3070 had dropped in price perhaps but at £500 I still consider that the start of the high end segment and should provide a very good 1440p and decent 4K experience. I don't think the performance matches the price tbh.

Also the 3060 is showing that in some titles at certain settings it can offer a better game play experience. That should not happen and it is more than just this one title.



Yeah so your claim that "even 6GB started to struggle" is bogus.

Had nothing to do with VRAM but AMD getting their shit together with their drivers and their older cards benefitting. Even their 4GB models saw performance improvements compared to NVIDIA. It's cute that you automatically assume that VRAM is the bottleneck, when powerful cards such as the 980 Ti don't buckle like other weaker cards. There are factors other than resolution such as bandwidth that will player a bigger role than VRAM most of the time because performance will suffer regardless of being above the minimum specs.
Bus width is linked to VRAM allocation. It is not like we have a 1060 12GB to determine if slow down is due to bus width or due to memory amount. Probably more the former in the majority of cases because the entire 10 series were very well balanced parts with an appropriate amount of VRAM for the amount of performance on offer.

You can start to see the limits of 6GB though here.

WD_1440p.png


Even though the 2060 has more bandwidth than the 1070 and 1080 and it has better colour compression as well the 1% lows are falling quite a long way behind that of the 1080.

borderlands-3-1920-1080.png


Here the 2060 is falling behind the normally slower 8GB 10 series parts

Same thing in Anno.

anno-1800-1920-1080.png


And as we saw with Watch Dogs Legion the averages don't tell the whole story at all so who knows what other games TPU tested where the average looked okay but the 1% lows were worse.



I made no comment about console settings. I simply said that 6GB started to struggle and we can see that it clearly did in some titles, either due to lack of bandwidth or due to VRAM constraints which when a chip is designed for a given performance target goes hand in hand most of the time.

Yes a 3070 with 16GB of VRAM would have far more VRAM than it can truely take advantage of but it would not be losing to the 3060 12GB in some games.

Hogsmeade_RT_1080p-color-p.webp


This is something that should not happen. The 3060 is providing a solid 30 fps 1080p ultra + RT gaming experience here. The 3080 and 3070 and getting absolutely crushed. That is a clear example of the 3070 and 3080 being VRAM limited. Even the 12GB+ RDNA2 cards are offering a far better gaming experience here. Anybody who purchased the 3070 over the 6800 or the 3080 over the 6800 XT for the RT capabilities is not going to be happy yet those who went 3060 12GB are absolutely fine if they are happy with 30 FPS.

Or here in RE4 where the 3070 can't even play the game but the 3060 can handle 45+ fps.

min-fps-rt-2560-1440.png


You can try and paint this as TLOU being an outlier but there are more titles where this is becoming an issue with the 3060Ti, 3070, 3070Ti and on occasion 3080 as well.

NV went cheap on VRAM for the 3070 and 3080. I mean having 8GB for the 1070 was great, having 8GB on the 2070 was perfectly fine but in 2020 the 3070 also having 8GB was bound to cause problems long term. For those who upgrade every few years then not a problem but for those who like to keep their card for 5 or so years it is something that will come back to bit you. Also with 4070Ti pricing many people who may have upgraded to a 40 series part have decided to hold on and even the 4070Ti at $800 damn dollars is showing signs of being limited at 4K which for that much money I just don't see as acceptable.


Or dont use RT with a 3070 and get 100+fps on 4k on Resident Evil 4

Jeesh whatever shall I do?
 
Last edited:

Gaiff

SBI’s Resident Gaslighter


Performance has improved as well and VRAM usage has been reduced.

Seriously disappointed in Sony for delivering such a shoddy port that needed another month of work. Now the nonsensical "PCs need 64GB of RAM to run this" can die.
 
Last edited:

Kataploom

Gold Member


Performance has improved as well and VRAM usage has been reduced.

Seriously disappointed in Sony for delivering such a shoddy port that needed another month of work. Now the nonsensical "PCs need 64GB of RAM to run this" can die.

Is it currently playable with only 16 GB of RAM? The rest of the system is good btw, I just don't feel like upgrading right now due to other expenses and that's the sole reason why I don't get the game already after the patches
 

Buggy Loop

Member


Performance has improved as well and VRAM usage has been reduced.

Seriously disappointed in Sony for delivering such a shoddy port that needed another month of work. Now the nonsensical "PCs need 64GB of RAM to run this" can die.


Well There It Is Jurassic Park GIF


Probably will be improved even further in coming weeks
 

Gaiff

SBI’s Resident Gaslighter
Another big patch. 1.05

Reduced shader building times
Optimized code to improve global CPU performance
Optimized content to improve performance across several levels
Improved level loading to help reduce the amount of 'Please Wait' and loading screens
Added a new Effects Density setting, which adjusts the density and number of non-critical visual effects (Options > Graphics)
Increased crowd sizes on Low and Medium Ambient Character Density settings and added a Very Low option (Options > Graphics)
Implemented additional scalability tuning for Low and Medium in-game Graphics settings
Reduced the VRAM impact of texture quality settings, allowing most players to increase their texture quality settings or experience improved performance with their current settings
Fixed a crash that would occur on boot on Windows 11
Fixed a crash that could occur on Intel Arc
Fixed a crash that may occur when starting a New Game in Left Behind prior to the completion of shader building
Corrected an issue where pointing the camera at the floor while aiming would cause the player and camera to visually stutter
 

Buggy Loop

Member
So huge improvements on that patch?

Will probably buy the game goes on sale then.

I really don’t understand peoples jumping in day 1.
 
Top Bottom