• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last of Us Part 1 on PC; another casualty added to the list of bad ports?

doVjLjf.jpeg


jGjPa8P.jpg


WdhOIp.gif
 

SlimySnake

Flashless at the Golden Globes
I still believe that we cannot use shit ports to define if X or Y GB or VRAM was required, especially with games that have memory leaks like the Last of Us, but i'm still puzzled by peoples who deliberately picked 8GB cards for fucking 4k. We knew before even Ampere launch that the 3080's 10GB should be ok at 4K if devs aren't derps, but all review sites never put these 8GB cards as 4K choices.

Puzzling

Some peoples really just need to buy a console and plug it without thinking and not go beyond 2 graphic settings, that much i realize now. :messenger_tears_of_joy:
The problem isnt just 4k though. As I showed above, even at low settings at 1080p which makes the game look like a PS2 game, it reserves 6GB of VRAM. A 3070 user will likely want to play this at 1440p high PS5 equivalent settings, and that requires 11GB.
 

Buggy Loop

Member
The problem isnt just 4k though. As I showed above, even at low settings at 1080p which makes the game look like a PS2 game, it reserves 6GB of VRAM. A 3070 user will likely want to play this at 1440p high PS5 equivalent settings, and that requires 11GB.

Yeah the port is completely borked in memory management. I think in a week(s ?) we’ll have a better idea after a big patch
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I don't have a single problem with the game. Getting 90 fps native 4k with the 4090.

Let it do shaders. Before you start the game. It took me 20 mins to do so.

Not a single crash either. I don't know why people like drama.

I am already 3 hours in

Using 12600k. So if you are crashing and you are using AMD cpu then you deserve it . No one told you to buy shit cpu heh lol

I did encounter one small bug where the clicker vanished while I am beating him, did a restart encounter and everything was fine
lol I am using an intel CPU, I let the shaders compile not once but twice because apparently the first time didnt take. No difference. It is virtually impossible to play this game while the game compiles shaders because

Am glad your $1,600 GPU is running the game fine..... my $850 GPU isnt. It drops to 4 fps. I cant turn the camera around without a stutter. I literally cant quit to main menu without crashing. My performance at 4k dlss quality is 60 fps locked with just 80% utilization one second, then drops to 30 fps after i change a setting and switch back. there is a massive memory leak somewhere because literally running the same benchmark twice with the same settings returns wildly different results.

It is definitely possible that ND optimized this game on the 4090 because even the 4080 has stutter issues as some youtubers have shown.
 

SlimySnake

Flashless at the Golden Globes
Yeah the port is completely borked in memory management. I think in a week(s ?) we’ll have a better idea after a big patch
The first thing I noticed with the vram is that they think the OS is reserving 2GB. To me, thats a red flag. OS does not do that. RE4 has a similar menu and it shows 9.1 GB available. I do not have chrome so its just steam, msi afterburner and geforcenow overlays. Why does this game think 2GB of VRAM is reserved by the OS?

I have also noticed that the vram goes UP not down after you go from native 4k to dlss 4k which is 1440p. Their ingame menu shows it should go down but msi afterburner clearly shows it going up from 7.5-8 to 8.5. either the settings didnt apply or the game didnt flush vram correctly.

lets see what this latest patch does.
 

//DEVIL//

Member
lol I am using an intel CPU, I let the shaders compile not once but twice because apparently the first time didnt take. No difference. It is virtually impossible to play this game while the game compiles shaders because

Am glad your $1,600 GPU is running the game fine..... my $850 GPU isnt. It drops to 4 fps. I cant turn the camera around without a stutter. I literally cant quit to main menu without crashing. My performance at 4k dlss quality is 60 fps locked with just 80% utilization one second, then drops to 30 fps after i change a setting and switch back. there is a massive memory leak somewhere because literally running the same benchmark twice with the same settings returns wildly different results.

It is definitely possible that ND optimized this game on the 4090 because even the 4080 has stutter issues as some youtubers have shown.
I do not know what is yout 850 GPU. is it a 16 gig card ?

my point is, if you are playing at 4k where you are very close to the max GPU memory allocation, then you will end up with massive stuttering. 2 gigs of spare vram is not really enough as it can spike during places.

if its not the Vram issue, is the game on NVME ? or regular old drive ? there really many factors why its stuttering.

I find it hard to believe that ND released a game based on a 4090 optimization only. it doesn't work like that.

maybe if people share more details about their specs , we can narrow down what can help improve the situation till a proper fix is out ( already one hit fix is out anyway )
 

SlimySnake

Flashless at the Golden Globes
I do not know what is yout 850 GPU. is it a 16 gig card ?

my point is, if you are playing at 4k where you are very close to the max GPU memory allocation, then you will end up with massive stuttering. 2 gigs of spare vram is not really enough as it can spike during places.

if its not the Vram issue, is the game on NVME ? or regular old drive ? there really many factors why its stuttering.

I find it hard to believe that ND released a game based on a 4090 optimization only. it doesn't work like that.

maybe if people share more details about their specs , we can narrow down what can help improve the situation till a proper fix is out ( already one hit fix is out anyway )
7 GBps SSD. 10 GB card but im not trying to run it at ultra settings or even at 4k.

Even at high settings 4k DLSS which is 1440p, it is a mess. No two runs are the same. Game randomly drops frames and stays there. Im within the vram budget after turning down settings and switching to 1440p. Doesnt matter. I have repeatedly stated that the game is a mess on all settings, including the lowest settings which looks worse than the game on PS3.

I dont know why you find it hard to believe something that has literally become a meme. Steam reviews are literally 67% negative. I am glad you want to help but you're not helping by dismissing real fucking people are having.
 
Last edited:

Kataploom

Gold Member
This is missleading comparison, because TLOU character models looks way beeter in the cutscenes anyway. Joel has some strange artefacts on his face on the 4'th screenshot,, but otherwise PS5 version looks similar to steam deck screenshot (that's how flat lighting looks on charatcer models during gameplay). When it comes to PS3 screenhot, the cutscens on PS3 werent even realtime.

TLOU1 isnt optimized very well for sure, and given that Uncharted 4's optimization was already extremely bad, this was to be expected. On PC the radeon 290X (5.6TF) was required for 720p 30fps, while PS4 (7850/7870 with 1.8TF) was running the same game at 1080p on 3x times slower GPU, so the game wasnt optimized at all, and unfortunately that's also the case with TLOU1 remake.
Calling it unoptimized is giving it a pass, it's a total disaster of a port... At least I haven't heard of any bugs but that's maybe because the base game is very well done at least.

Edit: nvm graphical bugs are rampant lol
 
Last edited:

//DEVIL//

Member
7 GBps SSD. 10 GB card but im not trying to run it at ultra settings or even at 4k.

Even at high settings 4k DLSS which is 1440p, it is a mess. No two runs are the same. Game randomly drops frames and stays there. Im within the vram budget after turning down settings and switching to 1440p. Doesnt matter. I have repeatedly stated that the game is a mess on all settings, including the lowest settings which looks worse than the game on PS3.

I dont know why you find it hard to believe something that has literally become a meme. Steam reviews are literally 67% negative. I am glad you want to help but you're not helping by dismissing real fucking people are having.
No I am not finding it hard to believe you have a problem. I am sure you do. I just find it hard to believe they did testing only on the highest-end single SKU card lool.

get a refund, fuckem. if sony just want to release a shit port, people will get their money back. and nothing sony hates more than refunds. so do your part and let them get fucked really.


these shitty companies with bad PC ports need to lose money and not make enough really. Might teach them to release a proper port before you release it to the public.
 

SlimySnake

Flashless at the Golden Globes
No I am not finding it hard to believe you have a problem. I am sure you do. I just find it hard to believe they did testing only on the highest-end single SKU card lool.

get a refund, fuckem. if sony just want to release a shit port, people will get their money back. and nothing sony hates more than refunds. so do your part and let them get fucked really.


these shitty companies with bad PC ports need to lose money and not make enough really. Might teach them to release a proper port before you release it to the public.
The problem with refunds is that the shaders took 2 hours to compile on some CPUs. I mean your CPU is roughly 15% better than my 11700k which can hit 5 Ghz and it still took 30 minutes. Mine glitched out or something, and rebuilt the shaders from scratch a second time, so it had been more than hour before i could even get in.

I think the shader comp taking that long and consuming 100% of the CPU shouldve been their first clue. Ive never seen my CPU hit that hard unless i was running cinnebench benchmarks. It hit 200 Watts and stayed at 190 watts the entire time. hit 85 degrees at one point. even cinnebench would top out at 80 degrees because i never bothered to run benchmarks for longer than 5 minutes.

there is also a memory leak because the game ran 4k 45 fps going up to 55 fps in some spots during the intro but kept getting progressively worse as i kept playing. i dont expect uniform framerates in every area of a game but quitting out of the game and coming back to the same area would give me a completely different performance. i actually captured some footage of me going through native 4k, 1440p, dlss quality and dlss performance, sometimes with no difference to framerate and it literally went down to 5 fps at one point because the game completely freaked out at me changing settings. this happens at ultra or high or normal settings.
 
Review score has been very very slowly creeping back up since the first patch so they might be improving things already.

Too early to tell but if they can keep the patches coming quick and fast then they might do ok.

If it's fixed quickly and ND next PC games releases polished then in the end it won't be a big deal. Only time will tell.
 

//DEVIL//

Member
If you bought it through steam you can still do refund even past the 2 hours. Just explain that it took more than 1 hours to do shaders and it ran like shit and show couple of links. You will get your refund no questions asked.

Also do not take my 30 mins as an accurate measure. Its just a guess. I knew it took me a long time so I left for a while and I came back. Figured it's would be 30 mins but could be also close to an hour. I didn't time it
 
Last edited:

SlimySnake

Flashless at the Golden Globes
If you bought it through steam you can still do refund even past the 2 hours. Just explain that it took more than 1 hours to do shaders and it ran like shit and show couple of links. You will get your refund no questions asked.

Also do not take my 30 mins as an accurate measure. Its just a guess. I knew it took me a long time so I left for a while and I came back. Figured it's would be 30 mins but could be also close to an hour. I didn't time it
i bought it on cdkeys because im cheap lol. whatever, i cant wait. i dont refund stuff. my wife does enough of that for the both of us.

30 minutes is probably how long it took for you. The 4 GHz AMD Zen 2 CPUs are likely the ones taking the longest. The game pegs both the CPU AND the GPU at different times during the shader compile process so it could also be related to how fast your GPU runs.
 

Woggleman

Member
I am sure that it will eventually be fixed and improved but after the highs of the TV show and increased interest in the franchise why would ND even put out something out in this condition? You would figure they would spare some extra expense to put out a quality PC version while the iron was hot instead of trusting the port to some company known for bad ports.
 

SlimySnake

Flashless at the Golden Globes
I am sure that it will eventually be fixed and improved but after the highs of the TV show and increased interest in the franchise why would ND even put out something out in this condition? You would figure they would spare some extra expense to put out a quality PC version while the iron was hot instead of trusting the port to some company known for bad ports.
Nd ported this themselves.
 

Corndog

Banned
yes vram has nothing to do with resolution.. it just seems you dont understand how memory works... just because you run a game at 1080p doesnt mean it uses less vram.. last of us remake is designed for a ps5 it isnt crossgen this means it requires 12gb+ of vram minimum anything less causes problems and this happens every generation... its nothing to do with optmiziation, its not some switch you just dial down and support 8gb. if that was possible then theres no reason for ps5 only games everything will be crossgen all the way to ps3 since you can just dial stuff down.

In any new generation the bottleneck is always memory its why it incrreases with every itteration you can virtually make any ps5 game work on ps4 but the bottleneck will be memory budgets so its why they skip crossgen and go straight up making a ps5 port. this is the same effect on pc .. you have a game designed for 12gb vram requirements and therefore a problem on 8gb cards and its like ur basically trying to fit a 9 inch d in the wrong hole. you will have to lower alot of settings that makes the game worthless for it to run under12gb
Tell me you don’t understand even the basics of how a game works. I guess you have never heard of a frame buffer.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Just tried the new patch. Instant difference! I was getting 50-55 fps in the Boston QZ scene I was testing last night and getting sub 40 fps on High 4k. I was blown away. CPU usage went down from roughly 75% to 45-50%. Figured this was it, they cracked the code!

Then I figured id restart checkpoint and enable DLSS because i noticed some shimmering in native 4k on some fences, and as soon as I restarted checkpoint, framerate dropped below 40 all the way down to the mid 30s again. never went back above 40. I quit out to the main menu, and crashed just like it was doing all night last night.

I dont know whats going on with this game tbh. It was running just fine even hitting 60 fps in corridors. I dont mind playing games hovering around 50-60 fps so I was really excited there for a second. Whats clear is that my GPU can handle this game at native 4k high settings at 50-60 fps which is in line with its performance delta with the PS5.
 
Last edited:

Corndog

Banned
a 1080 frame buffer and a 4k frame buffer dont use that much memory can you show me how different both are?
4k is 4 times the size of 1080p. So it’s 4 times the size of your 1080p buffer. Plus you don’t just have a single frame in memory. You have a least 2 or more.
I can try to find actual examples.
 

kingyala

Banned
4k is 4 times the size of 1080p. So it’s 4 times the size of your 1080p buffer. Plus you don’t just have a single frame in memory. You have a least 2 or more.
I can try to find actual examples.
show me how much memory a 4k buffer takes compared to a 1080p buffer.. dont chat about how many x the size is... show me the data or stop chatting shit..
 

Corndog

Banned
show me how much memory a 4k buffer takes compared to a 1080p buffer.. dont chat about how many x the size is... show me the data or stop chatting shit..
I am updating red dead 2 right now. When it is done I will compare vram usage in 4k vs 1080p. Hold your horses.

Edit:
Should be able to take the difference and divide by 3 to get 1080p frame buffer total size. Then multiply by 4 to get 4k total buffer size.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Just tried the new patch. Instant difference! I was getting 50-55 fps in the Boston QZ scene I was testing last night and getting sub 40 fps on High 4k. I was blown away. CPU usage went down from roughly 75% to 45-50%. Figured this was it, they cracked the code!

Then I figured id restart checkpoint and enable DLSS because i noticed some shimmering in native 4k on some fences, and as soon as I restarted checkpoint, framerate dropped below 40 all the way down to the mid 30s again. never went back above 40. I quit out to the main menu, and crashed just like it was doing all night last night.

I dont know whats going on with this game tbh. It was running just fine even hitting 60 fps in corridors. I dont mind playing games hovering around 50-60 fps so I was really excited there for a second. Whats clear is that my GPU can handle this game at native 4k high settings at 50-60 fps which is in line with its performance delta with the PS5.
I get 90-100fps at 4K High and 70-80 at 4K Ultra. Overall, not terrible at all. Given what the PS5 reaches, it should get close to 120fps but meh, it's probably significantly better optimized there so I won't be seeing the 4090 outperform it by 3.5-3.7x.
 

Corndog

Banned
show me how much memory a 4k buffer takes compared to a 1080p buffer.. dont chat about how many x the size is... show me the data or stop chatting shit..
On rdr2 at 1080p vram usage is 4140 mbs.
At 4k it’s 5279 mbs.
So the 1080p frame is 380 MBs. The 4k is 1528 MBs.
That’s a pretty big chunk of ram. Of course diss will get you memory savings.

Edit: these number could change a bit depending on what features are on. I don’t have screen space ambient occlusion which seems to add another 45 MBs at 4k.
 
Last edited:

IFireflyl

Gold Member
I beat the game today (prior to the patch being released). I had no crashes, and played at 1440p using Ultra settings, and no DLSS. I also locked my frames to 60fps. Having said that, the weakest part of my PC is my RTX 3090, so I'm not the average Steam player.
 

Guilty_AI

Member
On rdr2 at 1080p vram usage is 4140 mbs.
At 4k it’s 5279 mbs.
So the 1080p frame is 380 MBs. The 4k is 1528 MBs.
That’s a pretty big chunk of ram. Of course diss will get you memory savings.
on TLoU its +2gb according to a pic posted here, at least pre-patch.
 
Top Bottom