"people never blame their hardware" sounds to me like the game is perfect but okNobody said that. People ITT said don't expect 4K Ultra settings on an 8GB card on a demanding 2022/2023 game.
"people never blame their hardware" sounds to me like the game is perfect but okNobody said that. People ITT said don't expect 4K Ultra settings on an 8GB card on a demanding 2022/2023 game.
"people never blame their hardware" sounds to me like the game is perfect but ok
Is not a controversial statement nor does it detract from the port having issues. But ok.don't expect 4K Ultra settings on an 8GB card on a demanding 2022/2023 game.
Nah is that real ?
Damn, the game will be fixed tomorrow!
lol did you watch the video?See how quick this got pushed out after they've been told to fix it or they hate lesbians the other day?
lol did you watch the video?
Craig Lives !!!!!!!!!!Rise and shine, Dr. Freeman...
Craig Lives !!!!!!!!!!
The memes will be rolling for years from this
The problem isnt just 4k though. As I showed above, even at low settings at 1080p which makes the game look like a PS2 game, it reserves 6GB of VRAM. A 3070 user will likely want to play this at 1440p high PS5 equivalent settings, and that requires 11GB.I still believe that we cannot use shit ports to define if X or Y GB or VRAM was required, especially with games that have memory leaks like the Last of Us, but i'm still puzzled by peoples who deliberately picked 8GB cards for fucking 4k. We knew before even Ampere launch that the 3080's 10GB should be ok at 4K if devs aren't derps, but all review sites never put these 8GB cards as 4K choices.
Puzzling
Some peoples really just need to buy a console and plug it without thinking and not go beyond 2 graphic settings, that much i realize now.
The problem isnt just 4k though. As I showed above, even at low settings at 1080p which makes the game look like a PS2 game, it reserves 6GB of VRAM. A 3070 user will likely want to play this at 1440p high PS5 equivalent settings, and that requires 11GB.
lol I am using an intel CPU, I let the shaders compile not once but twice because apparently the first time didnt take. No difference. It is virtually impossible to play this game while the game compiles shaders becauseI don't have a single problem with the game. Getting 90 fps native 4k with the 4090.
Let it do shaders. Before you start the game. It took me 20 mins to do so.
Not a single crash either. I don't know why people like drama.
I am already 3 hours in
Using 12600k. So if you are crashing and you are using AMD cpu then you deserve it . No one told you to buy shit cpu heh lol
I did encounter one small bug where the clicker vanished while I am beating him, did a restart encounter and everything was fine
The first thing I noticed with the vram is that they think the OS is reserving 2GB. To me, thats a red flag. OS does not do that. RE4 has a similar menu and it shows 9.1 GB available. I do not have chrome so its just steam, msi afterburner and geforcenow overlays. Why does this game think 2GB of VRAM is reserved by the OS?Yeah the port is completely borked in memory management. I think in a week(s ?) we’ll have a better idea after a big patch
$850 GPU
I do not know what is yout 850 GPU. is it a 16 gig card ?lol I am using an intel CPU, I let the shaders compile not once but twice because apparently the first time didnt take. No difference. It is virtually impossible to play this game while the game compiles shaders because
Am glad your $1,600 GPU is running the game fine..... my $850 GPU isnt. It drops to 4 fps. I cant turn the camera around without a stutter. I literally cant quit to main menu without crashing. My performance at 4k dlss quality is 60 fps locked with just 80% utilization one second, then drops to 30 fps after i change a setting and switch back. there is a massive memory leak somewhere because literally running the same benchmark twice with the same settings returns wildly different results.
It is definitely possible that ND optimized this game on the 4090 because even the 4080 has stutter issues as some youtubers have shown.
7 GBps SSD. 10 GB card but im not trying to run it at ultra settings or even at 4k.I do not know what is yout 850 GPU. is it a 16 gig card ?
my point is, if you are playing at 4k where you are very close to the max GPU memory allocation, then you will end up with massive stuttering. 2 gigs of spare vram is not really enough as it can spike during places.
if its not the Vram issue, is the game on NVME ? or regular old drive ? there really many factors why its stuttering.
I find it hard to believe that ND released a game based on a 4090 optimization only. it doesn't work like that.
maybe if people share more details about their specs , we can narrow down what can help improve the situation till a proper fix is out ( already one hit fix is out anyway )
Calling it unoptimized is giving it a pass, it's a total disaster of a port... At least I haven't heard of any bugs but that's maybe because the base game is very well done at least.This is missleading comparison, because TLOU character models looks way beeter in the cutscenes anyway. Joel has some strange artefacts on his face on the 4'th screenshot,, but otherwise PS5 version looks similar to steam deck screenshot (that's how flat lighting looks on charatcer models during gameplay). When it comes to PS3 screenhot, the cutscens on PS3 werent even realtime.
TLOU1 isnt optimized very well for sure, and given that Uncharted 4's optimization was already extremely bad, this was to be expected. On PC the radeon 290X (5.6TF) was required for 720p 30fps, while PS4 (7850/7870 with 1.8TF) was running the same game at 1080p on 3x times slower GPU, so the game wasnt optimized at all, and unfortunately that's also the case with TLOU1 remake.
No I am not finding it hard to believe you have a problem. I am sure you do. I just find it hard to believe they did testing only on the highest-end single SKU card lool.7 GBps SSD. 10 GB card but im not trying to run it at ultra settings or even at 4k.
Even at high settings 4k DLSS which is 1440p, it is a mess. No two runs are the same. Game randomly drops frames and stays there. Im within the vram budget after turning down settings and switching to 1440p. Doesnt matter. I have repeatedly stated that the game is a mess on all settings, including the lowest settings which looks worse than the game on PS3.
I dont know why you find it hard to believe something that has literally become a meme. Steam reviews are literally 67% negative. I am glad you want to help but you're not helping by dismissing real fucking people are having.
The problem with refunds is that the shaders took 2 hours to compile on some CPUs. I mean your CPU is roughly 15% better than my 11700k which can hit 5 Ghz and it still took 30 minutes. Mine glitched out or something, and rebuilt the shaders from scratch a second time, so it had been more than hour before i could even get in.No I am not finding it hard to believe you have a problem. I am sure you do. I just find it hard to believe they did testing only on the highest-end single SKU card lool.
get a refund, fuckem. if sony just want to release a shit port, people will get their money back. and nothing sony hates more than refunds. so do your part and let them get fucked really.
these shitty companies with bad PC ports need to lose money and not make enough really. Might teach them to release a proper port before you release it to the public.
i bought it on cdkeys because im cheap lol. whatever, i cant wait. i dont refund stuff. my wife does enough of that for the both of us.If you bought it through steam you can still do refund even past the 2 hours. Just explain that it took more than 1 hours to do shaders and it ran like shit and show couple of links. You will get your refund no questions asked.
Also do not take my 30 mins as an accurate measure. Its just a guess. I knew it took me a long time so I left for a while and I came back. Figured it's would be 30 mins but could be also close to an hour. I didn't time it
Nd ported this themselves.I am sure that it will eventually be fixed and improved but after the highs of the TV show and increased interest in the franchise why would ND even put out something out in this condition? You would figure they would spare some extra expense to put out a quality PC version while the iron was hot instead of trusting the port to some company known for bad ports.
Tell me you don’t understand even the basics of how a game works. I guess you have never heard of a frame buffer.yes vram has nothing to do with resolution.. it just seems you dont understand how memory works... just because you run a game at 1080p doesnt mean it uses less vram.. last of us remake is designed for a ps5 it isnt crossgen this means it requires 12gb+ of vram minimum anything less causes problems and this happens every generation... its nothing to do with optmiziation, its not some switch you just dial down and support 8gb. if that was possible then theres no reason for ps5 only games everything will be crossgen all the way to ps3 since you can just dial stuff down.
In any new generation the bottleneck is always memory its why it incrreases with every itteration you can virtually make any ps5 game work on ps4 but the bottleneck will be memory budgets so its why they skip crossgen and go straight up making a ps5 port. this is the same effect on pc .. you have a game designed for 12gb vram requirements and therefore a problem on 8gb cards and its like ur basically trying to fit a 9 inch d in the wrong hole. you will have to lower alot of settings that makes the game worthless for it to run under12gb
Didn't some company called Iron Galaxy who are known for poor work do it?Nd ported this themselves.
Craig Lives !!!!!!!!!!
The memes will be rolling for years from this
I like calling it Ride to Hell syndrome...why does this make me want to play through it even more?
I've played through TLOU numerous times, I'm in this for this weird fever dream version.
a 1080 frame buffer and a 4k frame buffer dont use that much memory can you show me how different both are?Tell me you don’t understand even the basics of how a game works. I guess you have never heard of a frame buffer.
4k is 4 times the size of 1080p. So it’s 4 times the size of your 1080p buffer. Plus you don’t just have a single frame in memory. You have a least 2 or more.a 1080 frame buffer and a 4k frame buffer dont use that much memory can you show me how different both are?
show me how much memory a 4k buffer takes compared to a 1080p buffer.. dont chat about how many x the size is... show me the data or stop chatting shit..4k is 4 times the size of 1080p. So it’s 4 times the size of your 1080p buffer. Plus you don’t just have a single frame in memory. You have a least 2 or more.
I can try to find actual examples.
I am updating red dead 2 right now. When it is done I will compare vram usage in 4k vs 1080p. Hold your horses.show me how much memory a 4k buffer takes compared to a 1080p buffer.. dont chat about how many x the size is... show me the data or stop chatting shit..
I get 90-100fps at 4K High and 70-80 at 4K Ultra. Overall, not terrible at all. Given what the PS5 reaches, it should get close to 120fps but meh, it's probably significantly better optimized there so I won't be seeing the 4090 outperform it by 3.5-3.7x.Just tried the new patch. Instant difference! I was getting 50-55 fps in the Boston QZ scene I was testing last night and getting sub 40 fps on High 4k. I was blown away. CPU usage went down from roughly 75% to 45-50%. Figured this was it, they cracked the code!
Then I figured id restart checkpoint and enable DLSS because i noticed some shimmering in native 4k on some fences, and as soon as I restarted checkpoint, framerate dropped below 40 all the way down to the mid 30s again. never went back above 40. I quit out to the main menu, and crashed just like it was doing all night last night.
I dont know whats going on with this game tbh. It was running just fine even hitting 60 fps in corridors. I dont mind playing games hovering around 50-60 fps so I was really excited there for a second. Whats clear is that my GPU can handle this game at native 4k high settings at 50-60 fps which is in line with its performance delta with the PS5.
On rdr2 at 1080p vram usage is 4140 mbs.show me how much memory a 4k buffer takes compared to a 1080p buffer.. dont chat about how many x the size is... show me the data or stop chatting shit..
on TLoU its +2gb according to a pic posted here, at least pre-patch.On rdr2 at 1080p vram usage is 4140 mbs.
At 4k it’s 5279 mbs.
So the 1080p frame is 380 MBs. The 4k is 1528 MBs.
That’s a pretty big chunk of ram. Of course diss will get you memory savings.
Every game will be different depending on what they are storing for each frame. But I’m not qualified to break down all that that entails.on TLoU its +2gb according to a pic posted here, at least pre-patch.