You'll have to phone Naughty Dog and tell them how you're getting that sort of performance.No real issues here. Bought it from my buddy for $20 since he never redeemed his code from CDKeys.
Running great on my 6800 XT @2600mhz @ 1440p ULTRA, native. Hitting between 75-95 fps in most demanding spots. Hits 100+ quite often in less demanding areas & cut scenes. However, funny enough, it's probably the first FSR2 game (2.2) where I feel like Quality mode looks better than native in most cases, outside of a few motion issues. Hitting between 95-120fps in that case. If high settings are used, add 25%-35%+ to either & at that point it becomes a CPU bottleneck. Ultra to high is a HUGE fps gain.
Heavily undervolted (via AC/DC loadline) 13700K @ 5.4p/4.3e/4.8 cache (4.9 liter sized build), 32gb 4200 CL16 gear 1 w/ heavily tightened primary, secondary & tertiary timings ( 43-44 NS via MLC, 42 via Aida). Latest Win11 beta build.
Shader compilation took about 10ish minutes which is still way too long. It beats FH 5 in length which in my experience had the longest SC prior to this game. Other than that, it's been smooth sailing.
I feel bad for a lot of Nvidia buyers who are stuck w/ 8-10gb cards that were once touted as being on the higher end though, especially recent purchasers. I wouldn't buy any card (on the higher end) going forward without a minimum of 16gb, and I had that same sentiment for the past year (if you want to max settings at least). Just unfortunate, but it's always good to have a nice buffer of vram in case of shit ports like this one. Regardless, the game clearly has a lot of issues not related to hardware & the outrageous usage of vram is only one of many ND is going to have to try & patch up.
With that said, for something that was touted as built from the ground up for PS5, the visuals are extremely disappointing. Some cut scenes look great maxed out, but the actual gameplay graphics look average & a lot of times below that in regards to cross gen AAA games that weren't "built from the ground up". Since the PS5 version clearly doesn't use ultra/max settings, I'm unsure why certain people were touting it as a graphical showcase for PS5. The fact that the PS5 can only manage 4K30/1440p60 on high settings is not a good sign for future ND games, let alone trusting them w/ porting to PC which has thousands of configs. In general, this game clearly has a lot of widespread issues & most of that blame goes to ND & Sony for allowing it to release as such.
Still, feel like it's worth it @ the $20 I paid. Way too much @ $40 & embarrassing @ $60. Luckily Steam has a great refund policy for the people who bought it directly from there. Sony has a long way to go if they want to get in PC gamers good graces but at least they're trying.
You'll have to phone Naughty Dog and tell them how you're getting that sort of performance.
Barely above 60fps in a fairly quiet section at 1440p/Ultra.
57-66fps.
You'll have to phone Naughty Dog and tell them how you're getting that sort of performance.
Barely above 60fps in a fairly quiet section at 1440p/Ultra.
57-66fps.
Hence why I posted a video because it lines up with their data.Hate to rain on your parade buuuut, you realize GameGPU doesn't test half the GPU's & CPU's they list, right? Majority of them are based on past performance differences & used for estimations. I wouldn't trust them w/ a 10 foot pole these days.
No amount of OCing will make your 6800 XT beat a stock one by 40%. Post proof of you average 75-95fps. Either that patch did miracles or you got the most powerful 6800 XT on the planet.Regarding Janson, his GPU is stock w/ the highest touching around 2370mhz.. I'm constantly between 2600mhz-2700mhz w/ 2120 mem FT w/ SOC @ 1300mhz/FCLK 2100/ VCLK 1450 via MPT. If you actually knew any of this stuff you would realize what a fool you look like right now comparing the two. I easily beat a stock 6950 XT. So you might want to brush up on your metrics some before coming at me w/ rookie conjecture.
Hence why I posted a video because it lines up with their data.
No amount of OCing will make your 6800 XT beat a stock one by 40%. Post proof of you average 75-95fps. Either that patch did miracles or you got the most powerful 6800 XT on the planet.
Your avatar and name are familiar. Were you a member of overclock.net? Perhaps Gamespot? I think I've seen them somewhere but can't quite remember where.Sure, when I get home. I'll go through the same area as Janson, along w/ the beginning sequence & maybe some other random outside areas. No point in doing inside areas as it's always 99% 100+ fps.
And FYI, I do currently have the 3rd fastest 6800 XT on the planet. There's a reason why there's "OC" in my name.
Edit:
This will be fun
Your avatar and name are familiar. Were you a member of overclock.net? Perhaps Gamespot? I think I've seen them somewhere but can't quite remember where.
Thought so but it's been so long that I only have vague recollections. Too bad the site is pretty much dead now......I was formerly a prominent poster on overclock.net w/ tons of past former OC records via hwbot. Rarely visit the site anymore ever since the forum change due to google SEO not playing nicely w/ the former board (thus the change). I no longer compete for records anymore, mainly for fun & if it ends up one, then so be it. Not what I aim for anymore, just try to extract as much power out of whatever daily driven current hardware i'm using.
You'll have to phone Naughty Dog and tell them how you're getting that sort of performance.
Barely above 60fps in a fairly quiet section at 1440p/Ultra.
57-66fps.
Thought so but it's been so long that I only have vague recollections. Too bad the site is pretty much dead now.
In that case, I'm looking forward to seeing your results.
The entire point of playing on PC is the better performance, which the platform absolutely provides given competent developers. Naughty Dog is already scrambling to patch TLOU per their Twitter account. That shows it’s a software problem, not a platform problem.
if you threw it on your gigantic backlog pile by the time you get to it it'd probably have been patched 12 times over and the game would run about as good as God of Warif it was a good port, and just throw it on top of my backlog pile.
"It doesn't count because of reasons"a plague tale was a multiplat designed for pc ps5 and series x, the medium was also a multiplat and that game doesnt require any big memory budgets, dead space is a multiplat, returnal is a ps5 exclusive though it doesnt require much more graphics memory than last of us 1... this is the first nextgen only port that pushed ps5 to the limit that weve seen ported to pc yet.. i dont consider forspoken since the graphics and the memory footprint doesnt correlate
yes reasons, what do u think magic or something.... theres reasons why some game ports perform better than others and ive outlined them.. what else where u looking for fairy farts or what!"It doesn't count because of reasons"
You'll have to phone Naughty Dog and tell them how you're getting that sort of performance.
Barely above 60fps in a fairly quiet section at 1440p/Ultra.
57-66fps.
i have switched goalposts ur simply switching ur understanding levels... nobody said resolution doesnt require vram... i said it doesnt require more vram than the actuall graphics meaning lighting, assets, textures physics and all that goes on a screen... heres a read on how they made killzone shadow fall you can see the profiler and memory budgets of each data set on video ram ps5 https://www.eurogamer.net/digitalfoundry-inside-killzone-shadow-fallI'm sorry but fucking what? Then you have the nerve to tell him that "it seems that you don't know how memory works".
Nice goalpost shifting. You realized how utterly stupid this claim was and now try to toe a line but this is just as idiotic. Yes, VRAM has a lot to do with resolution. It's clear as day you don't have the faintest idea what the hell you're talking about.
so in ur head u think a 1080p game on ps4 uses the same memory as the same game at 1080p on a ps5You need to take the L and move on bubba
That’s not the same thing as saying resolution has nothing to do with VRAM usage, which is what you said, and it’s absolutely false.so in ur head u think a 1080p game on ps4 uses the same memory as the same game at 1080p on a ps5
Your logic is nonsensical. Rather than doing research and building a narrative around it, you do your research around the narrative you have built. When you were given Returnal as an example, you dismissed it because reasons. It doesn't fit your narrative so it doesn't count.yes reasons, what do u think magic or something.... theres reasons why some game ports perform better than others and ive outlined them.. what else where u looking for fairy farts or what!
No, that's what you said you filthy liar: yes vram has nothing to do with resolution.i have switched goalposts ur simply switching ur understanding levels... nobody said resolution doesnt require vram... i said it doesnt require more vram than the actuall graphics
And no way is PS5 running everything at ultra, like the PC benchmarks.
thats the thing these console trolls would rather ignore though... they blame us for buying the expensive powerful hardware that billion dollar companies can't seem to properly optimize for. It's like WTF, how is it our fault when they're the ones making the ports? We end up fixing them most of the time lmao
NO i will not use a console and give up all the advantages and freedom of playing on PC. No matter how any unoptimized VRAM munching ports you throw at me.
a 4k frame is 24 mb mate.. what hogs vram is the graphical budget of that frame as ive shown here this is the memory budget of killzone shadow falls vram on ps4That’s not the same thing as saying resolution has nothing to do with VRAM usage, which is what you said, and it’s absolutely false.
i said that because its insignificant a 4k frame is 24mb and a normal midrange gpu has 8gb... so its negligible... what hogs the memory isnt the resolution.. resolution mostly affects perfromance its not much of a vram bottleneckYour logic is nonsensical. Rather than doing research and building a narrative around it, you do your research around the narrative you have built. When you were given Returnal as an example, you dismissed it because reasons. It doesn't fit your narrative so it doesn't count.
No, that's what you said you filthy liar: yes vram has nothing to do with resolution.
VRAM has nothing to do with resolution which is so hilariously false that it's comical that you're still here spouting your nonsense. To top it all off, you can't write, have horrible syntax, and can't punctuate for shit. GTFO.
Bro, if you eliminate all other variables that you’re injecting to try to make your argument true and simply do an apples to apples test, you’d be comparing the same PS4 game at 1080p to 4K, or the same PS5 game at 1080p to the same PS5 game at 4K.a 4k frame is 24 mb mate.. what hogs vram is the graphical budget of that frame as ive shown here this is the memory budget of killzone shadow falls vram on ps4
because it does.... otherwise why spend the money on an RTX 4090 when a GTX 1050ti would run the exact same game? oh wait its because one gets significantly higher framerates and more features than the other. If it doesn't, that's not a hardware problem, you clearly fucked up when designing the softwareNo one gets on PC gamers for choosing PC, we roll our eyes at the selective few who believe purchasing a high end PC entitles them to perfect performance for all cross platform games. That's ignorant thinking that continues to be proven wrong over and over again this generation.
uh....No one gets on PC gamers for choosing PC,
PC gamers are funny. Constantly credit their hardware when they can run a game as they desire, but blame everything except their hardware when they can't.
PC Gaming is dead in the water, just buy a damn PS5 and be done with it...
PCMR strikes again!!! Pay thousands to play old console games poorly
Too much of Masterrrace
yes ive just said a 4k frame is 24mb and a 1080p frame is about 6mb but both numbers are negligible.. and im not injecting some fairy variables if you understood how computers work this discussion would have been easier but ur simply thick... reolutions affect perfromance not vram a similar 1080p game on ps5 uses more vram than the same game on ps4 because ps5 uses a higher graphical preset, textures, physics and so on.. a game can use even 100gb of vram at 1080p because of whatever its trying to render this has nothing to do with resolution... Problem is average people have been sold the lie that 8gb was enough for 1080p its as if its some magic number..Bro, if you eliminate all other variables that you’re injecting to try to make your argument true and simply do an apples to apples test, you’d be comparing the same PS4 game at 1080p to 4K, or the same PS5 game at 1080p to the same PS5 game at 4K.
Guess what? Rendering at 4K will use substantially more VRAM than 1080p, in any like for like scenario. You’re wrong.
Yeah, good choice. Dont play this game on medium textures. They are PS3 quality lolPlayed the game for 2hrs on steam (expecting the worst) to see how it would run on my PC - 13600k, RTX 2080, 32GB DDR5.
Ran the game with mostly high settings (med env textures), 1440p, DLSS quality, at ~60-70fps consistently. Image quality is good with DLSS on. Medium environment textures standout as noticeably worse than Last of Us Part 2 but somewhat expected given the lower VRAM of the 2080. In motion you probably wouldn't notice a difference but when the camera zooms in some of the textures look muddy.
Overall I was pleasantly surprised given the reports of crashes (had one crash while alt-tabbing while the prologue loaded) and performance issues. That being said I will be refunding this game to play it on PS5 at a later date or PC with a better GPU.
You talk about resolution as if it's only putting pixels on screen, effects escalate with resolution, it's not only about painting more pixels on the screen.i said that because its insignificant a 4k frame is 24mb and a normal midrange gpu has 8gb... so its negligible... what hogs the memory isnt the resolution.. resolution mostly affects perfromance its not much of a vram bottleneck
uh....
No one gets on PC gamers for choosing PC, we roll our eyes at the selective few who believe purchasing a high end PC entitles them to perfect performance for all cross platform games.
Your logic is nonsensical. Rather than doing research and building a narrative around it, you do your research around the narrative you have built. When you were given Returnal as an example, you dismissed it because reasons. It doesn't fit your narrative so it doesn't count.
No, that's what you said you filthy liar: yes vram has nothing to do with resolution.
VRAM has nothing to do with resolution which is so hilariously false that it's comical that you're still here spouting your nonsense. To top it all off, you can't write, have horrible syntax, and can't punctuate for shit. GTFO.
P.S Does anyone know if GeForcenow's capture uses vram? or is it writing straight to system ram and then dumping it in on the ssd? Im wondering if my benchmarks are going to be affected if i capture video on the background.
which effects! the vram footprint that resolution takes is negligible motion blur, aa and some other effects scale up with resolution but its negligible andYou talk about resolution as if it's only putting pixels on screen, effects escalate with resolution, it's not only about painting more pixels on the screen.
Also following what you replied to me about the game being memory intensive for being PS5 only and dismissing games like A Plague Tale Requiem which literally can't run above 1440p on PS5 (a lower resolution than this game) nor even has a 60 fps mode on that console.
See the Joel graphic defects? Can you keep saying it's because "lack of VRAM because optimized for PS5"? Come on, port is broken, there's no way a game doing way less on screen than APT Requiem runs better on PS5 and worse on PC than it because "PC lacks power" lmao.
You talk like PS5 is so unique and has such kind of magic hardware that PC will need to be at least 5 times more powerful to run ports of its exclusives, WTF?
And you don't need 12 GB of VRAM to run a PS5 game on PC unless its memory management is a total mess up because you don't only story GPU data there, that's absurd, the game is memory intensive on PC because of that.
They should have just ported the ps4 remaster, instead of this fuckup,
Just took some screenshots for what ND says will happen to vram usage when you go from 1080p to 4k. Almost 2GB more vram is required to go from 1080p to 4k.Bro, if you eliminate all other variables that you’re injecting to try to make your argument true and simply do an apples to apples test, you’d be comparing the same PS4 game at 1080p to 4K, or the same PS5 game at 1080p to the same PS5 game at 4K.
Guess what? Rendering at 4K will use substantially more VRAM than 1080p, in any like for like scenario. You’re wrong.
The PS3 version via RPCS3 holds up quite well.
The remaster for PS4 and PRO holds supreme well. 4k60 and if you don't download the patch you can downsample for 1080p and looks fantastic.
Bro, if you eliminate all other variables that you’re injecting to try to make your argument true and simply do an apples to apples test, you’d be comparing the same PS4 game at 1080p to 4K, or the same PS5 game at 1080p to the same PS5 game at 4K.
Guess what? Rendering at 4K will use substantially more VRAM than 1080p, in any like for like scenario. You’re wrong.
Joel has seen betetr days