show me vram usage of same settings in 1080p vs 4kI am updating red dead 2 right now. When it is done I will compare vram usage in 4k vs 1080p. Hold your horses.
Edit:
Should be able to take the difference and divide by 3 to get 1080p frame buffer total size. Then multiply by 4 to get 4k total buffer size.
show me the data same graphics settings at 1080p and 4k.. video or screenshots...On rdr2 at 1080p vram usage is 4140 mbs.
At 4k it’s 5279 mbs.
So the 1080p frame is 380 MBs. The 4k is 1528 MBs.
That’s a pretty big chunk of ram. Of course diss will get you memory savings.
Edit: these number could change a bit depending on what features are on. I don’t have screen space ambient occlusion which seems to add another 45 MBs at 4k.
and exactly some settings scale with resolution like motion blur aa or like u say ambient occlusion as they are screen space but this isnt to do with how a game is designed,, developers can target a 4k frame but stop some graphical settings from resolving to the screen resolution for instance half res effects.. so what happens is even though a 4k frame doesnt take substantial vram but on pc some effects automatically resolve to the resolution therefore wasting memory.. this isnt the resolution itself eating memory its the effects.. if you have a profiler to check whats happening youll see the culpritOn rdr2 at 1080p vram usage is 4140 mbs.
At 4k it’s 5279 mbs.
So the 1080p frame is 380 MBs. The 4k is 1528 MBs.
That’s a pretty big chunk of ram. Of course diss will get you memory savings.
Edit: these number could change a bit depending on what features are on. I don’t have screen space ambient occlusion which seems to add another 45 MBs at 4k.
One thing I am impressed by is that Naughty Dog has released two patches in two days. This is the kind of behavior that I expect from developers (especially non-Indie developers) when they first release a game. They didn't just release the game and then go on vacation for a few weeks. The patches have rolled in far quicker than Guerilla Games' patches with Horizon Zero Dawn.
and exactly some settings scale with resolution like motion blur aa or like u say ambient occlusion as they are screen space but this isnt to do with how a game is designed,, developers can target a 4k frame but stop some graphical settings from resolving to the screen resolution for instance half res effects.. so what happens is even though a 4k frame doesnt take substantial vram but on pc some effects automatically resolve to the resolution therefore wasting memory.. this isnt the resolution itself eating memory its the effects.. if you have a profiler to check whats happening youll see the culprit
I just gave you the numbers you wanted. Those effect are part of the buffers. The higher the resolution the larger those buffers are.and exactly some settings scale with resolution like motion blur aa or like u say ambient occlusion as they are screen space but this isnt to do with how a game is designed,, developers can target a 4k frame but stop some graphical settings from resolving to the screen resolution for instance half res effects.. so what happens is even though a 4k frame doesnt take substantial vram but on pc some effects automatically resolve to the resolution therefore wasting memory.. this isnt the resolution itself eating memory its the effects.. if you have a profiler to check whats happening youll see the culprit
More like they knew the game was in a bad state and are actively working on it.
PC gamers are funny. Constantly credit their hardware when they can run a game as they desire, but blame everything except their hardware when they can't.
thats not the resolutions fault you can render those effects at 1080p instead of 4k the problem is most pc games if not all just automatically scale those effects to resolve with the resolution and thats why you get beefed up waste of vram when in actuality the resolution isnt the one eating the vramI just gave you the numbers you wanted. Those effect are part of the buffers. The higher the resolution the larger those buffers are.
And as I showed, they take up a substantial portion of vram. And that is what you challenged. Own up to it and let’s move on. Everyone gets things wrong sometimes.
post an alligator next time mr ''attenboring''
Again, you’re changing the argument which was whether the frame buffer uses a lot of vram. It does, especially at 4k.thats not the resolutions fault you can render those effects at 1080p instead of 4k the problem is most pc games if not all just automatically scale those effects to resolve with the resolution and thats why you get beefed up waste of vram when in actuality the resolution isnt the one eating the vram
im saying the resolution isnt the fault the vram isnt high because of the resolution its because some of the effects scale up automatically with the resolution and its something that console developers mostly have a control of.. this is what profilers are for they show you the memory budgets or frmae budget but what happens on pc is irregular non optimization or control of what the frame is doing... in reality a 4k frame is about 24mb if you decide to bloat that up by wasting vram then keep making every effect resolve to 4k.. this is irrelelvant to how much a 4k frame costs...Again, you’re changing the argument which was whether the frame buffer uses a lot of vram. It does, especially at 4k.
Plus you are talking about lowering the quality of effects. In the numbers I gave you, I didn’t change the effects. They were exactly the same. So your point is mute.
those effects have nothing to do with resolution they simply automatically scale up which is stupid for a game designer and has nothing to do with resolution... why would you want ambient occlusion to resolve at 4k res when you can lower it.. a 4k frame is simply under 32mb its the unoptimized or uncontrolled effects that scale up to it... its not the resolutions fault again.Again, you’re changing the argument which was whether the frame buffer uses a lot of vram. It does, especially at 4k.
Plus you are talking about lowering the quality of effects. In the numbers I gave you, I didn’t change the effects. They were exactly the same. So your point is mute.
I remember you now.
Didn't I buy this used GPU off you a while back :;
In all seriousness, looking forward to the results as I do love to see a rig used to it's maximum potential.
Thought so but it's been so long that I only have vague recollections. Too bad the site is pretty much dead now.
In that case, I'm looking forward to seeing your results.
As do I. It hurts when I see systems not being maximized to their potential, but I get it. A lot of people just want simple.
Couple vids.
"test test" is a like for like w/ Jansn. The rest are just random.
I take a measured averaged 5.5% performance hit throughout all these videos w/ AMD Relive as I'm GPU Bound. So if I'm at 72, I'm actually at 76-77 & so on & so on.
On average, I am 24% faster than him & at most 28% faster than him on the same run. He uses a capture card as noted in his description so there is no performance hit to him. His lowest being 58 & mine being 76 ( 72 w/ AMD ReLive capture) on the same run.
Some links still processing higher quality. Unlisted but links should work. Posted them via burner account.
As I said, average between 75-95 via 1440P max/ultra, native. Peaks 100+ quite often w/ cut scenes, inside & random areas but didn't waste time recording those. Add around 25%-30% if using FSR2 Quality or 30% if using high @ native.
When someone knows the in & outs of OC'ing, the OS & in general any hardware their using it's a completely different ball game. I've been at this for over 20 years, so I guess you could consider me the exception.
But anyways, this was done in a 4.9 liter case. I could push the 6800 XT further easily if it wasn't. It currently is the most powerful build under 5 liters known on the internet. Although that will change when I finish building the dual slot 4090 4.9 liter system.
Nice - Showed all the settings as well.
Looks like you have the same camera panning stutter that has stopped me playing through it. Hard to discern on youtube with it's 60fps limit
I'm presuming the first temp on RTSS s the CPU on those videos, that I think you said is undervolted. What cooling are you using for it in such a small case?
This has reminded me to create a performance profile for my 12900k and see what I can wring out of it.
I'm not getting any jutter on my end. Not sure why it came out that way, but it's suppperrrr smooth. I do however sometimes have that weird issue when using the mouse, doesn't happen w/ controller which is what I've been using.
Dellided 13700K running bare die (no IHS) on a AXP-90 X36 w/ 92mm Noctua fan (from L9i) & 3mm foam fan duct to minimize turbulence.
6800 XT (Dell OEM 2 slot version, only 2 true 2 slot there is for the 6800 XT) has been repasted w/ liquid metal & repadded w/ 15 W/mK thermal pads.
I only experience it using the mouse as well
My friend cracked his 12900k die after delidding being too rough removing the solder ( Think you can get some chemicals now that help dissolve it ) - Put me off from doing mine
He's 6ft 11" and has sausage fingers. Not the kind of hands for delicate workAh, well 13th gen doesn't have any SMD's that are at risk w/ the delidding tool (rocketcool), unlike 12th gen. So it's a piece of cake in comparison. Seems odd what killed his CPU was rubbing the solder off, my guess is he chipped one of the SMD's. But yeah, some Quicksilver takes it off in a jiffy.
He's 6ft 11" and has sausage fingers. Not the kind of hands for delicate work
It seems that this game is using D3D11on12.dll, which could be the main reason of high CPU usage, Ram usage and stutter like Witcher 3 Rtx Update.
No need to quote this nonsense about "it uses DX11on12". There is no any DX11on12, there is D3D11on12. And the game process doesn't use it at all, for a person who really programmed something in both D3D11 and D3D12 it takes just a few seconds to peek inside the executable to see that it contains native D3D12 renderer.
Fun thing, the person who launched the rumor about "it uses D3D11on12" @ twitter also claims that The Witcher 3 initially used it (which is simply lie), which clearly says that he has no ideas what he is talking about. He noticed D3D11on12.DLL loaded in the project context and used it as a proof of his claim then gamers started spreading this tweet with false info like a virus.
Poor guy had no idea that D3D11on12 is loaded into context of ANY native D3D12 steam game simply because steam overlay uses it.
Summarizing, never use twitter or reddit to get technical info.
That is bullshit. Here is the analysis of a programmer:
The Last of Us Part I PC Port Receives 77% negative ratings on Steam, due to poor optimization
Also this port IS terrible, it uses DX11on12, AVX512, and batters a 13900K or 7950X for a game that looks no better than any game released in the last...forums.guru3d.com
Yeah, that shit about Witcher 3 was debunked too
Peoples with no knowledge are looking into files and starting drama.
Not just "a programmer", Unwinder is the developer of MSI Afterburner and RivaTuner, i.e. an extremely knowledgable programmer when it comes to PC gfx.That is bullshit. Here is the analysis of a programmer:
The Last of Us Part I PC Port Receives 77% negative ratings on Steam, due to poor optimization
Also this port IS terrible, it uses DX11on12, AVX512, and batters a 13900K or 7950X for a game that looks no better than any game released in the last...forums.guru3d.com
2nd update has improved things for me, massively. There's still slight stutter in new areas but it's nowhere near as distracting as before. Keep 'em coming.
So proud of my GPU *wipes tear from eye*That moment when a 3060 beats a 3070.
The Last of Us Part I Benchmark Test & Performance Analysis Review
The Last of Us is finally available for PC. This was the reason many people bought a PlayStation, it's a masterpiece that you can't miss. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a selection of modern graphics cards.www.techpowerup.com
Ice cold lol.
So proud of my GPU *wipes tear from eye*
Which NVIDIA GPU would you get if you wanna do 1440p high settings with RT in future games? Cause it’s looking pretty clear that 8GB cards are becoming obsolete lol.And yet people are going to tell you VRAM doesn’t matter.
Edit: Cripes, the 6800XT is leaving the 3070 in the dust.
Which NVIDIA GPU would you get if you wanna do 1440p high settings with RT in future games? Cause it’s looking pretty clear that 8GB cards are becoming obsolete lol.
When you fail to defuse the booby trap in time.
Well, the tech powerup slides confirm his theory.Yeah, Hardware Unboxed starting off with "i think the reason for this is quite obvious, it does appear to be a VRAM issue"
Wow
The game has a memory leak, maybe start off with that?
"as of now, a few days after this terrible port with memory leak was launched on PC, if you have 8GB you're fucked"
Proceeds to make 15 mins video for a version that shouldn't even have launched in this state.
Well, the tech powerup slides confirm his theory.
I actually confirmed that this game had memory leak issues a couple of days ago when i ran the same benchmark multiple times and the performance degraded every time. I agree that they should be taking into account the potential memory leak problems but PCs have always bruteforced stuff. Thats why i bought a GPU 2x more powerful than the PS5. So i can brute force anything.No because the game has documented memory issues
It's a snapshot in time that "now" there's a problem that the devs are investigating, i don't care that you analyse version 1.0 and say it like it is, at least journalists should fucking mention also that there's on-going issues that devs are investigating. What happened to basic journalism?
VERSION 1.0.1.6 PATCH NOTES FOR PC
- Decreased PSO cache size to reduce memory requirements and minimize Out of Memory crashes
- Added additional diagnostics for developer tracking purposes
- Increased animation streaming memory to improve performance during gameplay and cinematics
- Fix for crash on first boot
KNOWN ISSUES BEING INVESTIGATED
- Loading shaders takes longer than expected
- Performance and stability is degraded while shaders are loading in the background
- Older graphics drivers leads to instability and/or graphical problems
- Game may be unable to boot despite meeting the minimum system requirements
- A potential memory leak
- Mouse and camera jitter for some players, depending on hardware and display settings
The tEcHTuBerS already making a dozen video to trash VRAM is simply quick clickbaits drama. If after patched we see that pattern, then yes.
But at 1440p, 8GB should be plenty fine. All games with memory leaks lately has been insanely damaging to all these VRAM talks. These devs don't know wtf they are doing. What happens is that more VRAM is brute forcing your way out of incompetence.
If Plague Tales requiem looking like it does manages this memory usage and the 3070 gets fine even at 4k, devs have no excuses.
Of course,even the non XT is better than a 3070/tiAnd yet people are going to tell you VRAM doesn’t matter.
Edit: Cripes, the 6800XT is leaving the 3070 in the dust.
It's because the CPU is running at 100% 95 degrees causing everyone to sweat profusely.Well I keep getting the bug where everyone’s suddenly dripping wet for no reason lol.