• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NXGamer: Resident Evil 4: REmake - PS5, Xbox Series X, PC, Steam Deck Technical Review

01011001

Banned
lol PC version is trash too. Turned on RT and hair strand to play at 1440p and it crashed as soon as I went into the village and started recording. I had 2.5 GB left in VRAM and 32 GB of ram available but whatever.

Then I decided not to record, and saw that the performance went from 120 fps all the way down to 74 fps during gameplay. Fuck that.

The last cutscene was insane. The game went all the way down to 55 fps with the burning dude as the titles appeared. With RT and hair off, its a locked 60 fps at native 4k and locked 120 fps at 1440p in that cutscene with 10-15% of the GPU to spare. in the RT mode, the GPU was maxed out at 100% in that cutscene.

The game is clearly not optimized around RT and hair strand. Turn that off.

RT works fine if your VRAM bar in the options isn't anything worse than yellow I think. but it shouldn't even be that if possible.
interlacing helps, as well as reducing texture quality down to 2GB High (or 1GB if your card needs it)

but it doesn't run great for the hardware used no...
my card struggles to even remotely keep up with the consoles, and that's usually not the case if a game runs well on PC. I'm usually slightly behind console without RT and ahead of console with RT.
which isn't the case here
 
Last edited:

kingyala

Banned
There is no need for ps5 pro. They wouldn't be able to replace that gpu or cpu with much better stuff anyway
There is a need for better development
people dont realise its an optimization problem and not a hardware problem... last of us 2 ran on a ps4 people forget quickly.. unoptimized games make it look like the hardware is bad but its the other way around its why we have arkham knight looking that good on a ps4 and gotham knights not only looking bad but also locked at 30 fps on current consoles..
 

kingyala

Banned
lol PC version is trash too. Turned on RT and hair strand to play at 1440p and it crashed as soon as I went into the village and started recording. I had 2.5 GB left in VRAM and 32 GB of ram available but whatever.

Then I decided not to record, and saw that the performance went from 120 fps all the way down to 74 fps during gameplay. Fuck that.

The last cutscene was insane. The game went all the way down to 55 fps with the burning dude as the titles appeared. With RT and hair off, its a locked 60 fps at native 4k and locked 120 fps at 1440p in that cutscene with 10-15% of the GPU to spare. in the RT mode, the GPU was maxed out at 100% in that cutscene.

The game is clearly not optimized around RT and hair strand. Turn that off.
this is why i said this game isnt really taxing the hardware its just a badly optimized mess theres just no way a game like last of us 2 ran on a ps4 and this game drops to 30's on a series x and crashes on a pc with 32gb ram and all the iq problems on ps5, but people naively jump to say its a hardware limitation... without context
 

Moses85

Member
That's alot of fucking hope my friend
Michelle Obama Hope GIF by Obama
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Traversal stutter, high vram usage and crashing for lots of people when RT is enabled (mentioned in video).

We've become so accustomed to being treated like trash that this is viewed as a good release somehow.
Crashing for lots of people when RT is enabled?
Who is "lots of people", cuz very little about that on steam.
If it was actually an issue Steam would be blowing up about it.


High VRAM usage?
What constitutes high VRAM usage?
And when does using 10GB of VRAM at 4K become just normal VRAM usage.
How much VRAM would you suppose is "normal VRAM usage" at 4K?

Games were using ~8GB of VRAM at 4K like 3 - 4 years ago.....are you expecting them to stay at ~8GB of VRAM forever??
vram.png


vram.png


vram.png


vram.png
 
Looking at some of the posts here. It’s making me want to sell off my 3080 and go spend a stupid amount of money on a 4090. The game looks good but not that good that it should be able to run at max settings 1440p at over 60fps.

That VRAM usage is just insane.
 

emivita

Member
It's already too sunny and warm out there for me to be playing RE4R and 'feeling' it. Smartest thing to do is wait until next October-November when it will be appropriately patched and discounted and weather is atmospheric. Game was obviously rushed for release, and there's really no urgency to play such a remake, unless you've never played the original.
 

winjer

Gold Member
Looking at some of the posts here. It’s making me want to sell off my 3080 and go spend a stupid amount of money on a 4090. The game looks good but not that good that it should be able to run at max settings 1440p at over 60fps.

That VRAM usage is just insane.

Vram usage is not insane in these modern games. Consoles have 16GB, and they are the minimum common denominator.
For a couple of years, we got mostly cross gen games, that had limited graphics settings. But now things are changing and games are requiring more vram.
The issue is that nvidia insists in offering the least amount of vram, even on expensive GPUs.
The PS5 and Series X specs were presented many months before the RTX 3000 series were lanuched. But still, nvidia kept the 10Gb of vram.
The 3080 is still a powerful GPU, much more than any console, but that 10GB vram is limiting it already.
And the reason is simple, it will force gamers to upgrade sooner.
 

paulyboy81

Neo Member
I actually disagree with the video about which feels better to play with a VRR display.

I've been playing the demo on both this morning in Resolution/RT mode and despite the performance deficit the Xbox has much cleaner motion in practice. The PS5 lurches out of it's VRR range far too often for my liking, causing some fairly jarring stutter. Even when the Xbox version dips to it's lowest it still feels pretty consistent for the most part.

That said, without VRR I wouldn't touch any mode except Performance/NoRT with a barge pole.

Interested to see DF's coverage at the weekend. Given RE Engine has favoured Series X previously the performance discrepancy here is a little odd, I still wouldn't be surprised if there's something going on resolution wise, slightly different checkerboarding base resolutions or something. Even the VGTech video mentions that horizontal resolution is halved on PS5 with CA switched on, it's all a little odd.
 
Last edited:

MikeM

Member
Vram usage is not insane in these modern games. Consoles have 16GB, and they are the minimum common denominator.
For a couple of years, we got mostly cross gen games, that had limited graphics settings. But now things are changing and games are requiring more vram.
The issue is that nvidia insists in offering the least amount of vram, even on expensive GPUs.
The PS5 and Series X specs were presented many months before the RTX 3000 series were lanuched. But still, nvidia kept the 10Gb of vram.
The 3080 is still a powerful GPU, much more than any console, but that 10GB vram is limiting it already.
And the reason is simple, it will force gamers to upgrade sooner.
Its why I bought a 7900xt over the 4070ti.
 

DenchDeckard

Moderated wildly
I actually disagree with the video about which feels better to play with a VRR display.

I've been playing the demo on both this morning in Resolution/RT mode and despite the performance deficit the Xbox has much cleaner motion in practice. The PS5 lurches out of it's VRR range far too often for my liking, causing some fairly jarring stutter. Even when the Xbox version dips to it's lowest it still feels pretty consistent for the most part.

That said, without VRR I wouldn't touch any mode except Performance/NoRT with a barge pole.

Interested to see DF's coverage at the weekend. Given RE Engine has favoured Series X previously the performance discrepancy here is a little odd, I still wouldn't be surprised if there's something going on resolution wise, slightly different checkerboarding base resolutions or something. Even the VGTech video mentions that horizontal resolution is halved on PS5 with CA switched on, it's all a little odd.

My thoughts exactly, from playing both versions a ton over the last week.
 

The Cockatrice

Gold Member
- The game is very demanding on GPU (and CPU). Don't expect 8GB cards to run this at 4K
Wrong. It uses the same memory allocation technique as the previous RE games. You can pick how much memory you want the textures to use down to even 1.5GB. Everyone will be fine even at 4k.
 

The Cockatrice

Gold Member
You keep denning the reality.
Bird Head GIF by Kochstrasse™

Havent played any games demanding high VRAM requirements. Even next-gen games like Plague Tale Requiem barely reach 8GB actual vram use. Every single misinformed person like you takes allocated VRAM as actual VRAM usage. RE4R allocates a lot of VRAM up to even 10 GB or higher the more VRAM your GPU. It's perfect to reduce stutters and other issues. Most games do this, but the actual usage that no one uses has to be enabled in MSI Afterburner separately from the simple VRAM. It's called per process VRAM.
 
Last edited:

winjer

Gold Member
Havent played any games demanding high VRAM requirements. Even next-gen games like Plague Tale Requiem barely reach 8GB actual vram use. Every single misinformed person like you takes allocated VRAM as actual VRAM usage. RE4R allocates a lot of VRAM up to even 10 GB or higher the more VRAM your GPU. It's perfect to reduce stutters and other issues. Most games do this, but the actual usage that no one uses has to be enabled in MSI Afterburner separately from the simple VRAM. It's called per process VRAM.

We have already seen a few games using a lot of vram and the trend is to continue to increase, like it has always been.
You can whine all you want and call others as misinformed, but the reality is that games are increasing vram usage.
 

The Cockatrice

Gold Member
We have already seen a few games using a lot of vram and the trend is to continue to increase, like it has always been.
You can whine all you want and call others as misinformed, but the reality is that games are increasing vram usage.
jvg9zy7yt7na1.png


Dedicated vs Allocated are terms the average gamer has no idea, and apparently not even most tech youtubers. That screenshot is at max 4k with textures set to 8GB which is pointless. I swear every single time you oblivious doom bringers drop in topics like these to scream OMG NOT ENOUGH VRAM. Like Jesus. You have not seen any game use high VRAM. There is none unless you play at some oblivious extreme texture resolutions and above 4k textures or the game is trash optimized like Forspoken.
 
Last edited:

GymWolf

Member
RT works fine if your VRAM bar in the options isn't anything worse than yellow I think. but it shouldn't even be that if possible.
interlacing helps, as well as reducing texture quality down to 2GB High (or 1GB if your card needs it)

but it doesn't run great for the hardware used no...
my card struggles to even remotely keep up with the consoles, and that's usually not the case if a game runs well on PC. I'm usually slightly behind console without RT and ahead of console with RT.
which isn't the case here
The problem is that the game doesn't look nearly good enough to be this heavy...

Atomic heart look as good if not better and it runs MUCH MUCH better (and mundfish is not fucking capcom)
 

winjer

Gold Member
Dedicated vs Allocated are terms the average gamer has no idea, and apparently not even most tech youtubers. That screenshot is at max 4k with textures set to 8GB which is pointless. I swear every single time you oblivious doom bringers drop in topics like these to scream OMG NOT ENOUGH VRAM. Like Jesus. You have not seen any game use high VRAM. There is none unless you play at some oblivious extreme texture resolutions and above 4k textures or the game is trash optimized like Forspoken.

LOL, you discovered that allocated vram and used vram is not the same. Congratulations.
That does not change the matter that games are using more and more vram.
And when a GPU doesn't have enough vram it has to go over the PCIe bus, to the CPU and fetch from ram or even the SSD. And this causes stutters and performance loss.
 

Mr Reasonable

Completely Unreasonable
im think the controls is way worse on series , alot of people compared it with the ps5 on resetera.

I think the deadzone issue its like 12% on PS5 and its 37% on series which is a huge different.

Pray for day 1 patch miracle.
If the controller deadzone is the problem, won't the Devs merely adjust a setting?

Sounds like an easy fix, not something that needs a miracle, right? Or do Capcom not issue patches?
 
Last edited:

The Cockatrice

Gold Member
LOL, you discovered that allocated vram and used vram is not the same. Congratulations.
That does not change the matter that games are using more and more vram.
And when a GPU doesn't have enough vram it has to go over the PCIe bus, to the CPU and fetch from ram or even the SSD. And this causes stutters and performance loss.

I "discovered" something that you clearly had no idea. "LOL!". It changes everything you said, it proves you have no idea what the fuck youre talking about. Stop scaring people with lies and bs.
 

winjer

Gold Member
I "discovered" something that you clearly had no idea. "LOL!". It changes everything you said, it proves you have no idea what the fuck youre talking about. Stop scaring people with lies and bs.

What to you is a new discovery, for people like me, it's common knowledge.
I've been following hardware innovations for over 2 decades.
Your base of argument is based on mere insults. That shows how much you really understand tech.
 

The Cockatrice

Gold Member
What to you is a new discovery, for people like me, it's common knowledge.
I've been following hardware innovations for over 2 decades.
Your base of argument is based on mere insults. That shows how much you really understand tech.

Well, I didnt know you were a complete idiot, but this post pretty much seals the deal. I have provided actual proof to your false VRAM claims, and then you automatically came up with me being "new" and that I "just discovered this" as an excuse to your false claims and now you reply with more childish shit "i have over 2 decades of experience". Lmao. Ok. Ignored.
 

winjer

Gold Member
Well, I didnt know you were a complete idiot, but this post pretty much seals the deal. I have provided actual proof to your false VRAM claims, and then you automatically came up with me being "new" and that I "just discovered this" as an excuse to your false claims and now you reply with more childish shit "i have over 2 decades of experience". Lmao. Ok. Ignored.

Your proof is insulting other people and giving one game as an example, while ignoring games that already use more vram. And ignoring the constant trend of vram increases that have been happening since the first graphics cards.
 

Lysandros

Member
Interested to see DF's coverage at the weekend. Given RE Engine has favoured Series X previously the performance discrepancy here is a little odd, I still wouldn't be surprised if there's something going on resolution wise, slightly different checkerboarding base resolutions or something. Even the VGTech video mentions that horizontal resolution is halved on PS5 with CA switched on, it's all a little odd.
There isn't such an universal thing. Monster Hunter Rise using the same RE engine favored PS5 by a fair margin. DMC 5 120 FPS mode is another example. Depends on the game.
 
Last edited:

Ev1L AuRoN

Member
Playing on PC the demo, I find it very hard to aim, maybe is this high latency issue, man, and it is an AMD game so no reflex. This game is so unresponsive, it's just embarrassing.
 

M1chl

Currently Gif and Meme Champion
re4demo2023032022365.jpg

Runs like poo 4k interlaced on my 3060. (32fps at this part). 1440p runs fine without RT and hair crap.
But my brother in Christ, does it even look good?! This sort of screenshot looks like something with graphical complexity, from 360 era, just rendered out to screenshot level. Does it even use tessellation (look at those rocks)?. Am I going crazy or something? Where the fuck is the GPU power going to in this game.
 

lh032

I cry about Xbox and hate PlayStation.
If the controller deadzone is the problem, won't the Devs merely adjust a setting?

Sounds like an easy fix, not something that needs a miracle, right? Or do Capcom not issue patches?
some people said that the deadzone issue on xbox never fix for RE village and previous RE titles, not sure how true is that

I do hope Capcom can fix it for both platforms.
 
Last edited:

Mr Reasonable

Completely Unreasonable
some people said that the deadzone issue on xbox never fix for RE village and previous RE titles, not sure how true is that

I do hope Capcom can fix it for both platforms.
Ah right, I'll take that as encouraging then. I played through Village and Re7 and didn't think there was a problem.
 

MarkMe2525

Gold Member
VGtech and NXgamer giving the same 1944p resolution data on PS5 and SX without DRS (DF said 1440p on PS5 so they would have been wrong) in performance mode and 2160p in resolution mode and apart from much better performance for PS5 in RT mode and a little better in performance mode (also ignore DF xDD)
To be fair, all DF has released is a "preview". They are waiting on day 1 patch to release their full technical breakdown in order to have the more complete picture of what we will actually be playing. The discrepancy could simply be down to the limited amount of samples taken to Pixel peep. Or......... I could just be wrong 😃 and they are just really bad at Pixel peeping (I don't believe this to be true)
 
some people said that the deadzone issue on xbox never fix for RE village and previous RE titles, not sure how true is that

I do hope Capcom can fix it for both platforms.

lh0, to be fair, Capcom DID release a fix/patch for RE8's stick deadzone/square-aim controls® for the xbox consoles, don't remember exactly how long it took them but it did get fixed ;) (finished it on my XSX).

I'm PS5 only now so i got no horse in this race friend, maybe the PS versions of RE8 played better compared to the Xbox ones in the beginning but Capcom did fix the deadzone/stick issues - and talking about Playstation (or issues)...i think i'm gonna wait this one out, like a couple of users already mentioned, RE4RE has some of the worst image quality i've ever encountered, something's wrong with it and i don't even know if it's something that can be fixed/patched, it's a damn shame.
 

rodrigolfp

Haptic Gamepads 4 Life
Playing on PC the demo, I find it very hard to aim, maybe is this high latency issue, man, and it is an AMD game so no reflex. This game is so unresponsive, it's just embarrassing.
Using controller? There is a deadzone on the sticks. M+kb don't have this issue.
 

Skyfox

Member
Sorry if already posted but I hope Capcom are aware



I had major controller issues on my series x with the demo. Because it has a work in progress message at the start I figured it wouldn't be in the retail version but apparently it is 🤯
 
Tempted to cancel my Xbox pre-order and just go PC. I prefer to play on my OLED instead of my monitor but as of right now my PC is hooked up to my OLED anyways.
 

Umbasaborne

Banned
People saying there's no need for a PS5 Pro.

Bt71dQY.gif
The thing is though that this seems like poor optimization instead. Theres no reason the current consoles should be struggling with these games at this point. If your gonna be buying a new pro console ever 2 - 3 years, may as well just buy a pc. Consoles will always be 30 fps machines, maybe 40 or 60 if really well optimized like horizon forbidden west. so long as companies target that things like iq should hold up just fine
 

SlimySnake

Flashless at the Golden Globes
That's why I got a 3080 12GB, as consoles can use something like 10GB for VRAM. Gotta stay above them.
Same but my 12 gb was consuming 400+ watts and crashing all the time so i just ended up returning it for a10 gb model which tops out at 300 watts but man with RT on i can definitely see it become a bottleneck in some games.

I basically turn off RT in all games nowadays because either the RT is poorly optimized or in the case of RE engine games, comes at the expense of high res textures.
 

Mr Moose

Member
But my brother in Christ, does it even look good?! This sort of screenshot looks like something with graphical complexity, from 360 era, just rendered out to screenshot level. Does it even use tessellation (look at those rocks)?. Am I going crazy or something? Where the fuck is the GPU power going to in this game.
Nice hair, though lol. There's a wall on the PS4 version that has PS2 textures.
 
Top Bottom