• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Horizon Forbidden West - Digital Foundry Tech Review - A PS5 Graphics Masterclass

22:22:22

NO PAIN TRANCE CONTINUE
2k posts regarding an analysis regarding graphics... you do you boo smh

Regarding regarding
 
Last edited:

Elios83

Member
Game is just jaw dropping in resolution mode :messenger_sunglasses::messenger_face_screaming:
The characters look and are animated like real people.
Some beautiful landscapes :messenger_smiling_hearts:
I had to tweak HDR, image looked too bright and sharp with default settings.
Lowering overall brightness and lowering overall contrast between light and dark zones make everything look much better in motion.

I tried 60fps but it's clearly not the way the game was meant to be played.
There is so much dense vegetation and alpha details on screen in this game that the resolution in that mode isn't enough to resolve all these details without obvious shimmering in motion.
 

thatJohann

Member
Was there another shadow patch?
Seems a lot smoother in 30fps today…

Was also sitting closer yesterday …if your nose is up against a monitor it probably has more of an obvious issue.

Very smooth.

Smoother how? Maybe they tweaked the motion blur. I still find it too jarring in 30fps mode, feels like slow motion.

Also if you’re playing in 60fps mode, turn off motion blur, makes a huge difference.
 
Last edited:

kikii

Member
i wonder how these complainers eyes were back then when they played >>>

YIalCjW.jpg
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Game is just jaw dropping in resolution mode :messenger_sunglasses::messenger_face_screaming:
The characters look and are animated like real people.
Some beautiful landscapes :messenger_smiling_hearts:
I had to tweak HDR, image looked too bright and sharp with default settings.
Lowering overall brightness and lowering overall contrast between light and dark zones make everything look much better in motion.

I tried 60fps but it's clearly not the way the game was meant to be played.
There is so much dense vegetation and alpha details on screen in this game that the resolution in that mode isn't enough to resolve all these details without obvious shimmering in motion.
I did the same thing last night. I lowered the brightness, now the game looks drastically better and it already looked good
 

MikeM

Member
Looks absolutely phenomenal on my ZD9, just amazing work, so much detail it takes time for my brain to take it all in lol. Man I wish there was a way to record video or take photos of your TV playing content and have it have anywhere near the dynamic range, depth, motion quality and clarity of what you see in real life. I swear as soon as I can do that I will sell TVs through the internet in 90 seconds from start of sale to close. What I show people on amazing quality TVs in my job is so fucking crap compared to how it could look and they still think its mindblowing, because it is still amazing, just not as good as is could be.

The colour fading/smearing issues seen in this thread are probably caused by VA panels to an extent but its not the main issue, I have a TV with a freaking 31ms 100% response time and I don't see have this issue, so I'd say its more to do with shitty panels with colour issues caused by having rubbish image processing and maybe also having too many "image enhancement" settings turned on.

If I turn motion blur off in Fidelity mode I can see the individual frames of her running animation without smearing or ghosting, it shouldn't be happening on all these VA TVs with much much shorter pixel response times (99.99% of them anyway, given how much of an outlier my set is).

Time for new TVs!

PM me for advice, I love tellys.
Why does my 100% response 11ms X900H look like garbage in 30fps?

 

Elios83

Member
I did the same thing last night. I lowered the brightness, now the game looks drastically better and it already looked good
There is probably a bit of oversharpening that coupled with the huge amount of vegetation, micro geometry and transparencies results in a noisy image in motion with the default settings.
It would be ideal if Guerrilla made more graphical settings available in future patches but indeed I've found that by tweaking HDR reducing brightness and reducing the dynamic range between dark and light zones results in a way more comfortable image.
 

Bogeyman

Banned


I've seen that video, but it doesn't really contradict what I perceived.

As for water flow, it's nice they apply different textures in rougher regions of water etc. Regardless though, there's no particular streaming around individual obstacles, see.. anywhere really, eg around 11:03:

HJPsOBX.jpg

As for behaviour of the vegetation - the DF contains some very cool examples which I am definitely impressed by (e.g. that animal leaving permanent grass deformation).
Then again, see for example the scene at ca. 28:57 into the video. The plants barely move, and certainly not anything like they would in reality. See eg the screenshot below, where the ferns just clip right through Aloy, rather than bending away:

1bTe7qB.jpg

Now just to reiterate: NO game does these things (much) better than Horizon. The game does very much look to be cutting edge, or at least close to it, in all of these areas.
It's just that because it looks so good, these flaws stand out to me much more than they used to in uglier looking games.
 
Last edited:

Kuranghi

Member
I really wonder why nobody invented a tech to display games at 30 fps but have controls register at 120.
You know, a bit like oculus carmack time warp.

Some engines do process/update input more times during the frame than others, so they feel more responsive but you can actually do this on PC, you're actually running the game at the higher framerate and effectively discarding most of the frames. You can use Rivatuner Statistics Server (RTSS) to do it with its Scanline-Sync feature.

So for instance I was playing Horizon Chase Turbo and thats not a demanding game obviously so I could run it at 1440p120hz on my TV, but do 2x refresh rate S-Sync in RTSS so it feels like 1440p@240hz, it felt amazing.
 

Kuranghi

Member
Why does my 100% response 11ms X900H look like garbage in 30fps?


At lower framerates like 24 and 30 the faster the response time the more perceived stutter you will see, with the way human vision works if the pixels appear in one place and almost instantly in another without any blur your brain assumes they travelled between those two points and "adds blur" to explain the movement.

30fps is never going to look great in gaming even in an "ideal" situation like with my TV (Its high response time is to try and make 24hz content look best without black frame insertion) its just the nature of modern TV motion, you need more frames to avoid this issue. Thank yourself lucky you don't have an OLED with 0.2ms and 1.8ms response times, then high contrast 24hz content can be tough to watch and 30hz games look like a slideshow without motion blur.
 

rofif

Can’t Git Gud
Some engines do process/update input more times during the frame than others, so they feel more responsive but you can actually do this on PC, you're actually running the game at the higher framerate and effectively discarding most of the frames. You can use Rivatuner Statistics Server (RTSS) to do it with its Scanline-Sync feature.

So for instance I was playing Horizon Chase Turbo and thats not a demanding game obviously so I could run it at 1440p120hz on my TV, but do 2x refresh rate S-Sync in RTSS so it feels like 1440p@240hz, it felt amazing.
I always felt bloodborne did something like that. The game is only 30 fps but the controller responsiveness is top tier
 
I really enjoy Digital Foundry but I feel like they honestly dropped the ball with their HFW coverage. Rather than being impartial and pointing out legitimate flaws like they normally do, John just gushed the entire time and swept a lot of stuff under the rug that people are starting to now discover for themselves.
He does this a LOT!
 

ethomaz

Banned
I always felt bloodborne did something like that. The game is only 30 fps but the controller responsiveness is top tier
I kinda disagree.

I played both 30fps Bloodborne and Destiny at the same time… going from Destiny to Bloodborne feels in slow motion… Bloodborne to Destiny feels like fast forward.

Of course just for a few minutos after it become normal… enjoyed both.

It is just to say how Bloodborne framerate is compared with a game that does 30fps in render but have game logic at 60fps… and not forgetting Bloodborne has framepacing issues that helped to the difference in feel.
 
Last edited:

ethomaz

Banned


Might have been nice to mention this in the video..

Weird because the article clearly says it.
They even go deeper and said that even with the better response time they preferred to play the game in 4k30fps because it was the only mode to give the next-gen feel.

They where clear that only 4k30fps mode delivered a next-gen experience.

But people choose to ignore DF due the “moah 60fps master race” lol
 
Last edited:
That's not "SSR problems", that's exactly how SRR work, and a known limitation of the technique. If something is occluded from view it dissapears, that's why rat traced reflections are better.
And it happens the same with screen space shadows. Have a look at Elden Ring and you'll see the main character always have a weird halo effect around him. That's because shadows occluded from view by the character dissappear too, another limitation of the technique.
Same problem in BF 2042…
 
Last edited:

rofif

Can’t Git Gud
I kinda disagree.

I played both 30fps Bloodborne and Destiny at the same time… going from Destiny to Bloodborne feels in slow motion… Bloodborne to Destiny feels like fast forward.

Of course just for a few minutos after it become normal… enjoyed both.

It is just to say how Bloodborne framerate is compared with a game that does 30fps in render but have game logic at 60fps… and not forgetting Bloodborne has framepacing issues that helped to the difference in feel.
I don’t give a crap about bb frame pacing issues. Overrated issue. Maybe that’s the reason why it’s controls are far more responsive compared to other 30 fps games. Maybe they don’t vsync or something.
 

ethomaz

Banned
I don’t give a crap about bb frame pacing issues. Overrated issue. Maybe that’s the reason why it’s controls are far more responsive compared to other 30 fps games. Maybe they don’t vsync or something.
Well makes a bit of sense… BB runs a little over 30fps and the framepacing makes it looks like it is running below 30fps.

In any case Destiny 30fps is better than BB 30+fps.
 
Last edited:

rofif

Can’t Git Gud
Well makes a bit of sense… BB runs a little over 30fps and the framepacing makes it looks like it is running below 30fps.

In any case Destiny 30fps is better than BB 30+fps.
Bb is locked 30 afaik. The frame pacing I never felt too much.
Never played destiny. But I believe you :p
 

ethomaz

Banned
Bb is locked 30 afaik. The frame pacing I never felt too much.
Never played destiny. But I believe you :p
It is not.

It runs over 30fps most of time and drop to 20fps in some cases.

Edit - I’m talking about PS4… I never read anything about PS4 Pro.
 
Last edited:

Kuranghi

Member
Vincent won't like Resolution mode because it looks sharpened (Well it probably is, not just looks) and Performance has the poor image quality + both have the SSR and artifacts on the edge of Aloy during camera movement, the latter of which looks like MEMC artifacts. These are all things that he argues against using on TVs so its not surprising.

Also motion blurs default setting is unpleasant (for Res mode I'm talking about, I've no firsthand experience of Perf mode really) imo and Low is way better, I kind of wish there was a setting between Low and Medium, I think that would be the sweet spot, but far from the best example of MB in games. When its on default the minimum velocity to trigger the maximum blur is far too low so its just smearing the image whenever they move sort of quickly in gameplay and cutscenes, its very artificial looking imo. Its like really old school Depth of Field effects, like in RE4, but back then the res was so low you didnt see the hard/awkward edges.

The motion artifacts are lessened by turning MB down I believe but the usual SSR problems are way worse in this game, being far too many frames behind and creating insane amounts of ghosting. Would be good to fix the last part, add a sharpening slider and more granular MB control at least. The last two are easy things to add I would guess.
 

Inviusx

Member


Might have been nice to mention this in the video..

And for people blaming badly calibrated TVs this guy is a professional calibrator and TV reviewer.



People are obviously annoyed for making a purchasing decision based on DFs video. Understandable.

Vincent should do his own video that would great.
 

SlimySnake

Flashless at the Golden Globes
People are obviously annoyed for making a purchasing decision based on DFs video. Understandable.

Vincent should do his own video that would great.
eh. I dont think anyone made any purchasing decisions based on DF. Maybe they chose perf or fidelity mode based on their video, but if people base their purchase on something, it's the metacritic score or their favorite reviewer.

Where DF faltered is properly critiquing visual shortcomings which might have resulted in a better day one patch. they have had this game for weeks. John plays on his CX too which shouldve made these 30 fps problems even worse due to the OLED's pixel refresh rate. I have no idea how he could recommend the 30 fps mode in this state.

They already did with a Jedi hand wave, “Restart twice and it’s fixed”. Seems they were prepared for this complaint right off the bat.
Pretty sure that fixed a different issue, and several posters here confirmed that it worked for them.
 

SlimySnake

Flashless at the Golden Globes
This twitter user shows just why I absolutely abhor the performance mode lmao.



This guy only has a 48 inch OLED. Mine is 65 inch and blown up, it makes everything so much worse. I have no idea how this passed QA. Delay the damn game or ship without a 60 fps mode if this is the best you can do. The problem is exacerbated by the insane levels of foliage in the open world. I dont think i have ever seen a game with more dense foliage everywhere. Usually i would be begging for more foliage, but in this game it is an eye sore in both the 30 fps mode due to excessive sharpness and in the 60 fps mode due to the shimmering.

I am absolutely baffled by the fact that these talented devs have struggled just as much as B devs like Dying Light 2, Metro Exodus, and Guardians of the Galaxy devs did with the 60 fps modes in their games. Cerny literally took them from a 1.6 ghz netbook CPU to a state of the art 8 core 16 thread 3.5 ghz zen 2 processor and this is the best they can do? Did these guys even bother changing their engines to support multithreading? The GOW PC port maxes out one thread which is why the performance is so bad but the PS5 has been out for over a year and a badass CPU should have made 60 fps pretty easy to implement without having to significantly downgrade the game's visuals like we are seeing in games that are native 4k 30 fps.

I bet they didnt upgrade the engine to account for better CPU threading/or caching.
 
Last edited:

Haggard

Banned
This twitter user shows just why I absolutely abhor the performance mode lmao.



This guy only has a 48 inch OLED. Mine is 65 inch and blown up, it makes everything so much worse. I have no idea how this passed QA. Delay the damn game or ship without a 60 fps mode if this is the best you can do. The problem is exacerbated by the insane levels of foliage in the open world. I dont think i have ever seen a game with more dense foliage everywhere. Usually i would be begging for more foliage, but in this game it is an eye sore in both the 30 fps mode due to excessive sharpness and in the 60 fps mode due to the shimmering.

I am absolutely baffled by the fact that these talented devs have struggled just as much as B devs like Dying Light 2, Metro Exodus, and Guardians of the Galaxy devs did with the 60 fps modes in their games. Cerny literally took them from a 1.6 ghz netbook CPU to a state of the art 8 core 16 thread 3.5 ghz zen 2 processor and this is the best they can do? Did these guys even bother changing their engines to support multithreading? The GOW PC port maxes out one thread which is why the performance is so bad but the PS5 has been out for over a year and a badass CPU should have made 60 fps pretty easy to implement without having to significantly downgrade the game's visuals like we are seeing in games that are native 4k 30 fps.

I bet they didnt upgrade the engine to account for better CPU threading/or caching.

Why do you go rambling about cpu utilisation when we are looking at a GPU limited scenario..... Makes no sense.
 

SlimySnake

Flashless at the Golden Globes
Why do you go rambling about cpu utilisation when we are looking at a GPU limited scenario..... Makes no sense.
Because my PC doesnt have this issue if I want to increase my framerate. I dont have to cut down my resolution to 1080p to double my FPS. If a GPU can render something at native 4k 30 fps, it should be able to render it at double the framerate at 1440p since its pretty much the same pixel budget. If not less. There is something else going on behind the scenes here and I suspect it is the same issues we see in Horizon and GOW's PC ports where it is CPU bound.
 

Haggard

Banned
Because my PC doesnt have this issue if I want to increase my framerate. I dont have to cut down my resolution to 1080p to double my FPS. If a GPU can render something at native 4k 30 fps, it should be able to render it at double the framerate at 1440p since its pretty much the same pixel budget. If not less. There is something else going on behind the scenes here and I suspect it is the same issues we see in Horizon and GOW's PC ports where it is CPU bound..
What are you even talking about....
This is quite literally the most common GPU issue there is, namely not enough power, paired with a "less than ideal" solution from the side of GG with that shitty CBR which obviously simply doesn't work with these amounts of microgeometry.
It has absolutely nothing to do with the CPU at all.... We're not looking at draw call issues, this is about sheer GPU bandwidth, or the lack thereof.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
What are you even talking about....
This is quite literally the most common GPU issue there is, namely not enough power, paired with a "less than ideal" solution from the side of GG with that shitty CBR which obviously simply doesn't work with these amounts of microgeometry.
It has absolutely nothing to do with the CPU at all.... We're not looking at draw call issues, this is about sheer GPU bandwidth, or the lack thereof.
Nah, these console exclusive games were not designed to take advantage of PC CPUs in ways third party games have been for over a decade now. There is a reason why Guardians on PC has no issues scaling up and down with resolution but on consoles its a mess that goes all the way down to 1080p and cant even maintain that at 50 fps.

Again, the pixel budget of a 1440p 60 fps title is even less than the pixel budget of a native 4k 30 fps title. So what gives? Why is this game running at 1800p cb and not 2160p cb which is exactly half of the native 4k pixel budget? And there is no way this game is running at 1800p cb. it is definitely going down to 1080p because other 1800p games like Days Gone and Ghost of Tsushima on the pro did not have this issue.
 

Haggard

Banned
Nah, these console exclusive games were not designed to take advantage of PC CPUs in ways third party games have been for over a decade now. There is a reason why Guardians on PC has no issues scaling up and down with resolution but on consoles its a mess that goes all the way down to 1080p and cant even maintain that at 50 fps.

Again, the pixel budget of a 1440p 60 fps title is even less than the pixel budget of a native 4k 30 fps title. So what gives? Why is this game running at 1800p cb and not 2160p cb which is exactly half of the native 4k pixel budget? And there is no way this game is running at 1800p cb. it is definitely going down to 1080p because other 1800p games like Days Gone and Ghost of Tsushima on the pro did not have this issue.
The shitty IQ is due to the CBR not being able to handle fine geometry in this game.... There is zero indication for any CPU issues.
No one looks at the CPU when you have resolution or reconstruction issues... Because the CPU has nothing to do with that.
They should've simply disabled the CBR in performance mode, native 1440p would've been fine.
 
Last edited:

ethomaz

Banned
The shitty IQ is due to the CBR not being able to handle fine geometry in this game.... There is zero indication for any CPU issues.
No one looks at the CPU when you have resolution or reconstruction issues... Because the CPU has nothing to do with that.
They should've simply disabled the CBR in performance mode, native 1440p would've been fine.
Imagine if 1440p natives ends being worst than 1800CBR 🤔

I want to see what you will ask next.
 

Shmunter

Member
The shitty IQ is due to the CBR not being able to handle fine geometry in this game.... There is zero indication for any CPU issues.
No one looks at the CPU when you have resolution or reconstruction issues... Because the CPU has nothing to do with that.
They should've simply disabled the CBR in performance mode, native 1440p would've been fine.
Cbr has been used far and wide without intrusive issues. On balance, it has been a benefit. It is a combination of things going on here not just the reconstruction I would suggest.

Either way, the feedback is wide and swift. We shall see if the devs take it onboard.
 
Last edited:

ethomaz

Banned
Found the pleb without a gaming PC....
I want to see how much more nonsensical your comments can become. Must be those stuttery 30fps Messing with your head.
Thank you.
Consoles forever.

Anyway 1800CBR is probably better than 1440p and that is why I don’t believe 1440p native will change the IQ… it will downgrade it if all.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
The shitty IQ is due to the CBR not being able to handle fine geometry in this game.... There is zero indication for any CPU issues.
No one looks at the CPU when you have resolution or reconstruction issues... Because the CPU has nothing to do with that.
They should've simply disabled the CBR in performance mode, native 1440p would've been fine.
But their CBR solution in the first game is excellent. It was fine in Days Gone and GoT too. I am convinced it isnt 1800p. It is way way lower.

Also 1800p cb is around 2.8 million pixels. (half of the 5.6 million of native 1800p) which is way lower than native 1440p (3.7 million pixels). it is right between 1080p (2.1) million and 1440p. For whatever reason, they are unable to get their native 4k (8.2 million) 30 fps game running at 60 fps despite reducing the pixels by almost 3x.
 
Top Bottom