Was there another shadow patch?
Seems a lot smoother in 30fps today…
Was also sitting closer yesterday …if your nose is up against a monitor it probably has more of an obvious issue.
Very smooth.
Who in the love of internet still has 56k modem?Whoa, this thread needs a 56k warning in the title.
Only people who instead of gaming they search to find problems so they can make a forum post.Is the shimmering a common issue or it only affects a certain group of people?
I did the same thing last night. I lowered the brightness, now the game looks drastically better and it already looked goodGame is just jaw dropping in resolution mode
The characters look and are animated like real people.
Some beautiful landscapes
I had to tweak HDR, image looked too bright and sharp with default settings.
Lowering overall brightness and lowering overall contrast between light and dark zones make everything look much better in motion.
I tried 60fps but it's clearly not the way the game was meant to be played.
There is so much dense vegetation and alpha details on screen in this game that the resolution in that mode isn't enough to resolve all these details without obvious shimmering in motion.
IKR. It's funny how today some games get scrutinized to death and other games with many more graphical issues, get passesi wonder how these complainers eyes were back then when they played >>>
For some narrative over two decades of videogame history becoming a virus.i wonder how these complainers eyes were back then when they played >>>
Why does my 100% response 11ms X900H look like garbage in 30fps?Looks absolutely phenomenal on my ZD9, just amazing work, so much detail it takes time for my brain to take it all in lol. Man I wish there was a way to record video or take photos of your TV playing content and have it have anywhere near the dynamic range, depth, motion quality and clarity of what you see in real life. I swear as soon as I can do that I will sell TVs through the internet in 90 seconds from start of sale to close. What I show people on amazing quality TVs in my job is so fucking crap compared to how it could look and they still think its mindblowing, because it is still amazing, just not as good as is could be.
The colour fading/smearing issues seen in this thread are probably caused by VA panels to an extent but its not the main issue, I have a TV with a freaking 31ms 100% response time and I don't see have this issue, so I'd say its more to do with shitty panels with colour issues caused by having rubbish image processing and maybe also having too many "image enhancement" settings turned on.
If I turn motion blur off in Fidelity mode I can see the individual frames of her running animation without smearing or ghosting, it shouldn't be happening on all these VA TVs with much much shorter pixel response times (99.99% of them anyway, given how much of an outlier my set is).
Time for new TVs!
PM me for advice, I love tellys.
There is probably a bit of oversharpening that coupled with the huge amount of vegetation, micro geometry and transparencies results in a noisy image in motion with the default settings.I did the same thing last night. I lowered the brightness, now the game looks drastically better and it already looked good
nice try
I really wonder why nobody invented a tech to display games at 30 fps but have controls register at 120.
You know, a bit like oculus carmack time warp.
Why does my 100% response 11ms X900H look like garbage in 30fps?
Sony X900H Review (XBR55X900H, XBR65X900H, XBR75X900H, XBR85X900H)
The Sony X900H, also sold as the X90CH at Costco, is a great 4k TV for nearly any type of content. It has a VA panel with an excellent contrast ratio and a full-...www.rtings.com
i wonder how these complainers eyes were back then when they played >>>
I always felt bloodborne did something like that. The game is only 30 fps but the controller responsiveness is top tierSome engines do process/update input more times during the frame than others, so they feel more responsive but you can actually do this on PC, you're actually running the game at the higher framerate and effectively discarding most of the frames. You can use Rivatuner Statistics Server (RTSS) to do it with its Scanline-Sync feature.
So for instance I was playing Horizon Chase Turbo and thats not a demanding game obviously so I could run it at 1440p120hz on my TV, but do 2x refresh rate S-Sync in RTSS so it feels like 1440p@240hz, it felt amazing.
Might have been nice to mention this in the video..
He does this a LOT!I really enjoy Digital Foundry but I feel like they honestly dropped the ball with their HFW coverage. Rather than being impartial and pointing out legitimate flaws like they normally do, John just gushed the entire time and swept a lot of stuff under the rug that people are starting to now discover for themselves.
I kinda disagree.I always felt bloodborne did something like that. The game is only 30 fps but the controller responsiveness is top tier
Might have been nice to mention this in the video..
That's not "SSR problems", that's exactly how SRR work, and a known limitation of the technique. If something is occluded from view it dissapears, that's why rat traced reflections are better.
Same problem in BF 2042…And it happens the same with screen space shadows. Have a look at Elden Ring and you'll see the main character always have a weird halo effect around him. That's because shadows occluded from view by the character dissappear too, another limitation of the technique.
Lmaooo horrible lol!!!i wonder how these complainers eyes were back then when they played >>>
I don’t give a crap about bb frame pacing issues. Overrated issue. Maybe that’s the reason why it’s controls are far more responsive compared to other 30 fps games. Maybe they don’t vsync or something.I kinda disagree.
I played both 30fps Bloodborne and Destiny at the same time… going from Destiny to Bloodborne feels in slow motion… Bloodborne to Destiny feels like fast forward.
Of course just for a few minutos after it become normal… enjoyed both.
It is just to say how Bloodborne framerate is compared with a game that does 30fps in render but have game logic at 60fps… and not forgetting Bloodborne has framepacing issues that helped to the difference in feel.
Well makes a bit of sense… BB runs a little over 30fps and the framepacing makes it looks like it is running below 30fps.I don’t give a crap about bb frame pacing issues. Overrated issue. Maybe that’s the reason why it’s controls are far more responsive compared to other 30 fps games. Maybe they don’t vsync or something.
They already did with a Jedi hand wave, “Restart twice and it’s fixed”. Seems they were prepared for this complaint right off the bat.Yeah, Vincent teoh not feeling it. I’m sure GG will look at it.
Bb is locked 30 afaik. The frame pacing I never felt too much.Well makes a bit of sense… BB runs a little over 30fps and the framepacing makes it looks like it is running below 30fps.
In any case Destiny 30fps is better than BB 30+fps.
It is not.Bb is locked 30 afaik. The frame pacing I never felt too much.
Never played destiny. But I believe you
Might have been nice to mention this in the video..
And for people blaming badly calibrated TVs this guy is a professional calibrator and TV reviewer.
eh. I dont think anyone made any purchasing decisions based on DF. Maybe they chose perf or fidelity mode based on their video, but if people base their purchase on something, it's the metacritic score or their favorite reviewer.People are obviously annoyed for making a purchasing decision based on DFs video. Understandable.
Vincent should do his own video that would great.
Pretty sure that fixed a different issue, and several posters here confirmed that it worked for them.They already did with a Jedi hand wave, “Restart twice and it’s fixed”. Seems they were prepared for this complaint right off the bat.
This twitter user shows just why I absolutely abhor the performance mode lmao.
This guy only has a 48 inch OLED. Mine is 65 inch and blown up, it makes everything so much worse. I have no idea how this passed QA. Delay the damn game or ship without a 60 fps mode if this is the best you can do. The problem is exacerbated by the insane levels of foliage in the open world. I dont think i have ever seen a game with more dense foliage everywhere. Usually i would be begging for more foliage, but in this game it is an eye sore in both the 30 fps mode due to excessive sharpness and in the 60 fps mode due to the shimmering.
I am absolutely baffled by the fact that these talented devs have struggled just as much as B devs like Dying Light 2, Metro Exodus, and Guardians of the Galaxy devs did with the 60 fps modes in their games. Cerny literally took them from a 1.6 ghz netbook CPU to a state of the art 8 core 16 thread 3.5 ghz zen 2 processor and this is the best they can do? Did these guys even bother changing their engines to support multithreading? The GOW PC port maxes out one thread which is why the performance is so bad but the PS5 has been out for over a year and a badass CPU should have made 60 fps pretty easy to implement without having to significantly downgrade the game's visuals like we are seeing in games that are native 4k 30 fps.
I bet they didnt upgrade the engine to account for better CPU threading/or caching.
Because my PC doesnt have this issue if I want to increase my framerate. I dont have to cut down my resolution to 1080p to double my FPS. If a GPU can render something at native 4k 30 fps, it should be able to render it at double the framerate at 1440p since its pretty much the same pixel budget. If not less. There is something else going on behind the scenes here and I suspect it is the same issues we see in Horizon and GOW's PC ports where it is CPU bound.Why do you go rambling about cpu utilisation when we are looking at a GPU limited scenario..... Makes no sense.
The part where thinking that being completely ignorant and acting indifferent looks cool should usually end sometime in middle school. You seem to be stuck....
What are you even talking about....Because my PC doesnt have this issue if I want to increase my framerate. I dont have to cut down my resolution to 1080p to double my FPS. If a GPU can render something at native 4k 30 fps, it should be able to render it at double the framerate at 1440p since its pretty much the same pixel budget. If not less. There is something else going on behind the scenes here and I suspect it is the same issues we see in Horizon and GOW's PC ports where it is CPU bound..
Go play with your cumsock, boy. Someone who doesn't even understand the meaning of principles and can't think of a better argument than "no, you!" isn't good for much else.The only ignorance here is coming from you guys.
Nah, these console exclusive games were not designed to take advantage of PC CPUs in ways third party games have been for over a decade now. There is a reason why Guardians on PC has no issues scaling up and down with resolution but on consoles its a mess that goes all the way down to 1080p and cant even maintain that at 50 fps.What are you even talking about....
This is quite literally the most common GPU issue there is, namely not enough power, paired with a "less than ideal" solution from the side of GG with that shitty CBR which obviously simply doesn't work with these amounts of microgeometry.
It has absolutely nothing to do with the CPU at all.... We're not looking at draw call issues, this is about sheer GPU bandwidth, or the lack thereof.
The shitty IQ is due to the CBR not being able to handle fine geometry in this game.... There is zero indication for any CPU issues.Nah, these console exclusive games were not designed to take advantage of PC CPUs in ways third party games have been for over a decade now. There is a reason why Guardians on PC has no issues scaling up and down with resolution but on consoles its a mess that goes all the way down to 1080p and cant even maintain that at 50 fps.
Again, the pixel budget of a 1440p 60 fps title is even less than the pixel budget of a native 4k 30 fps title. So what gives? Why is this game running at 1800p cb and not 2160p cb which is exactly half of the native 4k pixel budget? And there is no way this game is running at 1800p cb. it is definitely going down to 1080p because other 1800p games like Days Gone and Ghost of Tsushima on the pro did not have this issue.
Imagine if 1440p natives ends being worst than 1800CBRThe shitty IQ is due to the CBR not being able to handle fine geometry in this game.... There is zero indication for any CPU issues.
No one looks at the CPU when you have resolution or reconstruction issues... Because the CPU has nothing to do with that.
They should've simply disabled the CBR in performance mode, native 1440p would've been fine.
Found the pleb without a gaming PC....Imagine if 1440p natives ends being worst than 1800CBR
I want to see what you will ask next.
Cbr has been used far and wide without intrusive issues. On balance, it has been a benefit. It is a combination of things going on here not just the reconstruction I would suggest.The shitty IQ is due to the CBR not being able to handle fine geometry in this game.... There is zero indication for any CPU issues.
No one looks at the CPU when you have resolution or reconstruction issues... Because the CPU has nothing to do with that.
They should've simply disabled the CBR in performance mode, native 1440p would've been fine.
Thank you.Found the pleb without a gaming PC....
I want to see how much more nonsensical your comments can become. Must be those stuttery 30fps Messing with your head.
But their CBR solution in the first game is excellent. It was fine in Days Gone and GoT too. I am convinced it isnt 1800p. It is way way lower.The shitty IQ is due to the CBR not being able to handle fine geometry in this game.... There is zero indication for any CPU issues.
No one looks at the CPU when you have resolution or reconstruction issues... Because the CPU has nothing to do with that.
They should've simply disabled the CBR in performance mode, native 1440p would've been fine.