Sosokrates
Report me if I continue to console war
Bothered? i payed this game 10yrs ago.
It doesn't drop, thus is consistant.
Bothered? i payed this game 10yrs ago.
It doesn't drop, thus is consistant.
I'm no English teacher but I'm telling you how can it be consistently higher when sometimes they both do 4k. Constantly means all the time.? Ofcourse it is, it doesn't drop. So it is consistant. Unless my understanding of the word consistant is different than yours?
Anyway, whatever. I am not even going to play this again with gamepass.
Who cares? You can't control the character during that time anyway. During actual gameplay they are all pretty solid.Exactly
"Pixel counts at 2688x2160 seem to be rare"
So it's that or framerates dropping to the thirties, pick your poison
Things are really hitting rock bottom for console warriors when performance is the same for all platforms with differences of a 1 sec frame rate drop in a non-interactive scene on a 10 minute video.Who cares? You can't control the character during that time anyway. During actual gameplay they are all pretty solid.
You are right. Plus, VGtech have only tested a small portion of the game. Who knows how it performs onboth systems in the most taxing environments/areas.I'm no English teacher but I'm telling you how can it be consistently higher when sometimes they both do 4k. Constantly means all the time.
What you are saying is PS5 is constantly 4k which I agree with.
Old engine prefers raster over compute.PS5 flipping the tables on the rez war is interesting. MS studio no less.
Don’t tell them this .. haven’t you heard .. parallelism in programming is so 2020 zzzzOld engine prefers raster over compute.
You seemed to be blindly ignoring the Series consoles getting similar hits to the 40s with DRS. Even using BC and being native 4K the Series consoles experience an exact hit as the native PS5 version at that same point in the game. So pick your poison because you’re losing either way.Exactly
"Pixel counts at 2688x2160 seem to be rare"
So it's that or framerates dropping to the thirties, pick your poison
You seemed to be blindly ignoring the Series consoles getting similar hits to the 40s with DRS. Even using BC and being native 4K the Series consoles experience an exact hit as the native PS5 version at that same point in the game. So pick your poison because you’re losing either way.
Yes, framerate is basically identical and nearly locked 60 on both, 59.87 vs 59.93. PS5 is locked 60 fps 99.79 % of the time vs 99.89 % on Series X...
Talk about grasping at straws... Some fans are really scraping the bottom of the barrel to find a positive for Series X here.
At this point, they should just accept it's a bit inferior on this game due to PS5 native resolution advantage and move on.
All these PS5 games we're seeing frame drops in could really be cleaned up with vrr. It's really helped on those XSX games with drops.
How is it "shady" they have updated the game in some way, there are plenty of other games that are exactly the same.
Nuff saidI don't know about PS5 but on Xbox it's like Shadow Of The Tomb Raider and Sniper Elite 4, it's badged up as a Series app but still runs off an external as before. Looks like the most basic upgrade, I couldn't see any visual differences when I played it last night. It looked the same as the X1X version running on Series X.
Nuff said
So its totally cool and not shady for MS to call that patch optimized for series X?It isn't "shady" though, the patch doesn't do much, just like Sniper Elite 4 etc
So its totally cool and not shady for MS to call that patch optimized for series X?
By constantly changing the refresh rate to match the fps it eliminates the studder, or perception of studder, that you get when frame rates fluctuate to the point that you won't notice frame drops that you'd otherwise notice without VRR enabled.But isn't VRR primarily for eliminating screen tearing but also alleviating the potential for input lag that accompanies vsync? I see folks throwing out VRR like any and all frame drops will be fixed with this tech but not sure why. Anyone want to explain this?
By constantly changing the refresh rate to match the fps it eliminates the studder, or perception of studder, that you get when frame rates fluctuate to the point that you won't notice frame drops that you'd otherwise notice without VRR enabled.
Basically you get better latency, no tearing, and smoother gameplay by using VRR on any game that isn't 100% locked at a single framerate.
Yup. I assume more cu's will be better over time. Just like with CPU. It's actually quite obvious if you look at the cu/sm progression vs clock progression over the last 10 yearsI can't get passed that matt guy saying xbox favours last gen engines etc but its always these last gen games that show advantages to ps5. It's weird.
can i say its a win for ps5 without getting banned?
alrighty then , lets wait for the digital foundary analysis as that is a very respected source among neogafersExcept it’s not when the PS5 drops well below the XSX in frame rate. It’s even in the spreadsheet the OP linked to…
…and it’s something the OP *conveniently* left out of his overview because it makes the PS5 version look poor.
At best this is a tie, if not a win for the XSX.
This shit isn't resolved on new systems? Besides they could have one up the available space even on Xbox. Embarassing.People arguing over a frame here or a minimal resolution difference when the ACTUAL difference is the pathetic mod support on Playstation systems.
Xbox has 5gb available vs 1gb on Playstation. Xbox allows new scripts, sounds, textures, quests, gameplay, characters, armors, weapons, overhauls, etc. Playstation only allows in game assets to be reused. It's basically two completely different games.
The lack of self awareness in this post makes it my favorite of the day.Yes, framerate is basically identical and nearly locked 60 on both, 59.87 vs 59.93. PS5 is locked 60 fps 99.79 % of the time vs 99.89 % on Series X...
Talk about grasping at straws... Some fans are really scraping the bottom of the barrel to find a positive for Series X here.
It helps with screen tearing, perception of stuttering and input lag. Vsync only deals with the first problem.But isn't VRR primarily for eliminating screen tearing but also alleviating the potential for input lag that accompanies vsync? I see folks throwing out VRR like any and all frame drops will be fixed with this tech but not sure why. Anyone want to explain this?
It helps with screen tearing, perception of stuttering and input lag. Vsync only deals with the first problem.
How many frames do you think VRR makes up for? LolThat's not great, thank god for VRR again.
Which points to them not utilizing the XSX.On series X you can run this game on an external HDD, don't even have to run it off the SSD what's nice
How many frames do you think VRR makes up for? Lol
Edit: just saw your tag, nevermind...
If the drops are sudden and steep, yes. Luckily, this doesn't happen in most games.And frame rate drops into the 40s are still going to be noticed.
Except it’s not when the PS5 drops well below the XSX in frame rate. It’s even in the spreadsheet the OP linked to…
…and it’s something the OP *conveniently* left out of his overview because it makes the PS5 version look poor.
At best this is a tie, if not a win for the XSX.
Yup. I assume more cu's will be better over time. Just like with CPU. It's actually quite obvious if you look at the cu/sm progression vs clock progression over the last 10 years
I love how dynamic weather translates into shitty clouds. Praise the console wars!What’s up with ps5s shitty clouds?
edit: I think it because they used the same account on the Xbox consoles and save sync made them the exact same weather / time of day .. etc.
Yes, framerate is basically identical and nearly locked 60 on both, 59.87 vs 59.93. PS5 is locked 60 fps 99.79 % of the time vs 99.89 % on Series X...
Talk about grasping at straws... Some fans are really scraping the bottom of the barrel to find a positive for Series X here.
At this point, they should just accept it's a bit inferior on this game due to PS5 native resolution advantage and move on.
So 'at best it's a tie, if not a win for XSX' just because PS5 version dropped lower than XSX for a sec in that scene while pushing 42% more pixels? What do you make of the moments where the FPS is the same and PS5 is still has 40% higher resolution then? XSX' FPS 'advantage' is 0.0015% in average despite dropping its resolution by 40% at some points. I have a hard time seeing a technical draw there. PS5 can outperform XSX in some situations, it happened before, we can accept and move on, this is not the end of the world.Except it’s not when the PS5 drops well below the XSX in frame rate. It’s even in the spreadsheet the OP linked to…
…and it’s something the OP *conveniently* left out of his overview because it makes the PS5 version look poor.
At best this is a tie, if not a win for the XSX.
Oh yes I’m really out to get that confoundant PS5! Yes sir-ree!I love how dynamic weather translates into shitty clouds. Praise the console wars!
Yeah it could, but I was only referring to cu/sm count. In GPUs frequently has not doubled in the past 10 years but cu/sm has tripled.But at the same time if games start to use more memory Xbox Series X memory setup will bite it in the ass but Sampler Feedback will help keep the memory usage lower with newer games.
Yeah it could, but I was only referring to cu/sm count. In GPUs frequently has not doubled in the past 10 years but cu/sm has tripled.
I never said it "creates" frames, I said makes up for. Just like v sync it's only within a certain variable. If I want to play a game @ 60fps and I'm using v sync but it keeps dipping into the 45 range I can't go "well thank god I'm using v sync", that makes absolutely no sense.I don't know why people think it creates frames, it stops your display causing judder and tearing to dropped frames.
That's better than it not doing that, simple.
Next gen update indeed..
sadly VRR is not used in consoles in an opportunistic way to remove vsync input lagBut isn't VRR primarily for eliminating screen tearing but also alleviating the potential for input lag that accompanies vsync? I see folks throwing out VRR like any and all frame drops will be fixed with this tech but not sure why. Anyone want to explain this?