• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Marvel's Spider Man Remastered is coming to PC on August 12, 2022 (also Miles Morales Fall 2022)

DukeNukem00

Banned
Mouse and keyboard are not for gaming. The movements are jerky on the mouse and digital walking on the keyboard.
It’s pure coincidence that it’s so happened that you can get fast precise controls in a shooter. Some games are created for it like rts but You can’t say that universally all games play better with a mouse. That’s just wrong.

While I often game with mouse and keyboard when I can on pc, I think that controller is usually a much more comfortable option. You can lean back, all controls are there ergonomically in your hand. And when it comes to mp games, everyone is on the same playing field.
Not everyone needs to be super competitive when playing.
And then you get analog walking and analog triggers for throttle for example.
On top of that you can get very immersive haptics or other features.
What gain do I get for slouching over Elden ring with keyboard? And controller is priced as medium to budget priced wireless gaming mouse :p

I didn't say it's 100% universal, i said its near universal. Which it is. Its not pure coincidence that "it happened" to get more precision in a shooter, the FPS genre was made around them. You see now and then this statement like you just did, mouse and keyboard are not for gaming. But they are. Most genres that we play today - shooters, rpg's, adventures, strategy, MMO's, mobas, sims, etc - they were actually created around what mouse and keyboard offers. It wasnt an acccident, it was intentionally created. I mean, adventure games are nicknamed point n click games even.

While the dual stick controller, sony just made it to be there initially, it didnt have a specific purpose or goal. It took years for games to learn what to do with them. I think we all know at this point the infamous Alien gamepost pic, right ?

I don't even think the near universal superiority of mouse and keyboard needs explained very much frankly. Every console game needs assiststance from movement, to interactions, to camera controls, to aiming, to shooting. Dual stick pads are so hilariously shit and inept that game developers need to fucking make games play themselves behind the scenes in order for us to have the illusion that games function on dual stick pads. Haptics and immersion and whatever else nonsense like this is besides the point and entirely subjective. I dont care if a game is not competitive, i dont want good and precise controls only in counter strike, if i play a single player game for 50 hours i want it there too
 

Thebonehead

Banned
Why not? PC is open. You can do whatever you want.

I kind of want to swing around the city as Bonesaw Macho Man Randy Savage, making stupid one-liners while beating the shit out of bank robbers.
You forgot to add at 144hz

randy savage GIF
 

64bitmodels

Reverse groomer.
I didn't say it's 100% universal, i said its near universal. Which it is. Its not pure coincidence that "it happened" to get more precision in a shooter, the FPS genre was made around them. You see now and then this statement like you just did, mouse and keyboard are not for gaming. But they are. Most genres that we play today - shooters, rpg's, adventures, strategy, MMO's, mobas, sims, etc - they were actually created around what mouse and keyboard offers. It wasnt an acccident, it was intentionally created. I mean, adventure games are nicknamed point n click games even.

While the dual stick controller, sony just made it to be there initially, it didnt have a specific purpose or goal. It took years for games to learn what to do with them. I think we all know at this point the infamous Alien gamepost pic, right ?

I don't even think the near universal superiority of mouse and keyboard needs explained very much frankly. Every console game needs assiststance from movement, to interactions, to camera controls, to aiming, to shooting. Dual stick pads are so hilariously shit and inept that game developers need to fucking make games play themselves behind the scenes in order for us to have the illusion that games function on dual stick pads. Haptics and immersion and whatever else nonsense like this is besides the point and entirely subjective. I dont care if a game is not competitive, i dont want good and precise controls only in counter strike, if i play a single player game for 50 hours i want it there too
also, with mouse and keyboard, while it's more awkward to play it in any scenario that isn't on a desk, it's more comfortable overall because your hands are just left to making the inputs and not actually holding the controller, which can sometimes be kind of heavy (remember the duke?)
plus, your hands more naturally rest in a position made for typing, rather than a position for.... holding a controller.
 

64bitmodels

Reverse groomer.
About a controller with the second stick being replaced with a touchpad, isn't that what the steam controller is?
it is, but the problem is that it didn't really catch on and most games aren't really configured for it. Most games end up frequently switching to mouse and keyboard controls whenever you tap the touchpad which can be a bitch to deal with and map properly
if they tried it again in a more familiar form factor (as in just copy the Xbox layout but without the right thumbstick) and made an input driver more suited for that kind of extra control it would sell way better
 
Last edited:

yamaci17

Member
Yes, definitely invest some time into playing with mods, especially with games that have a lot of freedom on what to do with them, like Elder Scrolls games or Darkest Dungeon.

Yes, you will fail. Yes, you will break your save to the point it is unrecoverable and you will have to try again. Yes, it will take a while.

But the rewards are there, and they are glorious.

When you can make Skyrim look from this



to this


I think it's worth it.
thank you, i do mods from time to time. but my question was not towards modding :D i asked if i should've given a try for darksiders 3 and see if the gamepad would work right out of the gate for me, and it did, because i set it up correctly 1.5 yrs ago, it works all the time
 

rofif

Can’t Git Gud
I didn't say it's 100% universal, i said its near universal. Which it is. Its not pure coincidence that "it happened" to get more precision in a shooter, the FPS genre was made around them. You see now and then this statement like you just did, mouse and keyboard are not for gaming. But they are. Most genres that we play today - shooters, rpg's, adventures, strategy, MMO's, mobas, sims, etc - they were actually created around what mouse and keyboard offers. It wasnt an acccident, it was intentionally created. I mean, adventure games are nicknamed point n click games even.

While the dual stick controller, sony just made it to be there initially, it didnt have a specific purpose or goal. It took years for games to learn what to do with them. I think we all know at this point the infamous Alien gamepost pic, right ?

I don't even think the near universal superiority of mouse and keyboard needs explained very much frankly. Every console game needs assiststance from movement, to interactions, to camera controls, to aiming, to shooting. Dual stick pads are so hilariously shit and inept that game developers need to fucking make games play themselves behind the scenes in order for us to have the illusion that games function on dual stick pads. Haptics and immersion and whatever else nonsense like this is besides the point and entirely subjective. I dont care if a game is not competitive, i dont want good and precise controls only in counter strike, if i play a single player game for 50 hours i want it there too
Man. You didn’t consider a single thing I’ve said. Go back to slouching over your spreadsheet input device.
You absolutely disregarded everything.
Comfort, ergonomics, dev designed layout, haptic, analog movement, everything.

I understand you don’t even want to consider controller or maybe you discovered pc having recently but that’s not even remotely fair.

Imo the only argument there is, is about mouse fast precision which As I’ve said is not needed and I happily trade it for comfort of use and smooth movements of camera. But that’s only right stick. The movement stick, analog and buttons are imo better than keyboard.
I you said the controller needs assistance with every input, movement and aiming. Bullshit!!!
The only aid there is, is auto-aim of some type in some games but it’s optional. Some of biggest Games don’t even have autoaim Enabled by default. Uc4, horizon, returnal, gears.

Every part of the controller offers more. You can walk in all 360 direction with analog movement speed. You can use trite or other functions on analog sticks. Doing trust with mouse is not an argument. I can do that with both left and right stick.

And another point. If a game most used mechanic relies on aiming super fast headshots and nothing else? That’s not too deep gameplay. And if a game requires 50 buttons? Wow such versatility. Ok, simulators are fair but it’s a different story.

So ending, it’s perfectly fine that you require super fast and fine movement and I get it. But disregarding everything is not fair. Gaming is not about competitive fast aiming for me
 

Nickolaidas

Member
thank you, i do mods from time to time. but my question was not towards modding :D i asked if i should've given a try for darksiders 3 and see if the gamepad would work right out of the gate for me, and it did, because i set it up correctly 1.5 yrs ago, it works all the time
LMAO - Sorry man!
 
no way to use steaminput for xbox/egs games, yes. that's why i completely ditched steaminput (disabled it completely) and use DS4Windows+Hidhide.


gtx 700 series were gimped from the start. it won't be able to properly perform in any modern game. Kepler architecture needed special drivers to be performant, on top of that, they also lack dx 12/low level API instructions. they're literally limited to dx 11.0 and unable to run some games that PS4 /xbox one can run (halo infinite, ac valhalla from 2021), even if you have a 780ti. compared to Kepler however, Maxwell and Pascal aged gracefully

also, 2 GB VRAM greatly limits potential performance of such GPU chips (likes of gtx 760, 960, 750ti and so on. they will be limited by their buffer even if their chip is more capable. PS4 can allocate approximately 4 GB memory for GPU operations. the comparison would be moot at this point. instead, a comparison should be made with rx 460 4 GB instead. pentium gold + 8 gb can make do with console settings+no background apps+ 30 fps limit, provided you give the system a cheap but more capable GPU (something like a gtx 1050ti, 970 or something)

those videos really conveyed a bad picture. wish they never did them. Instead, draw a comparison with a midrange PC from 2014, which would have i5 4460+gtx 970. that rig still plays games way above ps4 (1080p 45-60 fps depending on the scene). not quite a perfect 1080p 60 fps as it was back then. still, factors like Async, ACE, low level API benefits cannot be overlooked. as i distinctly remember. current day situation is much better, current gen GPUs being full of features, to the point where they have more features than console GPUs actually (infinity cache, more performant ray tracing capabilities, DLSS, and so on)

(Actually, gtx 700 cannot run god of war either. god of war requires a minimum of dx11.1, which Kepler also... does not support. horrible architecture, horrible GPUs)



take this rig for example. it would be a better comparison. 2.4 tflops of maxwell greatness. a simple, cheap, budget i5. it matches the ps4 performance at times (sometimes drop below 30, sometimes goes above 30, mixed bag). could it have been better? gtx 960 is, on paper, %33 faster than ps4 /2.4 vs 1.8 tflops/. has less bandwidth (112 gb/s vs ps4's 176 gb/s. but ps4 probably uses something like 140-150 gb/s). factor in the async, and other potential API benefits, I'd say this PC is doing a decent job. but you and I can infer different results from it. i look at it from the positive side) a gtx 970 puts you into 40-55 fps territory, of course. what I try to say is, 1 year of difference (kepler 2013, maxwell 2014) made a huge difference. on paper, gtx 960 was only %9-10 faster than gtx 760. yet, gtx 760 cannot officialy run the god of war. i mean, look at this,



this is some horrific stuff. 2 gb of vram+not having proper API support = death sentence. compare this 760's performance to the 4 GB 960 above. you would not believe both cards, on average, used to differ by a simple %10 margin. not anymore, as you can see with latest titles.




i think it has to do with expectations, i have no idea. but surely, game got alot of hate from lots of rtx gpu owners (my observation from the comments and forums). suggesting a "low" RT setting for 3080 raises some people's eyebrows. they expect more. personally, i would say those settings are meant for future CPUs and GPUs and I'm glad they exist. some people ridiculed me that even a rtx 3050 can get high framerates with "rt low" setting. well, it does not. as a metter of fact, DF themselves suggested RT low reflections for 3070/2080ti tier cards, which is rightfully appopriate


You say kepler was gimped from the start (I agree with you because that's a fact), but at the same I was still happy with my 2012 build (3770K + GTX 680 2GB). Till 2017 (I have bought 1080ti in 2017, and later on I have sold it and replaced with GTX 1080) I could run all PS4 ports with better results than consoles, even despite being VRAM limited in many games. For example GTA5 run at 60fps locked on my GTX 680 at 1080p while PS4 version run at 30fps. GTA5 with high texture settings can allocate around 3GB, while my GPU only had 2GB, but I had no stutters because hybrid transfer (GPU to system RAM, and back to GPU) was fast enough.

Now what's crazy, I still use 3770K from 2012 with GTX 1080 2GHz (10TF), and I can still game at 1440p 60fps except very few extremely demaning games like cyberpunk, the medium and MSFS (but I feel like the medium and MSFS are fine even at 30fps, and in cyberpunk 40fps lock on my G-sync monitor still feels responsive and smooth, and especially on gamepad).
 
Last edited:

yamaci17

Member
You say kepler was gimped from the start (I agree with you because that's a fact), but at the same I was still happy with my 2012 build (3770K + GTX 680 2GB). Till 2017 (I have bought 1080ti in 2017, and later on I have sold it and replaced with GTX 1080) I could run all PS4 ports with better results than consoles, even despite being VRAM limited in many games. For example GTA5 run at 60fps locked on my GTX 680 at 1080p while PS4 version run at 30fps. GTA5 with high texture settings can allocate around 3GB, while my GPU only had 2GB, but I had no stutters because hybrid transfer (GPU to system RAM, and back to GPU) was fast enough.

Now what's crazy, I still use 3770K from 2012 with GTX 1080 2GHz (10TF), and I can still game at 1440p 60fps except very few extremely demaning games like cyberpunk, the medium and MSFS (but I feel like the medium and MSFS are fine even at 30fps, and in cyberpunk 40fps lock on my G-sync monitor still feels responsive and smooth, and especially on gamepad).
yeah, diversion started around 2017, ac origins, destiny 2 etc. when games started to really require 3-4 GB VRAM and kind of started to use more dx 11.1 instructions and of course, kepler was bound by driver due to its anemic architecture

even then, there are still lots of games that still manage to run respectfully on those GPUs, but it became a fifty fifty situation

haha mate, i played cyberpunk with a 40 fps lock on my 1080 as well. but i played with mouse/keyboard instead. i agree that 40 fps is a great compromise and now some console devs are starting to realize itts potential. my mindset was to have high/ultra graphics in that game, being a gateway to nextgen. at low/medium settings, i was able to get 60 FPS but i said f**k it and just pushed settings and settled on a 40 FPS lock. i had some gpu headroom by %10-15 which helped me to alleviate GPU bound input lag as well

that cpu is still relevant even today, i'm sure there's no game it cannot run smoothly by any means as long as you accompany it with a decent GPU. that's why I laugh people saying 5600x will go obsolete because consoles have 8 cores... yeah they have 8 cores, but they're slow, 3.5 ghz Zen 2 cores... 5600x has 6 fast Zen 3 cores that clocked at 4.5-4.6 GHz... and i5 from that era (2012) can still run all the games fine as long as you don't overburden it with high amounts of background activitiy (yeah, that's a sacrifice or compromise you have to make do, i have to admit).



lmao, look at this dude with its 2500k-3060ti rig; a solid 60 fps majority of the time. if you target something like 4k/dlss quality you can still get away with older CPUs, like me



of course, 2600k/37700k aged even better... but these i5s are still capable. of course, their time is coming
 

Orta

Banned
Really looking forward to these. That pre-release video of Spidey chasing Rhino through a shopping mall on the PS5 was the first time I've ever looked upon a console with envy as a PC nerd.
 

Topher

Gold Member
I didn't say it's 100% universal, i said its near universal. Which it is. Its not pure coincidence that "it happened" to get more precision in a shooter, the FPS genre was made around them. You see now and then this statement like you just did, mouse and keyboard are not for gaming. But they are. Most genres that we play today - shooters, rpg's, adventures, strategy, MMO's, mobas, sims, etc - they were actually created around what mouse and keyboard offers. It wasnt an acccident, it was intentionally created. I mean, adventure games are nicknamed point n click games even.

While the dual stick controller, sony just made it to be there initially, it didnt have a specific purpose or goal. It took years for games to learn what to do with them. I think we all know at this point the infamous Alien gamepost pic, right ?

I don't even think the near universal superiority of mouse and keyboard needs explained very much frankly. Every console game needs assiststance from movement, to interactions, to camera controls, to aiming, to shooting. Dual stick pads are so hilariously shit and inept that game developers need to fucking make games play themselves behind the scenes in order for us to have the illusion that games function on dual stick pads. Haptics and immersion and whatever else nonsense like this is besides the point and entirely subjective. I dont care if a game is not competitive, i dont want good and precise controls only in counter strike, if i play a single player game for 50 hours i want it there too

And then there is the matter of personal preference. Personally, I don't like keyboard and mouse when it comes to racing/driving or in games played in third person. They simply do not feel right. First person, I'm only using keyboard and mouse. For example, when I'm playing Cyberpunk 2077 on my PC, I have my Elite Series 2 controller near by at all times because I hate driving vehicles with the keyboard. So swap them out. Other genres like top down games such as Divinity OS2, I have to have keyboard. Forza.....controller. And so on.

The point is the way we play games with our own hands is very subjective. For that reason, I think console versions of games should include the same keyboard and mouse support as their PC versions. This omission is mind-boggling to me especially for first person shooters. But ultimately, I think arguing over which is a better controls scheme is as futile as arguing over favorite colors.

Didn't insomniac tweet at one point that it would never come to Xbox and PC?

Well that aged like fine wine



It was probably a true statement at the time, but never say never.
 
Last edited:

Shmunter

Member
Really looking forward to these. That pre-release video of Spidey chasing Rhino through a shopping mall on the PS5 was the first time I've ever looked upon a console with envy as a PC nerd.
The rhino bit from the beginning…

 

Guilty_AI

Gold Member
How many people on PC use m/kb for sports, racers and fighting games?
I prefer m/kb for racers, i can give better adjustments to throttle/steering that way. Trying to give precise inputs to steering with analogue sticks is basically useless anyway so quick digital inputs feel better. The only real advantage of a controller is being able to give partial throttle for very fast vehicles in some sims or simcades, assuming the controller has analogue triggers.

...

I also play driving sims with mouse steering and acceleration when its possible
kaos-elmo.gif
 
Last edited:
yeah, diversion started around 2017, ac origins, destiny 2 etc. when games started to really require 3-4 GB VRAM and kind of started to use more dx 11.1 instructions and of course, kepler was bound by driver due to its anemic architecture

even then, there are still lots of games that still manage to run respectfully on those GPUs, but it became a fifty fifty situation

haha mate, i played cyberpunk with a 40 fps lock on my 1080 as well. but i played with mouse/keyboard instead. i agree that 40 fps is a great compromise and now some console devs are starting to realize itts potential. my mindset was to have high/ultra graphics in that game, being a gateway to nextgen. at low/medium settings, i was able to get 60 FPS but i said f**k it and just pushed settings and settled on a 40 FPS lock. i had some gpu headroom by %10-15 which helped me to alleviate GPU bound input lag as well

that cpu is still relevant even today, i'm sure there's no game it cannot run smoothly by any means as long as you accompany it with a decent GPU. that's why I laugh people saying 5600x will go obsolete because consoles have 8 cores... yeah they have 8 cores, but they're slow, 3.5 ghz Zen 2 cores... 5600x has 6 fast Zen 3 cores that clocked at 4.5-4.6 GHz... and i5 from that era (2012) can still run all the games fine as long as you don't overburden it with high amounts of background activitiy (yeah, that's a sacrifice or compromise you have to make do, i have to admit).



lmao, look at this dude with its 2500k-3060ti rig; a solid 60 fps majority of the time. if you target something like 4k/dlss quality you can still get away with older CPUs, like me



of course, 2600k/37700k aged even better... but these i5s are still capable. of course, their time is coming


Around 2012 i5 were extremely popular because people were saying hyper threading is just some useless feature for gaming. With time however HT proved to be a savior for my 3770K. I can see a big difference now in CPU intensive games (for example shadow of the tomb raider, or AC origins) between 4c4t vs 4c8t. When i5 3570 dips into 45fps territory, i7 3770 has 55-65fps. There are still some games that will dip below 60fps on my 3770K, but only on rare occasions, and not to mention on my g-sync monitor 50fps is still very playable.

Here's the list of games I have played lately with my performance impressions. Some of these games are also available on PS5/XSX, so maybe people will find my results interesting.

-God Of War, this is very demanding game at Ultra settings and there's no way I can get close to 60fps at 1440p, but with PS4 settings I get around 70fps. For comparison GTX 1070 has 50fps, and 1060 35 fps with these settings, so it's not like my GPU power is wasted with 3770K.

RE Village- at max settings, 1440p, I get around 80 fps average and 52-55fps dips on rare occasions during some more intensive combat. With high settings I get locked 60fps even during combat, but on my g-sync monitor this game is perfectly smooth even with 50fps dips, so I was playing with max settings.

-RE3 remake, max settings, 1440p, locked 60fps. I have tried playing this game at 1600p (Thanks to DSR) for a brief time and I had still 60fps. I think playing with DSR really improve picture quality if game is using TAA. At 1800p I could see some dips below 60fps already, especially near volumetric light shafts.

-Gears 5, ultra settings, 1440p (with min fps set to 30fps, so dynamic res wasnt changing anything) and I had locked 60fps in all levels except one (I had 45-55fps dip on one occasion in ACT 4, and that was indeed because of CPU bottleneck).

-Tiny Tinas Wonderlands, high settings, 1440p, locked 60fps for 99% of time with rare dips to around 50fps in couple of places because of CPU bottlenck.

-Metro exodus, ultra settings, 1440p, 60fps for 99% of time, but in taiga level I get 50-55fps dips on rare occasions.

-Guardians of The Galaxy, max settings, 60fps. I wonder why this game runs only at 1080p 60fps on PS5/XSX.

-RDR 2, XOX settings (medium low, + max texture settings), 1440, 55-60fps, perfectly playable on G-Sync monitor.

-Forza Horizon 5, ultra settings, 1440p, locked 60fps.

-Forza Horizon 4, ultra settings, 1440p + MSAAx4 with 2x DSR (4K), locked 60fps

-Mafia Remake, high settings, 1440p, 55-70fps.

-Horizon Zero Dawn, high settings, 1440p, locked 60fps.

-Days Gone, high / max settings mix with screen space GI turned on, 1440p, locked 60fps

-Kena, max settings, 1440p, 60fps locked

-Hellblade, max settings, 1440p, 60fps

-Halo Infinite, this is very demanding game at 1440p maxed out, so I get only around 45-50fps, or 60fps with low/med settings. This game however has awesome dynamic resolution scaling, so I get 60fps at 1440p with max settings and picture is perfectly sharp to my eyes despite this upscaling.

-Watch Dogs 2, high settings, 1440p, 60fps with small dips from time to time

-Watch Dogs Legion, high settings, 1440p with 45 fps lock. This game is CPU intensive and lower settings dont improve CPU performance that much, so dips below 60fps are more frequent, therefore 45 fps lock is the best choice

-Far Cry 6, similar situation to watch dogs legion (CPU related dips), but in this game I'm using 50 fps lock.

- Cold War and Vanguard, high settings, 1440p, 60fps locked

-Assassina Creed Origins, I have played this game lately because I wanted to compare my results with XSX / PS5 patch update. At 1440p high settings I get around 55-60 in the city because of CPU bottleneck (I know it's CPU bottleneck, because GPU usage drops to 65-70%), but outside cities I get 80-100fps (and this is when GPU usage hits 99%). Even at 1600p I can play this game with 60fps+.

-Shadow Of The Tomb Raider, max settings, 1440p. I get 60fps for around 99% of time, but in areas with a lot of people (Paititi) I can see sometimes 50-55fps dips because of CPU bottleneck. I was really impressed with the graphics in this game and I think some later areas looks more impressive than even Uncharted 4 jungle.

My GTX 1080 model is palit gamerock premium (2GHz OC out of the box, but I'm running it at 2100MHz) and with such massive OC it has 10.7TF (for comparison stock 1080ti has 11.3TF). I had 1080ti Asus Strix OC in the past, but I wasnt happy with it, because it was way too loud. The sound was literally like a jet engine, and it was very annoying (especially when my little niece slept in my room, and I wanted to play games), and on top of that temps inside my PC case were rised way too much (My 3770K was running at 85°C, and that's too hot). Now with GTX 1080 I can barerly hear anything even at 99% GPU usage (I have to put my ear near the PC case just to hear the fans), and also CPU temp is much lower as well (62-72°C now, 85°C before), so for me GTX 1080 was a much better choice and I have no regreats selling 1080ti.

With current situation (this whole pandemic and potential world war, economic crysis) the future is uncertain, but if world will not end soon, I will upgrade my PC in september. I'm planning to replace mobo / CPU because I want to build win11 compatible system, and I will also upgrade my GPU when UE5 games will finally start coming out. Cards like 4080 / 4090 will be too expensive for me, but I can afford something within 700$ range, maybe 4060ti (this card should offer RTX 3080 performance, and that's awesome GPU with big performance boost compared to my GTX 1080). Till then I'm perfectly happy with my current PC, and who knows maybe even Spider Man will run with good results on this 3770K.
 
Last edited:

Danknugz

Member
why both both? use mouse in right hand and controller in left, takes care of analog movement that you lose with kb/m and imprecise aiming you have with the controller. i've never done it but it sounds like a good idea 👍
 

DeepEnigma

Gold Member
I wonder if Microsoft are going to do free copies of a special windows 11 on XsX/XsS to let xbox players run Spider-man on their console too. They already allow all the old consoles except for the PS3 to be emulated on Series with an unofficial app, so it seems like a natural progression to let PlayStation exclusives on PC run on Series consoles too IMO.
youtube fashion GIF
 

PaintTinJr

Member
I find really amusing that you believe Sony would allow their games to run on XSS/XSX. 🤣
Hypothetically if it happened, what options would PlayStation have? IMO they'd either have to wear it and be aware that this backdoor solution - Xbox would probably use to upsell their proprietary SSDs - was going to damage perception of any 1st part exclusive game they wanted to port to PC - in the eyes of PS5 exclusive gamers - or they would need to retreat from the free PC gamer money. neither option IMO that helps Sony eventually migrate their core console business and profits to a PlayStation PC launcher in the coming decade.

So hypothetically MS have them over a barrel should they back door PC games - at lower performance - on Series, like they have with emulation.
 

SmokedMeat

Gamer™
For me the PS3 has slightly better 1st party games than PS2. ICO, GT, Shadow of Collosus, God of War, Jak n Daxter were all brilland but PS3 had,
Uncharted 1-3, TLOU, God of War 3, Infamous 1-2, Killzone 2-3, Resistance 1-3, Sly Cooper, Ratchetn Clank, Demon Souls etc

Maybe so. Mark of Kri, War of the Monsters, and Downhill Domination we’re also awesome PS2 titles. Wish they were still around.
 

rofif

Can’t Git Gud
https://thegamespoof.com/gaming-new...-on-pc-than-on-ps5-first-graphics-comparison/

Looking great. Better texture, lighting, ambient occlusion, shadow etc
are we looking at the same video ?
It looks the same. If there are any differences, it's only subjective to say it looks better. Maybe light hits different in a scene or two. You make it sound like it's league better.
You mention better textures.... what about worse textures ?
It's probably a bug or a different costume. They show it again in another scene and it's also this low res

ICBHmj9.jpg
 
Last edited:

Topher

Gold Member
are we looking at the same video ?
It looks the same. If there are any differences, it's only subjective to say it looks better. Maybe light hits different in a scene or two. You make it sound like it's league better.
You mention better textures.... what about worse textures ?
It's probably a bug or a different costume. They show it again in another scene and it's also this low res

ICBHmj9.jpg

Those are two totally different costumes. Notice the big white spider.
 
I think the real question is how much better.

We already know there’s better resolution, shadow, AO, lighting. Seems Nixxes is working their magic well
I'm curious to see how the streaming of assets, net of all the improvements you mentioned, improves or worsens.
 

Swift_Star

Banned
Hypothetically if it happened, what options would PlayStation have? IMO they'd either have to wear it and be aware that this backdoor solution - Xbox would probably use to upsell their proprietary SSDs - was going to damage perception of any 1st part exclusive game they wanted to port to PC - in the eyes of PS5 exclusive gamers - or they would need to retreat from the free PC gamer money. neither option IMO that helps Sony eventually migrate their core console business and profits to a PlayStation PC launcher in the coming decade.

So hypothetically MS have them over a barrel should they back door PC games - at lower performance - on Series, like they have with emulation.
they would sue MS.
 

Panajev2001a

GAF's Pleasant Genius
are we looking at the same video ?
It looks the same. If there are any differences, it's only subjective to say it looks better. Maybe light hits different in a scene or two. You make it sound like it's league better.
You mention better textures.... what about worse textures ?
It's probably a bug or a different costume. They show it again in another scene and it's also this low res

ICBHmj9.jpg
Looks like a bit of a gamma issue/difference not sure if it is the footage or how it was processed.
 

BennyBlanco

aka IMurRIVAL69
Reminder that Sony pulled GOW from running on Geforce Now on the Xbox web browser. There’s no chance they would let this run on an Xbox.
 

PaintTinJr

Member
they would sue MS.
I asked before and you didn't reply, but what for exactly? I used to read groklaw from time to time so am quite familiar with when big tech companies have sued and whether they were successful, even for the US legal system I'm failing to see an angle for them to litigate such a move. Here in the UK (or the EU) they would have even less chance of taking issue with a Personal Computer(PC) game published running on a PC operating system - irrespective of the hardware used.

Sony many years ago lost heavily when suing the VirtualPC company that made a PlayStation emulator for PC, and were only able to remove it from the market after the failed legal action through buying out the company. They then sold the company to Microsoft - after removing the emulator from the company's assets - who in turn phased out virtualPC on the mac IIRC.
 
Last edited:

Relique

Member
Yeah I suspected that
Sorry but if you couldn't see that those are two different costumes based on the giant white spider taking up half the screen then maybe you should realize that you aren't any kind of a reliable judge on the differences between the two versions. I barely clicked through that video for 10 seconds and I could immediately tell the big differences between the two versions. Just compare any two sets of shadows, or look at how much more defined hair is in the PC version.
 

rofif

Can’t Git Gud
Sorry but if you couldn't see that those are two different costumes based on the giant white spider taking up half the screen then maybe you should realize that you aren't any kind of a reliable judge on the differences between the two versions. I barely clicked through that video for 10 seconds and I could immediately tell the big differences between the two versions. Just compare any two sets of shadows, or look at how much more defined hair is in the PC version.
I did see it. I've just not played the game so I don't know if you can change whole costumes or only logo and so on.
I said that in my original post.... maybe read it ?
Why is everyone so fucking ass recently. chill people...
My posts get deleted left and right by the mods because assholes don't have any counter arguments and just report me.

There is no "BIG" difference. Even base ps4 to remaster on pc would not be "BIG". Just a bit better and different
 
Last edited:

Relique

Member
I did see it. I've just not played the game so I don't know if you can change whole costumes or only logo and so on.
I said that in my original post.... maybe read it ?
Why is everyone so fucking ass recently. chill people...
My posts get deleted left and right by the mods because assholes don't have any counter arguments and just report me.

There is no "BIG" difference. Even base ps4 to remaster on pc would not be "BIG". Just a bit better and different
Really? The logo is the only difference between the two costumes? Like I said you need to get your eyes checked.

Also telling people to chill while calling others "ass" and "assholes". No wonder your posts get deleted.
 

Dream-Knife

Banned
Nice. Especially nice because it doesn't seem like new games come out much anymore these days.
Then play some PC games and stop treating it as a super console. Great new games come out all the time.
Very true. Everything is a remake of a remake or port of a port. Eh.
See above.
Ok, I get tired of the lies from PC gamers on enthusiast They have a vested interest to keep people buying into consoles as they get a lot more money per player there as they get money from third parties sales and from online subscriptions which PC players still shy away from.
Yes, we shy away from getting cucked out of our own internet connection.
Think of it, are you really going to dump your PS and buy an RTX 3080 PC?
Why does everyone think you need a 3080 to be a PC gamer?
No please, no Sony launcher on PC...

That would be a way to ask for a fee in order to play online on PC, and I really really don't want to see that can of worms being opened.
No way that happens. Plus no one is going to use a Sony launcher anyway. If they made their games exclusive to a launcher everyone would just pirate it.
 
Top Bottom