• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why don't console games have more graphics options?

Guilty_AI

Member
Theres literally nothing that stops you from playing with default settings, this would just be advanced settings for those of us who like more options.
it bothers people with ocd that don't have enough self-control to stop themselves from tweaking and comparing graphics settings 3 hours straight.
Its extremely dumb but i'm not the one deciding how the market works.
 
Last edited:
That's the worst idea I've heard in a long time.

Console have been the superior gaming experience for a long-time specifically (amongst other aspect), because you don't have or need "options" since you're getting the best, most optimised version of the game possible on the specific console (unless it's CP2077) thanks to it being standard. It's playable in it's best state out of the box (or download).

In fact that's a huge step-back that PS4/PS5 have different options which is a testimony of how much unoptimised compromised their games have to target, which is an indicator that you're not playing the best version of the game in any cases.

If anything, it's PC that should have less options, in fact that's exactly what the best PC games or ports have, the better optimised and smooth they are, the least options they have because devs did their work. I've often said how the fact that CP2077 has so many options on PC is a testimony at how much of a crap mess it is.

giphy.gif


If making a handful of adjustments for a couple minutes before you play your 10+ hour long game PC games is too much, they have easy buttons too. For starters most new games optimize for your setup with the click of a button in the options menu or often while starting up the first time. When booted they read your hardware configuration and know which GPU/CPU/Ram you have available. A lot of games will allow you to select simplified options at the top of the option page too. If you have an Nvidia GPU you can use also Geforce Experience and let it decide for you how your games will be set.

The worst PC ports have reduced options, not the other way around. The absolute worst are the ones that dump all the individual features into a "post-process" option which incorporates everything from chromatic aberration to volumetric lighting. Games like this always have a mod to split it up. Take The Outer Worlds for instance, the CA in the game is incredibly heavy-handed out of the box with no option to fix it. I had to file-dive to fix it. Now there are about a dozen mods on Nexus that add the option to the game to adjust individual components of "post-process" that weren't available beforehand.

Thanks to PC options:

- a player with a 144+Hz screen can make adjustments to get higher framerates.
- a player with a 4K/60Hz screen can make adjustments to get the best IQ possible at 60Hz
- a player with an older GPU can still play a new game at 144Hz
- a player with a brand new 3090 can play games with more effects than a player with last year's GPU
- a player can disable distracting shitty effects like motion blur or CA
 
Last edited:

OmegaSupreme

advanced basic bitch
That's the worst idea I've heard in a long time.

Console have been the superior gaming experience for a long-time specifically (amongst other aspect), because you don't have or need "options" since you're getting the best, most optimised version of the game possible on the specific console (unless it's CP2077) thanks to it being standard. It's playable in it's best state out of the box (or download).

In fact that's a huge step-back that PS4/PS5 have different options which is a testimony of how much unoptimised compromised their games have to target, which is an indicator that you're not playing the best version of the game in any cases.

If anything, it's PC that should have less options, in fact that's exactly what the best PC games or ports have, the better optimised and smooth they are, the least options they have because devs did their work. I've often said how the fact that CP2077 has so many options on PC is a testimony at how much of a crap mess it is.
Garbage tier take and you should be embarrassed.
 

01011001

Banned
here is what developers should do:

offer the usual console options in the normal options menu.
but either have a cheat code or a visible menu option for advanced settings... maybe even have a warning there so even the most casual plebs get that they shouldn't change these if they don't know what they are doing.

but then have that menu and let me change the framerate lock, the vsync settings and maybe give different graphics settings on top of that.

because the choices some developers make are baffling and I really would prefer to have a way around that.

for example, Forza Horizon 5... I find it ridiculous that they chose to have a 4K performance mode with reduced settings, but not have an option for a 1440p performance mode using the quality mode settings.
 

crozier

Member
High-end PCs historically have run console games at a higher resolution with better fidelity and a faster frame rate. A big deal. This is the first generation, however and IMO, that we have a game changer (ray tracing).
 

Fredrik

Member
Thanks to PC options:

- a player with a 144+Hz screen can make adjustments to get higher framerates.
- a player with a 4K/60Hz screen can make adjustments to get the best IQ possible at 60Hz
- a player with an older GPU can still play a new game at 144Hz
- a player with a brand new 3090 can play games with more effects than a player with last year's GPU
- a player can disable distracting shitty effects like motion blur or CA
- a player don’t have to waste performance on 4K resolution

I still play in 1080p on PC, it’s awesome, most games are 100+ fps at Ultra.
On console I’m forced to play in native/dynamic 4K, often with fps drops below 60 fps.
 
- a player don’t have to waste performance on 4K resolution

I still play in 1080p on PC, it’s awesome, most games are 100+ fps at Ultra.
On console I’m forced to play in native/dynamic 4K, often with fps drops below 60 fps.

"Don't have to" is the key wording you used that makes PC gaming so open for just about anyone regardless of hardware.

Players don't have to do anything on PC when it comes to visuals unless they simply can't hit a target due to hardware age. 1080p/240Hz monitor? Players can run that with the right hardware and settings. 1440p/144Hz? Players can run that with the right hardware and settings. 4K/60Hz? Players can run that if I have the right hardware. 4K/120Hz? Players can run that if I have the right hardware. You might have to turn some effects down or disable others based on your hardware but you can typically get games running 120+fps if that's what you want to do.

If the options on a PC game are missing, someone knows what changes you need to make to the .ini files to fix it yourself. I actually love file-diving before I play a game. Fallout 4 is a great example of file diving to get more out of the game. The game loads five cells in vanilla form but I can make it load as many as 13 by changing a number. This means in my version of the game I'm seeing NPCs and assets loaded in much further ahead than someone that is playing vanilla. It also destroys performance so I typically run the game at 9 cells to keep performance in check. I can change the shadow draw distance, reflection parameters, even things like foliage density in some games.

But I don't have to do that either. I can also just go into settings, chose "Low/Med/High/Ultra" preset and just play the game. It's up to me.

offer the usual console options in the normal options menu.
but either have a cheat code or a visible menu option for advanced settings... maybe even have a warning there so even the most casual plebs get that they shouldn't change these if they don't know what they are doing.

Forza Horizon 3/4 have exactly this on the PC version of the game. The graphics options are preset to low/mid/high/ultra/extreme dynamic targets, but you can opt to slide over to advanced settings where a warning text bubble pops up letting you know that the options could produce performance issues if you have no idea what it all means. Then you set everything to Extreme anyways and dial it back from there if needed to hit IQ/framerate targets because that's the best way to squeeze everything out of the GPU.
 

cireza

Banned
As a console player, I want 60fps. That's it. I don't want to deal with parameters that will affect performance and have me wondering "what the fuck has changed now that I have set the shadow to Not-So-Ultra setting ?".

Give me 60fps. If it has to be an option, give me the option. And that's it. I don't want to have other parameters. That's why I play on consoles.

People who want better fine tuning should move to PC.
 
Last edited:

ANDS

King of Gaslighting
Because most console gamers get confused with graphics options.

You only have to explain it once. Or do like UBISOFT does and just put an image and description of what the thing is doing.

. . . graphical options aren't some dense subject and developers shouldn't be hiding them away from consumers because they don't think console users will get it (which I don't think is a thing).
 

rofif

Banned
- a player don’t have to waste performance on 4K resolution

I still play in 1080p on PC, it’s awesome, most games are 100+ fps at Ultra.
On console I’m forced to play in native/dynamic 4K, often with fps drops below 60 fps.
you don't know what you are missing out on playing on 1080p monitor. probably 24" too...
4k, oled, huge screen, amazing hdr colors with local dimming. It's like 2 next gens above that 1080p.

I mean, it is perfectly fine but I can't even watch 1080p sdr movies anymore...
 

winjer

Gold Member
You only have to explain it once. Or do like UBISOFT does and just put an image and description of what the thing is doing.

. . . graphical options aren't some dense subject and developers shouldn't be hiding them away from consumers because they don't think console users will get it (which I don't think is a thing).

You are right. But try to explain that to some console gamers and they'll still say it's complicated.
 

rofif

Banned
You are right. But try to explain that to some console gamers and they'll still say it's complicated.
it's not only about being complicated.
You need a lot of other factors in order to change settings like fps counter and gpu/cpu utilization.
And why should there be an options if the hardware is fixed? Why take that comfort away from the developers?
I bet games that strain the hardware like tlou2, could not look this good if some stuff had to be dynamic in order to be adjusted. There would have to be some limitations probably.
The 1 setting that we need is global fps cap off in console system. Just so these performance modes can go as high as vrr will allow. Quality mode should be dev adjusted
 

Fredrik

Member
you don't know what you are missing out on playing on 1080p monitor. probably 24" too...
4k, oled, huge screen, amazing hdr colors with local dimming. It's like 2 next gens above that 1080p.

I mean, it is perfectly fine but I can't even watch 1080p sdr movies anymore...
I know exactly what I’m missing out on. Worse performance.
I’m a gray man and my eyes are crap, I can’t see 4K even if I wanted to, so therefore I’ve focused on what I can see instead, a higher framerate. I used to play on three screens as well. 4K is possibly the worst waste of system resources I can think of, and on console that’s what devs are going for without any way for me to opt out to get better framerate instead.
 
Last edited:

rofif

Banned
I know exactly what I’m missing out on. Worse performance.
I’m a gray man and my eyes are crap, I can’t see 4K even if I wanted to, so therefore I’ve focused on what I can see instead, a higher framerate. I used to play on three screens as well. 4K is possibly the worst waste of system resources I can think of, and on console that’s what devs are going for without any way for me to opt out to get better framerate instead.
I am not attacking. I understand.
That's the reason why I got 48" screen for a monitor. 4k is easily beneficial now and I can see clearly :p my eyes are crap too
at 27", 4k looked good but really not very showing it's nature
 

Fredrik

Member
I am not attacking. I understand.
That's the reason why I got 48" screen for a monitor. 4k is easily beneficial now and I can see clearly :p my eyes are crap too
at 27", 4k looked good but really not very showing it's nature
Fair enough, I haven’t tried such a big screen. If I sit in front of my living room table, about 1.5 meter from the 65” TV I can start spotting small differences between 4K and 1080p on console. The only difference I see from back on the couch is when the logo in the corner of the TV say 4K or 1080p 😅
 

poodaddy

Gold Member
That's the worst idea I've heard in a long time.

Console have been the superior gaming experience for a long-time specifically (amongst other aspect), because you don't have or need "options" since you're getting the best, most optimised version of the game possible on the specific console (unless it's CP2077) thanks to it being standard. It's playable in it's best state out of the box (or download).

In fact that's a huge step-back that PS4/PS5 have different options which is a testimony of how much unoptimised compromised their games have to target, which is an indicator that you're not playing the best version of the game in any cases.

If anything, it's PC that should have less options, in fact that's exactly what the best PC games or ports have, the better optimised and smooth they are, the least options they have because devs did their work. I've often said how the fact that CP2077 has so many options on PC is a testimony at how much of a crap mess it is.
This is the dumbest fucking comment I've ever seen on any forum, full stop. I'm not saying this forum, I'm saying all of them.

Really, just sit and think about how fucking dumb that is.
 

StreetsofBeige

Gold Member
If a game maker can make a PC game covering a million configs across low and high end specs and offer endless sliders, which all affect performance, it cant be that hard for them to do the same for consoles. It should actually be easier because everyone has the same system.

On the plus side, at least now most big budget games have a performance/quality toggle so that's good. And given just about all games on next gen so far have a 60 fps option, it's pretty good. I don't have a 120fps TV, so I don't have any urge to have settings dialed down more to bump up frames.

Before recent games with perf/quality modes, all the toggles you do for console games seemed to make no difference to performance.

I remember playing Triple Play 98 on PS1. Terrible game with the choppiest frame rate ever in a baseball game I played. At that time is when console sports games had multiple camera angles, so I'm thinking hey maybe it's like playing games on PC and I'll adjust some settings and cam angles and find a smooth one. Nope. All were just as sloppy and choppy. If this was PC, I could mess with sliders and cam angles and find something playable and smooth.
 
Last edited:

Brofist

Member
Some people just don't do well with too many settings. I think keeping it simple for consoles gamers, a performance and a quality mode is probably enough.
 

RoadHazard

Gold Member
Graphics options are there so that people with different hardware configurations can make the game run well on their particular setup. On console everyone has the same configuration, so the settings should simply be optimized for that (with quality/performance settings when that makes sense). There's really no reason to have more granular options.
 

yamaci17

Member
Graphics options are there so that people with different hardware configurations can make the game run well on their particular setup. On console everyone has the same configuration, so the settings should simply be optimized for that (with quality/performance settings when that makes sense). There's really no reason to have more granular options.
why not simply give console preset for pc games then? they already did the obvious work of tweaking settings for optimal perf/graphics. i'm getting tired of following hunboxd/df videos to get optimized settings. even then ,they still use aggresively high settings so that they don't anger PCMR fellows... i just want console preset... may it be low med or high or mix of them. whenever i say this, occasional PCMR fellow jumps in and says "why not buy a console then11!". well I pick PC to freely choose what res i play, what framerate I play, and what I actually play, be it old or new. I just want optimized settings, and that's it. That way I can push more resolution as I like, or get more FPS as I like.

i can clearly see how transformative the supersampling is. I can also clearly see how transformative it is to play at 80-120 fps instead of a locked 60 fps target. but even with relatively upper midrange-lower high end GPU such as rtx 3070, I need optimal and optimized settings to achieve weird requets I have XD

Take kena bridge of spirits for example. I happily used alex's optimized settings that had some low/med settings. before them, i got 1800p ultra 50-60 settings. its cool, but could be better. high preset? 60-65. still nice. optimized alex settings he thinks that equal to consoles? 75-85 frames. its simply more smooth, performance is more consistent, and still looks the same. I just want more pixels to throw at TAA so that I can see the game clearly as how Its meant to be seen. I cannot stand blurry baseline images anymore. But that's me. some other person will probably scorn that i use that performance headroom for resolution. well i can clearly see how transformative supersampling is for blurry TAA. TAA is not meant for native resolutions. it is meant to be used alongside with supersampling component. yet devs got away using it at native resolutions and managed to make people accept the current native TAA image quality as norm.

I enjoy console like optimized settings. I don't have the resources and time to compare and contrast each setting for tons of contexts. these guys seem to know one or two things so I respect them and use their settings. I've been a happy user ever since... some settings use resources for meaningless minimal stuff
 
Last edited:

RoadHazard

Gold Member
why not simply give console preset for pc games then? they already did the obvious work of tweaking settings for optimal perf/graphics. i'm getting tired of following hunboxd/df videos to get optimized settings. even then ,they still use aggresively high settings so that they don't anger PCMR fellows... i just want console preset... may it be low med or high or mix of them. whenever i say this, occasional PCMR fellow jumps in and says "why not buy a console then11!". well I pick PC to freely choose what res i play, what framerate I play, and what I actually play, be it old or new. I just want optimized settings, and that's it. That way I can push more resolution as I like, or get more FPS as I like.

i can clearly see how transformative the supersampling is. I can also clearly see how transformative it is to play at 80-120 fps instead of a locked 60 fps target. but even with relatively upper midrange-lower high end GPU such as rtx 3070, I need optimal and optimized settings to achieve weird requets I have XD

Take kena bridge of spirits for example. I happily used alex's optimized settings that had some low/med settings. before them, i got 1800p ultra 50-60 settings. its cool, but could be better. high preset? 60-65. still nice. optimized alex settings he thinks that equal to consoles? 75-85 frames. its simply more smooth, performance is more consistent, and still looks the same. I just want more pixels to throw at TAA so that I can see the game clearly as how Its meant to be seen. I cannot stand blurry baseline images anymore. But that's me. some other person will probably scorn that i use that performance headroom for resolution. well i can clearly see how transformative supersampling is for blurry TAA. TAA is not meant for native resolutions. it is meant to be used alongside with supersampling component. yet devs got away using it at native resolutions and managed to make people accept the current native TAA image quality as norm.

I enjoy console like optimized settings. I don't have the resources and time to compare and contrast each setting for tons of contexts. these guys seem to know one or two things so I respect them and use their settings. I've been a happy user ever since... some settings use resources for meaningless minimal stuff

Because there are thousands of different possible PC configurations, with different strengths and weaknesses. It's completely different.

I mean, PC games DO usually have low/medium/high/ultra presets. But those are more like starting points that you can then tweak to suit your particular setup.
 

Dream-Knife

Banned
I am not attacking. I understand.
That's the reason why I got 48" screen for a monitor. 4k is easily beneficial now and I can see clearly :p my eyes are crap too
at 27", 4k looked good but really not very showing it's nature
Why not just go UW? I'd rather have that then a huge 4k tv 2 feet from my face.
 

Panajev2001a

GAF's Pleasant Genius
Because there are thousands of different possible PC configurations, with different strengths and weaknesses. It's completely different.

I mean, PC games DO usually have low/medium/high/ultra presets. But those are more like starting points that you can then tweak to suit your particular setup.
Which has its pros and cons… you can upgrade the HW and push the settings further up on the other side it is a way for the devs to cope with the variety in PC configurations and crowdsource (make your customers do it) part of the optimisation pass.
 

buenoblue

Member
I mean they are. Pretty much every game I play on ps5 has performance/quality toggles plus motion blur adjustment and the like.

A guy at my work just got a ps5 and played through most of ratchet, I asked if he was impressed and he said the game was jerky on his tv. He didn't even know or look at options and was playing 4k/30 mode. After I mentioned the option to change he flipped it to the 60fps mode and was like wow it's so smooth.

I have another friend who plays every pc games on max settings 4k on a 1080ti and just suffers the constant frame dips.

Most people don't actually care about settings.
 

Kev Kev

Member
I just don’t care that much about graphics, which is why the bit of extra effort it takes to build a PC is unappealing to me.

The console is fine, the graphics are fine… everything is fine.
 

winjer

Gold Member
I have another friend who plays every pc games on max settings 4k on a 1080ti and just suffers the constant frame dips.

Most people don't actually care about settings.

That's just dumb. Even if he doens't care or know about how to choose settings, he could at least use Geforce experience to set up his game.
 
There are some terrible takes on this thread. PCs will always offer my options because developers have to keep in mind a minimum settings baseline and a recommended one, with everything above being optional depending on what parts you have in your machine. You can't restrict too much for only the high-end crowd unless you want fewer sales and you "want to push the limits of consumer-level hardware", so the more options, the better. The wide range of hardware options makes PC gaming having more options a necessity because you never know what the consumer will have on their rig.

The same can't be said about consoles, they are closed platforms where the developers can focus all their resources on optimizing their product to the max of their capabilities. I agree that the current consoles are basically mid-range pcs and not everyone made the jump to 4k gaming or 60+fps yet, making it viable giving people more options based on their preferences and what equipment they have (not everyone has a 4k tv with Dolby 5.1/7.1/atmos). Hell, some gamers don't even care enough about going above 1080p yet, and some would sacrifice resolution in favor of performance, then again, making valid the decision to give gamers more options.

Like it or not, the more advanced the consoles get, the more options they will offer, and I more than welcome that thinking for the console crowd. I can't believe just now some games are making it industry-standard like being able to disable options like Chromatic Aberration (I hated it in Bloodborne), Motion Blur, Shadow Quality, AA, FoV, Camera options separated for X and Y axis, etc.

TLDR; More options are good, even for consoles, which are, generally, closed platforms.
 

MiguelItUp

Member
Because consoles were always meant to be a more streamlined and simplistic experience. Plug in and play, with no reason to toggle anything.
 
I only want 1 setting on every console game. VSYNC=ON

I'm so tired of console games tearing.

They could hide it in an advanced menu or something.
 

Rest

All these years later I still chuckle at what a fucking moron that guy is.
As long as I can turn off ugly filters like bloom, motion blur and film grain, that's all I need. All that other stuff is meaningless bullshit.
 

Brofist

Member
That's the worst idea I've heard in a long time.

Console have been the superior gaming experience for a long-time specifically (amongst other aspect), because you don't have or need "options" since you're getting the best, most optimised version of the game possible on the specific console (unless it's CP2077) thanks to it being standard. It's playable in it's best state out of the box (or download).

In fact that's a huge step-back that PS4/PS5 have different options which is a testimony of how much unoptimised compromised their games have to target, which is an indicator that you're not playing the best version of the game in any cases.

If anything, it's PC that should have less options, in fact that's exactly what the best PC games or ports have, the better optimised and smooth they are, the least options they have because devs did their work. I've often said how the fact that CP2077 has so many options on PC is a testimony at how much of a crap mess it is.
I don't know how I missed this the first time reading through the topic, but this is easily one of the dumbest things I've ever read on these forums, and you should be ashamed.
 

ethomaz

Banned
And here I’m asking for less graphical options 🤷‍♂️

Console experience should be direct, the same for everybody… a single profile without options to change anything.

For a PC experience you have well a PC.
Console should stay console.
 

base

Banned
I know PC games are always going to have more options than console games that's always a given considering the many combinations of hardware possible. Console game graphic settings are often limited to "film grain" on or off etc. Simple toggles.

However why not let them have more? On PC if you have a 10 year old card you can still play most new games at 60fps with a massive hit in graphical settings and Resolution. Consoles now tend to have a bit of a choice between 30 and 60 but that will probably stop once the cross gen games disappear. Why not let a console user adjust resolution and at least a preset graphic detail setting so they can hit that 60fps?

Is there a quality test that needs to be passed from Sony etc that prevents them from putting in options like 30fps or 60fps in each game just in case a user breaks the game by accident?

I mean if there is a " revert to default settings" clearly placed on the screen then it should be fine right.
Cause they need to be tested first. These days beta testers are working for free. In the past devs would hire people to do the work and they were paid for it.

Unlocking graphs options isn't that simple. Unlocking framerate in a game would cause some unexpected issues.
 
Top Bottom