• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

new 4K monitors @ CES: Will it become the next PC standard?

No, using a custom res on my 30" of 3840x2400 for some games - I'd like a screen doing it natively.

Why would you run a game at a resolution that doesn't "fit" in the pixel matrix of the monitor. Makes no sense. The thing can't even display those extra pixels. Can it?
 

emag

Member
I would need to see a 4k monitor next to a traditional 1080p one just to judge it. I'd love to see the actual difference that is discernable.

It's pretty much the difference between a retina iPad and a non-retina iPad (with a 20" screen).

We need to go to 8K for larger monitors and 4K for smaller ones.
 

Zaptruder

Banned
I say keep pushing that resolution envelope.

It's needed for the end game - full field of vision, full frame rate, 'retina' VR/AR displays.

That's maybe... 15 years out?
 
AA is way more efficient than brute force pixel increases.
No, actually, it isn't. With many developers now switching to deferred rendering, anti-aliasing often requires custom work just to get working, with large performance trade-offs.

Downsampling gets around these problems entirely. You just raise the resolution and downsample to the display resolution, no extra work required. It has the extra benefit of also increasing the clarity of the image, as well.
 

DarkoMaledictus

Tier Whore
Worthless... unless you upgrade you pc hardware u are not going to run over 1080p for very long. Takes massive hardware to get games running at those resolution... after not even a year u ll have to decrease it if you want to play the newer games unless u get another 500+ graphic card...
 

Rolf NB

Member
No, actually, it isn't. With many developers now switching to deferred rendering, anti-aliasing often requires custom work just to get working, with large performance trade-offs.

Downsampling gets around these problems entirely. You just raise the resolution and downsample to the display resolution, no extra work required. It has the extra benefit of also increasing the clarity of the image, as well.
Downsampling uses ordered grids. Hardware-supported AA can use rotated and otherwise sparse grids, and gives you bigger quality gains per sample.
Resolving deferred rendering buffers on multiple samples per pixel is completely trivial, and not in any way slower than doing it on a brute-force oversampled buffer. Actually, the same rule applies, in that you have fewer total samples in your buffer, and your resolve pass will be faster accordingly.

There may be some backwater engines that just can't do this properly, and of course it's a valid strategy to use brute force to compensate for engineering incompetence. But you should always check first to see if it works the proper way. You throw way less computing resources out the window this way.
 

Paertan

Member
Finally som high resolutions! But I dont want a 30 inch display at my desk =/
24 inch 16:10 with high res IPS would rock. 1080p is enough for me for gaming and i do that on my tv. But for everything else I want higher res.
 
I always welcome more resolution, but I think the time is right to increase pixel density not just screen real estate. A 4K 22" monitor sounds more appealing to me than a 32" monster sitting on my desk. At that point I might as well just use my TV.
 
Diminishing returns.... I feel like these jumps are going to matter less and less, and horsepower will be (and should be) used on better, cooler looking things than hairline aliasing on things..
 

whitehawk

Banned
No, actually, it isn't. With many developers now switching to deferred rendering, anti-aliasing often requires custom work just to get working, with large performance trade-offs.

Downsampling gets around these problems entirely. You just raise the resolution and downsample to the display resolution, no extra work required. It has the extra benefit of also increasing the clarity of the image, as well.
Is this true? It would make sense if so. When I run games on the Dolphin emulator, I can run games 1080p and beyond fine for the most part. But as soon as I add even 2x AA, the framerate takes a huge hit.
 
No, actually, it isn't. With many developers now switching to deferred rendering, anti-aliasing often requires custom work just to get working, with large performance trade-offs.

Downsampling gets around these problems entirely. You just raise the resolution and downsample to the display resolution, no extra work required. It has the extra benefit of also increasing the clarity of the image, as well.

most developers moving back to forward rendering solutions. Just take a look at any siggraph paper from 2013, most deal with lighting solutions for forward rendering. Deferred rendering has to many weaknesses, specially with alpha, bandwith, shadows, and aa issues. Not to mention it's much harder to implement SSC and such.

downsampling is extremely ineffecient and produces artifacts.

example of tiled based forward rendering providing comperable results to deferred rendering without the drawbacks of alpha, bandwith and aa issues.
http://www.cse.chalmers.se/~olaolss...s=publication&id=tiled_clustered_forward_talk

http://www.youtube.com/watch?v=6DyTk7917ZI&feature=youtu.be (so many light sources)
 
It won't be standard for a while. Needs market penetration, price reductions, etc.

It will be one day though since it looks like that's the road tvs will go down too.
 
Finally som high resolutions! But I dont want a 30 inch display at my desk =/
24 inch 16:10 with high res IPS would rock. 1080p is enough for me for gaming and i do that on my tv. But for everything else I want higher res.

you know a 30 inch monitor 16:10 takes maybe two extra inches on your computer desk, just saying.

I got Jr'ed for a thread about this a few months ago. is it ok to talk about 4K now?


it's OK to talk about it. just never you. you didn't know what you were talking about.
 
Is this true? It would make sense if so. When I run games on the Dolphin emulator, I can run games 1080p and beyond fine for the most part. But as soon as I add even 2x AA, the framerate takes a huge hit.

Yeah it doesn't really make any sense, especially considering that there are way too many types of AA that range in performance hit level, something like SSAA (super sampling) will absolutely kills the frame rates more than anything.
 

Jinko

Member
Ya know 4k is very misleading why did they switch the dot count from verticle to horizontol ?

Surely these sets should be called 2160p ?

4k is a cinema format anyway which is 21:9 (4096 × 1714)
 
The graphics card required to push 4K is magnitudes higher than 1080p. So right now? No. A few years? Probably.

Magnitude is x10.

3940x2160 vs 1920x1080 is 4 times more pixels to push so any card capable of doing eyefinity with 3 monitors can do it.
 

xenist

Member
I'm already playing maxed at above 1080p. 1080p looks merely passable to me now. 720p looks like a 3DS screen. I'm ready for 4k.
 

KKRT00

Member
most developers moving back to forward rendering solutions.

Most developers and then paste one paper :) I actually havent heard about any dev that is going out of deferred renders, i would rather say its different way around.

Maybe in long run it will be better way to render, but for now its not and probably it will take 3-4 years to get proper effect on any development.


Ps. Downsampling also eliminates shader aliasing, increases anisotropic filtering on textures and works better on subpixel aliasing than proper MSAA, so it has other benefits too.
 

Durante

Member
Ps. Downsampling also eliminates shader aliasing, increases anisotropic filtering on textures and works better on subpixel aliasing than proper MSAA, so it has other benefits too.
But all of these are also true for SGSSAA, and at the same sample count that will always be much better at reducing aliasing than downsampling (which is equivalent to an ordered grid sampling pattern).
 

KKRT00

Member
But all of these are also true for SGSSAA, and at the same sample count that will always be much better at reducing aliasing than downsampling (which is equivalent to an ordered grid sampling pattern).

Yeah, but SGSSAA is limited to Nvidia cards and not all games :) and from my experience OGSSAA gave me better results than SGSSAA in image quality, but it could be just my perception :)
 

Midou

Member
By the time a pair of graphics cards that could run a game that benefits from it are cheap, they will also be cheap.

I'm just excited because 1080p seems to be standard in too many monitors now. Need a new standard resolution, even if it takes another 5+ years to be standard.
 

Somnid

Member
Until we get to a nice retina resolution, keep it coming. I'm glad the bar is moving again, we've been stuck at 1600x2560 forever. Really though I'd like to see more 1080p+ monitors moving at 120FPS for 3D and extra smoothness. I do like that Sharp is integrating touch, all monitors moving forward need it whether or not it's immediately appropriate for current software/setups. Unfortunately I feel like these things coming out of CES are going to be too expensive. They need to be sub-$1000. We need them especially to develop software for high-res devices. High-res monitors are already expensive enough, which is sad because tablets which cost less than a monitor have started outpacing them in resolution. Same with laptops and tablets half the price, it's just depressing.
 
Downsampling uses ordered grids. Hardware-supported AA can use rotated and otherwise sparse grids, and gives you bigger quality gains per sample.
Resolving deferred rendering buffers on multiple samples per pixel is completely trivial, and not in any way slower than doing it on a brute-force oversampled buffer. Actually, the same rule applies, in that you have fewer total samples in your buffer, and your resolve pass will be faster accordingly.

There may be some backwater engines that just can't do this properly, and of course it's a valid strategy to use brute force to compensate for engineering incompetence. But you should always check first to see if it works the proper way. You throw way less computing resources out the window this way.
This is the ideal scenario.

Unfortunately, many developers don't bother even making the attempt. For a recent example, Sleeping Dogs and Guild Wars 2 both don't support any multisampling whatsoever (the only options in either are post-AA and OGSSAA or both combined). Far Cry 3 has abysmal performance with multisampling as well. I don't know what Starcraft 2 has now, but it also launched with no AA support whatsoever.

I don't know why this is the case, but the only alternative is pretty much to use post AA or some form of supersampling. The performance may be worse than proper multisampling but there ARE quality gains as well, so as far as I'm concerned the trade-off is worth it.
 

Izayoi

Banned
It will go well with the 780! Finally some progress in monitors. I might end up waiting on 4K 120hz, though.
 

onQ123

Member
you know a 30 inch monitor 16:10 takes maybe two extra inches on your computer desk, just saying.




it's OK to talk about it. just never you. you didn't know what you were talking about.

I knew exactly what I was talking about it was the people posting in my threads that didn't know or understand what I was talking about.
 

1-D_FTW

Member
Asus showed their 144hz 24 inch monitor (3D Vision Light Boost compatible). But that's only going to appeal to people who sit in a fixed spot and like their gaming to have awesome motion (ie TN).

I came into CES expecting to hear about TVs incorporating the 6 dollar 300mhz HDMI chip that Silicon Image has available now. But I guess that was too fucking much to ask. TVs are designed by 60 year old engineers, hoping to appeal to 50 year olds, and being game friendly with either better resolution support or input lag is apparently asking for the moon.
 

xemumanic

Member
I'd pay $1,500 for the 32" Sharp 4K I saw. I paid $1,024 for a 37" 1080p Westinghouse in 2006. Something like that Sharp would be a nice replacement.
 

Durante

Member
I came into CES expecting to hear about TVs incorporating the 6 dollar 300mhz HDMI chip that Silicon Image has available now. But I guess that was too fucking much to ask. TVs are designed by 60 year old engineers, hoping to appeal to 50 year olds, and being game friendly with either better resolution support or input lag is apparently asking for the moon.
I'd rather see them in a projector. Knowing that $1k 1080p DLP projectors are $5 away from 120 Hz and you can do nothing about it is ANNOYING AS HELL. See, it makes me use caps and italics in the same sentence!
 
Top Bottom