• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC GPU shipments drop 35% Year over Year

DaGwaphics

Member
I debunked this shit at least 5 times on this forum and at this point I'm quite tired of it. THOSE GPUs only have tiny 2 GB VRAM buffer that causes huge performance drops even at lowest settings. It has nothing to do with "MuH cONSOlE oPTimizATiON"

You've debunked nothing, because there is nothing to debunk. LOL

Just go and grab a PC with 512MB total RAM that can hold settings, resolution and framerate with a PS360 game. You can't find it LOL, because it never existed. Console optimization is a thing, to deny that is just being blind. Developers can streamline a lot more when they know they have X CPU threads at X speed with X amount of cache, same with the GPU. It is downright foolish to deny that.

Here is the 4GB 760 struggling as well (and this card could match the HD7950 that was around 1.5x as powerful as the PS4):


Edit: Okay this video is actually a 770, that's even worse. LOL


And if you want to see the HD 7950 falling below 30 at original settings:


This generation of consoles will get more out of their GPU arch than the comparable RDNA2 PC parts. It has happened with every console from the first one, it will happen again.
 
Last edited:

yamaci17

Member
You've debunked nothing, because there is nothing to debunk. LOL

Just go and grab a PC with 512MB total RAM that can hold settings, resolution and framerate with a PS360 game. You can't find it LOL, because it never existed. Console optimization is a thing, to deny that is just being blind. Developers can streamline a lot more when they know they have X CPU threads at X speed with X amount of cache, same with the GPU. It is downright foolish to deny that.

reason you cant do that is that YOU CANNOT even scale those games to ps360 levels in most cases. because at that point it is running games at bare minimum. you cannot scale GTA 5 on PC to the graphics level of X360/PS3. by default, at baseline, even at lowest settings, GTA 5 on PC provides better graphicsn tha PS3.

. gtx 1060 has been pushing 1.8x to 2.5x performance as it is supposed to (depending on how reliant the game is to async calculations) compared to PS4 since 7 years. it never has changed, never will change. if it had similar async capabilities to that of PS4's, it would forever push 2x+ performance over it. there still games exists in 2022 that GTX 1060 pushes 2x performance over PS4, latest example being marvel's spiderman. but denial is strong, keep on with it. even gtx 970 pushes near 1.8x performance over PS4 as it should, despite being bordering 9 years old at this point

these gpus have enough buffer. if they hadn't, you would see them falter compared to PS4. but of course that does not your "mUh ConsOLe OptiMizATion" narrative.

times have changed. its been 9 yrs and gtx 970 still pushes nearly 1.5x-2x performance over PS4 like it did day one. only reason there's slight %10 to %40 performance deviations is because NVIDIA's gimped async implementation. and now that is fixed with Turing and Ampere, it has been 5 YEARS and RTX 2070 super still offers near-matched -performance compared to a PS5 despite being superceded by 2 different architectures. only thing that will change that fact is when 2070 runs out of memory buffer. (and vultures like you will prey on such instances, despite 3060 proving otherwise. 2070 ran out of memory buffer in spiderman with ray tracing at 4K and dropped to 20 fps avg wheras 3060 keeps on with 40 FPS avg. but that didn't stop console vultures such as yourself to pin it on how ps5 is magical and how it now beats the 2070 despite 3060 being literal equal to the 2070 bar the memory buffer)



nearly 10 year old gtx 970 pushing 1.7x-2x performance over PS4 in a game that ported to PC in 2022. (gtx 970 3.9 tflops, ps4 1.8 tflops.)

end of the story.

and yes, I debunked it. look at how gtx 960 and 3070 TANKS compared to their HIGHER MEMORY BUFFER counterparts.

you LITERALLY can't run games such as GOD OF WAR on a tiny 2 GB buffer without it TANKING the entire performance. YOU can't. these games are designed and optimized for 4 GB memory buffer as a BASELINE. turning settings down WON'T help as developers do not optimize / tune their games for the 2 GB tiny buffer, as the 2 GB CARD owners are a niche userbase that no dev would care about at this point. 4 GB VRAM has been a baseline in PC since 2014 with GTX 970.

there exists no 4 GB hd 7850 and 7870 so you will keep abusing that specific point and be in denial that I couldn't refute anything.
 
Last edited:
If anything, this shows that PC gamers for the most part have been ready for next gen for at least 3 to 4 years.
also
Primary Display Resolution
1920 x 1080 64.83%
2560 x 1440 11.06%
Multi-Monitor Desktop Resolution
3840 x 1080 66.14%
4480 x 1440 19.92%
I am not 100% sure what this is supposed to mean but I assume the first (and often only) monitor is regular FHD, while those that have multimonitor solutions have mostly "only" two FHD displays, ie. still way less than 4k (and I doubt multimonitor is even used by the majority of games).
So they practically have around the same power, but don't "waste" it on resolution.
 

ACESHIGH

Banned
You've debunked nothing, because there is nothing to debunk. LOL

Just go and grab a PC with 512MB total RAM that can hold settings, resolution and framerate with a PS360 game. You can't find it LOL, because it never existed. Console optimization is a thing, to deny that is just being blind. Developers can streamline a lot more when they know they have X CPU threads at X speed with X amount of cache, same with the GPU. It is downright foolish to deny that.

Console optimization is a thing. But its way over hyped by console players. The PS5 GPU keeps evolving in all the comparisons made for some reason. First it was equal to a 2070, then 2080, then 3070 and so on and so forth.

Kepler and southern islands GPUs were not as meant to be next gen, those were released around late 2011 early 2012. Those were basically meant to max 7th gen console ports at 1080p or even down sample if you had room to spare.

Consoles always have future proof specs tech wise that enable them to last longer vs older and more powerful GPUs.

Kepler in particular was gimped by the 2 GB VRAM buffer you had in most models. I had a GTX 760 and could run most games at console settings and frame rates, lowering texture settings in some games starting with batman Arkham knight. Had issues with Vulkan or compute heavy games like ID Tech 6 ones. The consoles were way better there. And then some shoddy ports like nier automata.
 
Last edited:

yamaci17

Member
here's additional proof that how much 2 GB BUFFER can hurt a GCN card's performance with games that are HEAVILY VRAM bound (like god of war. I can't find a god of war test. quite frankly, not many people left with these cards. but of course, some people will cling on them)



cod warzone, from 50 fps avg to 33 fps avg. with extreme stutters and stalls (like his 'it aint pretty' god of war videos)

metro exodus, from 50+ fps avg. to a stuttery 24 fps avg. this is no joke. running out of VRAM has serious ramifications.

division 2 from 55 fps avg. to 27 fps avg.

ps4 at its best case would get %25-40 avg. performance advantage over these GPUs because ASYNC sucks on PC and if the game is heavy on async calculations.

god of war most likely abuses the hell out of async on PS4 and naturally like for like cards will perform %30-35 worse compared to it. its a dx11 title



rx 560 4 gig a gcn4 card (gcn4 only has a miniscule %4-5 ipc gains over gcn1-3 which ps4 has most features of) averages around 1080p/30-35 fps at PS4 settings. its a 2.6 tflops gcn4 card, but lacking async optimizations, it is very normal for a PC 2.5 tflops gcn gpu to match PS4's 1.8 tflops gcn gpu (it is also being fed by only 110 gb/s bandwidth whereas PS4 has access to 176 gb/s total bandwidth with lesser overhead. afaik only 25 gb/s used for CPU operations so that gives the GPU a whopping 150 gb/s)



and you can see here that in games that VRAM is enough, 7870 OUTPERFORMS the rx 560 as long as game does not OVERFLOW its VRAM

so you see, rx 560 OUTPERFORMS hd 7870 in god of war (hd 7870, 720p low settings in shambles, stuttery, rx 560 1080p rock solid 30 at medium PS4 settings) BECAUSE OF VRAM BUFFER.

if there existed a hypothetical 4 GB HD 7870 it would outperform the PS4 or at least match it, rather than being in shambles compared to it. only reason they fell apart is because of... REPEAT after me... RUNNING OUT OF VRAM BUFFER.
 
Last edited:

DaGwaphics

Member
@ yamaci17 yamaci17 Spiderman at medium on PC is a lot uglier than the PS4 release.

Let's try that with God of War with the original PS4 settings:



ehh, not much of an uplift for such a considerably more powerful GPU (nvidia cards were considerably more powerful per TF at this point - the 970 could beat the 290x a 5.5TF part).

Here is that 290x


It could do better than the PS4 at original settings, but not more than 2x better.

I never argued that PC GPUs could not be released years after the console that were more powerful. I simply said that the consoles will out punch their PC equivalents making it hard to make spec based comparisons.
 
Last edited:

ACESHIGH

Banned
The god of war PC port was/is notoriously unoptimized. Days gone runs much better. Devs could not care less and went with DX11 for a PC port in 2022. to give you an idea...

Also it was a tremendous CPU hog. One of the few games I could not run at console frame rates with a stock FX 6300.

But then you have the hacks at digital foundry calling it superb because it ran fine on a 6c 12t CPU with a 2080...
 
Last edited:

yamaci17

Member
@ yamaci17 yamaci17 Spiderman at medium on PC is a lot uglier than the PS4 release.

Let's try that with God of War with the original PS4 settings:



ehh, not much of an uplift for such a considerably more powerful GPU (nvidia cards were considerably more powerful per TF at this point - the 970 could beat the 290x a 5.5TF part).

oh narrative changed now, hasn't it? now it has to perform like for like. no, I never said that. I SPECIFICALLY said beforehand (knew that you would use it as a counter attack,yet you still do, proving that you do not even read what I said properly) due to Async implementations on consoles, consoles will always have a %20 to %40 performance advantage. I never denied this. this does not make the game unplayable on GTX 970 as it provides a much smoother and better experience than PS4 despite being 9 years old. a 9 year old GPU being %30-40 difference difference is acceptable. what you give as an example is where PS4 equivalent GPU being literally 2-2.5x TIMES slower than PS4 (720p low, barely 30 FPS due to VRAM overflow. this is what I'm DISCUSSING.). if you're going to move the goalposts, I won't bother you any further and good luck on your own.

"Spiderman at medium on PC is a lot uglier than the PS4 release. "

You're free to disprove it with actual solid evidence. I do not care. Considering how horizon zero dawn and god of war pins ps4 settings as medium, I have no reason to believe spiderman is any different. you're free to disprove otherwise)

gtx 970 did beat 290x "5.5tf part" because 290x was not an actual 5.5 TF card. they never performed their potential whereas gtx 970 was performed to its maximum potential.

most of the polaris cards also had gimmicky TFLOPS. like Rx 580, Vega 56/64. Vega 64 is on paper is a 12.6 TFLOPS behemoth. in reality it battles with a 8.8 TFLOPS gtx 1080 instead.

as I said, your narrative was that how a PS4 equivalent GPU was like 2x slower than a PS4. if you're going to change that, I'm out.

720p to 1080p often means a 2x performance overhead difference. if a ps4 equivalent GPU gets ps4 like performance at 720p instead of 1080p WITH lower settings. than you claim it performs nearly 2-2.5 times worse than the PS4.

gtx 970/1060 and alike proves otherwise. if it were the case, 1060/970 would BARELY push 1080p/30 FPS if the console had magical 2x more performant sauce.
 
Last edited:

BlackTron

Member
I've gone backwards in GPU power and use my Steamdeck as my main PC.
Anyone want a used GTX 1080Ti? Still works. It's in a huge HTPC case that looks like a audio Reciever.

Sounds like one badass PC case.

And an efficient lifestyle choice shifting DEATH STAR POWER for a portable PC.
 

DaGwaphics

Member
oh narrative changed now, hasn't it? now it has to perform like for like. no, I never said that. I SPECIFICALLY said beforehand (knew that you would use it as a counter attack,yet you still do, proving that you do not even read what I said properly) due to Async implementations on consoles, consoles will always have a %20 to %40 performance advantage. I never denied this. this does not make the game unplayable on GTX 970 as it provides a much smoother and better experience than PS4 despite being 9 years old. a 9 year old GPU being %30-40 difference difference is acceptable. what you give as an example is where PS4 equivalent GPU being literally 2-2.5x TIMES slower than PS4 (720p low, barely 30 FPS due to VRAM overflow. this is what I'm DISCUSSING.). if you're going to move the goalposts, I won't bother you any further and good luck on your own.

"Spiderman at medium on PC is a lot uglier than the PS4 release. "

You're free to disprove it with actual solid evidence. I do not care. Considering how horizon zero dawn and god of war pins ps4 settings as medium, I have no reason to believe spiderman is any different. you're free to disprove otherwise)

gtx 970 did beat 290x "5.5tf part" because 290x was not an actual 5.5 TF card. most AMD gpus back then had overly stated Tflops metrics that were for market. similar to how rtx 3070 is a "20 tflops card" but it is equal to 12.5 TFLOPS 2080ti.

it is the 3070 that has a gimmicky TFLOPS calculation. most of the polaris cards also had gimmicky TFLOPS. like Rx 580, Vega 56/64. Vega 64 is on paper is a 12.6 TFLOPS behemoth. in reality it battles with a 8.8 TFLOPS gtx 1080 instead.

as I said, your narrative was that how a PS4 equivalent GPU was like 2x slower than a PS4. if you're going to change that, I'm out.

Doesn't matter how AMD marketed their TF, the PS4's TF calculation would have been the same. If the AMD TFs of the day were overstated, so too are the TF numbers for the PS4. At the end of the day, the PS4 wildly surpassed or at worst matched 3 and 4TF (even those with 3 and 4GB memory) AMD parts thanks to the developers building games specifically around the bottlenecks of that specific piece of old hardware, a luxury the equivalent PC parts didn't have. That's all I've said, and it's impossible for you to disprove that statement since it is a factual statement that is already proven out.
 
Last edited:

yamaci17

Member
The god of war PC port was/is notoriously unoptimized. Days gone runs much better. Devs could not care less and went with DX11 for a PC port in 2022. to give you an idea...

Also it was a tremendous CPU hog. One of the few games I could not run at console frame rates with a stock FX 6300.

But then you have the hacks at digital foundry calling it superb because it ran fine on a 6c 12t CPU with a 2080...
even god of war, the worst port proves my point but they instead like to cling on to the ancient 2 GB cards as evidence that PS4 is 2x more performant with special sauce compared to equivalent GPU. exact same narrative with 750ti 2 gb/960 2 gb

it is near impossible to make them understand this. like, quite impossible from the looks of it. I've tried for at least 5 times by now, but nah. they seem incapable of understanding it. most likely is because they never had an actual 2 GB GPU and lived with its ramifications. I had a R7 265 (nearly ps4 equivalent) and I exactly know what is the problem over there. R7 265 always performed nearly equal to the PS4 AS LONG as the games did not overflow its tiny 2 GB buffer .everything has changed with ac origins in 2017 and going forward, most games required a minimum of 3 GB buffer at 900p/1080p for the card not to tank
 

yamaci17

Member
At the end of the day, the PS4 wildly surpassed or at worst matched 3 and 4TF AMD parts

no it didn't. you're looking at cards that ran out of VRAM.

if a card runs out of VRAM, it does not perform like it should. nearly half the chipset stalls.

true PS4 counterpart GPU would at least have 3.5-4 GB memory. end of the discussion.
 
Last edited:

DaGwaphics

Member
no it didn't. you're looking at cards that ran out of VRAM.

if a card runs out of VRAM, it does not perform like it should. nearly half the chipset stalls.

true PS4 counterpart GPU would at least have 3.5-4 GB memory. end of the discussion.
Most of the 3 and 4TF AMD GPUs had 3 and 4GB of video ram and ran either the same or only slightly faster than the PS4 (in spite of the fact that most were later GCN revisions with performance improvements). Maybe, actually try looking up the card specs next time. This holds for most third-party release as well, so, does not hinge on the specific quality of the GoW PC port (inefficient ports will be a forever ongoing thing that has to be accounted for on the PC side).
 
Last edited:

yamaci17

Member
Most of the 3 and 4TF AMD GPUs had 3 and 4GB of video ram and ran either the same or only slightly faster than the PS4. Maybe, actually try looking up the card specs next time.
you linked the video of a hd 7870

and said hd 7870 only has a tiny 2 gb buffer

proof;

JciZaOj.png

caps at 2 gb usage and never budges above. it is literally strangled to its maximum.

the game LITERALLY strangles 2 GB VRAM AT 720p low;

zVLfa7D.png


and you expect with such a tiny buffer, that FILLS entirely to its breaking point at 720P LOW, to perform good/equal to PS4???

like quite literally, card performs similar or nearly matched at all resolutions due to enormous VRAM bottleneck.

where's the 4 tflops 4 GB card that matches the PS4? I don't see any.
 
Last edited:

DaGwaphics

Member
I also linked a video of the 4GB 290x, the 770 4GB, and the 3.5GB 970, but yeah I guess you missed those.
 
Last edited:

yamaci17

Member
@ yamaci17 yamaci17 Spiderman at medium on PC is a lot uglier than the PS4 release.

Let's try that with God of War with the original PS4 settings:



ehh, not much of an uplift for such a considerably more powerful GPU (nvidia cards were considerably more powerful per TF at this point - the 970 could beat the 290x a 5.5TF part).

Here is that 290x


It could do better than the PS4 at original settings, but not more than 2x better.

I never argued that PC GPUs could not be released years after the console that were more powerful. I simply said that the consoles will out punch their PC equivalents making it hard to make spec based comparisons.

no you literally imply that console equivalent card (7850) performs shit, 720p, MORE THAN 2 TIMES slower.

" PS4 wildly surpassed or at worst matched 3 and 4TF"

the r9 290x you linked is not being surpassed or matched by PS4.

hPAaUox.png

the damn video literally tests 1440p at its start. even with FSR, it still has the FSR overhead that being upscaled to 1440p.

this is the best you got? jeez how desperate you have to be?

again, I WONT LET you move the goalposts. YOU implied that special ps4 magical console sauce makes it 2-2.5x times faster than the counterpart. and you dont have any other examples to back this implication other than 2 GB tiny buffer cards. now you move the goalpost by claiming 290/970 is not overperforming PS4 like they should. no, we're discussing whether they perform 2X TIMES worse than they should comparatively

if you ONCE more try to move the goalposts by saying "but it performs %30 worse than ps4 here!!1" I will stop taking you serious and stop the discourse on my end. really, I do not have the patience to discuss with people who do not want to learn something from a person who actually GOT an ps4 equivalent 2 GB tiny buffer GPU back in 2014.

if you, at start, said that gtx 970 only performs 1.5x over ps4 whereas it should perform 2x and that is due to console optimization, I wouldn't have nothing against it.

the fact that you fire up the discussion where 7850 performs 2.5x worse than ps4 (Ps4 being 1080p/30/med and 7850 being 720/low/25 FPS) is damning. either understand this or stop answering to me. inisuating or implying that ps4 magical sauce made it to perform 2.5x over a 7850 means that you're simply using an extreme example to give an example of console optimization.

as long as you give sensible and meaningful examples, no one can or will deny console optimizations. but this is not it. 7850/7870 and their 2 gb buffer is not it.
 
Last edited:

hlm666

Member
That’s more Nvidia holding back on putting them out there to push more 4080 sales, then anything else. They absolutely play that bullshit.
So does AMD, but it never gets any heat.

"We undershipped in Q3, we undershipped in Q4, Su told investors. "We will undership, to a lesser extent, in Q1 [sic]."

 

rofif

Banned
I've got 3080fe 10gb on release for 700$. yes I was one of the lucky few.
And it would still be great if not for stupid 10gb ... in A XX80 card. wtf. It really is a problem at 4k in already few modern games.
I still don't think that card was a great deal at 700$.... and 700$ XX80 would be looked at as a godsend nowadays.
GPUs are just too expensive.
If you can get ps5, deck/switch and some games... for a price of gpu alone? that's a problem.
The money is better spend on psvr2 or oled tv or any other gadget
 

Kenpachii

Member
I've got 3080fe 10gb on release for 700$. yes I was one of the lucky few.
And it would still be great if not for stupid 10gb ... in A XX80 card. wtf. It really is a problem at 4k in already few modern games.
I still don't think that card was a great deal at 700$.... and 700$ XX80 would be looked at as a godsend nowadays.
GPUs are just too expensive.
If you can get ps5, deck/switch and some games... for a price of gpu alone? that's a problem.
The money is better spend on psvr2 or oled tv or any other gadget

That's the stupid part about those GPU's, 2k for a 4090, U can buy a switch/deck/ps5/xbox series X for that and still have money left over.

Even populaire youtubers took the piss out of the GPU by making benchmarks with 0 fps in them because it was there entire budget for the PC and the GPU can't function alone.
 
Last edited:

SmokedMeat

Gamer™
That's the stupid part about those GPU's, 2k for a 4090, U can buy a switch/deck/ps5/xbox series X for that and still have money left over.

Even populaire youtubers took the piss out of the GPU by making benchmarks with 0 fps in them because it was there entire budget for the PC and the GPU can't function alone.

To make matters worse, prices don’t fall. While AMD is now dropping prices on their older GPUs, it’s amazing that Nvidia holds on and still rocks people for full price, years later.
The sad part is people must still be buying these cards on the Nvidia brand alone, when there’s clearly better value going AMD in some cases.
 
Last edited:

nkarafo

Member
no it didn't. you're looking at cards that ran out of VRAM.

if a card runs out of VRAM, it does not perform like it should. nearly half the chipset stalls.

true PS4 counterpart GPU would at least have 3.5-4 GB memory. end of the discussion.
True.

I had a 960 2GB card.

The 960 core should be MUCH more powerful than the base PS4 GPU. Yet, playing Resident Evil 7 at 1080p/60fps (same as PS4) the performance would tank because of the maxed VRAM. I had to play the game at a lower resolution to have stable performance, with a card that was what, twice as powerful as the PS4's GPU normally? Basically, the VRAM was a huge bottleneck for the GPU core, which wouldn't even come close to maxing out.

That's why i laugh at the 4060 if it will come with 8GB. If that's true, the card will become the 960 2GB equivalent for 2023. It will be dead before the year ends.
 
Last edited:

yamaci17

Member
True.

I had a 960 2GB card.

The 960 core should be MUCH more powerful than the base PS4 GPU. Yet, playing Resident Evil 7 at 1080p/60fps (same as PS4) the performance would tank because of the maxed VRAM. I had to play the game at a lower resolution to have stable performance, with a card that was what, twice as powerful as the PS4's GPU normally?

That's why i laugh at the rumor that the 4060 will come with 8GB. If that's true, the card will become the 960 2GB equivalent for 2023. It will be dead before the year ends.
likes of 2070 will be abused to no end in comparisons to PS5 even if such GPUs end up at %0.0001 utilization at the end of steam survey



this is literally what happens when you run out of vram. one second, you match it, the second, you're 2x slower than it. exact same thing happened with universally on all 2 GB cards. yet people keep denying it. this issue does not happen with 3060, and it matches PS5 in ray tracing department. 2070 super and even the 3060ti will TANK to a point they will perform worse than 3060/PS5 at 4K/ray tracing due to VRAM buffer overflow.

instead of acknowleding how the properly memory equipped 3060 performs like it sohuld, people will cling onto 2070/3060ti underperforming due to poor memory configurations.

this poorly made nx gamer video was used in many console/PC wars as a precious fuel. despite proving times and times how a 3060 also OUTPERFORMS the 3070 in this specific case of 4K+ray tracing, THEY kept clinging to the downfall of the 8 GB buffer. they cannot understand that it is caused by MEMORY overflow.

a 4 gb RTX 4090 would stall to a point it would perform worse than a Series X/PS5. does that make the 4090 "AS A CHIP" slower than ps5/series x?? NO.

problem here is that when people see such situations likes of "dawgraphics" think that PS5 EXTRACTS 2x more performance compared to a 2070 due to console magic sauce optimizations.

it is the OTHER way around. 2070 tanks 2x times.

the theoritical 12 gb 2070 (which is a 3060) performs like PS5.

you can say that 2070 is a gimped product and bad value compared to PS5. but once you state or claim that PS5 devs extracted 2x MORE PERFORMANCE despite having a 2070-like chip, YOU. ARE. WRONG. for that to happen, PS5 should also OUTPERFORM the 3060 by 2 times. does it happen? NO.

3060 also outperforms the 2070 by 2 TIMES in those scenes. DOES THAT MEAN IMSONIAC EXTRACTED 2X MORE PERFORMANCE FROM TEH 3060 COMPARED TO 2070???????? NO. A BIG NO.


2080ti renders 78 FPS avg. AT 4k/raytracing wheras 3070 TANKS TO 43 FPS AND 3060 AVGERAGES 52 FPS.

DOES THAT MEAN DEVS HAVE EXTRACT MORE PERFORMANCE WITH MAGICAL SAUCE FROM 3060 AND 2080TI?

NO. A BIG FREAKING NO.

IN THAT RESPECT, STOP USING THE CARDS THAT HAVE LIMITED/TINY BUFFERS IN RELATION TO CONSOLES. THEIR PERFORMANCE TANKS TO A POINT THE COMPARISON BECOMES MOOT.


if your answers to these questions are NO, THEN you have to accept that devs did not EXTRACT 2X MORE MAGICAL SAUCE performance from PS4 compared to 7850 or 7870 or any GPU for that matter, no. at best, it is around %20-40 (I DO not deny this, never did. a PS4 will always outperform its chip equivalents by %20-40 due to nonexistent GCN technologies in most actual GCN GPUs and the lack of async implementations in most ports. BUT. %20-40 is NOTHING compared to %100-125 some of you IMPLY.

using a card that nearly tanks its performance by %100-125 due to vram overflow as an example of console optimization making ps4 superior to a hd 7850 is dishonesty at its peak, if you're actually doing this on purpose despite learning why hd 7850 performs poorly as it does. then trying to move the goalpost to a 970 290x is irrevelant. just accept that hd 7850 extremely performs poorly because of VRAM overflow. nothing else is being discussed here. see examples above.

3060/PS5 outperforming a 2070 by a large margin when VRAM is stressed should be MORE THAN enough examples to refute anything Dawgraphic said. I rest my case.
 
Last edited:
also
Primary Display Resolution
1920 x 1080 64.83%
2560 x 1440 11.06%
Multi-Monitor Desktop Resolution
3840 x 1080 66.14%
4480 x 1440 19.92%
I am not 100% sure what this is supposed to mean but I assume the first (and often only) monitor is regular FHD, while those that have multimonitor solutions have mostly "only" two FHD displays, ie. still way less than 4k (and I doubt multimonitor is even used by the majority of games).
So they practically have around the same power, but don't "waste" it on resolution.
Let's not pretend PS5 is native 4k consoles all do upscaling.
 

DaGwaphics

Member
no you literally imply that console equivalent card (7850) performs shit, 720p, MORE THAN 2 TIMES slower.

" PS4 wildly surpassed or at worst matched 3 and 4TF"

the r9 290x you linked is not being surpassed or matched by PS4.

hPAaUox.png

the damn video literally tests 1440p at its start. even with FSR, it still has the FSR overhead that being upscaled to 1440p.

this is the best you got? jeez how desperate you have to be?

again, I WONT LET you move the goalposts. YOU implied that special ps4 magical console sauce makes it 2-2.5x times faster than the counterpart. and you dont have any other examples to back this implication other than 2 GB tiny buffer cards. now you move the goalpost by claiming 290/970 is not overperforming PS4 like they should. no, we're discussing whether they perform 2X TIMES worse than they should comparatively

if you ONCE more try to move the goalposts by saying "but it performs %30 worse than ps4 here!!1" I will stop taking you serious and stop the discourse on my end. really, I do not have the patience to discuss with people who do not want to learn something from a person who actually GOT an ps4 equivalent 2 GB tiny buffer GPU back in 2014.

if you, at start, said that gtx 970 only performs 1.5x over ps4 whereas it should perform 2x and that is due to console optimization, I wouldn't have nothing against it.

the fact that you fire up the discussion where 7850 performs 2.5x worse than ps4 (Ps4 being 1080p/30/med and 7850 being 720/low/25 FPS) is damning. either understand this or stop answering to me. inisuating or implying that ps4 magical sauce made it to perform 2.5x over a 7850 means that you're simply using an extreme example to give an example of console optimization.

as long as you give sensible and meaningful examples, no one can or will deny console optimizations. but this is not it. 7850/7870 and their 2 gb buffer is not it.

If you think the 290x video I linked doesn't show increased efficiency for the PS4 (a 1.8tf part on an older GCN revision vs. 5.5tf GPU on a newer GCN). I don't know what to tell you. I moved no goal posts. I used the cards that are available for comparison, I can't help the fact that a 4GB HD8570 was never released. LOL.

Also, I never said the PS4 was beating or matching the 5.5TF 290x, I said it was beating or matching the 3 to 4TF GCN 1 and 2 cards which it does, so, I would not talk about reading comprehension troubles. Though even the 290x comparison is a nice showing of how the fixed platform lifts that 1.8TF part beyond where it would exist on the less optimized PC platform. I love the made up additions of the magic secret sauce though, a nice touch. Consoles punch above their weight thanks to intensive micromanagement of available resources, not magic and I never claimed anything different.

If the simplistic point I was making is completely lost on you than clearly we can't have a debate. I am surprised though, because generally you don't pick such indefensible positions to debate from.
 
Last edited:
Down only 35% seems pretty good considering that Ethereum mining and the virus thing both ended and that killed a lot of external demand factors.

Seems like gamers are doing a better job picking up the slack than most GPU price haters in 2023 want to admit 😂
 

STARSBarry

Gold Member
Yo why am I reading about a graphics war between two outdated pieces of hardware that run games like shit?

I don't have a 4 grand PC so I can flex on PS4 players, I have a 4 grand PC so I can flex on PS5 players... it's not about cost efficiency it's about my PC just flat out being better at playing games.

Oh I also have a PS5 just so I can enjoy the console exclusives until I play them at a higher frame rate on my PC, and I have an xbox series X because sometimes I like playing gamepass with my feet up.

With that said though can we get back to GPU's costing laughable amounts? I didn't come here to be reminded about how shit GPU's where In the 200 dollar space, I knew that by them costing 200 dollars.
 
Last edited:

MikeM

Member
I've got 3080fe 10gb on release for 700$. yes I was one of the lucky few.
And it would still be great if not for stupid 10gb ... in A XX80 card. wtf. It really is a problem at 4k in already few modern games.
I still don't think that card was a great deal at 700$.... and 700$ XX80 would be looked at as a godsend nowadays.
GPUs are just too expensive.
If you can get ps5, deck/switch and some games... for a price of gpu alone? that's a problem.
The money is better spend on psvr2 or oled tv or any other gadget
Vram is why I went 7900xt over the 4070ti. Not running into vram issues with $1k+ CAD gpus.
 

MikeM

Member
Yo why am I reading about a graphics war between two outdated pieces of hardware that run games like shit?

I don't have a 4 grand PC so I can flex on PS4 players, I have a 4 grand PC so I can flex on PS5 players... it's not about cost efficiency it's about my PC just flat out being better at playing games.

Oh I also have a PS5 just so I can enjoy the console exclusives until I play them at a higher frame rate on my PC, and I have an xbox series X because sometimes I like playing gamepass with my feet up.

With that said though can we get back to GPU's costing laughable amounts? I didn't come here to be reminded about how shit GPU's where In the 200 dollar space, I knew that by them costing 200 dollars.
Bruh- you don’t hook up your PC to an OLED TV?

GIF by LoveIndieFilms
 

STARSBarry

Gold Member
Bruh- you don’t hook up your PC to an OLED TV?

GIF by LoveIndieFilms
Your right, because an OLED TV hasn't got a 240 refresh rate. I mean it caps out at a measily 120. How am I supposed to play competitively capped at such low frames?

Can we get back to the discussion about the GPU's that can preform these numbers?
 
Last edited:

MikeM

Member
Your right, because an OLED TV hasn't got a 240 refresh rate. I mean it caps out at a measily 120. How am I supposed to play competitively capped at such low frames?

Can we get back to the discussion about the GPU's that can preform these numbers?
Ditch the Xbox. Put your feet up with PC on the TV. Why subject yourself to such mediocrity?

My PC is attached to both my monitor and TV. Have it all!
 
Last edited:

STARSBarry

Gold Member
Ditch the Xbox. Put your feet up with PC on the TV. Why subject yourself to such mediocrity?

My PC is attached to both my monitor and TV. Have it all!
No way I love my little Xbox, I use it for 4k Blu-rays because I always have GT7 in my PS5 drive.
 

adamosmaki

Member
oh wow you mean to tell me people arent paying $400-500 for GPUs that only offer 15-20% performance improvement over previous gen GPUS that used to cost $250-300 ? Or that people arent willing to spent $1000+ for a damn GPU ?
If F****g Nvidia and AMD want sales to pick up they might want to release $300 mid range GPUS with 40%+ performance improvement over previous gen
 

SolidQ

Member
Until games will move to next gen graphic like UE5, and push new consoles to max, people will not change theirs gpu's, because all crossgen games runs fine even on GTX 10x0 series
Found new fresh video, even RTX 3070 not worth upgrade for me, from 60 fps to 60 fps zero difference for me (yes i'm always play with locked 60fps). Will wait for RDNA4 maybe for RX 8800
 

ClosBSAS

Member
LoL the demand for PC is not declining , it's just douchebag Nvidia ceo who is convinced his GPUs are worth 1200 bucks. Fucking joke
 

Chiggs

Member

I bet you know all about x86 ISA, the problems it poses for developers and performance, and the complete fucking mess it is. You just chose not to comment on it...that would have been too great a display of your power. Instead, my profile pic. Oh, you savage master of wit!

LoL the demand for PC is not declining , it's just douchebag Nvidia ceo who is convinced his GPUs are worth 1200 bucks. Fucking joke

I'm sorry, but no.

Worldwide PC shipments totaled 65.3 million units in the fourth quarter of 2022, a 28.5% decrease from the fourth quarter of 2021, according to preliminary results by Gartner, Inc. This marks the largest quarterly shipment decline since Gartner began tracking the PC market in the mid-1990s. For the year, PC shipments reached 286.2 million units in 2022, a 16.2% decrease from 2021.

And regionally-speaking:

tDiLTCM.png


But hey, big ups to ASUS for it's 5.3% increase, in its 5.6% market share. :messenger_grinning_squinting:
 
Last edited:

winjer

Gold Member

  • JPR found that AIB shipments during the quarter decreased from the last quarter by 12.6%, which is below the 10-year average of -4.9%.
  • Total AIB shipments decreased by -38.2% this quarter from last year to 6.3 million units and were down from 7.16 million units last quarter.
  • AMD’s quarter-to-quarter total desktop AIB unit shipments decreased -by 7.5%.
  • Nvidia’s quarter-to-quarter unit shipments decreased -15.2%. Nvidia continues to hold a dominant market share position at 83.7%.
  • AIB shipments from year to year decreased by -38.2% compared to last year.

Another quarter, another drop in GPUs shipments and sales.
This strategy of increasing prices in a recession is really working out....

You Are Dumb Patrick Star GIF by SpongeBob SquarePants
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.

Another quarter, another drop in GPUs shipments and sales.
This strategy of increasing prices in a recession is really working out....

You Are Dumb Patrick Star GIF by SpongeBob SquarePants

Nvidia and AMD are killing the market for discrete graphics cards with the constant price increases and lackluster performance improvements. Will they ever learn?
 

Kenpachii

Member
I think in general people just start to get tired of nvidia and amd. As a PC gamer myself i am done with the constant bullshit from both of them. Shipping gpu's with just to litle v-ram to sell the next card, making dlss exclusive features, half baked performance in games with bad launches etc etc. Price hikes. At some point people will just not care anymore.
 
Last edited:
Demand for GPUs would have been solid had Nvidia and AMD released reasonably priced products with gen-on-gen performance uplifts in-line with prior gens. It's not a structuraly weak industry I believe. Plenty of people (including me) WANTS to upgrade their GPUs, but the value proposition offered by latest products are simply not there.

On the other hand, it's unfortunate that both the big players no longer need to be "all-in" on PC GPU space. They have higher margin, higher growth and low risk segments to focus upon.
 

Tomi

Member
Prices will drop significantly very soon
Market is overflooded with gpu's and prices starting to collapse
 
Top Bottom