• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia CEO Jensen Huang Trashes AMD New GPU Radeon VII: 'The Performance Is Lousy'.

CuNi

Member
I gave a potential breakdown of what a 2080ti beater could be on Vega arch in another thread. You see, AMD did not just take an instinct card and boost the ram and bandwidth, they did custom work for each variation of these cards. So Instinct has its custom features, but Radeon 7 got 128 rops, the boosting scheme was reworked etc....

Now, for a 2080ti beater, AMD could have given us 256 rops, rework the CU's to increase them or they could have done what they did with PRO, increase the number of instructions delivered per CU, but that means the physical units would be much larger than typical.

They could then place 32Gb hbm on there, which means ridiculous bandwidth, yet, what is clear to us is that hbm needs to be fed to deliver the higher bandwidth that we enjoy there, so tdp would be around 350w on such a card at 7nm. So here's your 2080ti beater......and AMD could probably deliver that at $1100, $100 less than a ti, and this card beats the rtx titan too......

So we have, +21Gb vram over Ti, +8Gb over Titan, much higher bandwidth over those cards. That card would beat everything, but a bit more work would be involved especially on an older arch.

Yet, let's say AMD delivered that, forget about the net plusses highlighted above, persons or detractors would focus solely on tdp, tdp, tdp.

Let's be honest though, such cards are more for marketing and muscle flexing, but maybe it's essential for that very reason and for mindshare.

However, most people would still buy a $700 dollar card over an $1100, $1200 or whatever exorbitant amount they are selling the Titan.... So maybe that's AMD's thinking there. Yet, I think the real reason they did not provide a Ti beater on Radeon 7 is because they would rather do so on a new arch, with new engineering, lower tdp and better yields, which would translate to a much cheaper card beating the Ti, maybe in the $800-1000 range.....

Isn't there a lawsuit or some such against Nvidia? I remember so many rtx cards dying, man, had that been Amd, the press and the forums would go nuts. Yet, I'm still looking to see the tdp and frame times on these Turing cards when they finally actualize the tensor+rt cores in tandem, even including dlss. They said dlss was free anti-aliasing and a better image, but it seems to be more pro-aliasing more than anything else.......
People say TDP, TDP, TDP because it's so much more worse on AMD cards than Nvidia ones. If the difference is only a 100 bucks, then ofc the tdp counts because in the long run the more expensive card comes out as cheaper. Just like the VII really is not a competition to the 2080 overall except performance. 1) there are cheaper 2080s so price adventage is lost. 2) it performs the same, trading blows with the 2080 so neither one really crushes the other. 3) whether you use it or not doesn't really matter but the 2080 has RTX and DLSS which you could use and still manages to be cheaper.

So tell me what's the benefit of this card over the 2080? VRAM? There will always be some way to max out and fill the VRAM in a not so well optimized game or in general. You can always release even higher quality textures that will melt away at your VRAM. It's just like with RAM. Once RAM prices went down and storage went up, suddenly Software started to take a lot of RAM since it was there. Suddenly you didn't need to optimize that much since it doesn't matter anymore. The same is happening to VRAM. It gets used up because devs become lazy and just flood it with data to optimize less than they should. I have a 970 with its crippled 3,5GB + 500Mb and yet I still can play many of my games in the area of 144hz with still good looking textures. So what do I even need 16GB for? GPU's should start to have 8 as standard and games should stick to 8GB. 16 is just useless and encourages developers to lazy and not optimize games.
 

Ascend

Member
I don’t talk to everybody but I would never buy the GTX 1080TI over the GTX 2060 today.

The GTX 1080TI has more raw power but lacks in everything else.

GTX 2060 have good performace... around the GTX 1070 level... it is just a bit too pricey but we need to wait the non Founder editions.
The founders edition has the same MSRP as the AIBs this time.

The GTX 2060 already has memory limit issues in BFV with ray tracing. What use is it for the future if this is already the case?
https://www.computerbase.de/2019-01/nvidia-geforce-rtx-2060-test/5/

Getting the RTX 2060 over the GTX 1080Ti is not a smart move. And that's a huge understatement.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
I don’t talk to everybody but I would never buy the GTX 1080TI over the GTX 2060 today.

The GTX 1080TI has more raw power but lacks in everything else.

GTX 2060 have good performace... around the GTX 1070 level... it is just a bit too pricey but we need to wait the non Founder editions.
The founders edition has the same MSRP as the AIBs this time.

The GTX 2060 already has memory limit issues in BFV with ray tracing. What use is it for the future if this is already the case?
https://www.computerbase.de/2019-01/nvidia-geforce-rtx-2060-test/5/

Getting the RTX 2060 over the GTX 1080Ti is not a smart move. And that's a huge understatement.

Except it's no longer a move anyone can make. The 1080 Ti is completely sold out. It's not available anymore. It will now cost you ($800-$1000) more than a 2080 to get a 1080 Ti. People need to stop talking like it's a viable option. It's not. It's no longer available. You could get one at a decent price when the RTX's launched, but that stock is all gone.
 

Ascend

Member
Except it's no longer a move anyone can make. The 1080 Ti is completely sold out. It's not available anymore. It will now cost you ($800-$1000) more than a 2080 to get a 1080 Ti. People need to stop talking like it's a viable option. It's not. It's no longer available. You could get one at a decent price when the RTX's launched, but that stock is all gone.
That's true, but to pretend that somehow the RTX2060 is a superior choice compared to the 1080Ti really is stretching it. As for the RTX2080, we all know it's overpriced, and we can thank that card for the Vega VII price.
 
Well the problem Hueng has is that Nvidia are losing market share , i think he is just feeling a bit of pressure that's all. Yes Nvidia dominate the High end Graphics card SKU but his problem is the majority of PC gamers have Not got High End Graphics card's. The AMD 580 and Vega 64 are good cards as price vs performance go , we all know the the 2080ti is the best graphics card but its £300-500 to expensive.

so Hueng is being a dick because he know's he messed up big time with the price of the RTX line.

That's what they get for pricing themselves out of market share and doubling down on cryptominers instead of their established consumer based. Even hardcore Nvidia fanboys are thinking of jumping ship these days.

As someone who's been team green for more than a decade, it's refreshing to see AMD finally giving me an alternative to look at when upgrading.
 

ZywyPL

Banned
The HBM is what's killing the card. 7nm process = much lesser die size. No RT and Tensor cores = even lesser die size. They should've put ordinary 12GB G5X and sell it for 500, 550$ tops, and watch as they dominate the market, but of course they didn't learn anything from Fury and Vega fiasco... It's logical Jensen can laugh all he wants, once NV also switches to 7nm (and GDDR6 gets better and cheaper as well) there will be nothing AMD can compete with, because I seriously doubt Navi will be much/any better - it will still be a big monolithic GPU as oppose to initial concept, no RT support as they are still working on it, and since the VII already delivers 4K60, what will be the selling point for Navi?

AMD GCN is a seriously inefficient architecture so l agree with Jensen.

Fixed ;)
 

Panajev2001a

GAF's Pleasant Genius
The HBM is what's killing the card. 7nm process = much lesser die size. No RT and Tensor cores = even lesser die size. They should've put ordinary 12GB G5X and sell it for 500, 550$ tops, and watch as they dominate the market, but of course they didn't learn anything from Fury and Vega fiasco... It's logical Jensen can laugh all he wants, once NV also switches to 7nm (and GDDR6 gets better and cheaper as well) there will be nothing AMD can compete with, because I seriously doubt Navi will be much/any better - it will still be a big monolithic GPU as oppose to initial concept, no RT support as they are still working on it, and since the VII already delivers 4K60, what will be the selling point for Navi?



Fixed ;)

I think that a $499-599 card would have sold not much more than this repurposed super high bandwidth prosumer card will. They need to get the performance crown and more visibly: they are fighting a not always fair fight against perception and marketing a lot of the times.
 
Check the VII thread. I have been very vocal about the VII being a failed product but mainly because of its price tag and power consumption. This doesn't change the fact that they are on the same level performance wise. You can't deny that. I still wouldn't suggest buying it but the 2080 certainly isn't crushing it in any way when it comes to fps in games.
They kind of aren't on the same level performance wise though?
https://www.eurogamer.net/articles/...-first-radeon-7-benchmark-results-in-25-games

The Radeon Fantasy VII still falls behind both the 2080 and the 1080 Ti. There is absolutely not one single gamer who should be buying the Radeon Fantasy VII. I can't comment on the content creation and research side too much beyond the fact that CUDA is completely entrenched there. Outside of the very tiny minority of people who build Hackintoshes where you pretty much have to use AMD or nothing works because Apple doesn't allow Nvidia to release drivers for Mac OS, most people who do content creation or research are part of the CUDA ecosystem and only use Nvidia.
 

LordOfChaos

Member
The HBM is what's killing the card. 7nm process = much lesser die size. No RT and Tensor cores = even lesser die size. They should've put ordinary 12GB G5X and sell it for 500, 550$ tops, and watch as they dominate the market, but of course they didn't learn anything from Fury and Vega fiasco... It's logical Jensen can laugh all he wants, once NV also switches to 7nm (and GDDR6 gets better and cheaper as well) there will be nothing AMD can compete with, because I seriously doubt Navi will be much/any better - it will still be a big monolithic GPU as oppose to initial concept, no RT support as they are still working on it, and since the VII already delivers 4K60, what will be the selling point for Navi?



Fixed ;)

And since Navi still seems to be GCN, and as such will presumably have that four shader engine limit in tow, I'm not expecting the world from it either. Things might not get interesting until "Next Gen", if then. And yeah, agreed that if it had undercut the 2080 while matching it it would have been something, but only matching the price and (sometimes) performance in hand picked games while missing DLSS/RTX, eh.

AMD-GPU-Roadmap.png




I don't think though that they could just slap in GDDR5X and be just as competitive while cutting the price - Nvidia architectures made a bet years ago on better compression and bandwidth use, while AMD bet on higher memory bandwidth to feed their architectures, and that's playing out now.
 
Last edited:

CuNi

Member
They kind of aren't on the same level performance wise though?
https://www.eurogamer.net/articles/...-first-radeon-7-benchmark-results-in-25-games

The Radeon Fantasy VII still falls behind both the 2080 and the 1080 Ti. There is absolutely not one single gamer who should be buying the Radeon Fantasy VII. I can't comment on the content creation and research side too much beyond the fact that CUDA is completely entrenched there. Outside of the very tiny minority of people who build Hackintoshes where you pretty much have to use AMD or nothing works because Apple doesn't allow Nvidia to release drivers for Mac OS, most people who do content creation or research are part of the CUDA ecosystem and only use Nvidia.
I don't know if they show more on the desktop website, but on mobile I only saw 2 benchmarks. 1 was pretty much in the same performance area (2FPS difference) and the other showed the 2089 good ahead. They mention themselves that AMD has also released benchmarks where the VII performs equal in BFV and is also far ahead in 1 other game, so it's mostly equal in performance. We still need more data and benchmarks to have a final conclusion I'd say.
 

dirthead

Banned
I gave a potential breakdown of what a 2080ti beater could be on Vega arch in another thread. You see, AMD did not just take an instinct card and boost the ram and bandwidth, they did custom work for each variation of these cards. So Instinct has its custom features, but Radeon 7 got 128 rops, the boosting scheme was reworked etc....

Now, for a 2080ti beater, AMD could have given us 256 rops, rework the CU's to increase them or they could have done what they did with PRO, increase the number of instructions delivered per CU, but that means the physical units would be much larger than typical.

They could then place 32Gb hbm on there, which means ridiculous bandwidth, yet, what is clear to us is that hbm needs to be fed to deliver the higher bandwidth that we enjoy there, so tdp would be around 350w on such a card at 7nm. So here's your 2080ti beater......and AMD could probably deliver that at $1100, $100 less than a ti, and this card beats the rtx titan too......

So we have, +21Gb vram over Ti, +8Gb over Titan, much higher bandwidth over those cards. That card would beat everything, but a bit more work would be involved especially on an older arch.

Yet, let's say AMD delivered that, forget about the net plusses highlighted above, persons or detractors would focus solely on tdp, tdp, tdp.

Let's be honest though, such cards are more for marketing and muscle flexing, but maybe it's essential for that very reason and for mindshare.

However, most people would still buy a $700 dollar card over an $1100, $1200 or whatever exorbitant amount they are selling the Titan.... So maybe that's AMD's thinking there. Yet, I think the real reason they did not provide a Ti beater on Radeon 7 is because they would rather do so on a new arch, with new engineering, lower tdp and better yields, which would translate to a much cheaper card beating the Ti, maybe in the $800-1000 range.....

Isn't there a lawsuit or some such against Nvidia? I remember so many rtx cards dying, man, had that been Amd, the press and the forums would go nuts. Yet, I'm still looking to see the tdp and frame times on these Turing cards when they finally actualize the tensor+rt cores in tandem, even including dlss. They said dlss was free anti-aliasing and a better image, but it seems to be more pro-aliasing more than anything else.......

The problem is that none of the expensive cards actually provide the performance people want, so they're a waste of money. There are certain thresholds for cards.

* This card is fast enough to do X resolution at Y framerate

The problem with the $1,200 cards is that they're $500 more expensive but don't cross the next threshold. A 1080ti wasn't fast enough for 4k @ 144hz. A 2080ti isn't fast enough for 4k @ 144hz. Get over the hill and then maybe you can think about bending consumers over a tree stump, but don't just put some paperweight out there that doesn't even get over the next hurdle but costs a lot more. What's really accentuating this problem is that now that video games aren't actually programmed for PCs anymore but are just lazy console ports, it means none of the games are actually requiring these new expensive cards.

When Quake came out in 1996, you had to upgrade if you even wanted to play it acceptably. You needed a Pentium. If you wanted to play Quake 2 at 640x480+, you had to buy a 3DFX card. It wasn't a choice. Up until Microsoft paid off Western developers to basically abandon native PC development for the original Xbox, games were still actually made for bleeding edge PC hardware. They're not anymore. There's no ballistic game that requires a 2080ti to even run playably.
 
Last edited:

ethomaz

Banned
They kind of aren't on the same level performance wise though?
https://www.eurogamer.net/articles/...-first-radeon-7-benchmark-results-in-25-games

The Radeon Fantasy VII still falls behind both the 2080 and the 1080 Ti. There is absolutely not one single gamer who should be buying the Radeon Fantasy VII. I can't comment on the content creation and research side too much beyond the fact that CUDA is completely entrenched there. Outside of the very tiny minority of people who build Hackintoshes where you pretty much have to use AMD or nothing works because Apple doesn't allow Nvidia to release drivers for Mac OS, most people who do content creation or research are part of the CUDA ecosystem and only use Nvidia.
These are bechmarks supplied by AMD and they are disastrous.

Imagine the real ones.
 

dirthead

Banned
To be fair, the performance of the 2080ti is lousy for what it costs. I seriously expect a GPU to smoke pretty much every game at 4k @ 120hz for $1,200.
 

Lort

Banned
The problem is that none of the expensive cards actually provide the performance people want, so they're a waste of money. There are certain thresholds for cards.

* This card is fast enough to do X resolution at Y framerate

The problem with the $1,200 cards is that they're $500 more expensive but don't cross the next threshold. A 1080ti wasn't fast enough for 4k @ 144hz. A 2080ti isn't fast enough for 4k @ 144hz. Get over the hill and then maybe you can think about bending consumers over a tree stump, but don't just put some paperweight out there that doesn't even get over the next hurdle but costs a lot more. What's really accentuating this problem is that now that video games aren't actually programmed for PCs anymore but are just lazy console ports, it means none of the games are actually requiring these new expensive cards.

When Quake came out in 1996, you had to upgrade if you even wanted to play it acceptably. You needed a Pentium. If you wanted to play Quake 2 at 640x480+, you had to buy a 3DFX card. It wasn't a choice. Up until Microsoft paid off Western developers to basically abandon native PC development for the original Xbox, games were still actually made for bleeding edge PC hardware. They're not anymore. There's no ballistic game that requires a 2080ti to even run playably.


4k at 144??? Set the settings to max and even the 2080ti fails to get to 60 fps often at 4k.

https://www.eurogamer.net/articles/...-first-radeon-7-benchmark-results-in-25-games
 

Lort

Banned
That's my point. This batch of GPUs was fucking weak. They brought nothing new to the table. a 1080ti can basically do 60hz @ 4k. These new ones can't do 120hz @ 4k, no one cares about ray tracing yet, etc. It's fucking trash especially for the price.

Yup.
 

Redneckerz

Those long posts don't cover that red neck boy
RTX ray tracing is clearly not powerful enough for great hybrid rendering. The one game supporting it only traces reflections and even in that case it's heavily supported by screen space reflections.
Perhaps because RTX support was an after thought for that game as it was experimental code?

Yes, i would also be really shocked that a game that was not built around RT would use SSR, and i would also be shocked that they only used reflections given the code was still experimental.

You can't honestly be downplaying RTX simply because one game has it as an afterthought, which was really the only way how they could add RT support quickly in since at the time of concieving BFV, you obviously had to rely on SSR to begin with.

Artifacts from low ray count will be even more obvious in games that use it for shadows. I imagine it will be a novelty in a handful of games that most will turn off for performance.
This is literally fear mongering, based on a single title that was not even built with the tech in mind. I am not even breaking a lance for the RTX cards as i find their price premium bizarre, but come on.

I bought a 2080ti because of its 4k performance. I've been playing games on my 4k tv in locked 60fps. It's great. But for me those extra features are just bad in their current incarnations. DLSS is more suited to the 2080 that could get good performance with some overhead at 1440p but not great at 4k which highlights the fact that DLSS is not flexible. An RTX 2070 or 2060 still has to run in 1440p "scaled" with AI to 4k. So 2070 will get like high 30s in DLSS on Final Fantasy XV and good luck on the 2060. Hooray for DLSS.
  • Not understanding how RT has a significant perf impact yet it is improving since.
  • Not understanding how DLSS is too early days code.
New tech, child diseases. A concept forever known in most sectors regarding development.

Someone should tell Jensen that it's unsportsmanlike to kick a man when he's already down on the ground.
In this case, a woman, depending on who you ask.

To be fair, the performance of the 2080ti is lousy for what it costs. I seriously expect a GPU to smoke pretty much every game at 4k @ 120hz for $1,200.
But as most users know, what you expect has nothing to do with reality. 4K120 for 1200? Yeahokay.gif.

You can blame Nvidia for a lot of things but not understanding the orders of magnitude you need to have to increase perf as resolutions go higher is just being naive.

That's my point. This batch of GPUs was fucking weak. They brought nothing new to the table. a 1080ti can basically do 60hz @ 4k. These new ones can't do 120hz @ 4k, no one cares about ray tracing yet, etc. It's fucking trash especially for the price.
Except they brought Tensor cores, RT cores to the table.. And no one cares, because the tech is in active development.

I give you the price, but if it was cheaper, i can imagine you would be praising it to high heavens.
 
I mean, RT and DLSS are good advancements. I don't think anyone would say they aren't. The problem is how much we're paying for that. It just doesn't make sense.

How much did we use to pay for a high end GPU some 5 years ago? And 10? It shouldn't have skyrocketed they way it did.
 

thelastword

Banned
The problem is that none of the expensive cards actually provide the performance people want, so they're a waste of money. There are certain thresholds for cards.

* This card is fast enough to do X resolution at Y framerate

The problem with the $1,200 cards is that they're $500 more expensive but don't cross the next threshold. A 1080ti wasn't fast enough for 4k @ 144hz. A 2080ti isn't fast enough for 4k @ 144hz. Get over the hill and then maybe you can think about bending consumers over a tree stump, but don't just put some paperweight out there that doesn't even get over the next hurdle but costs a lot more. What's really accentuating this problem is that now that video games aren't actually programmed for PCs anymore but are just lazy console ports, it means none of the games are actually requiring these new expensive cards.

When Quake came out in 1996, you had to upgrade if you even wanted to play it acceptably. You needed a Pentium. If you wanted to play Quake 2 at 640x480+, you had to buy a 3DFX card. It wasn't a choice. Up until Microsoft paid off Western developers to basically abandon native PC development for the original Xbox, games were still actually made for bleeding edge PC hardware. They're not anymore. There's no ballistic game that requires a 2080ti to even run playably.
I can agree with lots of this, but I analyze everything on a case by case basis and I think Nvidia's tech is simply not worth it? At least AMD offers me 16Gb of HBM, crazy bandwidth better suited for 4k......At least they're putting their money where their mouth is, when presenting the Radeon 7 as a 4k card...…..Yet Nvidia wants to give me an 8Gb card with cheaper GDDR6 memory over HBM, with 448Gb/s bandwidth for $700 in comparison and ask me $100 more for the FE version at $800, for the privilege of OC'ing that bad boy? Yet, the FE only matches the boost clocks of the reference Radeon 7 when OC'd....So $800 for what exactly? For cheap cellphone RT hardware, which did in excess of 6 gigarays in 2016? Hybrid raytracing in one game, where some parts of the map are selectively raytraced with lots of rasterization still in place, a noisy image and a perf and rez executioner at that.....At least I know HBM is expensive, so for the amount they give and bandwidth it brings, I think it is well worth it in comparison...…..Radeon wins over NV with less stutter every time.......

All NV has done in this industry is cripple it, cripple price to performance.....Introduce lots of proprietary tools and software and of course hardware (gsync chips in monitors) and then go on to monopolize this industry...….Gameworks (setup to kill AMD performance), then people go on forums saying NV just crushes AMD in perf, when AMD cards have always been better on paper....More raw power, higher tflops etc....They pretend to be oblivious to the real issue and why AMD's perf has been behind NV all these years... Yet, things are changing, people are finally starting to see that power utilized when AMD has not been hamstrung by gameworks, dx11 etc......Vulkan and Dx12 has shown the power of AMD cards a bit better recently.....So you wonder, why does a game like Assassin creed perform so badly on AMD, why does FF.....Then you look at FC5 and it performs great on AMD cards, you look at the Division and it performs great there...You look at your Forza's, Dirt, Battlefields, Strange Brigade and you see the uplift over Nvidia's cards......And it's a good thing AMD is partnering with game companies on select titles, not to cripple NV cards but to extract the best performance they can get from theirs.....

If they did not do that, NV would just continue to riddle us with performance defeating features like PhysX, Hairworks, the entire gameworks portfolio "smokeworks" included, other features like Hbao+ and Vxao and the list goes on…….I'm all up for advancements in technology, but if you can't have those things enabled at playable framerates and at my current playing resolution, be gone with it, it's not ready for primetime......So NV calling me to spend hundreds of dollars and the features they boast and tout those cards on, everybody just turns it off for better perf and rez anyway, what's the point?...…...But No, buy this 2080, if it does not have enough power to enable those crippling performance features, buy the Ti at $1200, not enough power still? No problem, we have a $2499 RTX Titan.....They just hook you in the web, well some people at least, and make you spend exorbitant amounts of cash on the premise of great new technology, which most will turn off and they keep you spending more if you want to boast about GPU epeens online....

Yeah, no bones, NV marketing is something else, it works and most of their fans are the ones who do the most effective marketing for them, buying their cards, even those on 1050ti's pretending they're on 2080ti's, they come online even moreso to trash any AMD product that run circles around their low end NV cards, but they do not see the damage they're doing to the industry at large...….Yet everything is not forever, and enough persons have come forward to speak against the bewitching spell or rather curse NV has placed on this industry.....The 3.5 GB fiasco on the 970 was only the start of it, articles on gameworks was another, then that GPP thing really turn heads and caused a riot and then the lackluster Turing cards at such exorbitant prices was the last straw, it's no wonder Jensen is spooked....That lawsuit doesn't help either......

As for the PC situation......I too miss the old days of devs going to max out PC hardware....I miss Crytek, I miss Monolith of old (F.E.A.R)....I think FarCry (2004) from Ubisoft were all great accomplishments on PC as far as tech was concerned and it was the time you really took notice of PC hardware and what it could do, but right now it's just Nvidia implementing ridiculous features in games just to kill framerate and rez….Instead of devs actually maxing PC hardware smartly....At least with those games you saw the advancements, you saw the accomplishments and didn't mind upgrading to experience these titles....Hell, even I upgraded to a 8800GTX Ultra back then to experience Crysis and I still barely got a stable 30fps at 1280 x 1024....Yet, don't people remember how expensive those cards were, they were easily over $600...I think MSRP for the ultra was over $800.00 on debut...

Still, that type of PC development died, because people just pirated games to no end, Crytek and Ubisoft had huge issues with that in the early and mid 2000's....All the skidrow games etc, did a number on PC development...What's the point of putting all that effort into PC development if people just download a torrent.....Remember COD1 (2003), I played that on PC, along with United Offensive...That franchise only took off from COD2 on 360 and went bananas with Modern Warfare...So consoles offered something PC didn't, that's why the devs left......I still remember being wowed by Doom 2003, that first trailer blew everyone away. I also remember that Halflife 2/Source reveal, it was out of this world, the graphics and the physics combined...Those times with PC devs are gone, it's console devs wowing now with much inferior kit....I'm just happy next gen, consoles can finally have a decent CPU and GPU and it's thanks to AMD...Sorry NV, the last time NV gave XBOX a GPU, that ended in a fracas....PS3 had to cut cost by not going the double-cell route, they went to Nvidia, got a lousy GPU from Nvidia at a serious markup....Now AMD is feeding both consoles their technology and going deep into R&D to make powerful APU's and high performance systems in SFF's....Thank heavens for AMD, because if consoles had to depend on NV or even Intel, we'd be in a lot of trouble...…. A 4 core Pentium with a 1070 class processor for next gen perhaps...Pffftt!….at a serious markup too?

He isn't wrong. AMD isn't really bringing anything new to the table.
What new tech is Nvidia bringing....? You know that there are raytraced games or devs/companies utilizing that technology way before BFV or Nvidia, even on consoles. This hybrid reflection solution you see in BFV is not even impressive.....And I hope no one is saying DLSS is new, it's just upscaling through AI cores......The image is worse and if you just put your PC at native 1440p, you get much more performance......It's crazy, because when AMD/SONY started going the reconstruction route with CB, they never claimed you should go out and get $800.00 GPU's or consoles for such a feature.......Soon people will try to justify RTX cards, because it gives you 16xAF.....

Food for thought...






If you feel the need to talk about the competition then they actually are competition.
Succint, I even heard from a little bird that Jensen cashed in over 100,000 of his personal shares and racked in 18 million dollars when Nvidia stock skyrocketed earlier last year.....Just after he did the Todd Howard to investors btw.....

 
Top Bottom