• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia is basically a giant advertisement for AMD at this point.

64bitmodels

Reverse groomer.
I don't get it man, the 4080 12gb model is on a completely different die than the 16gb model with LESS memory bandwidth and Cuda cores. (which basically confirms that it was never intended to be a 4080, more likely a 4070)

Not to mention these prices... 900 for an xx80 series card. What the hell, Nvidia??? The 1080 and 980 weren't budget by any means but their prices were very reasonable and manageable, hell the 3080 had good bang for buck. 2 years later, and this GPU mining craze has made Jensen go friggin mad.

That Power draw too is just abhorrent, too. 320w for the 4080 alone. 320w!!!!! Can nvidia not optimize their cards for shit??? The 1080 was 150 watts! 3 more gens later and it's doubled. Isn't tech supposed to be smaller, less power hungry and better as time goes on, what happened here???

and now DLSS 3.0 is 4000 series exclusive, so good luck 3000 series guys. You won't get all the new fancy features, accuracy improvements and bug fixes, you're stuck with the old version which will without question become outdated in years, while dlss 3.0 thrives and keeps on living until the next dlss comes along and ends support for that too.

Meanwhile AMD is actually experimenting and innovating with their multi chip gpu design, which will hopefully have similar performance to nvidias offerings at like a fraction of the power budget, and their ai solution works on everything. Not to mention their cards will likely be cheaper because it's not hard to be cheaper than this apple bullshit nvidia is pulling.

Jensen Huang has lost his God damn mind. And if youre gonna drop 1199 dollars on an xx80 series GeForce, you have too
 
Last edited:

GymWolf

Member
Amd has a gold occasion on their hand to steal nvidia lunch money, but nvidia is still gonna sold out everything in the first hour most probably, so they are not gonna learn the lesson immediately.

We can only hope that the less-than-crazy-enthusiasts people (like me and many other gaffers) are gonna refrain from buying everything when the second barrage of gpus is gonna be available again after the initial shortage.

The first batch, is virtually probably sold out already.
 
Last edited:

Cryio

Member
4080 12 GB is so crippled in number of cores vs 4090, we might as well call it a 4060.

Can't wait for reviews for both RX 7000 and RTX 4000 GPUs. It's going to be a bloodbath.

Lovelace only upped core count by some ~70% vs Ampere, while RDNA3 is over 100% vs RDNA2 (12.200 smth vs 5.120 on 6900 XT)
 

Leonidas

Member
What features does nvidia have that warrant taking out a mortgage on a GPU?
Decent RT and better upscaling tech. for those who are willing to pay a slight premium over AMD. No one knows the price of RDNA3, knowing AMD it will be almost as bad as the 40-series in pricing.
 
Last edited:

Wildebeest

Member
If you are running them for purposes like AI research or a crypto scam mining farm, then I think the newer high power consumption cards actually do more work per watt. For a consumer it is more of a qualitative choice like do I really need to run Destiny 2 on a huge 8k TV at 240fps or could I possibly cope with less.
 
Last edited:

Reallink

Member
I don’t know. I fell for the AMD mystery hype trap last time, and missed out on getting a 3080 because I dropped my preorder to see what AMD was doing. Is there any serious info out that it’s worth doing that again?

The only info you need is that they're allowing Nvidia to sell 4090's for weeks before they even announce anything, should tell you all you need to know.
 

theclaw135

Banned
Decent RT and better upscaling tech. for those who are willing to pay a slight premium over AMD. No one knows the price of RDNA3, knowing AMD it will be almost as bad as the 40-series in pricing.

I find it funny power guzzling CF/SLI setups costing thousands, still don't upscale as seamlessly convenient as the Xbox 360 did.
 
Last edited:

twilo99

Member
I don't get it man, the 4080 12gb model is on a completely different die than the 16gb model with LESS memory bandwidth and Cuda cores. (which basically confirms that it was never intended to be a 4080, more likely a 4070)

vq0bjcg4jcp91.png
 

SmokedMeat

Gamer™
Seeing Jensen in his little leather jacket, holding out that big brick - like we’re all supposed to swoon and give thanks. I just want to slap the shit out of him.

AMD has a golden opportunity to take away marketshare if they don’t get greedy.

I’m both excited to see what AMD pulls off, and hoping Nvidia’s new cards bomb after the FOMO/Scalper crowd gets theirs.
 
RDNA 3 will edge out or even exceed Lovelace in rasterisation.

Vice verse Lovelave will edge out RDNA 3 on ray-tracing.

But over 2x RT performance on RDNA 3 over 2 is a massive improvement.

I could care less about cranking RT to the max, especially considering its lack of implementation on major titles. So rasterisation, efficiency and value is what I will be looking for.

It's on AMD now to not fuck it up.
 

64bitmodels

Reverse groomer.
If only AMD didn't suck at Ray Tracing, ML and DLSS equivalent...
FSR has caught up with DLSS drastically, machine learning isn't important when you're talking about video games, and RT still isn't enough of a noteworthy feature to be a deal breaker. Also, AMD could easily level up their RT performance with these new cards.
 

CamHostage

Member
I don't know... are you seeing these RTX 4000 demos? For crazy-expensive video cards, they're at least showing crazy-good results.

I'd like to believe that Nvidia's hubris could be usurped by a worthy competitor, but their shit is pretty danged impressive, and the competition has not been making me too happy (although I'm a cheapo, so I can only afford the competition.) When both new consoles moved away from Nvidia to AMD, I was still hesitantly worried that Nvida had an edge but figured now was the time to get go of that assumption... but then DLSS started making waves and other Nvidia features were really impressive (plus Omniverse, which isn't about the GPU but I'm otherwise amazed what they can do with that much data,) and meanwhile AMD has been playing catch-up and sort of getting close but not really getting there, and meanwhile these new consoles have been slow to deliver something out-of-this-world (albeit no PC game on Nvidia has done better, since they're all made to run across systems.) Maybe they're closer in parity than I think, but I can't help but wonder sometimes what PS5 or Xbox Series would have looked like if they had GeForce GPUs instead of Radeons.

Whenever I see Nvidia announce something, I tend to go, "Ooh, this is a cool future... if only I could afford it", and when I see AMD announce something, my reaction tends to be, "Oh good, they're finally doing that too..."

..But that's just me from an outside perspective. I'm a laptop guy, so I don't have big graphics cards from either of these titans and don't really look at these things a lot aside from thinking of it in a console owner's perspective or as future outlook. Yet, from what I see, I would still buy one of these Nvidia monsters, if I had all the money in the world.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
FSR has caught up with DLSS drastically, machine learning isn't important when you're talking about video games, and RT still isn't enough of a noteworthy feature to be a deal breaker. Also, AMD could easily level up their RT performance with these new cards.
FSR is still far from DLSS that keeps also improving. And I expect RT being in almost all current gen games forward, so...
 

CamHostage

Member
Nvidia makes some amazing demos but they are not representative of real world game performance. They're just there to look really good. And I still don't see what features nvidia has added to again, warrant spending a grand on a graphics card.

Sure, graphics demos aren't games, I get you there.

But NVIDIA Racer RTX will be playable this November (...for whoever can afford and find an RTX 4K card,) so we'll see how representative it is of real world performance through 'gameplay' in just a few months.

And if AMD has any playable demo as lifelike and rich in naturalistic effects as Marbles at Night (from 2020,) my eyes would love to see it.

 
Last edited:

Buggy Loop

Member
I mean, i'm all for the price disgust, but..

Not to mention these prices... 900 for an xx80 series card. What the hell, Nvidia??? The 1080 and 980 weren't budget by any means but their prices were very reasonable and manageable, hell the 3080 had good bang for buck. 2 years later, and this GPU mining craze has made Jensen go friggin mad.

And AMD followed the pricing for Ampere when they had an opportunity to get marketshare with respectable performances. Even worse than Nvidia since their MSRP was pure paper launch and AIBs were +$100 to +$150 over the Ampere counterparts with the same goddamn coolers. 980? Dude, that's ages ago in the silicon world. TSMC can just say fuck off if you don't want to pay the price, they've got more than enough demand.

That Power draw too is just abhorrent, too. 320w for the 4080 alone. 320w!!!!! Can nvidia not optimize their cards for shit??? The 1080 was 150 watts! 3 more gens later and it's doubled. Isn't tech supposed to be smaller, less power hungry and better as time goes on, what happened here???

They're hitting the limits of what's possible with Si wafers. You think the improvements in lithography nanometers where they say it's "30% less power usage than previous node" apply in this competitive field? It's applicable only if you leave the same number of transistors and improve the node. AMD & Nvidia just crams the maximum number of transistors they possibly can with their architecture. Never have these GPU die areas been denser. AMD was like ~20-40W less with RDNA2, all for picking slower memory but offset by a huge SRAM bank that hindered their possible dominance that gen by wiping the floor with Nvidia at rasterization if they had used that area for more CUs. Nobody would be looking at those 20-40W saving on their setup as an argument to gimp a possible clear contenter to Nvidia.

and now DLSS 3.0 is 4000 series exclusive, so good luck 3000 series guys. You won't get all the new fancy features, accuracy improvements and bug fixes, you're stuck with the old version which will without question become outdated in years, while dlss 3.0 thrives and keeps on living until the next dlss comes along and ends support for that too.
The under the hood DLSS 3 will work on RTX cards with all it's temporal improvements, but without the intermediate frames, which apparently adds a lot of latency anyway..

Meanwhile AMD is actually experimenting and innovating with their multi chip gpu design, which will hopefully have similar performance to nvidias offerings at like a fraction of the power budget, and their ai solution works on everything. Not to mention their cards will likely be cheaper because it's not hard to be cheaper than this apple bullshit nvidia is pulling.
I mean, i don't want to sound pessimist but, MCM has been a known metric for a long time now. It's a matter of finding the right optimization point when your architecture and the foundry density favors MCM vs Monolithic. One manufacturers' solution of going MCM before the other is not necessirely a clear winner, don't let that buzzword fool you. While you have less wafer scraps as they would be easier to make, you also have more complex bus systems to link them that adds to the costs and will never be as fast of a solution as monolithic.

Btw, they don't have an AI solution. They have no AI.

Jensen Huang has lost his God damn mind. And if youre gonna drop 1199 dollars on an xx80 series GeForce, you have too

And if you think that AMD will offer a competitive tech for -400$ than Nvidia for being a "good guy" or to raise up the creepy Lisa Su cult, you're out of your goddamn mind. They had the opportunity with RDNA 2. That's all the proof i need to say it ain't happening. They also said they wouldn't have a paper launch.. Well they have 1.8% of the steam hardware survey, so where are the cards?

PjDCAUgevSfdoJP2wthrC-1200-80.jpg.webp


I'm all for AMD stepping up and forceing Nvidia to lower prices. AMD did not do it for RDNA2, they seem content to have a niche marketshare. Console sale margins must be so nice that they don't care about PC.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
AMD has no motivation to lower their prices. They will probably see how quickly these cards will sell out. If they sell out and are hard to find for a while, then you can expect AMD to price their cards similarly. That's a fact.
 

RoadHazard

Gold Member
Sure, graphics demos aren't games.

But NVIDIA Racer RTX will be playable this November (...for whoever can afford and find an RTX 4K card,) so we'll see how representative it is of real world performance through 'gameplay' in just a few months.

And if AMD has any playable demo as lifelike and rich in naturalistic effects as Marbles at Night (from 2020,) my eyes would love to see it.



Have they confirmed that Racer is actually a playable thing and not just a real-time movie?
 
I don't get it man, the 4080 12gb model is on a completely different die than the 16gb model with LESS memory bandwidth and Cuda cores. (which basically confirms that it was never intended to be a 4080, more likely a 4070)

Not to mention these prices... 900 for an xx80 series card. What the hell, Nvidia??? The 1080 and 980 weren't budget by any means but their prices were very reasonable and manageable, hell the 3080 had good bang for buck. 2 years later, and this GPU mining craze has made Jensen go friggin mad.

That Power draw too is just abhorrent, too. 320w for the 4080 alone. 320w!!!!! Can nvidia not optimize their cards for shit??? The 1080 was 150 watts! 3 more gens later and it's doubled. Isn't tech supposed to be smaller, less power hungry and better as time goes on, what happened here???

and now DLSS 3.0 is 4000 series exclusive, so good luck 3000 series guys. You won't get all the new fancy features, accuracy improvements and bug fixes, you're stuck with the old version which will without question become outdated in years, while dlss 3.0 thrives and keeps on living until the next dlss comes along and ends support for that too.

Meanwhile AMD is actually experimenting and innovating with their multi chip gpu design, which will hopefully have similar performance to nvidias offerings at like a fraction of the power budget, and their ai solution works on everything. Not to mention their cards will likely be cheaper because it's not hard to be cheaper than this apple bullshit nvidia is pulling.

Jensen Huang has lost his God damn mind. And if youre gonna drop 1199 dollars on an xx80 series GeForce, you have too
Power usage of cpus and gpus is rising so much because of the fierce competition. Both parties want the performance crown. Thats why they are raising the power usage.

If we as consumers would stop buying overpriced and overspecced products the industry would have to create way more efficient designs.

And they kinda do. Look at those efficiency gains they are showing.
Theres no need to run those components at max frequencies and so on.

I am very interested to see those new cpus and gpus underclocked to reasonable power usage and compared to the previous gens at same power usage.

I was thinking about buying an AMD system and underclocking both cpu and gpu most of the time. I can easily play most games at 1080p/60.
And if I need more power I'll just change the settings and let both parts draw more power.

Might be unhealty for the power supply though. I haven't had the need or time for overclocking/underclocking stuff yet. But I will deep dive into these topics once both new cpus and gpus will be available.
 
Last edited:

CamHostage

Member
Have they confirmed that Racer is actually a playable thing and not just a real-time movie?

I was curious about that too.

Taking control of realistically rendered and physics-accurate RC cars, you’ll navigate around four unique sand-box style environments, letting you freely interact with all the physically modeled objects scattered about. Each environment is filled with photo-real objects, simulated in real time in NVIDIA Omniverse, a platform for 3D content creation and collaboration. The Universal Scene Description (USD)-based levels composed of 1,811 hand-modeled, textured and simulated assets objects, were built in 3 months by NVIDIA artists across 12 time zones using Omniverse. Artists collaboratively contributed to the shared Omniverse world using their preferred design and content creation tools, such as Autodesk 3ds Max, Maya, Blender, Modo, Maxon ZBrush, Adobe Substance 3D Painter, Substance 3D Designer, Photoshop, Illustrator, Rizom UV, and SideFX Houdini, achieving interoperability via the USD file format.

Each environment is rendered exclusively with full ray tracing, running at 60 FPS at 4K thanks to the power of GeForce RTX 40 Series graphics cards and NVIDIA DLSS 3’s game-changing Optical Multi Frame Generation technology. NVIDIA RTX Direct Illumination (RTXDI), NVIDIA Reservoir Spatio Temporal Importance Resampling Global Illumination (ReSTIR GI), multiple light bounces, and high ray counts create worlds with the most accurate shadows and realistic lighting ever seen in a real-time game. And in an instant, the time of day can be changed, updating world lighting accordingly, and switching-on lamps and other artificial light sources at night, transforming the scene.

https://www.nvidia.com/en-us/geforce/news/dlss3-supports-over-35-games-apps/#racer-rtx
k007UO7.jpg
 
Last edited:

Ironbunny

Member
Well I'm hoping AMD will do what they did with chiplet to CPU. NVIDIA can start doing Intel and whine about AMD using glue. Its confirmed that RDNA 3 will use chiplet design but the perfomance is a guess.

I was curious about that too.

Each environment is rendered exclusively with full ray tracing, running at 60 FPS at 4K thanks to the power of GeForce RTX 40 Series graphics cards and NVIDIA DLSS 3

https://www.nvidia.com/en-us/geforce/news/dlss3-supports-over-35-games-apps/#racer-rtx

So thats 20 FPS and with frame interpolation 60 FPS?
 

CamHostage

Member
Hmm, why isn't the video 60fps if it can actually run at that? Would be a lot more impressive.

It's not uncommon, unfortunately. (The sales reps who upload YT clips are not the same as the tech engineers capturing them and the video staff cutting them.) Looks like you can upload a video to YT all the way up to 4K / 2160p @60 fps, but there are compromises and the bitrate seems like it didn't increase mathematically enough to fully account for the video data increase.

Unfortunately, we're stuck with Youtube being the only way in the whole world to host a video...
 
Last edited:

IDKFA

I am Become Bilbo Baggins
I'm looking to buy a PC, but I'm not paying Nvidia's insane prices. If AMD launch their new cards at a few hundred quid cheaper than Nvidia, then I'm all in on AMD.

If AMD are thick as shit and price their cards the same as Nvidia then I'll just buy an Xbox series X and be done with it.
 

Buggy Loop

Member
I'm looking to buy a PC, but I'm not paying Nvidia's insane prices. If AMD launch their new cards at a few hundred quid cheaper than Nvidia, then I'm all in on AMD.

If AMD are thick as shit and price their cards the same as Nvidia then I'll just buy an Xbox series X and be done with it.

Strange logic

I too find these prices to be insane, but a 3060 Ti would still cover most of your needs (especially compared to consoles) for the gen. Why create a nonexistant problem by aiming for the newest generation's top tier cards, but go back to consoles if prices are "hundreds" away from what's there.

Seth Meyers Whatever GIF by Late Night with Seth Meyers
 

MacReady13

Member
Been wanting to rebuild a gaming PC for a little while now. Was going to get a 3000 series card but decided to wait for 4000 series. These prices and shady practices have turned me off completely and I will be waiting for AMD to hopefully save the day.
 

SlimySnake

Flashless at the Golden Globes
Eh they are doing this because AMD has dropped the ball for almost a decade. At the end of the day, they know that they will have a better product. Yes, they are charging premium but like Apple, that premium typically translates into quality just like Apple products.

Now the price gouging and deceptive 4070 (or even 4060) pricing has the potential to backfire but currently, I dont know anyone here who wouldve picked up a 6800xt over a 3080. DLSS is just that and there is no comparison when it comes to ray tracing performance which should become the norm going forward.

Oh and AMD also inflated their prices. The $450 6700xt was always in stock at Microcenter and yet they kept it at $899 banking on GPU starved consumers to come in and make a bad impulse purchase. To me, thats just as disgusting as what Nvidia is doing. AMD had an amazing opportunity to continue to sell these GPUs at MSRP where they wouldve become a far better value compared to Nvidia cards that were going for a $500 markup. But AMD simply decided to markup their own cards instead of selling them to buyers for an affordable price that they themselves set at the RDNA2 reveal.

Oh and 320 Watts for the 4080 is exactly what my 3080 pulls. In fact, my 3080 FTW3 12 GB model went over 400 Watts pretty much all the time. I thought the watts were going to be in the 600 range.
 
Last edited:
Top Bottom