• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD is Starfield’s Exclusive PC Partner

Buggy Loop

Member
If you think it's that easy, then why do you think NVidia, Intel and AMD have SDKs for their tech? And specific plugins for game engines like Unreal and Unity.
The reality is that it's a lot more complicated than what you try to make it seem.

Because the SDK is basically the sauce for what to do with the same motion vector inputs? What the hell, how do you think it's supposed to work?

Again...



What's anyone's excuse? A mere guy working on mario 64 RT implemented it mere hours after SDK was made public. That's the reality. That's also what Nixxes is saying. The reality you're trying to portray is out of thin fucking air.

NVidia did have a good advantage in tesselation at that time, and they force ridiculous levels of tessellation. Which hindered performance on all GPUs, even NVidia. But even more on AMD, that like you said, had lower capabilities.
Here is the performance graph for enabling hairworks. Even on a top end PC of the time, with an NVidia card, the performance loss was huge. But the improvement to visuals were negligible.
People even did mods, reducing tessellation for hairworks without any visual diference and a big performance gain.
410jFU0.png

So script kiddies fixed what AMD couldn't after working with the dev since the beginning? It was all AMD driver in the end? Surprised Pikachu

That's the whole point of DX11 era, driver team could alter the API to whatever they needed. Most of the drivers were actually fixing dev mistakes, something that can't be done anymore with DX12/Vulkan and thus we see the shitshow from some ports. For polaris then they had primitive discard accelerator. Late to the party of course with NV polymorph, while they had a fixed function one before that.

About Crysis 2, yes the sea was culled. But that was not the only object being tessellated. Many more were rendered, with ridiculous levels of tessellation.
Here is an example of a road cement block, that never needed this amount of triangles. Not even close.
Don't try to defend this crap and call other people fanboys, when you ignore all this BS.
I had an NVidia card at the time I played Crysis 2 and could have gotten better performance if it wasn't for this non-sense with tesselation.

zpDhWz8.png

Again, these are old debunked debates. So old the URL of the thread from Crytek engineer on their forum doesn't exist anymore. Here's the saved reddit text :
If you're not going to bother reading the replies then please don't respond with comments like "Crysis2 tessellation is almost garbage." If you want to argue a point, please address the ones already on the table.
  1. Tessellation shows up heavier in wireframe mode than it actually is, as explained by Cry-Styves.
  2. Tessellation LODs as I mentioned in my post which is why comments about it being over-tessellated and taking a screenshot of an object at point blank range are moot.
  3. The performance difference is nil, thus negating any comments about wasted resources, as mentioned in my post.
  4. Don't take everything you read as gospel. One incorrect statement made in the article you're referencing is about Ocean being rendered under the terrain, which is wrong, it only renderers in wireframe mode, as mentioned by Cry-Styves.

This is a lame attempt to put this as Nvidia's fault.

Inherently, Crytek's story is very particular in this case.

At DX9 version, at launch, they, they were accused of abandoning PC gamers with Crysis 2. The first one was so infamous with performance kneecaping PCs and served as a benchmark title. At the time Crysis 2 came out, Crytek had not improved the parallax occlusion method. So the parallax occlusion mapping method (surface has depth) was no-go. In Crysis 2 & 3, it had a fatal flaw in the engine, it can't be used to define anything that changes the silhouette, only surface detail.

So Crytek at the time, went full derp in tessellation. It was more readily available technology. They sprinkled it on absolutely everything they could, it became the focus of their marketing campaign, also to push the narrative that their engine could handle tessellation better than anyone else in the industry. The thing is that tessellation is super ineffective, to get fine detail out of it the polycount goes way up and at the time the performance impact weren't well documented in the engine.

Then the above controversy happened (Nvidia at the center of target of course...)

And it pushed Crytek to implement POM https://docs.cryengine.com/display/SDKDOC2/Silhouette+POM

A lot of the Crysis 2 issues were driver issues too. If you look at benchmarks done later, AMD magically improved.
http://www.guru3d.com/miraserver/images/2012/his-7950/Untitled-20.png
Crysis 2 used GPU writebacks for occlusion. Nvidia optimized for those and AMD was very late on those optimizations. AMD not supporting Driver Command lists for example (multi-threaded rendering/deferred contexts/command lists) also hurt them in many games.

Source for this?


In 2010's, AMD had around 30-35% market share. Not 50%, don't try to inflate things.

Justin Timberlake What GIF


54178_10_amd-gaining-discrete-gpu-market-share-nvidia_full.png


2010 in finer detail :
0WJfE6K.jpg


Supporting A low level API is too much for one company. Even with DX12, it took almost a decade for it to become the norm. And Mantle became Vulkan, so it was no loss.
Not eve MS was able to push DX12 to be adopted faster. And they have much more disposable budgets to invest in tech, than AMD.

Neither Bullet, nor Havok were made by AMD.
And even GPU Physx died out, because neither devs nor gamers cared that much about physics effects in games.
If it wasn't for NVidia sponsoring some games to implement Physx, it probably would not be used in any game.

Not made by but part of the initiative for open source.

file.php


PhysX didn't disapear, it is in unreal engine 4 and unity for a long time, didn't even require the logo to pop at start anymore. Even AMD sponsored Star Wars Jedi: Survivor uses PhysX. Fucking Nvidia and their sponsored games, only way PhysX survives :messenger_tears_of_joy:

So again what was the problem with PhysX outside of the initial dedicated card (also pre-Nvidia buyout), to multi platform in SDK 3 in 2011? How far back do we go for evil Nvidia shenanigans when competition had no fucking solutions to fight back? Literally at the dawn of new tech and we expect everything to be working on all hardware on first go when NovodeX AG's solution was dedicated to begin with? You seem leniant on Mantle implementation time, not so much on the rest of Nvidia's technology suite.

Consoles don't use Mantle. Each has it's own proprietary API.
No way AMD, or any company could convince MS or Sony to use some other API.

Not at console level, but the ports. What was the plan then if not that?
UQXcH3R.jpg


AMD sending engineers to studios to make all console ports mantle, low effort, would have completely changed history as we know it today. AMD might actually have the 80% market share.

If they had put 2 "man months", 2 engineers 1 month, to lock-in Mantle in 40 top tier games picked during a year, at say a good $200k salary, it would cost $8M (and probably way less than that as the engineers would nail down console → Mantle)

They would have 1) by-passed any DX11→Gamework, 2) while competition is lowering performances, they would have made it better

They had the full cake of market share for the taking, and they fumbled. AyyMD™
 
Last edited:

winjer

Gold Member
Because the SDK is basically the sauce for what to do with the same motion vector inputs? What the hell, how do you think it's supposed to work?

Again...



What's anyone's excuse? A mere guy working on mario 64 RT implemented it mere hours after SDK was made public. That's the reality. That's also what Nixxes is saying. The reality you're trying to portray is out of thin fucking air.


That's the point of the SDK, to make things easier and faster for devs.
Without the SDK and plugins it took days to implement. We had devs saying that when DLSS2 came out, that it took several days.

But if it's so easy to implement these upscalers, why are there so many game without FSR2 and XeSS? If all it takes is a few hours, then there is no excuse for DLSS having many more games that FSR2 and XeSS combined.
And let's be reminded that consoles can only use FSR2. So it would be expected that more games would use it. Unless something was blocking it.

So script kiddies fixed what AMD couldn't after working with the dev since the beginning? It was all AMD driver in the end? Surprised Pikachu

That's the whole point of DX11 era, driver team could alter the API to whatever they needed. Most of the drivers were actually fixing dev mistakes, something that can't be done anymore with DX12/Vulkan and thus we see the shitshow from some ports. For polaris then they had primitive discard accelerator. Late to the party of course with NV polymorph, while they had a fixed function one before that.

Those "script kids" didn't fix AMD's drivers, they tweaked the tessellation scale not to use as many polygons as originally.
They fixed the crap that NVidia and CDPR did with Hairworks and tessellation.
And everyone got a boost in performance, be it users of nvidia of AMD GPUs.

Again, these are old debunked debates. So old the URL of the thread from Crytek engineer on their forum doesn't exist anymore. Here's the saved reddit text :

This is a lame attempt to put this as Nvidia's fault.

Inherently, Crytek's story is very particular in this case.

At DX9 version, at launch, they, they were accused of abandoning PC gamers with Crysis 2. The first one was so infamous with performance kneecaping PCs and served as a benchmark title. At the time Crysis 2 came out, Crytek had not improved the parallax occlusion method. So the parallax occlusion mapping method (surface has depth) was no-go. In Crysis 2 & 3, it had a fatal flaw in the engine, it can't be used to define anything that changes the silhouette, only surface detail.

So Crytek at the time, went full derp in tessellation. It was more readily available technology. They sprinkled it on absolutely everything they could, it became the focus of their marketing campaign, also to push the narrative that their engine could handle tessellation better than anyone else in the industry. The thing is that tessellation is super ineffective, to get fine detail out of it the polycount goes way up and at the time the performance impact weren't well documented in the engine.

Then the above controversy happened (Nvidia at the center of target of course...)

And it pushed Crytek to implement POM https://docs.cryengine.com/display/SDKDOC2/Silhouette+POM

A lot of the Crysis 2 issues were driver issues too. If you look at benchmarks done later, AMD magically improved.
http://www.guru3d.com/miraserver/images/2012/his-7950/Untitled-20.png
Crysis 2 used GPU writebacks for occlusion. Nvidia optimized for those and AMD was very late on those optimizations. AMD not supporting Driver Command lists for example (multi-threaded rendering/deferred contexts/command lists) also hurt them in many games.

So polygons magically appear when using the wire frame. Must be a magical game engine, because other don't do that.
And the loss in performance with tessellation must be also magical. Well I call BS on Crytek. They screwed up with the tessellation on that game and then were trying to shift blame.

The reason why AMD performance eventually improved is because AMD implemented a limitador on tessellation factors, on a driver level.
That way, in any games NVidia pushed tessellation to stupid levels, AMD's driver would just render the necessary factor.

Driver command List don't affect tessellation performance. To the GOU it's only sent the draw calls for the original geometry and the instruction for the GPU to tesselate it. But this is done inside the GPU.
And AMD did dabble a bit with Driver Command Lists, for example in Civ V, where they got a nice performance boost.
But you are right in pointing that out as one of the issues with AMD's performance in DX11.

54178_10_amd-gaining-discrete-gpu-market-share-nvidia_full.png


2010 in finer detail :
0WJfE6K.jpg

Really! You had to find a graph with Desktop, Servers and Workstations. That includes AMD's APUs.
Here is the graph for discrete GPUs on PC desktops. From Jon Peddie Research.
00f7Rdm.png



Not made by but part of the initiative for open source.

file.php


PhysX didn't disapear, it is in unreal engine 4 and unity for a long time, didn't even require the logo to pop at start anymore. Even AMD sponsored Star Wars Jedi: Survivor uses PhysX. Fucking Nvidia and their sponsored games, only way PhysX survives :messenger_tears_of_joy:

So again what was the problem with PhysX outside of the initial dedicated card (also pre-Nvidia buyout), to multi platform in SDK 3 in 2011? How far back do we go for evil Nvidia shenanigans when competition had no fucking solutions to fight back? Literally at the dawn of new tech and we expect everything to be working on all hardware on first go when NovodeX AG's solution was dedicated to begin with? You seem leniant on Mantle implementation time, not so much on the rest of Nvidia's technology suite.

Read what I wrote carefully, you will find I wrote GPU PhysX. That path is dead and buried.

Lenient on mantle? AMD never blocked any other API with mantle.
They implemented it on BF4 and that's it. There were no shenanigans with it.
And the sucessor to it, Vulkan, runs pretty well on NVidia's hardware.

Not at console level, but the ports. What was the plan then if not that?
UQXcH3R.jpg


AMD sending engineers to studios to make all console ports mantle, low effort, would have completely changed history as we know it today. AMD might actually have the 80% market share.

If they had put 2 "man months", 2 engineers 1 month, to lock-in Mantle in 40 top tier games picked during a year, at say a good $200k salary, it would cost $8M (and probably way less than that as the engineers would nail down console → Mantle)

They would have 1) by-passed any DX11→Gamework, 2) while competition is lowering performances, they would have made it better

They had the full cake of market share for the taking, and they fumbled. AyyMD™

That would be an interesting thing, if Mantle would not have been abandoned so soon.
But have you looked at how long DX12 took to become a standard? It was released in 2015, but only recently did it become the standard.
Even with the support of Microsoft and with so many games being made for consoles with AMD's hardware and low level APIs.
And you expect that AMD, at the wort time of the company would have the resources to push Mantle.
Even if AMD did that, they would be up against Microsoft and DirectX. Look at how many games are not using DirectX in the last decade. It's a small faction.
 

Buggy Loop

Member
Ditch the Crytek graphic engine engineer's comments under the bus to continue into the nvidia conspiracy. You call "BS" on that, winjer winjer of Neogaf :messenger_tears_of_joy:

Always Sunny Reaction GIF


The cult is far closer to QAnon than i thought.

"I was talking about GPU-PhysX!" while the comment is "If it wasn't for NVidia sponsoring some games to implement Physx, it probably would not be used in any game.", about games... still using PhysX, including AMD sponsored ones 🤷‍♂️

I give up, as if i give a shit you think Nvidia is behind everything that went wrong with AMD. Bigger victim mentality than any other tech companies in the world.
 

winjer

Gold Member
Ditch the Crytek graphic engine engineer's comments under the bus to continue into the nvidia conspiracy. You call "BS" on that, winjer winjer of Neogaf :messenger_tears_of_joy:

Always Sunny Reaction GIF


The cult is far closer to QAnon than i thought.

"I was talking about GPU-PhysX!" while the comment is "If it wasn't for NVidia sponsoring some games to implement Physx, it probably would not be used in any game.", about games... still using PhysX, including AMD sponsored ones 🤷‍♂️

I give up, as if i give a shit you think Nvidia is behind everything that went wrong with AMD. Bigger victim mentality than any other tech companies in the world.

If only the issues with tessellation were only with Crytec. Several games had the same issues, mostly sponsored by NVidia.
Yet you pretend it was all a misunderstanding with Crytec. Give me a break.

You were directly quoting me, when I said GPU Physx is dead, to contradict me, while saying that Physx still lives on UE, which runs on CPU. My statement stands correct.

And you pretending it's all a conspiracy, does not hide the fact that for over 2 decades NVidia has made much worse than anything AMD/ATI did, or any company on the GPU space.
 

Buggy Loop

Member
If only the issues with tessellation were only with Crytec. Several games had the same issues, mostly sponsored by NVidia.
Yet you pretend it was all a misunderstanding with Crytec. Give me a break.

No i addressed Witcher 3 too, and Project Cars, are you following?


AMD's Huddy story doesn't even hold up.

You were directly quoting me, when I said GPU Physx is dead, to contradict me, while saying that Physx still lives on UE, which runs on CPU. My statement stands correct.

Tech evolved from a form to another, fucking Nvidia!

And you pretending it's all a conspiracy, does not hide the fact that for over 2 decades NVidia has made much worse than anything AMD/ATI did, or any company on the GPU space.

I Dont Care Deal With It GIF


Being late or having no alternative is not Nvidia's fucking problem. What's expected here? What other business even play fair? Should Nvidia take AMD's hand and write a CUDA alternative that they neglected for 15 years?

Just like DXR. OMG Nvidia going RTX to hurt performances! Path tracing? Fucking Nvidia trying to make AMD look bad! While AMD has been part of DXR consortium since what.. 2017? Always the same story. Always with the pants caught down even when there's warning signals years ahead.

Intel will pop AMD out of GPU space between their fingers. Honestly i can't wait. Tired of AMD cult always being the victim. This is coming from 20 years of owning ATI/AMD. I'm fed up with the attitude, it made me ditch them. JUST FUCKING MAKE GREAT CARDS COMPETITIVE AND IT WILL WORK. STOP BLAMING NVIDIA.
 
In screenshots they can look very close but anyone that says they can't tell DLSS from FSR in motion either needs to get to an optician stat, is trolling, or has never actually tried both themselves.

For me DLSS is normally an auto pick even if I don't need the performance as it tends to be better than TAA while FSR gets tested for a few minutes to see if it has improved then permanently disabled when it inevitably gives a grainy image full of pixelated artifacts on fine detail.

I don't give a shit about brand loyalty and flip-flop between Amd, Intel & Nvidia for hardware depending on who's the best when I upgrade but Amd has been flop after flop for a while now (not CPUs as I love my 7800x3d) and locking out the superior option is very anti consumer.

Amd seem to get treated as the plucky underdog like they aren't also a gigantic corporation. If they start putting out better cards people will buy them just like the CPUs but stuff like RT and DLSS are key features at the high end and Amd are a gen or two behind Nvidia here.

Amd is lucky Nvidia has been cocky enough to stick tiny amounts of vram on anything that isn't high end so they benefit from all the bad press Nvidia get here. I am surprised that the lower end Nvidia cards are still by far the most popular stuff in the market as while they really benefit from DLSS they are too weak to do anything outside of console level RT so it goes to show Amd are miles behind in mindshare.

I would love Amd to put out some best in class GPUs just like I would love MS to have another x360 type generation as competition is vital from a consumer point of view.

Nvidia are really taking the piss these days but are so far ahead that it doesn't matter - if you want the best GPU you have one choice and shit like blocking DLSS isn't going to help Amd.

Anyhow rant over.
 

ToTTenTranz

Banned
This is all you need to know about this tool, sees Intel as the good guy and AMD as the bad guy and "can't wait" for AMD to leave the GPU space.
Beyond idiotic to want less competition in the space, but I suppose what else could you expect from a blind Nvidia fangirl?

Claiming "I can't wait" for less competition is definitely in the top #5 of biggest asinine stupidity one can ever read in videogame forums.

Dude's not even pretending anymore.
 

winjer

Gold Member
No i addressed Witcher 3 too, and Project Cars, are you following?


AMD's Huddy story doesn't even hold up.

Project cars was never about tessellation. It was about Physx drivers and game optimization. Don't mix up things.

Crysis and Witcher 3 were only some of the games that had performance issues due to overuse of tessellation.
Here is more wire frame shots from another nvidia sponsored game, Arkham City. Very similar to Crysis 2, in the stupid use of tessellation.
Don't try to defend this crap, when everyone at the time could see tight through nvidia's tactics.
And this didn't just hurt AMD. It also hurt nvidia's users, especially those with older GPUs.
XVBaPfT.jpg


T4xVtOl.jpg


And again, the issue with The Witcher 3 performance was not with drivers or anything like that. It was about tessellation factors.
That is how modders fixed it's performance, for both AMD and NVidia users. Lowering the tessellation factors in the config files, improved performance and made no diference in image quality.

And this is the answer nvidia gave to ExtremeTech.
As you can see, at the time, AMD never got a chance to look into and optimize their drivers for Gameworks, resulting in lower performance in several games.

According to Nvidia, developers can, under certain licensing circumstances, gain access to (and optimize) the GameWorks code, but cannot share that code with AMD for optimization purposes.

Tech evolved from a form to another, fucking Nvidia!

NVidia's Physx always had a CPU and GPU path. The GPU path died out.
Only remains the CPU path. There was no evolution.

Being late or having no alternative is not Nvidia's fucking problem. What's expected here? What other business even play fair? Should Nvidia take AMD's hand and write a CUDA alternative that they neglected for 15 years?

Just like DXR. OMG Nvidia going RTX to hurt performances! Path tracing? Fucking Nvidia trying to make AMD look bad! While AMD has been part of DXR consortium since what.. 2017? Always the same story. Always with the pants caught down even when there's warning signals years ahead.

Intel will pop AMD out of GPU space between their fingers. Honestly i can't wait. Tired of AMD cult always being the victim. This is coming from 20 years of owning ATI/AMD. I'm fed up with the attitude, it made me ditch them. JUST FUCKING MAKE GREAT CARDS COMPETITIVE AND IT WILL WORK. STOP BLAMING NVIDIA.

You start by saying that no busyness plays fair and that we should not expect Nvidia to do it either.
And then you say that we should stop blaming nvidia.

The reality is that AMD for a long time made competitive cards. But because of nvidia's tactics to undermine it's performance, AMD's loss more and more of the market, resulting in having less and less money to invest in development.
The result is that today, RTG has much less capabilities to compete. But this was a result of decades with NVidia playing dirty.
Now we don't have competition on the GPU market, so NVidia can charge whatever ridiculous prices they want. And gamers are the one that lost.
 

Kenpachii

Member
Project cars was never about tessellation. It was about Physx drivers and game optimization. Don't mix up things.

Crysis and Witcher 3 were only some of the games that had performance issues due to overuse of tessellation.
Here is more wire frame shots from another nvidia sponsored game, Arkham City. Very similar to Crysis 2, in the stupid use of tessellation.
Don't try to defend this crap, when everyone at the time could see tight through nvidia's tactics.
And this didn't just hurt AMD. It also hurt nvidia's users, especially those with older GPUs.
XVBaPfT.jpg


T4xVtOl.jpg


And again, the issue with The Witcher 3 performance was not with drivers or anything like that. It was about tessellation factors.
That is how modders fixed it's performance, for both AMD and NVidia users. Lowering the tessellation factors in the config files, improved performance and made no diference in image quality.

And this is the answer nvidia gave to ExtremeTech.
As you can see, at the time, AMD never got a chance to look into and optimize their drivers for Gameworks, resulting in lower performance in several games.





NVidia's Physx always had a CPU and GPU path. The GPU path died out.
Only remains the CPU path. There was no evolution.



You start by saying that no busyness plays fair and that we should not expect Nvidia to do it either.
And then you say that we should stop blaming nvidia.

The reality is that AMD for a long time made competitive cards. But because of nvidia's tactics to undermine it's performance, AMD's loss more and more of the market, resulting in having less and less money to invest in development.
The result is that today, RTG has much less capabilities to compete. But this was a result of decades with NVidia playing dirty.
Now we don't have competition on the GPU market, so NVidia can charge whatever ridiculous prices they want. And gamers are the one that lost.

Physics went the CPU path because of consoles. This is also why physics died out big time.

AMD cards and driver support of AMD was complete dog shit in the witcher 3 area. And only AMD is to blame for this. Users that owned AMD cards where basically begging AMD to fix there shit more then anything else. Instead they dig in there heads like they always did, blame nvidia and release new dog shit hardware that again features no driver support.

Just look at the reality right now.

upscaling is all the big shit right now. Where is FSR 3? where is FSR 2 that competes with DLSS 2? It doesn't exist. It's all jank. I can tell you this, if u sit at higher resolutions, u will want to use DLSS and amd cards are straight up seen as useless at that point as they simple don't compete.

The same goes for so much more, gsync was godly, freesync was trash for years on end, at the end it worked out but years years later and nvidia just adopted it and never looked back, while with amd it was a huge wait to get anything good going.

Then nvenc, the whole twitch community uses it, even high profile streamers like summit that sits on a AMD GPU, has a nvidia gpu to render his stream on a second setup. Where's AMD's answer? doesent'exist.

Shadowplay before that, almost no overhead recording/streaming, AMD reaction? some utterly broken pile of garbage software that looks like shit, performance like shit just to die off without any support further. Sure they have AV1 or whatever support. Sadly the biggest streamer platform doesn't support it so kinda useless and even that took them years.

I remember the people at the 5000 series AMD cards, buying them because they wanted competition to lower nvidia prices. Only for AMD to up there top model by 2,5x the amount the next series.

There is a reason why AMD cards are a meme, and nobody buys them. The only thing they are good at are apu's. Everything else they got themselves to blame for. Provide a product that does everything better then your competition and provide it for multiple generations and people will swing. CPU's are a good example of this. luckely cpu's don't have to deal with AMD dog shit software developers.

As a buyer, your a complete idiot if u move with AMD at this point. And that's where there shit isn't selling, looking at the steam chart, even with the insane overpriced 4000 series and low sales as result, 7000 series is nowhere to be seen.
 
Last edited:

winjer

Gold Member
Physics went the CPU path because of consoles. This is also why physics died out big time.

Even at the launch of the PS4, Cerny demoed Havok running on the GPU. So devs could choose where to do these calculations.
GPU Physx is a whole different matter, as this was always locked to NVidia's GPUs. Of the two paths Physx had, only the CPU one could run on consoles, but that was a limitation imposed by NVidia.

AMD cards and driver support of AMD was complete dog shit in the witcher 3 area. And only AMD is to blame for this. Users that owned AMD cards where basically begging AMD to fix there shit more then anything else. Instead they dig in there heads like they always did, blame nvidia and release new dog shit hardware that again features no driver support.

Correct, AMD drivers at that time were lacking. But the whole hairworks situation made it much worse.
And remember that NVidia admitted at the time, to the media that they forbade AMD from accessing the source code of Gameworks. This made optimization for AMD much harder than it should have been.
All of this pilled on and on, to erode AMD's market share. And today, it's barely 10%.

Just look at the reality right now.

upscaling is all the big shit right now. Where is FSR 3? where is FSR 2 that competes with DLSS 2? It doesn't exist. It's all jank. I can tell you this, if u sit at higher resolutions, u will want to use DLSS and amd cards are straight up seen as useless at that point as they simple don't compete.

DLSS2 is the gold standard. No doubt about it. Anyone with an NVidia GPU has no reason to use FSR2 or XeSS.
But FSR2.2 is decent enough for anyone without an RTX card. Sadly there are still games being released with FSR 2.0. Or older games that never got an update.
Dying light got a massive update this week. And it still uses FSR2.0 and XesS 1.1 is blocked on AMD GPUs. But it works fine with NVidia GPUs.
D you think this is normal?

The same goes for so much more, gsync was godly, freesync was trash for years on end, at the end it worked out but years years later and nvidia just adopted it and never looked back, while with amd it was a huge wait to get anything good going.

NVidia didn't improve Freesync. They just rebranded it to Gsync compatible, when they saw that Freesync was the default standard for the vast majority of monitors.
And Intel joined in with Free-sync support, soon after AMD anounced it.
The ones that did the work were monitor brands that kept improving the tech.
And Gsync is pretty much dead, as there are almost no monitors being released with that module.

Then nvenc, the whole twitch community uses it, even high profile streamers like summit that sits on a AMD GPU, has a nvidia gpu to render his stream on a second setup. Where's AMD's answer? doesent'exist.

Shadowplay before that, almost no overhead recording/streaming, AMD reaction? some utterly broken pile of garbage software that looks like shit, performance like shit just to die off without any support further. Sure they have AV1 or whatever support. Sadly the biggest streamer platform doesn't support it so kinda useless and even that took them years.

I remember the people at the 5000 series AMD cards, buying them because they wanted competition to lower nvidia prices. Only for AMD to up there top model by 2,5x the amount the next series.

There is a reason why AMD cards are a meme, and nobody buys them. The only thing they are good at are apu's. Everything else they got themselves to blame for. Provide a product that does everything better then your competition and provide it for multiple generations and people will swing. CPU's are a good example of this. luckely cpu's don't have to deal with AMD dog shit software developers.

You are right, AMD on the GPU space is well behind NVidia. To decades of sabotaging the competition finally took it's toll.
But you do realize that the ones paying for it are we, the gamers. Because now NVidia sets the price it wants. And the result is a 4050, rebranded to a 4060, close to the price of a 4070.
 

GymWolf

Member
Will laugh when it still runs better on Nvidia cards.
Raster vs raster amd should be on top except against a 4090.

A 7900xtx probably beat a 4080 and a 7900xt probably beats a 4070ti., They also have more vram.

Hopefully i'm wrong since i have a 4080:lollipop_grinning_sweat:
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
NXGamer talks about this at length in his new video, he calls it a "tech media hypocrisy" for not bringing it up nearly as much when Nvidia does it and gives examples like Plague Tale Requiem, previous Far Cry games etc where features are locked to Nvidia specific cards.


 
Last edited:

01011001

Banned
NXGamer talks about this at length in his new video, he calls it a tech media hypocrisy for not bringing it up nearly as much when Nvidia does it and gives examples like Plague Tale Requiem, previous Far Cry games etc where features are locked to Nvidia specific cards.




uhm what? which features were blocked by Nvidia in plague tale?
 

adamsapple

Or is it just one of Phil's balls in my throat?
uhm what? which features were blocked by Nvidia in plague tale?

I think he used the word when talking about a few examples at the same time (or that's what I caught at least).

But Plague Tale Requiem does not have an FSR option in the PC version. (nor does the newly released AEW game, but both have DLSS).
 
Last edited:

Topher

Gold Member
NXGamer talks about this at length in his new video, he calls it a "tech media hypocrisy" for not bringing it up nearly as much when Nvidia does it and gives examples like Plague Tale Requiem, previous Far Cry games etc where features are locked to Nvidia specific cards.




But as he points out later, a larger portion of the market is affected by this when DLSS is not included. That along with the fact that this is potentially an issue with one of the biggest Bethesda games in years is greatly broadening awareness of it. That isn't to say "it's different" if/when Nvidia does this sort of thing. It's not. But it also doesn't make sense to focus on those two and leave out the publishers/devs who are agreeing to do this. Microsoft/Bethesda need to answer the slew of questions about this and so far they opting for silence. Not cool no matter which way we try to look at this.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Metro, Control, BattleField, Doom Eternal etc
P.S Even Cyberpunk with one year delay?

I'm sure there's a lot of examples, I'm not primarily a PC player so don't know these. Just saw NXG's commentary as something interesting and contrary to basically what most of the internet is talking about.
 

SF Kosmo

Al Jazeera Special Reporter
It's not really AMD fault.

Why does Starfield need a sponsorship? This isn't some indie developer who is barely making rent. This game is going to make a billion with or without an AMD check.
Todd is greedy.
I'm guessing they found it valuable to partner with them because the console versions are running on AMD silicon and they actually wanted help optimizing to that.

There are tiers of partnership in these deals, and we don't really know yet which tier Starfield resides in. While it does look like AMD pays some devs to lock out features from competitors, it isn't yet clear that that's the case with Starfield.
 

GHG

Member
This is all you need to know about this tool, sees Intel as the good guy and AMD as the bad guy and "can't wait" for AMD to leave the GPU space.
Beyond idiotic to want less competition in the space, but I suppose what else could you expect from a blind Nvidia fangirl?

It's not about wanting "less competition", it's about wanting better competition.

A competent Intel competing with Nvidia is far better for all of us than an incompetent AMD competing with Nvidia.

People really need to stop defending and supporting incompetence. We also see this attitude in the console space and it just stinks the place out and drags standards down for everyone. Demand better and stop blaming the competent company for another company's incompetence.
 
Last edited:

Buggy Loop

Member
It's not about wanting "less competition", it's about wanting better competition.

A competent Intel competing with Nvidia is far better for all of us than an incompetent AMD competing with Nvidia.

People really need to stop defending and supporting incompetence. We also see this attitude in the console space and it just stinks the place out and drags standards down for everyone. Demand better and stop blaming the competent company for another company's incompetence.

I only bought ATI/AMD from 1996 to 2016, always been exclusively AMD for CPU even now

But I’m a fan girl of Nvidia 😂

I was on ATI 2D cards probably before these clowns were born. I gave them 20 years. Supporting underdogs is something that I guess is a strong sentiment when you’re young. I stopped pity buying and just bought the best product I felt like, including non open source features, because I don’t give a single fuck. What does open source do for me? Nothing. Always reactionary to Nvidia tech, always catching up or screaming sabotage even from AMD management. Let’s not even go through the “wait for X or Y driver” I went through all those years

Don’t go into my PC build recommendations either where I recommend 6700XT.

Intel will eat them alive. The AMD cult is always the victim. A loser attitude.
 
Last edited:

Soltype

Member
It's not about wanting "less competition", it's about wanting better competition.

A competent Intel competing with Nvidia is far better for all of us than an incompetent AMD competing with Nvidia.

People really need to stop defending and supporting incompetence. We also see this attitude in the console space and it just stinks the place out and drags standards down for everyone. Demand better and stop blaming the competent company for another company's incompetence.
I've been wanting AMD to legitimately have something worthwhile for years, the only reason Nvidia can do the BS that it does is because AMD is not a threat. Do AMD fans really believe people wouldn't start buying their products if they were honestly compelling. It feels like they are constantly two steps behind Nvidia and it sucks because if you want a premium product you have nowhere else to go.
 

yamaci17

Member
I think he used the word when talking about a few examples at the same time (or that's what I caught at least).

But Plague Tale Requiem does not have an FSR option in the PC version. (nor does the newly released AEW game, but both have DLSS).
doesn't need to, a plague tale requiem has its own sophisticated temporal upscaler, it works pretty damn good too.

also, FSR 2 released much much later than DLSS. it is very normal there are games that have no FSR. not every dev will go back and add stuff. the examples they will give you will be mostly compromised of such titles, which is not a point that can be used as a " gotcha moment ".
 
Last edited:

Sleepwalker

Member
The no comment answer sounds like a no to me. But adding dlss is just one more mod out of the bunch this game will need anyway.
 

01011001

Banned
I think he used the word when talking about a few examples at the same time (or that's what I caught at least).

But Plague Tale Requiem does not have an FSR option in the PC version. (nor does the newly released AEW game, but both have DLSS).

but there's zero evidence suggesting that Nvidia has anything to do with that.
especially for AEW, which most likely only has DLSS either due to them not giving a shit, or using an older version of UE4.

unlike AMD, Nvidia publicly stated they do not block FSR from games they sponsor (why would they? it's good PR every time people compare the 2 in new games).
Nvidia could be lying, but noone has come forth claiming that yet, and AMD is denying any comment even with all the bad press happen currently, which let's be honest, might as well be an admission of guilt
 
Last edited:

Kenpachii

Member
Even at the launch of the PS4, Cerny demoed Havok running on the GPU. So devs could choose where to do these calculations.
GPU Physx is a whole different matter, as this was always locked to NVidia's GPUs. Of the two paths Physx had, only the CPU one could run on consoles, but that was a limitation imposed by NVidia.



Correct, AMD drivers at that time were lacking. But the whole hairworks situation made it much worse.
And remember that NVidia admitted at the time, to the media that they forbade AMD from accessing the source code of Gameworks. This made optimization for AMD much harder than it should have been.
All of this pilled on and on, to erode AMD's market share. And today, it's barely 10%.



DLSS2 is the gold standard. No doubt about it. Anyone with an NVidia GPU has no reason to use FSR2 or XeSS.
But FSR2.2 is decent enough for anyone without an RTX card. Sadly there are still games being released with FSR 2.0. Or older games that never got an update.
Dying light got a massive update this week. And it still uses FSR2.0 and XesS 1.1 is blocked on AMD GPUs. But it works fine with NVidia GPUs.
D you think this is normal?



NVidia didn't improve Freesync. They just rebranded it to Gsync compatible, when they saw that Freesync was the default standard for the vast majority of monitors.
And Intel joined in with Free-sync support, soon after AMD anounced it.
The ones that did the work were monitor brands that kept improving the tech.
And Gsync is pretty much dead, as there are almost no monitors being released with that module.



You are right, AMD on the GPU space is well behind NVidia. To decades of sabotaging the competition finally took it's toll.
But you do realize that the ones paying for it are we, the gamers. Because now NVidia sets the price it wants. And the result is a 4050, rebranded to a 4060, close to the price of a 4070.


Honestly i wonder if u even experienced that time. Even the biggest die hard AMD fans kicked the company into the bucket for there bullshit

If you bought nvidia:
- It just works, nice driver updates on time, new updates for drivers arrive fast within hours or a day after a patch for a game arrives, stable performance, no insane cpu overhead in games. U press update on a driver and 3 minutes later u are playing a game again.

U bought AMD:
- memory modules on your 290's shitting the brick all day long
- cold boots into black screens
- driver updates arrive sometimes half a year later
- updating your drivers was the most cancerous experience i ever had with amd total dog shit, safe mode > ddu > reboot, into windows, install driver and have fun having your desktop wreckt.
- a new game updates, drivers break constantly no updates from amd, google what gpu driver and game version u had to have for optimal experiences holy shit what a pain.
- have to tinker your own drivers because AMD doesn't care about you or there product. Google all day long
- Heavy cpu overhead, which means even with a 2x faster AMD gpu u would hit 50% slower performance when CPU demand goes up what a shit show.
- Crossfire not even once
- AMD to busy blaming another company, because they invest millions into techniques for there cards and don't give them away for free, but refuse to spend 2 seconds to ask to implement a feature they already have because its to much time for them to do so.
- nvidia drops optimisation guides from the left to the right, AMD nowhere to see.
- Gsync super smooth future tech that removes a ton of visual jank, AMd releases years later freesync that works like absolute shit for multiple years
- shadowplay it just works, amd version it just never works and is far worse.
- nvenc it just works, amd nvenc doesn't exist guys buy our cpu's to render streams completely inneficient guys. pls buy.
etc etc.

Honestly a fucking script kiddy solved the entire hairworks problem and even had a better solution then Nvidia had access towards within 2 days, yet AMD was screaming from the roof tops weeks if not months to nvidia how much of a victim they where. What a joke. They could have brute forced in a driver update, x16 tesselation. barely anybody would see the difference and have 50% performance improvement over nvidia's insane x64 bullshit. But that would have worked if they didn't forgot about there last driver update which was almost half a year ago lamo.

Sorry mate, AMD made there own bed by neglecting there users, give a worse experience for decades. People hoped when AMD took the helm things would improve. But instead more unstable hardware with no driver support got delivered and that was that.

I sold my 290's, bought a cheap 970, never looked back. shit just worked. The 10 people after the 200 series they had left probably all deserted AMD once they dropped there HBM cards, oh boy what a shit show that was.


Here's a forbes article on the matter that's paywelled copy pasted it.




The increasingly bitter war between GPU manufacturers AMD and Nvidia continues this month with the release of CD Projekt Red’s The Witcher 3 and with it, another GameWorks controversy. Except this time it’s much easier to see the naked truth.

The story so far: AMD believes that the implementation of an Nvidia-developed graphics feature called HairWorks (part of the company’s GameWorks library) in The Witcher 3 is deliberately crippling performance on AMD Radeon graphics cards. HairWorks — similar in functionality to AMD’s TressFX — taps into DirectX 11 to tessellate tens of thousands of strands of hair, making them move and flow realistically.


Early and exhaustive benchmarks from German site HardwareLuxx indicates that when HairWorks is activated on a higher-end Nvidia cards like the GTX 980, framerate performance drops by 30% (which of course it does, because extra eye candy affects performance!) But on a Radeon 290x? Up to a 61% hit to average framerates.


If you’re following this story, you may be aware of CD Projekt Red’s official statement on the matter. They told Overclock3D that yes, HairWorks can run on AMD hardware, but that “unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products.”

The problem with this statement is that it was overly vague and left a lot of possibilities dangling. Possibilities that can be interpreted for a variety of arguments. Why can’t it be optimized? Was it merely an issue of limited resources or man hours? Is it because, as some have argued, AMD’S GCN 1.1 Directx 11 tessellation is sub-par? Or did Nvidia explicitly prevent CD Projekt Red from optimizing their code on AMD hardware or from inserting their own tech?


The answer to the latter question is a decisive no. We saw technologies from both companies in Rockstar’s Grand Theft Auto V, a game that was optimized quite efficiently on a wide range of hardware from both Team Green and Team Red. And Nvidia’s Brian Burke recently reiterated to PCPer.com that “our agreements with developers don’t prevent them from working with other IHVs.”

But let’s get to the heart of this article. AMD’s chief gaming scientist Richard Huddy recently went on the offensive, claiming that Nvidia’s HairWorks code is somehow deliberately sabotaging Witcher 3 performance on AMD hardware. Speaking to ArsTechnica, he said the following:


“We’ve been working with CD Projekt Red from the beginning. We’ve been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we’re concerned. We were running well before that… it’s wrecked our performance, almost as if it was put in to achieve that goal.
That’s funny, since I attended an Nvidia press conference all the way back in June 2013 that showed an early version of Nvidia’s HairWorks — then unnamed — running on multiple wolves in The Witcher 3. Later, in January 2014 — 16 months ago — Nvidia officially christened the technology “HairWorks” and showed it off again, using several examples of the tech implemented into The Witcher 3.


Here’s a video from Gamescom 2014 (August) showing HairWorks running in The Witcher 3.

Let’s assume Huddy’s claim of working with the developer “from the beginning” is true. The Witcher 3 was announced February 2013. Was 2+ years not long enough to approach CD Projekt Red with the possibility of implementing TressFX? Let’s assume AMD somehow wasn’t brought into the loop until as late as Gamescom 2014 in August. Is 9 months not enough time to properly optimize HairWorks for their hardware? (Apparently Reddit user “FriedBongWater” only needed 48 hours after the game’s release to publish a workaround enabling better performance of HairWorks on AMD hardware, so there’s that.)



Hell, let’s even assume that AMD really didn’t get that code until 2 months prior, even though they’ve been working with the developer since day 1. Do you find that hard to swallow?

That’s all irrelevant in my eyes, because the ask never came in time. Via Ars Technica, Huddy claims that when AMD noticed the terrible HairWorks performance on their hardware two months prior to release, that’s when they “specifically asked” CD Projekt Red if they wanted to incorporate TressFX. The developer said “it was too late.”

Well, of course it was too late. Nvidia and CD Projekt Red spent two years optimizing HairWorks for The Witcher 3. But here’s the bottom line: The developer had HairWorks code for nearly two years. The entire world knew this. If AMD had been working with the developer “since the beginning” how on earth could they have been blindsided by this code only 2 months prior to release? None of it adds up, and it points to a larger problem.



Look, I respect AMD and have built many systems for personal use and here at Forbes using their hardware. AMD’s constant pulpit of open source drivers and their desire to prevent a fragmented PC gaming industry is honorable, but is it because they don’t want to do the work?

A PC enthusiast on Reddit did more to solve the HairWorks performance problem than AMD has apparently done. AMD’s last Catalyst WQHL driver was 161 days ago, and the company hasn’t announced one on the horizon. Next to Nvidia’s monthly update cycle and game-ready driver program, this looks lazy.

If you want a huge AAA game release to look great on your hardware, you take the initiative to ensure that it does. What you don’t do is expect your competitor to make it easier for you by opening up the technology they’ve invested millions of dollars into. You innovate using your own technologies. Or you increase your resources. Or you bolster your relationships and face time with developers.

In short, you just find a way to get it done.

If I sound frustrated, it’s because I am. I’ve been an enthusiastic fan of AMD for a good long while (just look at the numerous DIY builds and positive reviews I’ve given them), and last year at this same time I was admittedly on the other side of this argument. But what I’m seeing now is a company who keeps insisting their sole competitor make their job easier “for the good of PC gaming.” And I see said competitor continuing to innovate with graphics technologies that make games more beautiful. And I see promises like the concept of “OpenWorks” laying stagnant a full year after they’re hyped up. And I see AMD’s desktop GPU market share continue to slip and think to myself “maybe this is not a coincidence.”

Sorry AMD didn't got killed of because of Nvidia playing dirty.

AMD got killed by ignoring there own products and users.
 
Last edited:

Fredrik

Member
But as he points out later, a larger portion of the market is affected by this when DLSS is not included.
Yeah everything else is not even worth talking about imo. If Nvidia has like 70% of the core PC gamer market then of course it’ll be a big deal if one of Nvidia’s best features isn’t included. And if Bethesda simply can’t talk about DLSS being included because of the AMD deal, well then they were dumb to sign that deal.
What positive things will this deal do for Bethesda or Microsoft or Starfield?
 

Silver Wattle

Gold Member
It's not about wanting "less competition", it's about wanting better competition.

A competent Intel competing with Nvidia is far better for all of us than an incompetent AMD competing with Nvidia.

People really need to stop defending and supporting incompetence. We also see this attitude in the console space and it just stinks the place out and drags standards down for everyone. Demand better and stop blaming the competent company for another company's incompetence.
Don't try to rephrase the comment for him.
Intel will pop AMD out of GPU space between their fingers. Honestly i can't wait.
That is what he said, no bullshit about wanting "better" competition, it was he "can't wait" for Intel to "pop AMD out of the GPU space".

Also AMD does need to offer a more competitive product, but to call them incompetent is ridiculous, I think RDNA3 is a disappointment, but it's still a solid generation of cards after RDNA2 was a good generation of cards.
People were writing AMD off after RDNA1, all the rumours for RDNA2 were estimating their halo card to compete with the 3070, instead the 6800XT was a very good competitor to the 3080.

AMD has some issues they need to address, but in no way are they incompetent.
 
I think that one thing AMD has over Nvidia is the control panel since 6000 series. Overclocking, application selection, settings, etc. are all only a few clicks away with a UI that is informative and easy to navigate. Nvidia's control panel sucks. A lot of people use 3rd party apps to use some basic functions on Nvidia cards.
 
Last edited:

winjer

Gold Member
Honestly i wonder if u even experienced that time. Even the biggest die hard AMD fans kicked the company into the bucket for there bullshit

If you bought nvidia:
- It just works, nice driver updates on time, new updates for drivers arrive fast within hours or a day after a patch for a game arrives, stable performance, no insane cpu overhead in games. U press update on a driver and 3 minutes later u are playing a game again.

U bought AMD:
- memory modules on your 290's shitting the brick all day long
- cold boots into black screens
- driver updates arrive sometimes half a year later
- updating your drivers was the most cancerous experience i ever had with amd total dog shit, safe mode > ddu > reboot, into windows, install driver and have fun having your desktop wreckt.
- a new game updates, drivers break constantly no updates from amd, google what gpu driver and game version u had to have for optimal experiences holy shit what a pain.
- have to tinker your own drivers because AMD doesn't care about you or there product. Google all day long
- Heavy cpu overhead, which means even with a 2x faster AMD gpu u would hit 50% slower performance when CPU demand goes up what a shit show.
- Crossfire not even once
- AMD to busy blaming another company, because they invest millions into techniques for there cards and don't give them away for free, but refuse to spend 2 seconds to ask to implement a feature they already have because its to much time for them to do so.
- nvidia drops optimisation guides from the left to the right, AMD nowhere to see.
- Gsync super smooth future tech that removes a ton of visual jank, AMd releases years later freesync that works like absolute shit for multiple years
- shadowplay it just works, amd version it just never works and is far worse.
- nvenc it just works, amd nvenc doesn't exist guys buy our cpu's to render streams completely inneficient guys. pls buy.
etc etc.

Honestly a fucking script kiddy solved the entire hairworks problem and even had a better solution then Nvidia had access towards within 2 days, yet AMD was screaming from the roof tops weeks if not months to nvidia how much of a victim they where. What a joke. They could have brute forced in a driver update, x16 tesselation. barely anybody would see the difference and have 50% performance improvement over nvidia's insane x64 bullshit. But that would have worked if they didn't forgot about there last driver update which was almost half a year ago lamo.

Sorry mate, AMD made there own bed by neglecting there users, give a worse experience for decades. People hoped when AMD took the helm things would improve. But instead more unstable hardware with no driver support got delivered and that was that.

I sold my 290's, bought a cheap 970, never looked back. shit just worked. The 10 people after the 200 series they had left probably all deserted AMD once they dropped there HBM cards, oh boy what a shit show that was.


Here's a forbes article on the matter that's paywelled copy pasted it.


Sorry AMD didn't got killed of because of Nvidia playing dirty.

AMD got killed by ignoring there own products and users.

At the time I had plenty of hardware to play with and test, and I had a HD7950, GTX 680, GTX 970 and R9 390.
I never had any issues with memory in any of these GPUs. Never had cold boots into black screens. Never had issues with driver updates screwing up my desktop and much less if using DDU.
The HD7950 is on my list of the best GPUs I ever owned. Can't say the same thing about the 390, as it was a worse GPU than the 970.
I was one of the first unraveling the memory celeuma about the GTX970 memory pools, at the guru3d forums. As I was one of the first to manage to fill the whole 4Gb of the card, hitting that last slow memory pool and seeing the performance drop. And all it took me was a highly modded Skyrim.
And that BS about script kids fixing a game, shows you don't know what they did. But guess what, they only reduced the tessellation factor in the config files, to fix the mess NVidia did, and that affected eve nVidia users.
I was playing the Witcher 3 at the time on an nvidia GPU, and even then I used that mod to increase performance, and there was no visual diference.
You are only half right about the CPU overhead. The real reason for AMD's driver performance was it's frontend, that was made for low level APIs, so it really struggled with DX11. The other reason was the lack of driver command lists, as AMD's drivers only got support for Civ5. While Nvidia supported it for every game.
You are also right about the NVidia guides, I used them a lot at the time. A real shame that Nvidia stopped making them.
You are wrong about Free-sync. The issues there were at the time were not because of AMD's drivers or the tech itself. It was because of the monitors that existed at the time, were very limited. With time, we got much better monitors that were able to use the tech in a more efficient way.
But credit is due to NVidia for sparking such an important revolution. For me this is the most important thing nvidia did in it's whole history. I can never go back to not having variable refresh rate monitors.
But we also have to give credit to AMD, for making this tech affordable to everyone. If it wasn't for Freesync, we would still be paying 150$ more, for that module. And the reality is that today, all monitors use Freesync, despite the rebrand to G-sync compatible. And none use that G-sync module.
I never used shadowplay, nor the AMD recording feature. I just don't care about streaming. But yes, for anyone that cares about that, NVidia was the better choice, by far.
Considering how much you hate AMD, I'm surprised you didn't brought up the issue with frame times with AMD's drivers. In 2012, Scott Watson from Techreport, did an article showing that the AMD graphics despite having great average FPS, had much worse frame times than NVidia's. Here is a quote:

Pop back and forth between the 99th percentile and average FPS plots, and you’ll see two different stories being told. The FPS average suggests near-parity performance between the 7950 and the GTX 660 Ti, with a tiny edge to the GeForce. The 99th percentile frame time, though, captures the impact of the Radeon’s frame latency issues and suggests the GTX 660 Ti is easily the superior performer. That fact won’t be a surprise to anyone who’s read this far.

AMD did manage to fix this issue with the next drivers, but it was also a significant win for NVidia at the time.

You end talking as if trying to shift the blame into AMD for NVidia's sabotaging.
The reality is that AMD was struggling a lot at the time, not just the graphics division. Even the CPU division was struggling because of the issues with GF and Bulldozer.
But NVidia's sabotaging with gameworks, PhysX and tessellation made things increasingly worse. I don't understand why you are trying to hide all the bad things NVidia was doing at the time, that were much worse than the FSR deals we have today.
You make such a fuss about FSR2 deals today, but completely brush off NVidia's dealings, as if they were nothing.
And it wasn't just AMD that was suffering because of NVidia's shenanigans. This was the time when AIB's started to get squeezed hard by NVidia. But once again, you pretend like nothing is happening.
 

Zathalus

Member
NXGamer talks about this at length in his new video, he calls it a "tech media hypocrisy" for not bringing it up nearly as much when Nvidia does it and gives examples like Plague Tale Requiem, previous Far Cry games etc where features are locked to Nvidia specific cards.



Some points he gets horribly wrong:

Plague Tale Requiem didn't support FSR, this is correct. However there is zero evidence that this is due to Nvidia. So why would anyone blame Nvidia? The reverse is not true, unless new evidence comes to light the fact is AMD is blocking DLSS. That is where the problem lies. If AMD was not blocking DLSS and the game did not ship with DLSS then nobody would be angry at AMD, the ire would be directed to Bethesda.

The second point honestly makes me curious as to why NXGamer does not bother with basic research, he makes the following three claims:

1. Developers need to pay to use DLSS
2. Nvidia needs access to your game code.
3. Nvidia needs to train the ML engine on your specific game.

None of that is true, Nvidia requires only that you advertise that your game is using DLSS (even this can be negotiated away). They don't require access to your game, and DLSS hasn't required training since the original DLSS 1.0 version.

I'm actually shocked anybody makes such a claim. Digital Foundry gets absolutely wrecked on this forum, yet NXgamer can make absurd claims like that and is held up as some sort of standard?
 
Last edited:

Topher

Gold Member
Yeah everything else is not even worth talking about imo. If Nvidia has like 70% of the core PC gamer market then of course it’ll be a big deal if one of Nvidia’s best features isn’t included. And if Bethesda simply can’t talk about DLSS being included because of the AMD deal, well then they were dumb to sign that deal.
What positive things will this deal do for Bethesda or Microsoft or Starfield?

Bethesda gets a chunk of cash. Can't think of anything else. Starfield will be worse off especially if the game has performance issues early on. Just hoping this is all a nothing burger, but it is going to suck if we have to wait on reviews to find out.
 

StereoVsn

Member
People defending a large corp AMD for their shenanigans like it's their close family.

It fucking doesn't matter that Jensen and Co are as scummy or scummier (EVGA just went belly up largely because of Nvidia's fucking over AIBs).

The move by AMD to enforce DLSS absense in their games sucks for gamers. No need to defend this shit with whataboutism comparing to Nvidia. Corpos aren't your friends.
 
Last edited:

Topher

Gold Member
Some points he gets horribly wrong:

Plague Tale Requiem didn't support FSR, this is correct. However there is zero evidence that this is due to Nvidia. So why would anyone blame Nvidia? The reverse is not true, unless new evidence comes to light the fact is AMD is blocking DLSS. That is where the problem lies. If AMD was not blocking DLSS and the game did not ship with DLSS then nobody would be angry at AMD, the ire would be directed to Bethesda.

I don't see how Plague's Tale is any different than Starfield. Nvidia partnered with Adobe and Focus just like AMD is partnering with Bethesda.


And we don't have any evidence that DLSS will not be in Starfield at this point. It is all speculation.

The second point honestly makes me curious as to why NXGamer does not bother with basic research, he makes the following three claims:

1. Developers need to pay to use DLSS
2. Nvidia needs access to your game code.
3. Nvidia needs to train the ML engine on your specific game.

None of that is true, Nvidia requires only that you advertise that your game is using DLSS (even this can be negotiated away). They don't require access to your game, and DLSS hasn't required training since the original DLSS 1.0 version.

I'm actually shocked anybody makes such a claim. Digital Foundry gets absolutely wrecked on this forum, yet NXgamer can make absurd claims like that and is held up as some sort of standard?

But he does say that devs can make a deal to get out of paying for DLSS. And he starts out saying devs give their game to Nvidia, but then changes and says that he is "not sure" if Nvidia still have to train the engine.

Both DF and NXG get "wrecked" on this forum if they get something wrong. I mean....that is essentially what you are doing with this post, isn't it?
 
Last edited:

Fredrik

Member
Bethesda gets a chunk of cash. Can't think of anything else. Starfield will be worse off especially if the game has performance issues early on. Just hoping this is all a nothing burger, but it is going to suck if we have to wait on reviews to find out.
Do they still need a chunk of cash from deals like this when they have access to Microsoft’s wallet?
I find the whole thing strange tbh. Just a plain bad strategy.
 

Topher

Gold Member
Do they still need a chunk of cash from deals like this when they have access to Microsoft’s wallet?
I find the whole thing strange tbh. Just a plain bad strategy.

It's a head scratcher. I mean......I can almost understand smaller devs like Asobo going after deal like these, but Bethesda? Yeah.....don't get it.
 
Top Bottom