• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD is Starfield’s Exclusive PC Partner

Gudji

Member
My RX6600 is ready.

Jack Black Smile GIF
 
The thing is, these upscalers don't use that much tensor power. For example, in CP2077, XeSS running on DP4A is 2-3 fps diference from DLSS.
AMD could very well have an ML upscaling pass like XeSS, that ran just fine of their RDNA 2 and 3 cards. RDNA3 even has a bunch of WMMA instructions to accelerate these calculations, so it probably wouldn't even lose performance.

If AMD doesn't improve FSR2, then XeSS will become the preferred open upscaler, even for AMD users.
AMD has zero competence in AI so I'm not surprised they can't seem to do this. They can't even write a driver set which holds a candle to Nvidia's drivers after 30 years of trying.
 

Fredrik

Member
PC gamers arent all that different from console gamers. No one complained about checkerboard upscaling, they definitely wont care about the differences with other upscaling techs.
You assume too much. This is about increasing performance without destroying the image quality. Anyone with a mid Nvidia graphics card, which is the majority, could easily go from unacceptable to great framerates with DLSS without a bad image.
Now it’s all up in the air. The majority of PC gamers could be in a position where they have no choice but to accept worse image quality with only a minor increase in performance. Or they need to upgrade.
 

bbeach123

Member
Its easy money for dev I guess , you dont need to do anything , just add the fsr already had on console and call it AMD partner game . Done .
 

Silver Wattle

Gold Member
You don’t provide anything to the discussion

Are you saying Nvidia won't Nvidia if it comes to it?
What is there to discuss? you spout some bullshit about how AMD should be scared to wake the dragon of Nvidia, AMD are already getting stomped by nvidia in marketshare, why should they not try to compete with exclusive partnerships? the same type that Nvidia does and even started back in the day.
If anything AMD have nothing to lose by attacking Nvidia.
 

nemiroff

Gold Member
AMD have nothing to lose by attacking Nvidia.

AMD have everything to lose by attacking consumers.

They should attack Nvidia, yes, but by creating better technology and products instead of taking hostages.

The underdog status is pretty much all AMD have, but their pay to win strategy may very well put that at risk.
 
Last edited:

winjer

Gold Member
AMD has zero competence in AI so I'm not surprised they can't seem to do this. They can't even write a driver set which holds a candle to Nvidia's drivers after 30 years of trying.

Zero competences, no. Not even close.
You are only considering RDNA. And you are forgetting AMD has some of the best and most efficient CPUs, something important for some AI calculations.
They have CDNA, which has tensor cores. And they have Xilinx, which has an important role in FPGAs and their Vitis AI.

Are they behind NVidia, yes. But everyone is behind NVidia.
But AMD does have a strong competence in AI. The problem is that their efforts are all into the professional market.
Lisa needs to hire a few more AI engineers and send them to RTG.
 

john2gr

Member
Nvidia Physx?

Hardware-accelerated PhysX used proprietary technology (it's like complaining that DLSS only works for NVIDIA's GPUs). CPU-accelerated PhysX runs on everything, from NVIDIA to AMD and Intel. So no, that wasn't a good example.

A better example of a "shady move" is the tessellation in Crysis 2. NVIDIA knew that its GPUs were better at tessellation and overloaded the game in order to cripple its performance on AMD's GPUs. It's not exactly the same as what is happening now with AMD blocking DLSS, but it's still a "shady move".
 
Last edited:

winjer

Gold Member
multiple people have said this, but I have yet to see someone bring up an example of this happening.

when did Nvidia block a competitor's technology by making marketing deals?

A few examples on the top of my head:
Splinter Cell Chaos Theory, forcing SM3.0 or SM1.1
Ghost Recon AW, forcing Ubisoft to remove the DirectX patch 10.1
During the Riva generation, making devs use the texture formats that their GPUs used
Trying to block UT2003 DirectX 8.1 render path
Making a campaign among devs and media, to convince people that tile based rendering was a bad idea, when Kyro launched
PhysX, that NVidia claimed it only could run on NVidia GPUs. While having a highly de-optimized CPU path. And of course, the GPU path was blocked to NVidia GPUs.
The tessellation in Crysis 2 that
Blocking the FSR2 mod and so many games that only have DLSS2
Hairworks being highly de-optimized to run on AMD GPUs. And closed source, to make it very hard to work with.
There were also a ton of games, that Nvidia blocked AMD access to, before the game launched, so AMD could not optimize their drivers in time. So performance on AMD GPUs would be much lower for a few days after launch.
 
Last edited:

EDMIX

Member
what the absolute fuck? lol wooooooooow

well-there-it-is.gif


So I built my sister a PC for work / gaming and used the H1 NZXT and decided to go with one of those 6000 series cards......looks like she about to get a better run on this game then me using a 4090 lol (i kid, but maybe not) lol


H1 NZXT case is on sale at Best Buy right now if anyone is planning on doing a mini type build

 

winjer

Gold Member
so Physx is blocking AMD technology?

PhysX had two paths. One on the GPU, that was blocked to run only on NVidia GPUs. But it was only shader code, so it could very well run on any GPU.
It also had a CPU path, but it was highly de-optimized, to pretend that the advanced Physx mode could only run on NVidia GPUs.
So it was blocked for AMD GPUs. And blocked on Intel and AMD CPUs.

Intel had Havok at the time and they showed it could run on the CPU and do things as impressive as the Physx GPU.
But Intel did an poor job in supporting it with devs, so it saw limited adoption.
 

01011001

Banned
A few examples on the top of my head:
Splinter Cell Chaos Theory, forcing SM3.0 or SM1.1
Ghost Recon AW, forcing Ubisoft to remove the DirectX patch 10.1
During the Riva generation, making devs use the texture formats that their GPUs used
Trying to block UT2003 DirectX 8.1 render path
Making a campaign among devs and media, to convince people that tile based rendering was a bad idea, when Kyro launched
PhysX, that NVidia claimed it only could run on NVidia GPUs. While having a highly de-optimized CPU path. And of course, the GPU path was blocked to NVidia GPUs.
The tessellation in Crysis 2 that
Blocking the FSR2 mod and so many games that only have DLSS2
Hairworks being highly de-optimized to run on AMD GPUs. And closed source, to make it very hard to work with.
There were also a ton of games, that Nvidia blocked AMD access to, before the game launched, so AMD could not optimize their drivers in time. So performance on AMD GPUs would be much lower for a few days after launch.

that formatting is giving me a stroke.

most if this I can't comment on since you don't link any sources of these things actually happening (many so old that I could see them being unconfirmed gossip.

the other half is games implementing technology that only Nvidia had, which is not the same as blocking technology of a competitor.
having developers implement tesselation, a completely optional thing, or Physx, another completely optional thing, is not even in the same universe as blocking DLSS.

lastly, DLSS2 only games are super rare, and anything that released before FSR2 can not be counted whatsoever for obvious reasons
 

01011001

Banned
PhysX had two paths. One on the GPU, that was blocked to run only on NVidia GPUs. But it was only shader code, so it could very well run on any GPU.
It also had a CPU path, but it was highly de-optimized, to pretend that the advanced Physx mode could only run on NVidia GPUs.
So it was blocked for AMD GPUs. And blocked on Intel and AMD CPUs.

Intel had Havok at the time and they showed it could run on the CPU and do things as impressive as the Physx GPU.
But Intel did an poor job in supporting it with devs, so it saw limited adoption.

Phsyx is Nvidia (bought) technology, AMD/ATi had no competition, therefore not comparable as no competitors equivalent technology was blocked from being implemented
 
Last edited:

Tams

Member
DLSS is noticeably better, but it is also the proprietary technology, that also requires developers to put not insubstantial work into implementing it, and then the majority of players wouldn't even be enjoy said extra work.

If AMD are paying developers to not have to do extra work, why would developers turn that down?

Moan at the gatekeeper.
 

01011001

Banned
DLSS is noticeably better, but it is also the proprietary technology, that also requires developers to put not insubstantial work into implementing it, and then the majority of players wouldn't even be enjoy said extra work.

1 guy can mod it in in less than a week.
no AAA studio has any excuse to not implement it.

and also the majority of officially supported/recommended GPUs for Starfield support DLSS.

the last 3 Nvidia generations all support it. that's dozens of models of Graphics cards from the market leader.
 

winjer

Gold Member
Phsyx is Nvidia (bought) technology, AMD had no competition, therefore not comparable as no competitors equivalent technology was blocked from being implemented

Yes, there were physics based techs at the time, Havok was the most prominent. But there were also proprietary tech on specific game engines, like Source, for example.
But the thing is, in games where NVidia sponsored physics, users either had a path to run only on NVidia GPUs. Or they had a very de-optimized path on the CPU, that caused performance loses. This was terrible for everyone.
And there were several games sponsored by Nvidia to use Physx. So please, don't try to make excuses for NVidia and Physx.

Truth be told, Physx was one of the biggest scams in PC gaming.
First it started with the original company claiming that their physics were so advanced that it couldn't run on CPUs or GPUs of the time. Which was a complete lie, proved by Havok doing similar things on the CPU. And Nvidia then buying Physx, and running it on their GPUs.
And second, it was scam when NVidia claimed it could only run on their GPUs, and that CPUs could not run such advanced effects.
 

john2gr

Member
DLSS is noticeably better, but it is also the proprietary technology, that also requires developers to put not insubstantial work into implementing it, and then the majority of players wouldn't even be enjoy said extra work.

If AMD are paying developers to not have to do extra work, why would developers turn that down?

Moan at the gatekeeper.

You clearly don't know how DLSS and FSR 2 work.

DLSS, FSR 2 and XeSS require the same data. All of them use the same technique in order to work (which is why it's so easy to replace FSR 2 with DLSS 2 via mods and vice versa). These upscaling techniques use current and previous frames (temporal data), motion vectors, and the depth buffer. So, once you've implemented DLSS 2, you've already done 90% of the work for both FSR 2 and XeSS.

This isn't PhysX vs Havok (which are completely different physics solutions).
 
Last edited:

01011001

Banned
Yes, there were physics based techs at the time, Havok was the most prominent. But there were also proprietary tech on specific game engines, like Source, for example.
But the thing is, in games where NVidia sponsored physics, users either had a path to run only on NVidia GPUs. Or they had a very de-optimized path on the CPU, that caused performance loses. This was terrible for everyone.
And there were several games sponsored by Nvidia to use Physx. So please, don't try to make excuses for NVidia and Physx.

Truth be told, Physx was one of the biggest scams in PC gaming.
First it started with the original company claiming that their physics were so advanced that it couldn't run on CPUs or GPUs of the time. Which was a complete lie, proved by Havok doing similar things on the CPU. And Nvidia then buying Physx, and running it on their GPUs.
And second, it was scam when NVidia claimed it could only run on their GPUs, and that CPUs could not run such advanced effects.

so first of all, phsyx was almost always the most random small details that not even Nvidia users turned on at the time.

secondly, many things that have been done with Physx never materialised again since.

we basically haven't seen detailed interactive smoke and/or fog in any modern game since Arkham Knight. the closest to that is Counter Strike 2, but very simplified and less realistic (for balance reasons)

so I wonder why that is honestly. PhysX did some really cool stuff back in the day, stuff you never saw any other tech do in any game really.
the dynamically ripping plastic sheets in Mirror's Edge are another example of that.
 
Last edited:

Tams

Member
You clearly don't know how DLSS and FSR 2 work.

DLSS, FSR 2 and XeSS require the same data. All of them use the same technique in order to work (which is why it's so easy to replace FSR 2 with DLSS 2 via mods and vice versa). These upscaling techniques use current and previous frames (temporal data), motion vectors, and the depth buffer. So, once you've implemented DLSS 2, you've already done 90% of the work for both FSR 2 and XeSS.

This isn't PhysX vs Havok (which are completely different physics solutions).

I know that. But it is still extra work, and they are being paid to not do work. It's a no-brainer.
 

winjer

Gold Member
so first of all, phsyx was almost always the most random small details that not even Nvidia users turned on at the time.

secondly, many things that have been done with Physx never materialised again since.

we basically haven't seen detailed interactive smoke and/or fog in any modern game since Arkham Knight. the closest to that is Counter Strike 2, but very simplified and less realistic (for balance reasons)

so I wonder why that is honestly. PhysX did some really cool stuff back in the day, stuff you never saw any other tech do in any game really.
the dynamically ripping plastic sheets in Mirror's Edge are another example of that.

Physx did nothing special. On most games it was just a gimmick that run like crap.
Fortunately GPU phsyx has died off.
I'll repeat it again, Havok was doing things as good, if not better.
Do you remember the PS4 presentation? They used Havok to show a few interesting demos.

 

john2gr

Member
Yes, there were physics based techs at the time, Havok was the most prominent. But there were also proprietary tech on specific game engines, like Source, for example.
But the thing is, in games where NVidia sponsored physics, users either had a path to run only on NVidia GPUs. Or they had a very de-optimized path on the CPU, that caused performance loses. This was terrible for everyone.
And there were several games sponsored by Nvidia to use Physx. So please, don't try to make excuses for NVidia and Physx.

Truth be told, Physx was one of the biggest scams in PC gaming.
First it started with the original company claiming that their physics were so advanced that it couldn't run on CPUs or GPUs of the time. Which was a complete lie, proved by Havok doing similar things on the CPU. And Nvidia then buying Physx, and running it on their GPUs.
And second, it was scam when NVidia claimed it could only run on their GPUs, and that CPUs could not run such advanced effects.

There isn't ANY CPU-based PhysX and Havok game that CAN COME CLOSE to the hardware-accelerated PhysX games.

Even to this date, Cryostasis (which came out 14 years ago) has physics BETTER than your current-gen games ->

Batman: Arkham Knight's smoke effects (that react with the vehicle and the environment are unmatched) ->

Batman: Arkham City also had impressive physics that even current-gen games don't have ->

Mafia 2 is BETTER than its recent remaster as it has more advanced physics thanks to hardware-accelerated PhysX. Imagine that, a remaster having WORSE physics than its original version ->

Say what you will, but hardware-accelerated PhysX gave us a look at the future. And now, after ten whole years, we are STILL a generation behind in physics...
 

kuncol02

Banned
Yes, there were physics based techs at the time, Havok was the most prominent. But there were also proprietary tech on specific game engines, like Source, for example.
But the thing is, in games where NVidia sponsored physics, users either had a path to run only on NVidia GPUs. Or they had a very de-optimized path on the CPU, that caused performance loses. This was terrible for everyone.
And there were several games sponsored by Nvidia to use Physx. So please, don't try to make excuses for NVidia and Physx.

Truth be told, Physx was one of the biggest scams in PC gaming.
First it started with the original company claiming that their physics were so advanced that it couldn't run on CPUs or GPUs of the time. Which was a complete lie, proved by Havok doing similar things on the CPU. And Nvidia then buying Physx, and running it on their GPUs.
And second, it was scam when NVidia claimed it could only run on their GPUs, and that CPUs could not run such advanced effects.
They even were blocking GPU Physx when they detected AMD driver on your PC, even if you were using nvidia card to play. For some time you could actually use some old nvidia card as physx accelerator and render on AMD, but it was quickly blocked by nvidia.

we are STILL a generation behind in physics
And we can thank Nvidia for that.

edit:
GPU based PhysX could also run on consoles. That means there was no technical reason for it to be blocked on AMD GPUs.
 
Last edited:

01011001

Banned
Physx did nothing special. On most games it was just a gimmick that run like crap.
Fortunately GPU phsyx has died off.
I'll repeat it again, Havok was doing things as good, if not better.
Do you remember the PS4 presentation? They used Havok to show a few interesting demos.



I have never seen any game use comparable physics gimmicks to PhysX, even to this day.
what they did in Borserlands 2 or the Batman games is something that would be nice to see in modern games tbh... but alas, nothing.

and I also never saw any game that uses Havok do similar things.

and yes it was usually a gimmick that ran like crap. which is why this isn't comparable.

developers made a deal with Nvidia to add some gimmicky things that they can show in trailers.
that's not the same as blocking a widely used upscaling technology that's vastly superior to your own.
 
Last edited:

winjer

Gold Member
There isn't ANY CPU-based PhysX and Havok game that CAN COME CLOSE to the hardware-accelerated PhysX games.

That's because NVidia was pushing physx into games, so they forced more advanced effects. But in most games, they were just gimmicks, like the toilets in Borderlands 2 and the Smoke in Arkham.
And it always caused a significant drop in performance, even the GPU accelerated path. So most just turned it off.
Havok could do anything that physx could, probably better, because it was well optimized. But devs had no incentive to make it as flashy as physx, in part because they were not getting paid by NVidia and in part because it would drop performance.
But if you look at temos of the time, havok was doing very impressive stuff.
 

john2gr

Member
They even were blocking GPU Physx when they detected AMD driver on your PC, even if you were using nvidia card to play. For some time you could actually use some old nvidia card as physx accelerator and render on AMD, but it was quickly blocked by nvidia.


And we can thank Nvidia for that.

NVIDIA should have kept pushing hardware-accelerated PhysX for even more games. Seriously, imagine what we'd be getting with the raw power of today's GPUs. It's a shame that they abandoned it.

PS: And yes, NVIDIA's "shady move" was blocking GPU PhysX when using a secondary NVIDIA GPU (for handling those effects) on an AMD PC system. That's a perfect example of what AMD is doing now with DLSS 2.
 

winjer

Gold Member
I have never seen any game use comparable physics gimmicks to PhysX, even to this day.
what they did in Borserlands 2 or the Batman games is something that would be nice to see in modern games tbh... but alas, nothing.

and I also never saw any game that uses Havok do similar things.

and yes it was usually a gimmick that ran like crap. which is why this isn't comparable.

developers made a deal with Nvidia to add some gimmicky things that they can show in trailers.
that's not the same as blocking a widely used upscaling technology that's vastly superior to your own.

That's because no one cares about it. The tech exists. There are game engines and middleware that can do advanced physics effects.
Havok was already running on the PS4, with very impressive performance and complexity. Sony did all the work with their SDK. But devs and gamers didn't care.

And GPU PhysX also died out, and no one cared about it.
The only reason people still talk abut it, is because it of NVidia's shenanigans with it.
 
Last edited:

Buggy Loop

Member
What is there to discuss? you spout some bullshit about how AMD should be scared to wake the dragon of Nvidia, AMD are already getting stomped by nvidia in marketshare, why should they not try to compete with exclusive partnerships? the same type that Nvidia does and even started back in the day.
If anything AMD have nothing to lose by attacking Nvidia.

200.gif
 

Mister Wolf

Gold Member
At first I was pissed but after modding in DLSS2 and Frame Generation(Pure Dark) into Jedi Survivor yesterday I'm not even tripping anymore. The modders will fix this bullshit fast.
 
Last edited:

Buggy Loop

Member
DLSS is noticeably better, but it is also the proprietary technology, that also requires developers to put not insubstantial work into implementing it, and then the majority of players wouldn't even be enjoy said extra work.

If AMD are paying developers to not have to do extra work, why would developers turn that down?

Moan at the gatekeeper.

Yeah… no

That’s not how it works

Script kiddies or can even change .DLL file to replace FSR for DLSS

In fact, PureDark modder can implement DLSS in any games that has TAA




Indie guys had implemented DLSS hours after SDK was out




Or the unreal / unity engines which have a toggle after having the plugin in




AMD is more like..


Removing RT and DLSS that were already implemented..

what is this? AMDGaf nowhere to be found? Since when did this board go all pro nvidia anti AMD?

AMD’s marketshare os tumbling down to ~8-10% range while Nvidia and Intel climb. Why would this reflect any differently on neogaf? Thread about which GPU you got is also pretty much in that range.

Nobody is pro Nvidia I would say, this ain’t sport teams, in the sense that no I do not recommend any GPUs of this generation (outside balls to the wall 4090). I also no longer buy underdogs just for pity buying like I did for ATI/AMD for so long.

The problem is AMD is fleecing just as much as Nvidia, for worse tech and features, worse resell value, not as professional centric, etc etc. What is supposed to happen? They are nowhere near aggressive to retake the market.
 
Last edited:

Mister Wolf

Gold Member
Yeah… no

That’s not how it works

Script kiddies or can even change .DLL file to replace FSR for DLSS

In fact, PureDark modder can implement DLSS in any games that has TAA




Indie guys had implemented DLSS hours after SDK was out




Or the unreal / unity engines which have a toggle after having the plugin in




AMD is more like..


Removing RT and DLSS that were already implemented..


Pure Dark is going to make tons of cash off Starfield. I bet when he saw this news about AMD being the official partner he was ecstatic.
 

welshrat

Member
Can't say it bothers me much, I am rocking a RX6800 and its the best card I have owned, drivers and hardware are excellent and will likely jump on an end of line 7900XT or RDNA4 card. Always swapped between Nvidia and AMD wherever I saw the best value. Appreciate the 4090 is a cracking card but far too much money or energy draw for my liking.
 

Danknugz

Member
Yeah… no

That’s not how it works

Script kiddies or can even change .DLL file to replace FSR for DLSS

In fact, PureDark modder can implement DLSS in any games that has TAA




Indie guys had implemented DLSS hours after SDK was out




Or the unreal / unity engines which have a toggle after having the plugin in




AMD is more like..


Removing RT and DLSS that were already implemented..



AMD’s marketshare os tumbling down to ~8-10% range while Nvidia and Intel climb. Why would this reflect any differently on neogaf? Thread about which GPU you got is also pretty much in that range.

Nobody is pro Nvidia I would say, this ain’t sport teams, in the sense that no I do not recommend any GPUs of this generation (outside balls to the wall 4090). I also no longer buy underdogs just for pity buying like I did for ATI/AMD for so long.

The problem is AMD is fleecing just as much as Nvidia, for worse tech and features, worse resell value, not as professional centric, etc etc. What is supposed to happen? They are nowhere near aggressive to retake the market.

so people here dropped all loyalty since May (i haven't been here for a few month) just because nvidia outperforms in the market? that's kind of strange to equate neogaf popularity with market performance when you compare msft vs sony. as far as i'm aware nvidia is just up due to people al of sudden throwing money at anything AI related. a lot of shit tier AI tickers are also up heavily since May.
 

Buggy Loop

Member
so people here dropped all loyalty since May (i haven't been here for a few month) just because nvidia outperforms in the market? that's kind of strange to equate neogaf popularity with market performance when you compare msft vs sony. as far as i'm aware nvidia is just up due to people al of sudden throwing money at anything AI related. a lot of shit tier AI tickers are also up heavily since May.

What? There’s still AMD pro peoples here, have you read the thread?
 
Can't say it bothers me much, I am rocking a RX6800 and its the best card I have owned, drivers and hardware are excellent and will likely jump on an end of line 7900XT or RDNA4 card. Always swapped between Nvidia and AMD wherever I saw the best value. Appreciate the 4090 is a cracking card but far too much money or energy draw for my liking.
As a 3070 owner I dont mind it. I actually played death stranding with fsr over dlss when I had a 2060(dlss had these weird ghosting effects that fsr didnt have)
 
maybe i missed it. was under the impression this board was heavily AMD similar to Sony and when i skimmed this thread all i saw was jokes about AMD and cheering nvidia
Should we be cheering for inferior tech? What has AMD GPU division done to earn your loyalty?
 

Danknugz

Member
Should we be cheering for inferior tech? What has AMD GPU division done to earn your loyalty?
well AMD tech is in the consoles and historically i've seen them held in high regard not just here, but there seems to be a strang kind of cult like following about AMD, which is probably nothing more than a result of them being cheaper for relatively the same performance.

personally its my opinion that there isn't something magical, you get what you pay for and in AMDs case it's less about Nvidia being overpriced, but more about AMD cutting corners, running hot and maybe paying developers less because their drivers are so horrible and if you run them on PC, in my experience, it won't be a huge difference at first, but eventually some kind of small inconsistency or incompatibility will rear its head and cause you not to be able to run some game in a certain way, or utilize some kind of niche software because of some strange quirk with an AMD driver which you almost never find with nvidia drivers. Just my experience.
 
Top Bottom