I'd like you to toss a piano over so that I can smell the table before it turns into a radiant golf club.I agree, this is why nvidia hasnt been able to innovate anything in the past 2 decades because of their bad business behaviours.
My RX6600 is ready.
AMD has zero competence in AI so I'm not surprised they can't seem to do this. They can't even write a driver set which holds a candle to Nvidia's drivers after 30 years of trying.The thing is, these upscalers don't use that much tensor power. For example, in CP2077, XeSS running on DP4A is 2-3 fps diference from DLSS.
AMD could very well have an ML upscaling pass like XeSS, that ran just fine of their RDNA 2 and 3 cards. RDNA3 even has a bunch of WMMA instructions to accelerate these calculations, so it probably wouldn't even lose performance.
If AMD doesn't improve FSR2, then XeSS will become the preferred open upscaler, even for AMD users.
You assume too much. This is about increasing performance without destroying the image quality. Anyone with a mid Nvidia graphics card, which is the majority, could easily go from unacceptable to great framerates with DLSS without a bad image.PC gamers arent all that different from console gamers. No one complained about checkerboard upscaling, they definitely wont care about the differences with other upscaling techs.
What is there to discuss? you spout some bullshit about how AMD should be scared to wake the dragon of Nvidia, AMD are already getting stomped by nvidia in marketshare, why should they not try to compete with exclusive partnerships? the same type that Nvidia does and even started back in the day.You don’t provide anything to the discussion
Are you saying Nvidia won't Nvidia if it comes to it?
AMD have nothing to lose by attacking Nvidia.
the same type that Nvidia does and even started back in the day.
AMD has zero competence in AI so I'm not surprised they can't seem to do this. They can't even write a driver set which holds a candle to Nvidia's drivers after 30 years of trying.
multiple people have said this, but I have yet to see someone bring up an example of this happening.
when did Nvidia block a competitor's technology by making marketing deals?
Nvidia Physx?
multiple people have said this, but I have yet to see someone bring up an example of this happening.
when did Nvidia block a competitor's technology by making marketing deals?
Nvidia Physx?
so Physx is blocking AMD technology?
A few examples on the top of my head:
Splinter Cell Chaos Theory, forcing SM3.0 or SM1.1
Ghost Recon AW, forcing Ubisoft to remove the DirectX patch 10.1
During the Riva generation, making devs use the texture formats that their GPUs used
Trying to block UT2003 DirectX 8.1 render path
Making a campaign among devs and media, to convince people that tile based rendering was a bad idea, when Kyro launched
PhysX, that NVidia claimed it only could run on NVidia GPUs. While having a highly de-optimized CPU path. And of course, the GPU path was blocked to NVidia GPUs.
The tessellation in Crysis 2 that
Blocking the FSR2 mod and so many games that only have DLSS2
Hairworks being highly de-optimized to run on AMD GPUs. And closed source, to make it very hard to work with.
There were also a ton of games, that Nvidia blocked AMD access to, before the game launched, so AMD could not optimize their drivers in time. So performance on AMD GPUs would be much lower for a few days after launch.
PhysX had two paths. One on the GPU, that was blocked to run only on NVidia GPUs. But it was only shader code, so it could very well run on any GPU.
It also had a CPU path, but it was highly de-optimized, to pretend that the advanced Physx mode could only run on NVidia GPUs.
So it was blocked for AMD GPUs. And blocked on Intel and AMD CPUs.
Intel had Havok at the time and they showed it could run on the CPU and do things as impressive as the Physx GPU.
But Intel did an poor job in supporting it with devs, so it saw limited adoption.
DLSS is noticeably better, but it is also the proprietary technology, that also requires developers to put not insubstantial work into implementing it, and then the majority of players wouldn't even be enjoy said extra work.
Phsyx is Nvidia (bought) technology, AMD had no competition, therefore not comparable as no competitors equivalent technology was blocked from being implemented
DLSS is noticeably better, but it is also the proprietary technology, that also requires developers to put not insubstantial work into implementing it, and then the majority of players wouldn't even be enjoy said extra work.
If AMD are paying developers to not have to do extra work, why would developers turn that down?
Moan at the gatekeeper.
Yes, there were physics based techs at the time, Havok was the most prominent. But there were also proprietary tech on specific game engines, like Source, for example.
But the thing is, in games where NVidia sponsored physics, users either had a path to run only on NVidia GPUs. Or they had a very de-optimized path on the CPU, that caused performance loses. This was terrible for everyone.
And there were several games sponsored by Nvidia to use Physx. So please, don't try to make excuses for NVidia and Physx.
Truth be told, Physx was one of the biggest scams in PC gaming.
First it started with the original company claiming that their physics were so advanced that it couldn't run on CPUs or GPUs of the time. Which was a complete lie, proved by Havok doing similar things on the CPU. And Nvidia then buying Physx, and running it on their GPUs.
And second, it was scam when NVidia claimed it could only run on their GPUs, and that CPUs could not run such advanced effects.
You clearly don't know how DLSS and FSR 2 work.
DLSS, FSR 2 and XeSS require the same data. All of them use the same technique in order to work (which is why it's so easy to replace FSR 2 with DLSS 2 via mods and vice versa). These upscaling techniques use current and previous frames (temporal data), motion vectors, and the depth buffer. So, once you've implemented DLSS 2, you've already done 90% of the work for both FSR 2 and XeSS.
This isn't PhysX vs Havok (which are completely different physics solutions).
so first of all, phsyx was almost always the most random small details that not even Nvidia users turned on at the time.
secondly, many things that have been done with Physx never materialised again since.
we basically haven't seen detailed interactive smoke and/or fog in any modern game since Arkham Knight. the closest to that is Counter Strike 2, but very simplified and less realistic (for balance reasons)
so I wonder why that is honestly. PhysX did some really cool stuff back in the day, stuff you never saw any other tech do in any game really.
the dynamically ripping plastic sheets in Mirror's Edge are another example of that.
Yes, there were physics based techs at the time, Havok was the most prominent. But there were also proprietary tech on specific game engines, like Source, for example.
But the thing is, in games where NVidia sponsored physics, users either had a path to run only on NVidia GPUs. Or they had a very de-optimized path on the CPU, that caused performance loses. This was terrible for everyone.
And there were several games sponsored by Nvidia to use Physx. So please, don't try to make excuses for NVidia and Physx.
Truth be told, Physx was one of the biggest scams in PC gaming.
First it started with the original company claiming that their physics were so advanced that it couldn't run on CPUs or GPUs of the time. Which was a complete lie, proved by Havok doing similar things on the CPU. And Nvidia then buying Physx, and running it on their GPUs.
And second, it was scam when NVidia claimed it could only run on their GPUs, and that CPUs could not run such advanced effects.
They even were blocking GPU Physx when they detected AMD driver on your PC, even if you were using nvidia card to play. For some time you could actually use some old nvidia card as physx accelerator and render on AMD, but it was quickly blocked by nvidia.Yes, there were physics based techs at the time, Havok was the most prominent. But there were also proprietary tech on specific game engines, like Source, for example.
But the thing is, in games where NVidia sponsored physics, users either had a path to run only on NVidia GPUs. Or they had a very de-optimized path on the CPU, that caused performance loses. This was terrible for everyone.
And there were several games sponsored by Nvidia to use Physx. So please, don't try to make excuses for NVidia and Physx.
Truth be told, Physx was one of the biggest scams in PC gaming.
First it started with the original company claiming that their physics were so advanced that it couldn't run on CPUs or GPUs of the time. Which was a complete lie, proved by Havok doing similar things on the CPU. And Nvidia then buying Physx, and running it on their GPUs.
And second, it was scam when NVidia claimed it could only run on their GPUs, and that CPUs could not run such advanced effects.
And we can thank Nvidia for that.we are STILL a generation behind in physics
Physx did nothing special. On most games it was just a gimmick that run like crap.
Fortunately GPU phsyx has died off.
I'll repeat it again, Havok was doing things as good, if not better.
Do you remember the PS4 presentation? They used Havok to show a few interesting demos.
There isn't ANY CPU-based PhysX and Havok game that CAN COME CLOSE to the hardware-accelerated PhysX games.
They even were blocking GPU Physx when they detected AMD driver on your PC, even if you were using nvidia card to play. For some time you could actually use some old nvidia card as physx accelerator and render on AMD, but it was quickly blocked by nvidia.
And we can thank Nvidia for that.
I have never seen any game use comparable physics gimmicks to PhysX, even to this day.
what they did in Borserlands 2 or the Batman games is something that would be nice to see in modern games tbh... but alas, nothing.
and I also never saw any game that uses Havok do similar things.
and yes it was usually a gimmick that ran like crap. which is why this isn't comparable.
developers made a deal with Nvidia to add some gimmicky things that they can show in trailers.
that's not the same as blocking a widely used upscaling technology that's vastly superior to your own.
What is there to discuss? you spout some bullshit about how AMD should be scared to wake the dragon of Nvidia, AMD are already getting stomped by nvidia in marketshare, why should they not try to compete with exclusive partnerships? the same type that Nvidia does and even started back in the day.
If anything AMD have nothing to lose by attacking Nvidia.
DLSS is noticeably better, but it is also the proprietary technology, that also requires developers to put not insubstantial work into implementing it, and then the majority of players wouldn't even be enjoy said extra work.
If AMD are paying developers to not have to do extra work, why would developers turn that down?
Moan at the gatekeeper.
what is this? AMDGaf nowhere to be found? Since when did this board go all pro nvidia anti AMD?
Yeah… no
That’s not how it works
Script kiddies or can even change .DLL file to replace FSR for DLSS
In fact, PureDark modder can implement DLSS in any games that has TAA
Indie guys had implemented DLSS hours after SDK was out
Or the unreal / unity engines which have a toggle after having the plugin in
AMD is more like..
Boundary EA Launch Q&A - Devs Explain Long Delays, Confirm Removal of DLSS (and RTX) in Favor of FSR2 and XeSS
The zero-G competitive FPS Boundary launches soon on Steam Early Access. We've interviewed the dev team's CEO to discuss the upcoming releasewccftech.com
Removing RT and DLSS that were already implemented..
Yeah… no
That’s not how it works
Script kiddies or can even change .DLL file to replace FSR for DLSS
In fact, PureDark modder can implement DLSS in any games that has TAA
Indie guys had implemented DLSS hours after SDK was out
Or the unreal / unity engines which have a toggle after having the plugin in
AMD is more like..
Boundary EA Launch Q&A - Devs Explain Long Delays, Confirm Removal of DLSS (and RTX) in Favor of FSR2 and XeSS
The zero-G competitive FPS Boundary launches soon on Steam Early Access. We've interviewed the dev team's CEO to discuss the upcoming releasewccftech.com
Removing RT and DLSS that were already implemented..
AMD’s marketshare os tumbling down to ~8-10% range while Nvidia and Intel climb. Why would this reflect any differently on neogaf? Thread about which GPU you got is also pretty much in that range.
Nobody is pro Nvidia I would say, this ain’t sport teams, in the sense that no I do not recommend any GPUs of this generation (outside balls to the wall 4090). I also no longer buy underdogs just for pity buying like I did for ATI/AMD for so long.
The problem is AMD is fleecing just as much as Nvidia, for worse tech and features, worse resell value, not as professional centric, etc etc. What is supposed to happen? They are nowhere near aggressive to retake the market.
so people here dropped all loyalty since May (i haven't been here for a few month) just because nvidia outperforms in the market? that's kind of strange to equate neogaf popularity with market performance when you compare msft vs sony. as far as i'm aware nvidia is just up due to people al of sudden throwing money at anything AI related. a lot of shit tier AI tickers are also up heavily since May.
maybe i missed it. was under the impression this board was heavily AMD similar to Sony and when i skimmed this thread all i saw was jokes about AMD and cheering nvidiaWhat? There’s still AMD pro peoples here, have you read the thread?
As a 3070 owner I dont mind it. I actually played death stranding with fsr over dlss when I had a 2060(dlss had these weird ghosting effects that fsr didnt have)Can't say it bothers me much, I am rocking a RX6800 and its the best card I have owned, drivers and hardware are excellent and will likely jump on an end of line 7900XT or RDNA4 card. Always swapped between Nvidia and AMD wherever I saw the best value. Appreciate the 4090 is a cracking card but far too much money or energy draw for my liking.
Should we be cheering for inferior tech? What has AMD GPU division done to earn your loyalty?maybe i missed it. was under the impression this board was heavily AMD similar to Sony and when i skimmed this thread all i saw was jokes about AMD and cheering nvidia
well AMD tech is in the consoles and historically i've seen them held in high regard not just here, but there seems to be a strang kind of cult like following about AMD, which is probably nothing more than a result of them being cheaper for relatively the same performance.Should we be cheering for inferior tech? What has AMD GPU division done to earn your loyalty?
If it has RT it will obviously run better on Nvidia with RT enabledWill laugh when it still runs better on Nvidia cards.