• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD gives you maximum RT performance

poppabk

Cheeks Spread for Digital Only Future
My first Nvidia card was GeForce 2. I just bounce around depending on what's best for me price and performance wise, currently on a 6900XT which will hopefully last me 5+ years.
 

Hot5pur

Member
Yeah early 2000 9800pro the GOAT
I still have that one in the closet somewhere haha...legendary card

I'd say if you are on a 2060 GPU or above at sub 4k you can probably skip this gen. Raytracing is hardly noticeable outside of very few games. Like maybe 2-3 where it makes a tangible difference.

If I was getting a 4k card today I'd go AMD for the large VRAM pool for the same money. Again mostly because raytracing is a waste.
 

SmokedMeat

Gamer™
AMD’s ray tracing is good, but I don’t know why they’d make this video when Nvidia’s equal cards are better in performance. Well, with the exception of Unreal 5’s Lumen which could make things interesting.

Play to your strengths like VRAM and Rasterization. Show off your superior control panel.

If I were AMD I’d be taking the old school Sega approach of painting Jensen’s followers as sheep. Show them turning down texture settings and encountering more stuttering. Needing to crank up the fake frames to make games playable. Paint them as schmucks.
 

DaGwaphics

Member
AMD manage to get their best card to more or less match Nvidia last gen best. If anything, AMD is closing the gap.

All jokes aside, they've definitely made big gains in this area. Once they erase most of Nvidia's faux frames advantage things will look even better for them. If they can advance at this same pace moving to the 8000 series they might start to catch up.
 

Pedro Motta

Member
My last Radeon card 🫡
3JDoUHE.jpg
heyyyy had one of those, what a beast.



This demo was insane at the time.
 

Buggy Loop

Member
AMD’s ray tracing is good, but I don’t know why they’d make this video when Nvidia’s equal cards are better in performance. Well, with the exception of Unreal 5’s Lumen which could make things interesting.

Play to your strengths like VRAM and Rasterization. Show off your superior control panel.

If I were AMD I’d be taking the old school Sega approach of painting Jensen’s followers as sheep. Show them turning down texture settings and encountering more stuttering. Needing to crank up the fake frames to make games playable. Paint them as schmucks.

How did it go for Sega?

Sponsor a game that has 6.2 GB of VRAM at 1080p for low textures and looks worse than games that used 256MB VRAM a few years ago. But you need >8GB!, clearly look at this!

SflY4rE.jpg


Should go well for AMD.

Always pushing visuals, AyyMD

We're all salivating for their next sponsored games

Meanwhile, 6.2GB VRAM @ 4k ultra settings

52466162831_0d05f0b717_3k.jpg


A-Plague-Tale-Requiem-Windows-03-11-2022-23-29-34.png
 
Last edited:

SmokedMeat

Gamer™
How did it go for Sega?

Sponsor a game that has 6.2 GB of VRAM at 1080p for low textures and looks worse than games that used 256MB VRAM a few years ago. But you need >8GB!, clearly look at this!

SflY4rE.jpg


Should go well for AMD.

Always pushing visuals, AyyMD

We're all salivating for their next sponsored games

Meanwhile, 6.2GB VRAM @ 4k ultra settings

52466162831_0d05f0b717_3k.jpg


A-Plague-Tale-Requiem-Windows-03-11-2022-23-29-34.png

Naughty Dog’s crap isn’t AMD’s fault. Coincidentally you’re showing off a game that also runs better on AMD GPUs. 🤷🏼‍♂️


APTR.png


If I were AMD I’d absolutely be demanding that VRAM was pushed bigtime in all of my sponsored games. Making Nvidia look inferior with their low VRAM would be a wise business decision.
 
Last edited:

Buggy Loop

Member
Naughty Dog’s crap isn’t AMD’s fault. Coincidentally you’re showing off a game that also runs better on AMD GPUs. 🤷🏼‍♂️


APTR.png

Right, rasterized. Not sure what your point is. Game looks gorgeous, has reasonable VRAM usage. AMD’s marketing around VRAM using TLOU as a reference? They’re high as fuck.

AMD runs plaque tale faster in rasterization, good for them 🤷‍♂️

I’m pointing out that their sponsored games have been a fucking mess so far. They somehow all align with AMD for the worst PC ports, how peculiar.
 

OverHeat

« generous god »
Naughty Dog’s crap isn’t AMD’s fault. Coincidentally you’re showing off a game that also runs better on AMD GPUs. 🤷🏼‍♂️


APTR.png


If I were AMD I’d absolutely be demanding that VRAM was pushed bigtime in all of my sponsored games. Making Nvidia look inferior with their low VRAM would be a wise business decision.
Laughing in 4090 😂😂😂 leave the low ram cards for the plebs
 

SmokedMeat

Gamer™
Right, rasterized. Not sure what your point is. Game looks gorgeous, has reasonable VRAM usage. AMD’s marketing around VRAM using TLOU as a reference? They’re high as fuck.

AMD runs plaque tale faster in rasterization, good for them 🤷‍♂️

I’m pointing out that their sponsored games have been a fucking mess so far. They somehow all align with AMD for the worst PC ports, how peculiar.

These aren’t sports teams. I’m not going to argue it’s not fair that your team is losing to my team in sponsored games. Optimizing for the company’s GPU line is arguably the reason to sponsor in the first place right?
I can’t help it if AMD sponsored games always trounce Nvidia in performance. Especially Ubi’s games.
Naughty Dog’s game is shit regardless of what GPU is in your rig.

If anything instead of excusing Nvidia for giving the bare minimum of VRAM that literally everyone is telling you will be a problem in the next year or two - get mad at Jensen! You should be demanding more value for your money from Nvidia, not getting mad about AMD. These aren’t sports teams! I swear Nvidia fans are their own worst enemy.
If Nvidia rolls out a 5070 with 16GB of VRAM then that’s a win for everyone. Stop excusing them like 8GB should be enough. In the real world that’s not always going to be the case anymore.
 

Mr.Phoenix

Member
Technically I like the concept of amds approach. One core that does everything having RT built into every core. Eventually that will pay off but they are 2 years behind.
Architecturally, their one core for everything approach is good. But the problem is that they are focusing on adding more cores, cache, or increasing clocks instead of making the individual core more advanced.

We are now at a point where raster performance is not the limiting factor in any GPU but its RT performance. Hence, no matter how good their GPUs are, once doing an RT workload, they get crippled.
 

night13x

Member
AMD is pleased to announed ULTRA MAX RT PERFORMANCE. SEE THE DIFFERENCE!

tested at low rt settings with FSR performance mode
 
Top Bottom