• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Removes ‘4GB VRAM is not enough for games’ Marketing Hours Before Radeon RX 6500 XT 4 GB Launch

ManaByte

Gold Member
Wizard Of Oz GIF
 

lukilladog

Member
Worst ripoff since the gt 1030 ddr4?. AMD and Nvidia have reached new levels of Pathetic.
 
Last edited:

Boglin

Member
Maybe the monopoly is justified? Nobody else has stepped up in what 15 years...
Yeah, was just joking around. If we're looking at the current offerings then of course it would be justified and I'm not charitable enough to buy an inferior product just for the sake of competition.
If Nvidia truly does become a monopoly in the GPU space one day and starts leveraging it in anti-consumer ways then I'll let the regulating bodies deal with it
 

hlm666

Member
WTF is AMD doing. They lose dgpu market share with their most competitive high end parts in years and do this not long before Intel are to release their own cards. They have gotten high off their own farts from the praise their cpu success has gotten them. This gpu actually supports RT but it's so bad they should have disabled it at a driver level.

 

Kenpachii

Member
WTF is AMD doing. They lose dgpu market share with their most competitive high end parts in years and do this not long before Intel are to release their own cards. They have gotten high off their own farts from the praise their cpu success has gotten them. This gpu actually supports RT but it's so bad they should have disabled it at a driver level.


People cheer for AMD because its the underdog, sadly they don't realize AMD is just as shit as Nvidia on the consumer solution or even worse.
 
Last edited:

01011001

Banned
this card is such an absolute piece of shit lol... even at MSRP it is dogshit, but noone ever sold it at MSRP to begin with, even before scalpers come into view that thing went on store shelves for well over $300
 

supernova8

Banned
Can we have reviewers other than Hardware Unboxed and Gamers Jesus? Those people are just boring now. It's the same old depressed bollocks every time. Even when there are good products (not defending the 6500 shit-t) they aren't even that positive.
 

hlm666

Member
People cheer for AMD because its the underdog, sadly they don't realize AMD is just as shit as Nvidia on the consumer solution or even worse.
I just want as many competing players in the pc hardware space as possible and it frustrates me AMD seems to think treating nvidia like it's intel and stuck on 5 year old chip node is going to work. Intels marketing department for the coming xe cards went home early today, AMD just gave them a product that just made those stupid out of context bar graphs not needed.
 

hlm666

Member
Can we have reviewers other than Hardware Unboxed and Gamers Jesus? Those people are just boring now. It's the same old depressed bollocks every time. Even when there are good products (not defending the 6500 shit-t) they aren't even that positive.
my link above is to a site with a 40 page written review of the card, you have options other than those 2 outlets.
 

Kenpachii

Member
3929486-mj05czpaygx8wcm88ilqdvjighfzfuptweaes1ngl6g.png

I just want as many competing players in the pc hardware space as possible and it frustrates me AMD seems to think treating nvidia like it's intel and stuck on 5 year old chip node is going to work. Intels marketing department for the coming xe cards went home early today, AMD just gave them a product that just made those stupid out of context bar graphs not needed.

What did intel do? link pls
 

01011001

Banned
Can we have reviewers other than Hardware Unboxed and Gamers Jesus? Those people are just boring now. It's the same old depressed bollocks every time. Even when there are good products (not defending the 6500 shit-t) they aren't even that positive.

sounds to me like:
"give me reviewers with lower standards please"
 

Sosokrates

Report me if I continue to console war
3929486-mj05czpaygx8wcm88ilqdvjighfzfuptweaes1ngl6g.png



What did intel do? link pls

Holy smokes I was not expecting that?

Is is really memory bound?

Seems really odd for a last gen game.

I wonder how the 6500xt compares to a 5500xt 4gb

Because compute and bandwidth wise they are both very similar.

Edit:
Something seems up with that benchmark
YDyM914.png
 
Last edited:

hlm666

Member
What did intel do? link pls
I was being sarcastic that Intel normally use questionable bar graphs in their marketing slides, but with AMD releasing this card Intel will just use this to compare their coming cards against and the bar graphs wont need to do shit like having a bar graph with one bar twice as long as the other but the axis doesn't actually show the whole bar.

Something seems up with that benchmark
Your benchmarks are at medium and the other are ultra. The 5500xt in the ultra benchmarks probably has 8gb vram and not hitting the vram capacity.
 

Sosokrates

Report me if I continue to console war
I was being sarcastic that Intel normally use questionable bar graphs in their marketing slides, but with AMD releasing this card Intel will just use this to compare their coming cards against and the bar graphs wont need to do shit like having a bar graph with one bar twice as long as the other but the axis doesn't actually show the whole bar.


Your benchmarks are at medium and the other are ultra. The 5500xt in the ultra benchmarks probably has 8gb vram and not hitting the vram capacity.

It would of been a good idea to include some other 4gb cards to see if they have the same issue.
 

Kenpachii

Member
Holy smokes I was not expecting that?

Is is really memory bound?

Seems really odd for a last gen game.

I wonder how the 6500xt compares to a 5500xt 4gb

Because compute and bandwidth wise they are both very similar.

Edit:
Something seems up with that benchmark
YDyM914.png

Medium settings for you, the one i posted was ultra settings.

U can read the entire benchmark here

https://www.guru3d.com/articles_pages/radeon_rx_6500_xt_review,16.html

ray tracing is particular funny on that card.

index.php
 
Last edited:
When your $200 GPU from 5 years ago (RX 580, hell, even an RX 480 from 6 years ago) completely destroys your current $200 offering, you know you fucked up.

Absolute fucking waste of sand. Imagine being that bad which makes another terribly priced card, RTX 3050, sound like a way better deal for $50 more at $250, when that in itself shouldn't cost more than $150 like it's predecessors.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Wasted wafer in these hard times. Would've been better used for PS5 and Xbox consoles.
This hit me harder than it should have.
That 6500XT space could have been used for basically anything else and it would have been of better use.

Heck they could literally have rereleased the 5600XT with more VRAM and that would make more sense than this piece of shit of a graphics card.

Who the fuck approved this?
 

Bo_Hazem

Banned
This hit me harder than it should have.
That 6500XT space could have been used for basically anything else and it would have been of better use.

Heck they could literally have rereleased the 5600XT with more VRAM and that would make more sense than this piece of shit of a graphics card.

Who the fuck approved this?

This also reminds me of Samsung laughing at Apple everything to follow them by next year. Lack of character and pride.
 
I have a 3060 which as I understand it was supposed to be the budget GPU option. But of course cryptocurrency and COVID happened.
 

STARSBarry

Gold Member
I have a 3060 which as I understand it was supposed to be the budget GPU option. But of course cryptocurrency and COVID happened.

And incredibly good budget option for its price when compared to what ever this is....

Shame you can't get them for RRP, they sell for a 3080 RRP price now.
 
This hit me harder than it should have.
That 6500XT space could have been used for basically anything else and it would have been of better use.
The die is probably very similar to what would appear on a Ryzen integrated GPU with 16 CU's.

In fact, it's using the 6nm node like new Ryzen 6x00U/H cpu's and a low pcie-bus (typical on integrated graphics) which makes it a bit suspicious.

The RX 6400 with 12CU's is the same as the top range integrated product this year, so if they end up having a gpu chiplet separate from the cpu (I doubt it, but...), high chances 6500XT is the result of binning and the rest have either disabled CU's due to heat, or bad yields.

Anyway, it's a GPU that is only good enough for integrated graphics and whose bus tells it's heritage quite well.
 
Last edited:

RoboFu

One of the green rats
Wasted wafer in these hard times. Would've been better used for PS5 and Xbox consoles.
More than likely these were some very last binned left overs laying around in a warehouse somewhere they thought they could make a quick buck on.
 
Top Bottom