• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia's RTX 4060 Ti and AMD's RX 7600 highlight one thing: Intel's $200 Arc A750 GPU is the best budget GPU by far

hlm666

Member
can it be used for 3D/Rendering and such things?
Has Blender support, here's an article covering some of that stuff although i'm not sure if the driver improvements have had any effect on these.

 

Buggy Loop

Member
The 6700XT is imo.

The A750 will work just fine for esports titles and the like, but if you plan to play upcoming AAA games then I don’t know if I would recommend an 8GB card at any price.

This

Intel's a bit too random in performances with these drivers. Ideally, intel improves a lot for the next iteration.

The other alternative is that 6700 non XT, rarely seen in reviews is $10 more, sometimes with deals that bring is even same price to 7600, is +6% performance, +2GB of VRAM, a mere 28W difference.

Personally, you want to hop in with a GPU in this state of the market? Pick a 6700 XT or 6700 and wait for next gen GPUs.
 
Last edited:

Rentahamster

Rodent Whores
I hope Intel's next generation of cards has massively increased performance at killer prices. I can't believe we're at the point where we're asking Intel of all companies to save us. This is how bad things have gotten.
 

Soodanim

Gold Member
I hope Intel's next generation of cards has massively increased performance at killer prices. I can't believe we're at the point where we're asking Intel of all companies to save us. This is how bad things have gotten.
It's funny and sad. I was thinking the same thing. If Intel is the one to force the others into being competitive again, that's all sorts of bizzaro world.
 

hlm666

Member
After the finacials they released and the stock price surge because of their AI lead Nvidia probably already working on their exit plan from consumer gpus, AI is the major market for them going forward and cloud will be where they bother with game hardware, maybe.

"Nvidia’s market cap is just over $950 billion as of Thursday afternoon, up a whopping $170 billion since the 30-year-old company released an earnings report Wednesday night that pushed its share price up 25.6% to $383.43."

 

Rentahamster

Rodent Whores
After the finacials they released and the stock price surge because of their AI lead Nvidia probably already working on their exit plan from consumer gpus, AI is the major market for them going forward and cloud will be where they bother with game hardware, maybe.
How do you figure that? Their hardware runs AI algorithms very well, and if they keep all the secret sauce AI hardware for themselves, they'll have a leg up on the competition.
 

PeteBull

Member
Intels 8gig proposition is pretty solid, value wise, as long as u only play super mainstream games, if u like niche/older games- drivers still have tons of room for improvement =/
 

Crayon

Member
Depending on how the Linux support shapes up (did it already? Not paying attention) I'll definitely consider Intel.
 

hlm666

Member
How do you figure that? Their hardware runs AI algorithms very well, and if they keep all the secret sauce AI hardware for themselves, they'll have a leg up on the competition.
Waste of silicon when they can sell it into AI datacentres at much better margins, did you look at the numbers there?

"Nvidia’s earnings report estimated $11 billion in sales for the second quarter, more than 50% higher than analysts’ predictions."

gaming used to be their largest revenue source, it was like 50/50 the past few years and now it's about 50% of data centre. The writing is on the wall.
 

Rentahamster

Rodent Whores
Waste of silicon when they can sell it into AI datacentres at much better margins, did you look at the numbers there?

"Nvidia’s earnings report estimated $11 billion in sales for the second quarter, more than 50% higher than analysts’ predictions."

gaming used to be their largest revenue source, it was like 50/50 the past few years and now it's about 50% of data centre. The writing is on the wall.
Oh, I see what you mean. I thought you were suggesting that they'd abandon hardware completely. Yes, if gaming becomes a smaller and smaller piece of their revenue pie, there is less incentive to focus on it. However, if it still prints money, then I don't see why they would abandon it completely.
 

Buggy Loop

Member
Oh, I see what you mean. I thought you were suggesting that they'd abandon hardware completely. Yes, if gaming becomes a smaller and smaller piece of their revenue pie, there is less incentive to focus on it. However, if it still prints money, then I don't see why they would abandon it completely.

Doesn’t make sense to abandon it since their R&D for professional tasks, ray tracing or AI for enterprises, is also contributing to upsell their gaming hardware with higher margins and holding a near monopoly with 82% market. If the enterprise and gaming models were drastically on different paths, then maybe.

We already see that gaming is of less interest to them as a business. High prices means less demand, means more foundry slots for enterprise solutions with better margin. But that’s the thing, they still sell gaming cards with probably a nifty margin.

Sadly it’s the same for AMD. Enterprise, CPUs and consoles are their bread and butter. Why enter in a price war with Nvidia over products that barely make waves in the hardware world? Especially the flagship cards, those have no chance to go down in price, they are <1% of market, who would sacrifice margin for a meaningless win?

I’ll give them that AMD is smarter than ATI who insisted to bleed almost to bankruptcy for epeen wars. Sucks for consumers, but this is the reality, these companies don’t need your money, so they put lots of margin in.

Dropping entirely the hardware would be a bigger mindshare problem and impact to stock market than the cost of keeping a foundry running a line for ~1% products.
 

winjer

Gold Member
Intel does have one advantage over AMD's current GPUs, as they already have pretty good dedicated units for ray-tracing and machine learning.
Sadly, AMD is still using shaders for ML and TMUs for RT.
 

StereoVsn

Member
I think for the money it's a decent card. Personally I would spring for 6700xt or even 6750xt depending on price/sales.

Or maybe Intel alternative with over 8GB, but not sure on pricing for that. I wouldn't worry about RT as let's face it with these cards you wouldn't use it all that much.
 

Buggy Loop

Member
Intel does have one advantage over AMD's current GPUs, as they already have pretty good dedicated units for ray-tracing and machine learning.
Sadly, AMD is still using shaders for ML and TMUs for RT.

Next iteration if they play their cards right, they can capture some nice market share. Unless Nvidia sees a threat to >80% market share and decides to compete in price.

Sadly it’ll be mid range only on Intel side. That’s the smartest decision for them, sucks a bit for those waiting for the next 3080 in bang for the buck
 

LordOfChaos

Member
The driver improvements have been extremely impressive for these cards and continue to improve. However, I’d say the intel cards are attractive because of how bad the others are, not due how good they innately are.

Yes, of course. Nvidia and AMD just haven't been that interested in selling low end silicon when they can move every high end chip they make. I'd be happy for Intel to start gobbling some share starting with the low end in response.

Only thing is, they practically require resizable bar or it's shit, so it excludes a lot of potential few year older systems from a modern cheap GPU upgrade
 

LordOfChaos

Member
can it be used for 3D/Rendering and such things?

Pretty good on Blender.



Blender-Cycles-GPU-Render-Performance-Scanlands-No-OptiX-1.jpg



Nvidia still has a lead with Optix using their ray tracing accelerating hardware, but Intel is already beating the equal tier of AMD card each without Optix, and given that it already has stronger ray tracing acceleration hardware than AMD, using that could give even better results down the line.
 
Really makes you wonder what the next decade is going to be like. I wonder if Intel’s GPUs puts them in the console business if Nvidia decides it’s not worth it. That would be terrible for prospects of a BC Super Switch.
 
I saw a pretty good prebuilt (AM5 DDR5 Ryzen 7) with a 6800XT.

Fuck these new overpriced GPUs go with that if you want to go budget.
 
Friends don't let friends buy Intel GPUs. The Arc 370m in my laptop is WORSE in a lot of games than the 12700h's integrated graphics. The drivers have not made performance better in my favorite titles so I really couldn't care less about supposed 'support' these cards are receiving. A whole generation of beta testers for a company with immense resources is unacceptable imo
 
The 6700XT is imo.

The A750 will work just fine for esports titles and the like, but if you plan to play upcoming AAA games then I don’t know if I would recommend an 8GB card at any price.

Yeah 6700 XT is the best value on market if you want something that'll last a few years at least.

6600 XT is best if you don't quite need all the performance of a 6700 XT (you game at 1080p)
 
No clue about the price diff between 4060 ti and A750, but Intel does 60-70 FPS on Forza Horizon 5 maxed at 1080p while Nvidia's one does around 150 FPS with DLSS.

This applies to pretty much any DLSS 3 game, which is going to be almost every big game from now on except the ones with marketing deals and not much more.

Also power consumption from 4060 TI is like half as much as A750's.

Now if technologies like DLSS 3 or RT didn't exist, well then i might ignore both and go for AMD.
 
Top Bottom