• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PCGamer || Intel : 'We're definitely competitive or better than Nvidia with ray tracing hardware'; 'a card that's faster than the [RTX] 3060 at prices

Draugoth

Gold Member
Intel is hitting the poke on Nvidia:

https://www.pcgamer.com/intel-arc-a770-a750-rtx-3060-performance/

https://www.pcgamer.com/intel-arc-ray-tracing-performance/

"When you have a title that is optimised for Intel, in the sense that it runs well on DX12, you're gonna get performance that's significantly above an [RTX] 3060,"
"And this is A750 compared to a 3060, so 17%, 14%, 10%. It's going to vary of course based on the title."
"And so from my perspective, the gamer says, 'Okay, if I want to play DX12 right now, I'm gonna get an amazing value.' You're gonna get a card that's faster than 3060, at prices that are lower. If you think you're playing DX12 and DX11, and you're splitting it, you're still gonna get amazing value. Because we're gonna price it for the performance that we're delivering today.
www.pcgamer.com

I've only the CliffsNotes version and not a complete whitepaper, but one important puzzle piece to this touted performance is a BVH cache within the GPU. This is solely used to accelerate BVH traversal—BVH stands for bounding volume hierarchy and is a cornerstone of how modern ray tracing is implemented in real-time in games.
"We tried to make ours generic because we know that we're not the established GPU vendor, right. So all of our technology pretty much has to work with low dev rel (developer relations) or dev tech engagement. And so things like our cache structure and our hierarchy, you know, our thread sorting unit, which are the two techs that we're going to talk about in this video, they work without any dev rel or dev tech work."
"I'm kind of torn on this one. Because to your point, there's some things that you would normally expect to lag. And the reason you would expect them to lag is because they're hard, and they need to come after you have a solid base. But for better or worse, we just said we need all these things. And so we did XeSS, we did RT, we did AV1, we kind of have a lot on the plate, right? I think we've learned that maybe, you know, in this case, we have a lot on the plate and we're gonna land all the planes, and that's taken us longer than we would have expected.
 

jigglet

Member
I would never buy a GPU from a new maker at least for the first few years.

I work in IT and I'm pretty good at fixing problems, but when I game, I don't even want to waste 1 minute of my spare time to install a patch or troubleshoot an issue. Even if it's a quick fix, it's too much. I want a mature hardware / driver system that I never have to think about. I wouldn't even consider Intel / a new entrant until like 5 years has passed.
 

Crayon

Member
They have to take a bath on the prices for this thing to get out of the gate at all. Let that first gen limp along and try to make the second gen is a lot stronger. I'd love to have a 3rd player and intel should be able to swing it.
 

Reallink

Member
Problem for Intel is Nvidia could literally (if they so chose) launch a $349 4060 next week that's faster than a 3080 while still earning a very tidy profit on it. Meanwhile Intel will have to eat shit on a card that's significantly slower than a 3060 in like 80-90% of games.
 

BadBurger

Is 'That Pure Potato'
I'll just wait on the inevitable tests once hardware gets into people's hands. It reads like some executive who used to work in some other division took over this new GPU endeavor and was just winging it in the interview / Q&A.
 
I'm confused at the tone of some comments in this thread. Short of large quantities of stock in Intel competition in the GPU space (or I guess a short position on Intel itself), what possible reason could a NeoGAF member have to want this to fail?

Another competitor in the space would be awesome for gamers. GPU prices have been completely inflated for years, and this will hopefully help.

This place is really strange sometimes.
 
Last edited:
It would be great if Intel could deliver quality at a reasonable price point, but if you look at the benchmarks you really don’t see it yet.
 
When you have a title that is optimised for Intel [...] it runs well
and
So all of our technology pretty much has to work with low dev rel (developer relations) or dev tech engagement. And so things like our cache structure and our hierarchy, you know, our thread sorting unit, which are the two techs that we're going to talk about in this video, they work without any dev rel or dev tech work.

contradicts itself a bit.
It just works, because it has do, but it runs only well, if you put in the work as a dev.
I guess that one is a final result statement and the other talks about details of the entire pipeline, but still sounds a bit odd.

Intel just has to catch up since they were not properly in that business. Although all those iGPUs should have taught them more in all those years. It's some decade or something where laptops were presented that run the than current F1 game at sort of Steamdeck-level? intelHD? AMD struggles to have their Zen moment in the GPU area since the 9800Pro and nvidia also had some stinkers now and then, intel will hopefully figure stuff out in the coming years and at least help getting the prices down.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
I'm confused at the tone of some comments in this thread. Short of large quantities of stock in Intel competition in the GPU space (or I guess a short position on Intel itself), what possible reason could a NeoGAF member have to want this to fail?

Another competitor in the space would be awesome for gamers. GPU prices have been completely inflated for years, and this will hopefully help.

This place is really strange sometimes.
I think it may be fear from maybe the AMD side plus NVIDIA aficionados that causes this backlash, plus Intel themselves are in need of a humbling as a company.

Still, Intel becoming a viable competitor does not help the market become more competitive unless… they steal NVIDIA market share. If NVIDIA’s market share remains grossly the same (more or less) and Intel cuts away at some of AMD’s GPU market share you will make NVIDIA happier as it helps kill AMD off and paves for a more complete monopoly.
 
Well they're launching 2 years after the RTX 3060 so it's not like launching a new card with a more expensive, bigger chip made on a more advanced N6 process node and worse power/performance is that much of an achievement if you claim 15% better raytracing performance.

It's still great that a new player finally arrived on the GPU market, but I don't know if Intel should be playing the "performance crown" card. They'd better come up with some crazy good prices.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I would never buy a GPU from a new maker at least for the first few years.

I work in IT and I'm pretty good at fixing problems, but when I game, I don't even want to waste 1 minute of my spare time to install a patch or troubleshoot an issue. Even if it's a quick fix, it's too much. I want a mature hardware / driver system that I never have to think about. I wouldn't even consider Intel / a new entrant until like 5 years has passed.
You say that as if youve lived through multiple generations of new gaming GPU makers entering the market.
If you remember ATI and Nvidia entering the GPU space how long did it take them to become dominant?

Not that im championing Intels Arc GPUs.....im already well above their highest spec, but if their RT implementation is as good as they say it is.
Then the B series GPUs could actually be worthwhile for alot of people.

The A series is just too weak and too late to the party, Ada and RDNA3 are right round the corner and we dont have any independent benchmarks of the A770?

B series needs to come out sometime next year, or skip B entirely and go straight to the C series so you are atleast able to match AMD and Nvidias mid rangers.
 

GreatnessRD

Member
Intel dropped the ball so hard with this GPU launch. I just hope they stay with it and will become a true 3rd competitor in the space, but man, the A series looking down real bad right now, lol.
 

Skifi28

Member
By the time they launch they'll be competing with the 4060. But forgetting that, you'll be maybe getting 10-15% performance better in some dx12 games for half the performance in most older titles. You decide if it's worth it, I'd still take the 3060.
 
If you remember ATI and Nvidia entering the GPU space how long did it take them to become dominant?
Neither were king at their first tries, but Riva TNT/ Geforce256 already killed the first dominant card manufacturer 3dfx. Matrox or SiS never got off the ground. And the 9000 series were the only time ATI actually were ahead. So if intel sticks the landing with the third or fourth iteration it would be similar. Though the GPU market was still in its infancy back then while intel tries to enter very late, with a ton of driver development history and a lot more features, so they have to put in much more effort to get up to speed to maybe squeeze in.
 
Last edited:

Drew1440

Member
I would never buy a GPU from a new maker at least for the first few years.

I work in IT and I'm pretty good at fixing problems, but when I game, I don't even want to waste 1 minute of my spare time to install a patch or troubleshoot an issue. Even if it's a quick fix, it's too much. I want a mature hardware / driver system that I never have to think about. I wouldn't even consider Intel / a new entrant until like 5 years has passed.
Intel have been making graphics processors for over 22 years.
https://en.m.wikipedia.org/wiki/Intel740
 

winjer

Member
Competitive with NVidia? LOL
Even Intel's powerslides say the opposite.




The best that the fastest ARCH GPU, the A770, can do is match the second slowest Ampere GPU.
A card that Intel was promoting as having performance between the 3060Ti and 3070.
And let's remember that Ada Lovelace is almost on the market.
Probably the best that Arch GPUs can do is beat RDNA2 in Ray-tracing. Maybe.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Competitive with NVidia? LOL
Even Intel's powerslides say the opposite.




The best that the fastest ARCH GPU, the A770, can do is match the second slowest Ampere GPU.
A card that Intel was promoting as having performance between the 3060Ti and 3070.
And let's remember that Ada Lovelace is almost on the market.
Probably the best that Arch GPUs can do is beat RDNA2 in Ray-tracing. Maybe.
If the A750 is matching the RTX3060 im pretty sure the A770 should handily beat it, and may be RTX 3060Ti levels.

They havent shown us any of the RT benchmarks yet.
Thats what im interested in.
 

winjer

Member
If the A750 is matching the RTX3060 im pretty sure the A770 should handily beat it, and may be RTX 3060Ti levels.

They havent shown us any of the RT benchmarks yet.
Thats what im interested in.

Maybe, but the difference between a 3060 and a 3060Ti is quite big. Much more than the name suggests.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Maybe, but the difference between a 3060 and a 3060Ti is quite big. Much more than the name suggests.
Yeah the gulf between the 3060 and 3060Ti is massive, Nvidia were smart giving them such similar names so those NOT in the know assume they are close.

The A750 is a 28 XeCore GPU
The A770 is a 32 XeCore GPU

There isnt a huge difference between them, but it might be enough to get the A770 closer to the 3060Ti than to the 3060.
I dont expect it to actually beat the 3060Ti but if it can get close enough in pure raster and be a proper match in Raytracing then theyve basically achieved what they set out to achieve.

Its late.....very late.....very very late but atleast its done.

The only interesting thing about the Arc GPUs is all the talk about them having developed some better way to do Raytracing acceleration, if its as effecient and performant as they say it is, their cards may lose in raster even in the B series, but then they will just bench games with Raytracing enabled to make them look better.
I dont expect Intel to have xx80 level cards at any point before the D series, but if they always have an xx70 level card, then they are in the running atleast.
 

hlm666

Member
It's Tom Petersen so no wonder it sounds like the bulshit spin from Nvidia during the turing launches. He kept trying to say xess was open source in the DF interview aswell then tagging shit on like it plugs into our custom engines/binaries in the brackgroung but the sdk is open source. The guy couldn't lie straight in bed.
 

supernova8

Member
I dont expect Intel to have xx80 level cards at any point before the D series, but if they always have an xx70 level card, then they are in the running atleast.
Plus for anyone who was in the market for a 3060 level card for a good price (assuming Intel prices aggressively), xx80 and xx70 cards are irrelevant anyway. Steam hardware surveys suggest that the vast vast majority of people are not remotely interested in stuff over like $300ish. I would get the 3060-competitor if it does the job.
 
Last edited:

Skifi28

Member
Intel have been making graphics processors for over 22 years.
https://en.m.wikipedia.org/wiki/Intel740
And despite that their drivers have been so shit that they've completely abandoned them and going emulation for dx9 and prior despite supposedly building on them for 22 years. Actions speak louder than words and I don't trust their GPUs to work properly, I don't just play games released in the past 2 years.
 
Last edited:
I don't think it's a big deal, we're still in the infancy of ray tracing anyway, at least until new consoles come along that can do it to a reasonable level. But progress is progress and dynamic resolutions + reconstruction tech will definititely help.
 

winjer

Member
And despite that their drivers have been so shit that they've completely abandoned them and going emulation for dx9 and prior despite supposedly building on them for 22 years. Actions speak louder than words and I don't trust their GPUs to work properly, I don't just play games released in the past 2 years.

Exactly. And if we compare all GPUs, including integrated, Intel is the biggest producer in the world.
So they should have the incentive of providing good drivers, even before releasing Arch.

The hardware might have potential, but those drivers are very bad and it will take a long time to fix and improve everything they need to.
Maybe by the time Intel releases Battlemage, a couple of years from now, their drivers will be on point.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Plus for anyone who was in the market for a 3060 level card for a good price (assuming Intel prices aggressively), xx80 and xx70 cards are irrelevant anyway. Steam hardware surveys suggest that the vast vast majority of people are not remotely interested in stuff over like $300ish. I would get the 3060-competitor if it does the job.


Exactly.
Focusing on the vast majority of the market makes more sense than aiming for the upper echelon.
xx60 class cards are what sell the most.
Intel making cards that compete at that level make them viable for alot more people, especially if they price them competitively.
Add in moneyhatting XeSS adoption with their supposed RT advantage and Intels game plan is sound, they were just late to the party for the A series.
Focus on getting the B and C series out in time and this really could be a 3 horse race. (not really we know Nvidia is gonna be killing it either way)
 

supernova8

Member


Exactly.
Focusing on the vast majority of the market makes more sense than aiming for the upper echelon.
xx60 class cards are what sell the most.
Intel making cards that compete at that level make them viable for alot more people, especially if they price them competitively.
Add in moneyhatting XeSS adoption with their supposed RT advantage and Intels game plan is sound, they were just late to the party for the A series.
Focus on getting the B and C series out in time and this really could be a 3 horse race. (not really we know Nvidia is gonna be killing it either way)
I just wonder what price they will go for. We already hear they're raising prices on the CPU side.

I suspect we'll either get loads of stock from day one but at a "hmm fair enough" price or not so much stock at a "wooow bargain price" followed by a bit of a drought (to drum up demand) and then more stock.
 
Last edited:

Tams

Member
It's hilarious seeing all these posts mocking Intel's latest GPU efforts, considering how sure many on this forum were that Intel would blow NVidia and AMD away with their first desktop GPU line.... b-b-but Raja... lol. Who's your daddy now, bitches?
Don't count me in this. I always thought thia was going to be a dumpster fire.

Intel's history regarding doing graphics themselves. Their company culture. Their greed. Their ignorance. Raja fucking Koduri (thank goodness AMD managed to get rid of that clown).
 

supernova8

Member
Also this bit in the interview is hilarious
Petersen wouldn't unveil the price of either card just yet, but mentioned that the launch will happen "very, very soon." So keep your eyes peeled as we should hear about pricing and availability momentarily.
There is definitely a limit to how many times you can say "it's coming soon" before people starting laughing, and Intel has exceeded that number already.

Plus, it would be funny if PC Gamer ran a much shorter article saying "Intel approached us to talk about their upcoming Arc graphics cards but we told them to swivel and come back when they can actually give us review units to play with and are not under any restrictions (beyond an embargo)".

Will Smith Reaction GIF
throwing get out GIF
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
It's hilarious seeing all these posts mocking Intel's latest GPU efforts, considering how sure many on this forum were that Intel would blow NVidia and AMD away with their first desktop GPU line.... b-b-but Raja... lol. Who's your daddy now, bitches?
Intel internally said they were targeting ~RTX3060Ti/RTX3070 with their top end card.

I dont think anyone was expecting RTX3080 levels of performance when everything reported on Arc prior to us even seeing the units was that their top was basically a midranger.

Assuming the A770 is spitting distance from the RTX3060Ti then they achieved what they wanted to achieve.
Just a little late cuz Ada is round the corner, had the A770 launched properly in 2021 it wouldnt look as dire as now knowing that RDNA3 and Ada are round the corner.

This is the leaked slide from early 2021:


They were always aiming for SOC2 the small one to race GTX1650
And SOC1 A7 to race the RTX3060 - RTX3070


So who exactly are you implying was expecting anything more?
 
Last edited:

I Master l

Member
Their GPUs have AI hardware which is impressive considering AMD GPUs dont and
the performance is not bad for a first gen
 

//DEVIL//

Member
Lol 🤣 @ competitive. .. were they able to get dirt 5 to run ? Last time I checked the game didn't even start on Intel drivers according to hardware unboxing ( or maybe it was similarly silly game/s like that I forgot
 

FStubbs

Member
I'm confused at the tone of some comments in this thread. Short of large quantities of stock in Intel competition in the GPU space (or I guess a short position on Intel itself), what possible reason could a NeoGAF member have to want this to fail?

Another competitor in the space would be awesome for gamers. GPU prices have been completely inflated for years, and this will hopefully help.

This place is really strange sometimes.
You're looking at Intel for price relief?
Not Funny Laughing GIF
 
Last edited:
Top Bottom