• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Apple's M1 Max GPU is as powerful as an Nvidia RTX 2080 desktop GPU and the Sony PS5 gaming console

ethomaz

Banned


Intel's open image denoise benchmark.

Ryzen 4700S's 4.4 score shows about half of Ryzen 7 4750G's 8.1 score. Ryzen 4700S's 4.4 score is in line with Ryzen 2700X's 4.4 score.

PS; Intel Core i5 11400 RocketLake has AVX-512. In real-world gaming workload, PS5's raytracing denoise pass is done on GPU's shaders.

Intel Core i5 11400 (RocketLake with AVX-512) 11.5 scores nearly double Ryzen 5 5600X's 7.0.
Which part of heavy limited by low cache/memory latency didn’t you understand.

GDDR6 kills the performance of 4700S in these benchmarks.
 

rnlval

Member
Which part of heavy limited by low cache/memory latency didn’t you understand.

GDDR6 kills the performance of 4700S in these benchmarks.
Nope. SIMD with images are streaming workloads and they are closer to GPU's streaming behavior. Game logic needs lower latency.





Ryzen 4700S Geekbench 5's integer score is similar to Ryzen 4750G's. Ryzen 4700S's integer components are unaffected despite the GDDR6 memory design.
 
Last edited:

rnlval

Member
Which part of heavy limited by low cache/memory latency didn’t you understand.

GDDR6 kills the performance of 4700S in these benchmarks.
Mirroring Intel OpenAPI raytracing benchmark 4700S's results




Notice Ryzen 4700S's basic integer IPC results are close to PC's Zen 2. Ryzen 4700S's vector math takes a large performance hit. PS5 CPU is purpose-designed for its intended workload targets.

Instruction set IPC benchmark table from https://www.hardwareluxx.de/index.p...-ryzen-4700s-desktop-kit-im-test.html?start=1
 
Last edited:

LordOfChaos

Member
Holy Shit 128 Core Quad GPU?


The performance with reasonable scaling with that off-die hop looks pretty great.

Two things I'm wondering about, first, if I only want to add GPU cores, am I always stuck adding more CPU cores with another M1 Max die? I'm actually the opposite work wise and crush CPUs and RAM 99% of the time, not so much GPU, so I'd always be stuck adding GPU cores with it? I wonder if there could possibly be binned chips where only the CPU or GPU cores are enabled, plus the connector and everything else needed to make it work.

The other thing is with a GPU of this scale (128 cores in Apple speak makes it 16384 shaders), I really want to see an approach to accelerated ray tracing at this point. M1 supports it, but in the sense that GPUs supported it in software pre-2000 series, not very fast.
 
Last edited:
Will it run 10 decent titles for the least before the m2 release ? I highly doubt it. "I always use this logic to decide building newer nvidia regg cause the outcome have always been disappointing and usually being sceptical".
 
Last edited:

T-Cake

Member
Right? Theres nothing to play on it, and what there is performs poorly relative to equally priced competition. Delusional.

I've been watching a couple of lads test out the M1 Max chips and because there are literally only 1 or 2 games for ARM+Metal, most titles don't even run at 1080p60 because you have to use Crossover and/or Parallels.
 

Drew1440

Member
Are Apple still using PowerVR based designs for the M1 chips? The IMG CXT has recently been introduced which claims 1.5TFlops per core (but does not list the max amount of cores supported)
 

RPSleon

Member
Apple wrote the book on spin…Sony just copied it

just chiming in to say M1 chip support on most mainstream apps is pretty shitty outside of the marketing deals Apple has

u buy an M1 mac and u will quickly hate your decision
If you buy it for gaming*

I love mine. My first apple product and its great for what i do. I am kinda glad i cant game in my workspace tbh 😂
 

jufonuk

not tag worthy
I got a Mac book air with M1 chip. Does that mean I can play ps5 games ? Which of the two thunderbolt or head phone jack (wtf iPhones have no jacks but this does ?!, ) do I insert the disc into ?
 

JCK75

Member
Wow it's almost as powerful as a GPU that costs 1/4th of it's price(and that's counting the inflated price gouging currently taking place)
Can't wait until this powers the next Nintendo Switch that will retail for $9K
 

Panajev2001a

GAF's Pleasant Genius
Are Apple still using PowerVR based designs for the M1 chips? The IMG CXT has recently been introduced which claims 1.5TFlops per core (but does not list the max amount of cores supported)
Apple designs their own GPU’s and graphics API’s but do have a long term licensing agreement with IMG Technologies so they must have access to that IP and future ones too.

I think that they have a year or so before they decide to integrate Ray tracing in their GPU’s, likely they need something that can work acceptably well on the iPhone and then scale it outwards, possibly they will want to make their own customisations.

Predicting performance of the Apple SoC is interesting because a.) gaming wise Metal is still not the target of devs and few AAAA games are produced for it without layers and layers of abstractions and b.) mix of CPU, GPU, local bus, and custom memory setup is still carrying them very far (tons of caches and local storage on the SoC and very low latency main RAM on the same packages as the SoC… they are just flexing those low level balancing optimisations, bumping frequency, adding cores and execution units [in the last two years they basically doubled the execution pipelines of the “efficiency cores”], etc… they have not felt the need to bring in massive new features like HW Raytracing, HW decompression and additional co-processors for I/O… for the latter they are pushing an interface as fast as 7.2 GB/s of raw bandwidth but unclear what HW they provided to avoid saturating the CPU cores transferring all that data back and forth at that speed).
 

UnNamed

18+ Member
Apple have very powerful products, no doubt. But for now they lack of any proof their product are actually better in a real usage.

Why I do care of any M1 powerful X and Y specs if a 4K $ laptop from Apple is blown away by a much cheaper PC laptop in real games and real applications? Why I should by Apple in 2022 when software will be (maybe) optimized in 2023?
 
Top Bottom