You have no idea how this works right?
Imagine you can render your Ratchet at native 4K 60 fps.
But now you want the super cool ray-traced reflections, unfortunately you can't have nice reflections at native 4K 60fps, the RT cores are weak, therefore you have to compromise.
It's even worse if AMD doesn't have dedicated RT cores like NVIDIA, again the hardware is weak.
You are insulting other users because you patethically want to comment something you don't know anything about lol.
This is exactly what I was talking about in the OP.
You should click it, it will show all the Jaguar CPUs and how the consoles APU were superiors to everything else, including APUs made for desktops.
The anandtech article quotes an opinion (that you can't even understand) of the journalists.
The wikipedia page show facts.
Next step logic but oh boy, you have a lot of work to do on it.
The dedicated RT hardware is an opportunity not a cost.
Can be used for many things, including RT lighting.
If it is powerful enough can replace things that were done with the rest of the hardware therefore freeing resources.
Yes they look good but not jaw dropping like previous next gen titles.
More insults to a random users on the internet, you are definitely the stable guy here
If more detailed graphics equal more man hours, how are indie teams made with an handful of people able to create games that looks much better than anything else created in the previous generations?
You may be a "graphics whore" but you don't seem tu understand much about it, sad...