Well I dont know about amd but there were reports were nivida did manipulate the performance of their older cards to make their newer cards look better. You have to keep in mind that on pc alot of optimization is done through gpu driver and those guys want to sell new gpus while console tech teams doing api optimizations throughout the gen. So to say punch above its weight isn't there is just wrong, the ammount tho is debatable.
kepler was rigged from the get go, not later on.
1) kepler needed specific codepath optimizations to get the best performanceo out of them
2) kepler was gimped for dx12, no proper support, only compatible with it. most dx12 features actively hurt kepler performance
the days where gpus needed driver optimizations to perform well is over. dx12 is a different beast, its now up to devs to code for architectures instead of the other way around. this is why Pascal, still after 7 years, is performant. this is why a 1080ti still can beat a brand new 3060 even in recent titles, because devs cant simply choose to ignore them. they must actively test their game so that pascal is performant for their title. this is why even in 2022, games like spiderman runs perfect on Pascal
1) it has proper dx12 capabilities
2) it does not need or depend on code optimizations from NV
the card just works. same for turing and ampere. its been 5 years since turing released, and 2080ti still performs the same or sometimes outperforms 3070 (due to vram problems sometimes but generally they're on par). same goes for other cards in the stack.
so days of kepler/fermi and where NVIDIA had full control over how their card going to perform are over. it had to be over; dx12 is not built like that. nvidia probably had to design their cards so that they are not strangled by specific driver codepaths. this shows. it is best for business interests, as a myriad of Pascal users are what drives AAA game sales on Steam. without famous 1060s 2060s and 3060s performing as expected, the game sales would be jank (as if they're not already) outside of popular multiplayer titles.
turing and ampere is even two three step further than pascal, they're actually futureproofed as long as you have enough vram with extra dx12.2 capabilities that are yet to be explored.
practically, a 12 gb 3060 is the exact opposite of what a 2 gb gtx 770 was. one was gimped from the start, never intended for future console ports (dx 12), and equipped with hilarious vram. the other is equipped with features that future console ports will use (dx12.2), and equiped with plenty of VRAM.
cards like 3060ti/2070 at 1080p will be okay. but 3070 is already gaining so much attention, 8 gb being problematic at 1440p target.
another perspective if you're not convinced: it took only 2-2.5 years for a gxt 970 to decimate a 780ti. a gtx 970 was supposed to be between a 780 and 780ti, but situation went so sour that it ended up destroying it certain dx12 titles and even in some dx11.1 titles (because kepler was so gimped that they literally did not even support dx11.1 features).
if such a thing were to happen in an era where PC gaming is much more popular and cards like 1060 and 2060 are populating steam surveys, it would be a PR nightmare+disaster. it simply cant happen anymore. PC gaming got too big for any dev or NVIDIA to purposefully gimp the performance of cards. its not even beneficial for any parties involved.