Each time I jump from Destiny 2 on PS4 to the PS5 version, I feel like i'm playing a game from two completely different generation of consoles, instead of a last gen title with a couple of new features. 60fps makes that much of a difference.
Really depends on the game and what the design and engine choices the devs make. I use a 4k display so i'd like it be as close to possible but it isn't a deal breaker. Playing on PC primarily I know how much of a hog native 4k is.
On the consoles AMD Fidelity FX Super Resolution should hopefully provide somewhat comparable results to DLSS to allow for 1080p native to be effectively AI Upscaled to look really close to native 4K, so where possible I think it's going to be a non-issue whether devs render their games at native 4K or use some lower base res to AI upscale from.
The extra horsepower available should be used for better physics, performance and of course graphics.
I don't think just focusing on 60FPS will be necessary, a lot of more demanding games could be 1080p 120FPS, perhaps RT included to some degree and the hardware could AI Upscale the RT effects to look much better.
I don't mind lower internal res such as in Returnal if the upscaling tech results in a crisp image, essentially you don't see the difference.
However once I knew it was rendering at 1080 internally I started looking for the artefacts and found a few.
My biggest gripe is the low quality in-game face of Selene, when you pan the camera and look at that, it does look like a PS3 game.