Think about what you're saying here, you're saying there is no difference between a console and PC optimisation yet you're saying the PC port is bad because the difference between PC and console fps isn't as large as other games. So why can't it work the other way? If there was no difference then wouldn't the delta always be proportional?
What I'm saying is 50fps@4k for a game that looks as good as it does is in fact good performance. it running at a higher 36fps on PS5 than a plagues tale doesn't take away from that, it shows that they possibly optimised the PS5 version of TLOU better than a plagues tale. The PS5 version of a Plagues tale running at 35fps@1440p and not really beating it in graphics isn't necessarily showing a sign of better PC performance.
What I should have said is: pc developers have ABILITY to make games run just like on consoles without much API overhead.
Its all on developers in the end and games can (and are) unoptimized. Most pc games run as they should but there are always few bad releases that should perform better.
In this comparison you have:
Aw2 - 2x performance
Plague tale - 2x performance
Tlous - only 36% better
What game is the outlier? This is first naughty dog pc game so I guess it can be excused. It doesn't show magic optimization on PS5 but lack of optimization on pc.
In the end ps5 is just 10TF RDNA2 GPU and should perform like that, there is no magic attached to it other than decompression hardware.