• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry vs Watch Dogs on Wii U

It's a requirement.



Tearing IS really bad. There is no middle ground for tearing. But here you are acknowledging that your statement of "significant number of titles" is incorrect with the tacit admission that, in fact, there are ports that are ALSO identical or improved, so there's that. Fact of the matter is that the system is in the same bracket so it's going to produce similar results. Not sure why you need to continue your hilarious trolling tirade when it's already known. Watch Dogs wii u is not a good buy, and it's clear that it wasn't a focus for Ubisoft and their port team so I'm not sure why you're defending Ubisoft here.

This is false. i'm not sure what you're talking about here. there are only 3 games where wiiu get the nod for having overall better graphics/performance in DF, while 12-13 games get a clear nod for the 360 having the better version, not even counting Disney infinti 2 where the wiiu version has the lowest resolution, and ninja gaiden razors edge on 360/ps3 where is was reported to have significantly better frame rate then the wiiu version here on neogaf by most people that played it on wiiu.
 

Gears

Member
As a owner of every console except for a XB1, along with a decent PC, I must admit I never got the wiiu with its specs in mind as a selling point. I know the reason for threads like this, but the all the systems specs and shortcomings are old news. Its essentially just a beast of a co-op machine. If I want dem good graphics, I'll make my purchase on a system that is more capable
 

Spinluck

Member
Didn't FIFA 13 Wii U outsell Watchdog Wii U since November 18th? No one gives a shit about this terd of a game, and Ubisoft diary clearly didn't care about making a sufficient port.

Buy Smash Bros or Mario Kart and call it a day.
 
It's super easy, either your game do not tear or it does, and if it do it's BAD, aka no middle ground.

ok this is ridiculous, there can be minor screen and there can be heavy tearing, just like frame rate, where it can extremely bad or minor.

quote from DF

Performance-wise, both consoles do a good job of delivering a stable, consistent frame-rate but it is the Xbox One version that delivers the smoothest experience overall. Frame-rate drops are extremely rare, only appearing in the most extreme circumstances - for all intents and purposes we're looking at a locked 30fps. Unfortunately, the higher resolution on PS4 comes with a catch in the form of noticeable performance dips during strenuous sequences. During a normal run of play, the game does a good job of maintaining the target frame-rate but frame-rate faltered sometimes during battle sequences and sometimes even traversal across the landscape. Both versions use an adaptive v-sync setup that results in torn frames when the 30fps performance target isn't met, but thankfully, torn frames are contained mostly within the top 25 per cent of the screen, with less of a serious impact on image quality than you might expect.
 
If they would've heavily altered the game world then it would've probably taken more than 6 months to develop this port. Is there also a point of comparison for how much better it would run had they completely scaled back everything about the game?

Every PC open world title with user editable configuration and an open resource monitor can show you exactly what effects stress what parts of a system, and what framerate improvements result from doing so.

If Ubisoft wanted to release Watch Dogs on WiiU at a steady framerate, they very easily could have, even with 'only' an additional 6 months development time.
The fact that it has triple buffered V-Sync and frame rate drops (like many other WiiU multiplatforms) suggests that there was a decision to optimise for image quality not performance.

The further fact that so many multiplatforms have zero teraing but framerate drops suggests that the WiiUs CPU is 'good enough' to handle unoptimised 360 code most of the time without doing much in the way of profiling or optimisation passes, and the superior GPU has enough RAM to allow for 'free' triple buffering, but that is more speculative on my part.
 

Mithos

Member
ok this is ridiculous, there can be minor screen and there can be heavy tearing, just like frame rate, where it can extremely bad or minor.

YOU ninjablade do not get to decide what other peoples opinions should be, if SteveP thinks that tearing = bad no matter how small than that is the END of it, you may disagree, but that's all, there is nothing false about his statement.
 

bobeth

Member
YOU ninjablade do not get to decide what other peoples opinions should be, if SteveP thinks that tearing = bad no matter how small than that is the END of it, you may disagree, but that's all, there is nothing false about his statement.
It's not an opinion when you claim there's no middle ground, sorry...
 
YOU ninjablade do not get to decide what other peoples opinions should be, if SteveP thinks that tearing = bad no matter how small than that is the END of it, you may disagree, but that's all, there is nothing false about his statement.

of course tearing is bad but the fact is there are variable degrees of how bad tearing is, just like frame rate he was trying to tell me there is no middle ground Like it was a fact
 
It's not an opinion when you claim there's no middle ground, sorry...

There are people in this topic claiming the WiiU has "worst" versions of multiplatform games because apparently framerate drops matter more to them than things like exclusive content or control innovations, so I don't see why its so unbelievable that someone would be just as bothered by tearing.
 
TBH there's tearing and there's TEARING, as is, is not always not so prevalent or not always is as noticiable.

Framedrops is always there, is something you see everytime it happens.
 

bobeth

Member
There are people in this topic claiming the WiiU has "worst" versions of multiplatform games because apparently framerate drops matter more to them than things like exclusive content or control innovations, so I don't see why its so unbelievable that someone would be just as bothered by tearing.
Not the point...
 

TI82

Banned
Which is a 2D or sidescroller, which Nintendo excels at. I think 3D is just a terrible option for wii u, or Nintendo consoles in general. In hindsight, 3D games were never good on Nintendo consoles anyway.

I think Rayman Legends is better than any 2d platformer Nintendo has made in a very long time. :/
 

SpiderCrab

Neo Member
Every PC open world title with user editable configuration and an open resource monitor can show you exactly what effects stress what parts of a system, and what framerate improvements result from doing so. If Ubisoft wanted to release Watch Dogs on WiiU at a steady framerate, they very easily could have, even with 'only' an additional 6 months development time.

The idea that PC graphic tweak options make it easy to port CPU bound games across platforms is laughable. Typical current gen games use job systems across their 6 cores, often combined with a heavy game and render thread. Good luck porting that down to the 3 WiiU cores. And that doesn't even begin to approach the problems of porting vector4 style math libraries down to paired singles.

The further fact that so many multiplatforms have zero teraing but framerate drops suggests that the WiiUs CPU is 'good enough' to handle unoptimised 360 code most of the time without doing much in the way of profiling or optimisation passes, and the superior GPU has enough RAM to allow for 'free' triple buffering, but that is more speculative on my part.

The WiiU GX2 API doesn't provide the option to disable vsync. It is as simple as that.
 
The idea that PC graphic tweak options make it easy to port CPU bound games across platforms is laughable.

I'm not saying you can wholesale equivocate a PCs configuration settings, I'm saying that if you need an example of what variables can affect what performance a configurable PC title offers real world examples of that.

Or are you suggesting that Ubisoft don't have the access or capabilities to change things that traditionally cause cpu bottlenecks in open world games?

Typical current gen games use job systems across their 6 cores, often combined with a heavy game and render thread. Good luck porting that down to the 3 WiiU cores.

Presumably that work has already been done to work on a three core xenon.
 

prag16

Banned
The WiiU GX2 API doesn't provide the option to disable vsync. It is as simple as that.

Citation needed. And then how do you explain Darksiders 2?


It still boggles my mind how tolerant so many of you seem to be of tearing. Frame drops suck, but tearing looks absolutely awful. But as was said it's a matter of opinion what repulses us more. If tearing repulses StevieP more than it does DF, then so be it.

I disagree vehemently with DF on this too, as I've said before. The PS3 version of Blacklist was an unplayable pile of hot garbage due to screen tearing. I had NO issue with the Wii U version, which they claimed was the inferior version. I have no horse in this race, and would have loved to have played through the PS3 version for free (borrowed from a friend). But no, I ended up PAYING for the Wii U version.

I guess both sides need to stop saying what is objectively better/worse. Opinion is a huge factor. And just because one side is backed by DF doesn't make their opinions better. I still say tearing is worse. (Even "minor" tearing; the 25% top screen tearing in AC:U on PS4 stuck out to me like sore thumb in their videos even before I noticed how the tearing is denoted on their graph.)
 
This thread is a precisely why putting technical jargon ahead of everything else is stupid and makes everyone doing so looks like a snobbish nerd in front of everyone.
 

soy.

Banned
Didn't FIFA 13 Wii U outsell Watchdog Wii U since November 18th? No one gives a shit about this terd of a game, and Ubisoft diary clearly didn't care about making a sufficient port.

Buy Smash Bros or Mario Kart and call it a day.
i agree that nobody should give a shit about this piece of turd
i mean, even ubisoft themself are shitting on what supposed to be this child of theirs...

yet gaf are discussing it heavily for over 13 pages :D
it's like we're licking ubi's huge stinkin dung :D

This thread is a precisely why putting technical jargon ahead of everything else is stupid and makes everyone doing so looks like a snobbish nerd in front of everyone.
this is a rather entertaining thread, mate!
it's not everywhere u see people blatantly admitting that he's a snobbish nerd :D
 

Amir0x

Banned
I'm not saying that it's not a bad port. I am however saying that the developer itself isn't the main issue, and like others have said, customizing graphics settings for a port isn't as easy as flipping a couple switches, setting code to wiiU and then calling it a day. Graphics aren't infinitely scalable.

Yes, it's not about flipping a couple switches.

It's about doing their damn job and doing the fucking programming necessary to tone the graphics settings down to get it to function properly. It's called putting in the damn effort or not trying at all. You don't make garbage and release on the system for full price and then say "not our fault herpity derp."

There is literally no one else at fault for releasing Ubisoft's shit product than Ubisoft. Anyone who releases a product to market is fully responsible for the state it is in. If you cannot get it to function properly, then you put in the extra effort to do so or cancel it and state why (can't get it to run decently on the system), or any million other possible scenarios in which a capable business who isn't as abhorrent as Ubisoft might actually try one day.

But of course, it is Saint Ubisoft, so it must be someone else's fault for the port being a disaster. I mean, surely someone else had their hands on programming the game for the Wii U. Clearly, they had no clue about the Wii U's limitations for like 3 years now (pre-release of the system even) and could not have made a gameplan to ensure games like this are appropriately tailored for the platform. Nopers, release it in a garbage state and blame someone else.

You're remarkable Crossing Eden, I really mean that.
 

jimi_dini

Member
This is false. i'm not sure what you're talking about here. there are only 3 games where wiiu get the nod for having overall better graphics/performance in DF, while 12-13 games get a clear nod for the 360 having the better version

Because DF doesn't seem to care about tearing.

DF: "yeah, Arkham Origins tears a lot on PS3/360, Wii U doesn't. Verdict: there is not a single reason to buy the Wii U version"

I simply can't take them seriously anymore.

Seriously? What were they thinking when they developed this system?

That there are lots of people, who get super annoyed when they see tearing. Fuck tearing. Seriously.
 

Astral Dog

Member
come on now. they went for the bare minimum power wise, just to make the console smaller, A gpu with only 176 glops, and a weak cpu to boot, it wasn't just a choice, they just don't care for specs. it was all about being low power wattage, small, and gamepad, no effort was put making descent specs.

Now, this is old, but whatever.

Wii U is what it is, but its not fair to say they did not care about specs.
They care, but in a different way, than what Sony did,
Wii U is a system with very well defined limitations, for all intent and purposes, its a seventh generation system, but more efficient , modern(in that it uses GPGPU features, and a more recent shader model) and with double memory.

It seems like a system that is not cheap to manufacture , it uses 32MB Edram, customized cpu and gpu, and the Gamepad controller, as I understood, things like Edram don't come cheap, Nintendo did not just put a 360 in another box, for example(but maybe they should have)
Why,well , as Nintendo is not as big of a company as Sony or Microsoft, and even some third party companies, they cant compete in resources, even with the money of Wii and DS,so they designed a system that was an extension of what they were used to, not to compete directly with the other next gen systems, they tried to give it other value, it just didn't work out.
but even without the Gamepad, I don't think the graphics would have changed much.

No matter what anyone says, but the jump to HD was a very difficult one, for any company, imagine the developers going from making Wii and DS games, to PS4 or One style graphics, even the jump from Wii to Wii U must have been hell to them.
I guess Nintendo at the very least had some good reasons to make the hardware underpowered, and take that risk with the Gamepad, maybe it could had even worked, with a slighty different strategy, of course.
Wii U is like 'My first HD System' for developers,and they have been releasing very good games, so there is that, but it was a bad choice in the end, they simply came far to late to the party.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Because DF doesn't seem to care about tearing.

DF: "yeah, Arkham Origins tears a lot on PS3/360, Wii U doesn't. Verdict: there is not a single reason to buy the Wii U version"

I simply can't take them seriously anymore.

That is a very sensational way of posting. If you actually read the Arkham Origins article, the following is mentioned in plain English:

- WiiU has the worst texture streaming, low rez textures stay on screen the longest.
- WiiU version has the worst texture filtering out of the consoles.
- Also the lowest shadow resolution
- The performance on WiiU is so bad that the game PAUSES for a few seconds in detailed environments, like Skyrim PS3 pre-patch.
- On top of all that, the multiplayer mode is absent from the Wii U version entirely.

Sure it has VSync engaged, but the negatives faaaaaar outweigh the one single positive, CLEARLY DF is teh bias !

But you go ahead and pick and champion the one factor that suits your personal needs better :)
 

prag16

Banned
That is a very sensational way of posting. If you actually read the Arkham Origins article, the following is mentioned in plain English:

- WiiU has the worst texture streaming, low rez textures stay on screen the longest.
- WiiU version has the worst texture filtering out of the consoles.
- Also the lowest shadow resolution
- The performance on WiiU is so bad that the game PAUSES for a few seconds in detailed environments, like Skyrim PS3 pre-patch.
- On top of all that, the multiplayer mode is absent from the Wii U version entirely.

Sure it has VSync engaged, but the negatives faaaaaar outweigh the one single positive, CLEARLY DF is teh bias !

But you go ahead and pick and champion the one factor that suits your personal needs better :)
Nevertheless, they still don't seem nearly as bothered by tearing as many of us here. Granted, the Arkham Origins example he chose wasn’t the best one.
 

Seik

Banned
All this talk about Vsync is getting me depressed.

Screen tearing should be made illegal, like, as a criminal act and be a base of quality for all games. :p

I wish it wasn't such a hitter on framerate too. Though I'm glad Nintendo seems to make this obligatory with their games.

Note that I'm not defending the Wii U ports or anything like that, just saying. To follow topic: the Batman games, while they have vsync, suffer from being freaking slideshows sometimes and I just can't. (Well, I only played Arkham City so far, but I guess Origins ain't much better, right?)
 

dark10x

Digital Foundry pixel pusher
Nevertheless, they still don't seem nearly as bothered by tearing as many of us here. Granted, the Arkham Origins example he chose wasn’t the best one.
I am bothered a lot by tearing myself *BUT* most UE3 games use a dynamic v-sync solution. That is, screen tearing only appears when the frame-rate dips below 30 fps. On 360 UE3 games generally offer a fairly steady 30 fps with occasional dips.

What I'm saying is that all tearing is not created equal. Watch Dogs on PS3 or 360 is an example of bad tearing where the issue is visible at almost all times. That's just not acceptable but in a lot of other games the problem is far less severe.

Tearing is quite rare on the new consoles, however, with the few exceptions typically taking a different approach where tearing can only appear within the to 25% of the screen and not to the same egregious amount that we find unacceptable. Most games are triple buffered, though.

The PS3 version of Blacklist was an unplayable pile of hot garbage due to screen tearing. I had NO issue with the Wii U version, which they claimed was the inferior version.
I still take issue with this. I wouldn't consider any of the console versions playable, to be honest. The tearing ruins the game on 360 and PS3 *BUT* the WiiU average frame-rate is low enough that it makes the game difficult to play. The PC version is the only truly solid version of the game in that it looks and runs dramatically smoother.

I mean, I could barely stomach the performance problems in The Evil Within yet, in reality, its frame-rate was still much higher than Blacklist on WiiU which people here seem to think was a great port. :\ I'm just surprised at the violent response against tearing when those same people turn around and don't seem to mind ~20 fps.
 

jimi_dini

Member
Sure it has VSync engaged, but the negatives faaaaaar outweigh the one single positive, CLEARLY DF is teh bias !

But they didn't write that. They literally wrote "not a single reason", which is factually incorrect.

For Arkham Origins there are actually 2 reasons, which I consider extremely important.

1. no tearing at all
2. less glitches, less freezes

But hey, that's just me. I know, it's hard to understand. I got used to broken software quality and freezes and tearing and shit on PS3 as well. But I'm currently getting used again to proper software quality.

It's simply about choice. When you call yourself objective, then you should simply list all the negatives and all the positives, even when you personally don't consider the positives to be worth it.
 

adamsapple

Or is it just one of Phil's balls in my throat?
But they didn't write that. They literally wrote "not a single reason", which is factually incorrect.

For Arkham Origins there are actually 2 reasons, which I consider extremely important.

1. no tearing at all
2. less glitches, less freezes

But hey, that's just me.

What is your source on Reason 2 ? Please don't say "Personal experience" because i could make a whole book out of personal experiences.

The article acknowledges micro-freezing of few seconds on the WiiU whenever the engine is under load, this directly negates "less freezing" etc.

Regarding Tearing, sure i can imagine that is a plus, i personally know people who get nauseous playing games with a lot of tearing, i'm not one of them. However, in this particular case, the hit on frame rate seems to be big enough that it makes the game not tearing, not look like anything special.

Regarding SC Blacklist above, yes i've recently played through the PS3 version, and even that version isn't a smooth locked 30 FPS performer, however the DF article says it has the best frame rate of the console versions. I can't imagine how bad the judder and inconsistent the controller response is on the other versions, especially WiiU.

I, personally, have little to no issue with tearing if it means better frame rate. Playable frame rate should always be far more important.
 
I'm just surprised at the violent response against tearing when those same people turn around and don't seem to mind ~20 fps.

It is amusing but not surprising. I hate tearing as much as the next guy, but come on... We are reaching unplayable frame rates here.
 

jimi_dini

Member
What is your source on Reason 2 ? Please don't say "Personal experience" because i could make a whole book out of personal experiences.

for example:
http://www.polygon.com/2013/10/25/5026574/batman-arkham-origins-review-knightfall

In the five or so hours I've put into the game since that patch, I've had around half a dozen crashes, including some cases where the game would not even load to the main menu following a crash. In addition to my own problems with Arkham Origins, a cursory look at WB's support forums makes it clear that I'm not alone in these issues. Some players have avoided major problems, but others have run into constant crashes, corrupted save data, and a major bug that prevents use of one of the game's fast travel points. Most distressingly, these issues appear to exist across all platforms, save one. We have not been able to find word of any major problems with Arkham Origins on the Wii U.

I also know quite a few people who played multiple versions and they say exactly that as well. Simply use google.

Maybe it's because Wii U has more memory. I know quite a few PS3 games that really love freezing after playing for a few hours at once. Possibly memory leaks. Maybe because it was ported by a different developer?

Oh and btw. there is even a thread on GAF: http://www.neogaf.com/forum/showthread.php?t=704251, plenty of GAFfers say that as well.

The article acknowledges micro-freezing of few seconds on the WiiU whenever the engine is under load, this directly negates "less freezing" etc.

freezing as in the console freezing, which means that stupid beep beep beep on PS3.

Regarding Tearing, sure i can imagine that is a plus, i personally know people who get nauseous playing games with a lot of tearing, i'm not one of them.

That's great and all, but that's your opinion. My opinion is that tearing is unacceptable.

Regarding SC Blacklist above, yes i've recently played through the PS3 version, and even that version isn't a smooth locked 30 FPS performer, however the DF article says it has the best frame rate of the console versions.

I guess because of all the tearing. I played Wii U + PS3 myself. Enjoyed Wii U way more.
 
They must have been N64 fans...

it's probably them being biased and grasping for straws. i mean come on, over half of those have those ports where wiiu loses in DF verdict have a huge performance gap in frame rate to games that have minimum tearing on 360 , yet they have no problem playing a slide show, cause minor tearing ruins the game.
 
it's probably them being biased and grasping for straws. i mean come on, over half of those have those ports where wiiu loses in DF verdict have a huge performance gap in frame rate to games that have minimum tearing on 360 , yet they have no problem playing a slide show, cause minor tearing ruins the game.

Good thing is that they can buy a PC, enable V-Sync and not buy a new GPU for 15 years.
 

Hermii

Member
Now, this is old, but whatever.

Wii U is what it is, but its not fair to say they did not care about specs.
They care, but in a different way, than what Sony did,
Wii U is a system with very well defined limitations, for all intent and purposes, its a seventh generation system, but more efficient , modern(in that it uses GPGPU features, and a more recent shader model) and with double memory.

It seems like a system that is not cheap to manufacture , it uses 32MB Edram, customized cpu and gpu, and the Gamepad controller, as I understood, things like Edram don't come cheap, Nintendo did not just put a 360 in another box, for example(but maybe they should have)
Why,well , as Nintendo is not as big of a company as Sony or Microsoft, and even some third party companies, they cant compete in resources, even with the money of Wii and DS,so they designed a system that was an extension of what they were used to, not to compete directly with the other next gen systems, they tried to give it other value, it just didn't work out.
but even without the Gamepad, I don't think the graphics would have changed much.

No matter what anyone says, but the jump to HD was a very difficult one, for any company, imagine the developers going from making Wii and DS games, to PS4 or One style graphics, even the jump from Wii to Wii U must have been hell to them.
I guess Nintendo at the very least had some good reasons to make the hardware underpowered, and take that risk with the Gamepad, maybe it could had even worked, with a slighty different strategy, of course.
Wii U is like 'My first HD System' for developers,and they have been releasing very good games, so there is that, but it was a bad choice in the end, they simply came far to late to the party.

This. When a game is tailored around the limitations of the console and utilizes its strong points it is incredible what it does with specs that are so weak on paper. The specs are well thought out so that real world results gets very close to those theoretical peak performances which are being payed so much attention to on other systems.

This topic is discussed to death already, but with hindsight in my opinion they should have continued to evolve the wiimote and sold it at a cheaper price point rather than going for the gamepad. The Wii U should have been a Wii in HD with more accurate motion controls imo.
 

KageMaru

Member
I'm not saying you can wholesale equivocate a PCs configuration settings, I'm saying that if you need an example of what variables can affect what performance a configurable PC title offers real world examples of that.

Or are you suggesting that Ubisoft don't have the access or capabilities to change things that traditionally cause cpu bottlenecks in open world games?

These PC config settings would have to be able to adjust number of Ai characters, physics, geometry, and other parts of the game that hit the CPU if that's the bottleneck on the Wii-U. Not sure what graphical options open world games offer these days, but I don't think your assumption applies at all.

Presumably that work has already been done to work on a three core xenon.

You presume incorrectly. Game code is written to the 3 core/6 threaded Xenon chip. So to moving that over to the 3 core/3 thread low clock Wii-U CPU must be a daunting task.

Yes, it's not about flipping a couple switches.

It's about doing their damn job and doing the fucking programming necessary to tone the graphics settings down to get it to function properly. It's called putting in the damn effort or not trying at all. You don't make garbage and release on the system for full price and then say "not our fault herpity derp."

There is literally no one else at fault for releasing Ubisoft's shit product than Ubisoft. Anyone who releases a product to market is fully responsible for the state it is in. If you cannot get it to function properly, then you put in the extra effort to do so or cancel it and state why (can't get it to run decently on the system), or any million other possible scenarios in which a capable business who isn't as abhorrent as Ubisoft might actually try one day.

But of course, it is Saint Ubisoft, so it must be someone else's fault for the port being a disaster. I mean, surely someone else had their hands on programming the game for the Wii U. Clearly, they had no clue about the Wii U's limitations for like 3 years now (pre-release of the system even) and could not have made a gameplan to ensure games like this are appropriately tailored for the platform. Nopers, release it in a garbage state and blame someone else.

You're remarkable Crossing Eden, I really mean that.

When it comes to graphics settings, I feel like it's a lose lose situation for the devs. You keep the graphics at a comparable level to last Gen and performance suffers, you dial back the graphics/AI/physics/etc. to improve performance but your game doesn't even look as good as last Gen versions now. Either way the devs would be labeled as "lazy".

I do agree that as a publisher Ubisoft should have done more or just pulled the game entirely.

Because DF doesn't seem to care about tearing.

DF: "yeah, Arkham Origins tears a lot on PS3/360, Wii U doesn't. Verdict: there is not a single reason to buy the Wii U version"

I simply can't take them seriously anymore.

You are leaving out a lot of context with that generalization here. Reading every article it's not hard to see why few articles have claimed the Wii-U as the winner.
 

prag16

Banned
You are leaving out a lot of context with that generalization here. Reading every article it's not hard to see why few articles have claimed the Wii-U as the winner.

He's right about the first general statement he made though. DF absolutely doesn't care about tearing nearly as much as many of us here, and doesn't seem to care about tearing as much as many other graphical shortcomings. For example they said Wii U SC Blacklist was worse than PS3 version, but the PS3 version made my eyes bleed (okay, exaggeration, but it was a poor experience imo due to all the tearing) while the Wii U version was pretty much fine for me despite some worse drops than the PS3 version that I hardly noticed; the PS3 version was so busy tearing that I couldn't appreciate the possibility that framerate was a little better at times.

This is one of the reasons why I only read DF for the objective analysis aspect, and totally disregard most of the subjective analysis.
 

danmaku

Member
Please let me disable V-Sync for Bayo1, so the game actually plays as well as the 360 version.

This ^

I used to think tearing was the worst, but Bayo1 slows down to a crawl in certain sections due to vsync. It's as if witch time was activated for everyone, including me. It fucks up my inputs and it's a pain in the ass. If the alternative is tearing, please give me tearing.
 
This ^

I used to think tearing was the worst, but Bayo1 slows down to a crawl in certain sections due to vsync. It's as if witch time was activated for everyone, including me. It fucks up my inputs and it's a pain in the ass. If the alternative is tearing, please give me tearing.

Finally someone else who noticed this. DF certainly didn't mention it in their tech analysis. I really wish they gave the option in the game at least. Maybe I should tweet Kamiya!
 

adamsapple

Or is it just one of Phil's balls in my throat?
Finally someone else who noticed this. DF certainly didn't mention it in their tech analysis. I really wish they gave the option in the game at least. Maybe I should tweet Kamiya!

He would probably block you before replying to that tweet :p
he's in a particularly block friendly mood the past few days.
 

KageMaru

Member
He's right about the first general statement he made though. DF absolutely doesn't care about tearing nearly as much as many of us here, and doesn't seem to care about tearing as much as many other graphical shortcomings. For example they said Wii U SC Blacklist was worse than PS3 version, but the PS3 version made my eyes bleed (okay, exaggeration, but it was a poor experience imo due to all the tearing) while the Wii U version was pretty much fine for me despite some worse drops than the PS3 version that I hardly noticed; the PS3 version was so busy tearing that I couldn't appreciate the possibility that framerate was a little better at times.

This is one of the reasons why I only read DF for the objective analysis aspect, and totally disregard most of the subjective analysis.

I think you should read that face off again. Yes the PS3 has tearing but the frame rate on the Wii-U seems to drop to low 20s/high teens any time there's action with 2 or more enemies. The PS3 frame rate is more than just a little bit better here. You might be fine with that poor performance but I'd bet most people would find the tearing to be less destructive to the experience. On top of that, you're ignoring the downgrade in assets and disgustingly long load times.

If it's honestly hard for you to see why they think the PS3 version is better, I'm not sure what to tell you.
 

dark10x

Digital Foundry pixel pusher
Finally someone else who noticed this. DF certainly didn't mention it in their tech analysis. I really wish they gave the option in the game at least. Maybe I should tweet Kamiya!
I didn't mention it as I didn't notice it.

I wish I still had a CRT handy to really test properly. When I was playing the two back to back I had them run into the video capture setup which adds a lot of lag already.

you're ignoring the downgrade in assets and disgustingly long load times.
Yeah, the loading times were insanely long and the game lacked high-res textures on WiiU.
 

prag16

Banned
I am bothered a lot by tearing myself *BUT* most UE3 games use a dynamic v-sync solution. That is, screen tearing only appears when the frame-rate dips below 30 fps. On 360 UE3 games generally offer a fairly steady 30 fps with occasional dips.

What I'm saying is that all tearing is not created equal. Watch Dogs on PS3 or 360 is an example of bad tearing where the issue is visible at almost all times. That's just not acceptable but in a lot of other games the problem is far less severe.

Tearing is quite rare on the new consoles, however, with the few exceptions typically taking a different approach where tearing can only appear within the to 25% of the screen and not to the same egregious amount that we find unacceptable. Most games are triple buffered, though.


I still take issue with this. I wouldn't consider any of the console versions playable, to be honest. The tearing ruins the game on 360 and PS3 *BUT* the WiiU average frame-rate is low enough that it makes the game difficult to play. The PC version is the only truly solid version of the game in that it looks and runs dramatically smoother.

I mean, I could barely stomach the performance problems in The Evil Within yet, in reality, its frame-rate was still much higher than Blacklist on WiiU which people here seem to think was a great port. :\ I'm just surprised at the violent response against tearing when those same people turn around and don't seem to mind ~20 fps.

I'm surprised at the violent response to 20fps when those same people don't seem to mind horrendous full-screen tearing by comparison.

In Blacklist on Wii U, there generally only seemed to be obtrusive issues during larger firefights with a lot going on. Suck scenarios were pretty few and far between while going full stealth through the entire game. Maybe that's why it barely bothered me. But in the PS3 version, almost any time I rotated the camera while anything beyond the simplest of things was onscreen, I was treated to many delightful torn-to-shreds frames.

To me the choice was clear. Though I will agree with you on the other big complaint; the load times were horrendous. The saving grace was that deaths/reloads were nearly instant.

I think you should read that face off again. Yes the PS3 has tearing but the frame rate on the Wii-U seems to drop to low 20s/high teens any time there's action with 2 or more enemies. The PS3 frame rate is more than just a little bit better here. You might be fine with that poor performance but I'd bet most people would find the tearing to be less destructive to the experience. On top of that, you're ignoring the downgrade in assets and disgustingly long load times.

If it's honestly hard for you to see why they think the PS3 version is better, I'm not sure what to tell you.

I just pulled up that faceoff while I was replying to dark10x. You're exaggerating what they actually said. I played this game full blown stealth mode and this isn't a twitch shooter. Some drops to 20 are far less jarring in that context than full screen tearing.

It's possible for differences of opinion, you know. It's why as I said before I value the objective analysis from DF, but tend to ignore many of the subjective conclusions drawn, and other such editorial commentary.
 
Top Bottom