Xbox One vs. PS4: The overlooked element.
This isn't going to be an article that pits one console against the other. But rather the rationale currently being used against itself. We all know by now that on paper the PS is more powerful than the Xbox One, and we also know that there are a growing number of examples out there of the PS4 showcasing much better results than the Xbox One.
But the problem with the arguments and the conclusions drawn is that, if true, they don't bode well for either platform.
Recently, I wrote a post based on a comparison of the new Tomb Raider game on both the Xbox One and PS4. And the most surprising part was not that the Xbox One was indeed capped at 30fps, but that the PS4 wasn't actually able to maintain a stable 60fps. And not only was it not stable, but throughout virtually the entirety of the gameplay shown it was never anywhere near 60fps.
This is disconcerting for one very simple reason. If people are right, and the ONLY reason for the performance gap is the quality of the hardware and not how well it was utilized. It means even the more powerful of the 2 consoles isn't truly able to play a revamped last gen title at 1080p + 60fps settings.
That should take the wind out of the sails of proponents of BOTH consoles.
IF any of that were absolute truth, it would mean that you can expect gaming to NOT get substantially better in these new platforms.
But, the problem is this; the specs don't tell that story from last gen to this gen. There is 16 times more RAM in both current gen consoles vs. the last gen and that memory is faster. The processors have more cores, and while the clock speed per core may be lower, the efficiency of those cores is much greater and those cores can clock up to closer to the 3GHz mark all while adding new instruction sets.
The story is the same on the GPU front. Both current gen consoles sport massive gains in terms of dedicated graphics memory, GPU speeds and support for new instruction sets means these systems are theoretically massively more powerful than their predecessors. This article states that the Xbox One is 10x more powerful as a standalone unit than the Xbox 360.
All that you should need to read into these is this. If both systems were fully utilized in current gen titles, there should be no way that either system should be unable to play even a revamped version of any last gen game at 1080p and 60fps. And they should be stable there. In many cases they should even have been able to go up to a stable 120fps to match the frequency of the now typical 120Hz flat screen TV.
The fact that not even last gen games ported over are reaching those levels SHOULD tell everyone critiquing these differences that the hardware isn't the true story. A failure to properly optimize these games for the hardware they're on and a lack of time for the platform and SDK's to mature lead to early games which SHOULD have been able to run better faster instead hitting limits bounded by system elements which should not have been a factor in the first place.
But this is typical with every new console release. Go compare some of final titles out for the PS2 to launch titles on PS3 and the same for the original Xbox and the Xbox 360. In virtually every case, the prior gen title had drastically better looking titles on its way out than the new console had on the way in. And for the same reasons, if people had ported games across both prior and next gen, you would have seen the same under-achieving results.
Reference hardware and SDK's haven't been available for a full AAA game development cycle. The drivers and SDK's haven't matured to the same level. And the studios haven't had time to play with the tools.
I have no doubts that both the PS4 and Xbox 1 a year or two from now will both be pumping out games more visually advanced than Tomb Raider running in native 1080p and stable at 60fps.
But the problem with the arguments and the conclusions drawn is that, if true, they don't bode well for either platform.
Recently, I wrote a post based on a comparison of the new Tomb Raider game on both the Xbox One and PS4. And the most surprising part was not that the Xbox One was indeed capped at 30fps, but that the PS4 wasn't actually able to maintain a stable 60fps. And not only was it not stable, but throughout virtually the entirety of the gameplay shown it was never anywhere near 60fps.
This is disconcerting for one very simple reason. If people are right, and the ONLY reason for the performance gap is the quality of the hardware and not how well it was utilized. It means even the more powerful of the 2 consoles isn't truly able to play a revamped last gen title at 1080p + 60fps settings.
That should take the wind out of the sails of proponents of BOTH consoles.
IF any of that were absolute truth, it would mean that you can expect gaming to NOT get substantially better in these new platforms.
But, the problem is this; the specs don't tell that story from last gen to this gen. There is 16 times more RAM in both current gen consoles vs. the last gen and that memory is faster. The processors have more cores, and while the clock speed per core may be lower, the efficiency of those cores is much greater and those cores can clock up to closer to the 3GHz mark all while adding new instruction sets.
The story is the same on the GPU front. Both current gen consoles sport massive gains in terms of dedicated graphics memory, GPU speeds and support for new instruction sets means these systems are theoretically massively more powerful than their predecessors. This article states that the Xbox One is 10x more powerful as a standalone unit than the Xbox 360.
All that you should need to read into these is this. If both systems were fully utilized in current gen titles, there should be no way that either system should be unable to play even a revamped version of any last gen game at 1080p and 60fps. And they should be stable there. In many cases they should even have been able to go up to a stable 120fps to match the frequency of the now typical 120Hz flat screen TV.
The fact that not even last gen games ported over are reaching those levels SHOULD tell everyone critiquing these differences that the hardware isn't the true story. A failure to properly optimize these games for the hardware they're on and a lack of time for the platform and SDK's to mature lead to early games which SHOULD have been able to run better faster instead hitting limits bounded by system elements which should not have been a factor in the first place.
But this is typical with every new console release. Go compare some of final titles out for the PS2 to launch titles on PS3 and the same for the original Xbox and the Xbox 360. In virtually every case, the prior gen title had drastically better looking titles on its way out than the new console had on the way in. And for the same reasons, if people had ported games across both prior and next gen, you would have seen the same under-achieving results.
Reference hardware and SDK's haven't been available for a full AAA game development cycle. The drivers and SDK's haven't matured to the same level. And the studios haven't had time to play with the tools.
I have no doubts that both the PS4 and Xbox 1 a year or two from now will both be pumping out games more visually advanced than Tomb Raider running in native 1080p and stable at 60fps.
Comments
Post a Comment