Xbox Series X/S vs. PlayStation 5 - A direct comparison and the Ars launch-month verdict
This data is scraped automatically and may be incorrect.
- Kyle Orland and Sam Machkovech
- Nov 22 2020
- Word count
- 1068 words
We've definitely hit a point where game consoles, in trying to keep up with PC's, are going to continue to have a harder and harder time keeping up with the performance in a small form factor.
Also, I do absolutely relish in this quote, considering how long my console buddies kept insisting 'no 30 FPS is fine'.
That was never anything but Stockholm Syndrome
That's not true at all, as long as a framerate is 30ish and steady I honestly cannot tell while playing games.
My PS4 Pro delivered pretty enough graphics already. Watching from the couch, most of the time I can't even tell the difference between 1080p and 4k.
The one exception is framerate: with 60 FPS, I definitely can see the difference. The smoothness is extremely pleasant to my eyes.
I'm not even a competitive gamer, so the gains for me are mostly aesthetical.
Have you had a chance to play on a monitor/TV that supports ≥120hz, and with a GPU capable of keeping the framerate consistently above that yet? Because, at least in my experience, the difference between my old 60hz monitors and my new 100/120/144hz ones was pretty apparent, especially when playing shooters, or whenever rapidly shifting my POV around even in slower paced games. I haven't had a chance to experience 240hz myself yet, but I imagine it's only at that point that it will actually be incredibly hard to notice any significant differences with the naked eye, and the returns will likely start getting diminishing at that point too.
p.s. You might also be interested in this (which seems to support my feelings/assumptions):
Does High FPS make you a better gamer? Ft. Shroud - FINAL ANSWER
Agreed that 120FPS is beautiful, I'll also acknowledge that the smoothness does start hitting diminishing returns harder after about 85 fps.
However, FPS also determines the latency added to the output/input stack (screen -> eye -> button press -> game engine -> screen)
At 30 fps, that's 33.3 ms between frames, which is a pretty high latency to add to the stack (time from making your input to seeing frame rendered).
60fps: 16.6 ms
120fps: 8.3 ms
After 120fps, reducing latency will mostly have to come from the rest of the stack, specifically display lag and input device lag. Input latency has gotten a lot worse since switching away from CRT and wired interrupt-based controllers to LCD and USB/wireless. A Sega Genesis hooked to a CRT TV has 33ms of total input latency (from button press to screen).
Nice video on that last bit: https://www.youtube.com/watch?v=_qIj_Ooq85Q
There obviously seems to be a soft limit on noticeable increases. But it's always seemed painfully obvious that people who claim they can't tell the difference were just seeing it on mismatched hardware. Of course you won't be able to notice 120fps on a 59hz monitor. I am (sort of consistently ;-;) able to run Destiny 2 at around 90 fps (having a 144hz monitor) and whenever it jumps up there it's so clear that I have to stop and behold it lol
If you actually can't tell the difference between 60 and 120 you should try VR. It's much more FPS sensitive.
That kinda goes both ways. I think the 4K/60fps obsession is a way louder force on the internet than the people insisting that 30fps/1080p is fine. I actually think it kinda brought down Halo: Infinite. They wanted that 4K/60fps label since it now seems to be the only thing that matters in graphics marketing and then people saw the actual footage and went, "uhm, that doesn't look so good". Well, of course it doesn't! They made a box that's twice as powerful as the One X (this might actually be one of the subtlest generational processing power upgrades in console history) and threw 4 times the pixels at the screen at 60fps, so they had to make the graphics/lighting way worse.
I have a gaming PC that generally can do 60 fps, but certain genres aside (competitive shooters, racing games, etc) 30fps is... fine. For my entire playthrough of Breath of the Wild, never did I stop and think "huh, this would be much better at 60fps/4K". 60fps is a sport. If graphics don't matter or extremely fast movement is important, go 60fps, sure. But for everything else, it's a case of "this feels 10% smoother", which, at double the hardware requirements, is not worth it for me. I much rather see that processing power invested in things like better lighting, raytracing or generally improved LOD/view distance. It's something I actually notice. And it seems a lot of recent AAA releases give you the option for exactly that reason.
I think more than anything 4K isn't really worth it at all if you have to sacrifice fidelity, especially if you're more than 8 ft from a TV. Increasing pixel density at this point only helps if you've already maxed graphics settings.
High quality, 60fps 1080p is far preferable to low quality 4k. I've got a GPU that easily hits that, and 60fps 4k on very low settings sometimes.
Warning: broad generalizations follow:
The problem is that consoles have to try to keep up with the marketing that they can't reasonably keep up with. Upper-midrange PCs can do 4k60fps easily now, and PC games benefit from users being able to make that choice and upgrade as needed. But console users see that happening, and start demanding it (because it's now obvious they're not 'the best'), and Sony/Microsoft chase that game and market 4k that isn't really 4k, or make huge sacrifices to do so (because their hardware can't keep up).
Looks like the new consoles can hit that. But when current highest-end PC capability (currently capable of 8k60fps or higher) starts hitting the upper-mid range, the flamewars will rise anew and you'll start seeing that same kind of '8k but with massive sacrifices' again.
I'd actually go a little further with the marketing problem: You can't really tell the difference, nowadays. 5 year old games look practically the same as games that come out today. That certainly wasn't the case up until the late 00s. Every new console generation easily had a 5x performance increase, sometimes way more. That's just what you got by waiting 5 years. And the difference was huge. 2D->3D. Nearest neighbor texture sampling -> bilinear filtering. Static lighting -> real-time shadows. Things that you could actually categorize as "not there before, now it's there!".
Nowadays, it's "it was there before, but now it looks slightly better at certain angles". I'm actually quite excited about raytracing but I have to admit, the difference to screen-space reflections is only visible in niche cases. And that's the one big example I can come up with.
What's so convenient about 4K/60fps is that it's a measurable number. You can't argue with it being better than 1080p/30fps, it just is. It's math, goddammit! You don't have to put a big, red arrow pointing at that bush in the background being reflected in a puddle, it's just a better number you can't argue with. But it takes up massive amounts of hardware power and it is – IMO – just as subtle in most games. I'd much rather see that hardware working on better shaders and animations used for clever art direction. But that's harder to sell. It's not a number you can just stick into your game ad and get people excited. And that's why the 4K/60fps narrative is going a bit on my nerves. It will probably take another generation before we can actually get 4K/60fps as a stable default and that's because it's simply not that important. Modern AAA games sell because of production value, not tech. Asking for higher resolution or framerates often means just asking for fewer assets drawn and less detailed effects. I don't want that, at least not for games where half the appeal is cinematic immersion.
I think that depends on your device. I have a large 4k TV, and 1080p vs 4k is extremely noticeable. I don't want to say you can see the pixels, but you can kinda see the virtual pixels with 1080p. 1080p is just not a lot of information when spread over the very large TVs, that are also very affordable, that exist today.
I would take 4k 30fps over 1080p/60fps for anything that's not like a fighting game.
I feel like the whole "30fps is fine, you can't even see 60fps" was just propagated by console fanboys haha. If someone hasn't seen 60fps before then I get it, but after having played PC games for a few years now, 30fps is painful to look at. On some games, I can even run near 144fps and, having a 144hz monitor, that is even more amazing.
I'm leaning towards the Xbox Series X in a year or two. The backwards compatibility is probably my most important desired feature, and you can place the system into developer mode to access retroarch. Also, that quick resume has me jealous coming from a gaming PC.
This time around, I will be going out of my way to buy when there is a limited edition console for sale. If I end up selling it early, I can hopefully get a few extra $ from a limited edition console than I would from the plain console. PS5 will eventually come down and I will probably buy one when there is a slim version. It is way too large as is for my media area.
I didn't know about that at all, but just watched this video earlier today that walked through the possibility of doing it on a Series S: The $299 XBOX Series S is an Emulation Beast
That's actually really cool, I had no idea it was so easy to set it up for emulation. I assumed all the modern consoles were heavily locked-down. Seems like the Series S could be a great option for an emulator box or even putting inside something like a MAME cabinet.
Part of me does wonder if the DualSense controller, arguably the most interesting part of these "next-gen" consoles, is going to go the way of Nintendo's HD Rumble - hardly ever used except by certain first-party games, and even then never utilized to its full extent (outside of demos like Astro's Playground or 1-2-Switch). I certainly hope not.
Silly as it may sound, the PS5's controllers are selling me on the PS5. I love cool shit like that and it's surprising me that Sony is innovating that way. It definitely does remind me of something Nintendo would have tried. I really hope more games utilize it
Honestly, I hope it takes off. The Xbox has had something like this since last gen. Playing Forza with force feedback coming through the haptic triggers really is something to behold and I'm disappointed more games from last gen didn't make use of it. I've also enjoyed it in Gears and Tomb Raider.
I'm not sure what the big difference between the Xbox One controller's and the PS5's controller implementation is but even if the Xbox One controller's is a baseline, I think it's something to implement across the board.
To me one of the most important points of comparison is that, starting with the PS3, all Sony consoles (with the exception of the Vita) where significantly more buggy and less resistant to failure than its competitors. On the PS4 Pro messing with safe-mode, rebuilding the database, trying over and over to turn it on and following multiple online tips to get it to work was a common occurrence for me. A power failure can mean a world of annoyances. The PS5 is not different: https://kotaku.com/the-ps5-is-kind-of-buggy-right-now-1845727223. It even has some of the same problems of the PS4!
I’m not sure why many reviews and comparisons fail to take that into account.
I expect a videogame console to work 99.99% of the time.
Me and most of my gamer friends are extremely fed up and don’t even consider buying a PS5 just because of Sony’s history of bugginess and constant annoyances on its consoles.
I feel like that's the kind of thing that reviews wouldn't really get into, because they tend to only come up over longer-term use and when a huge number of the consoles are out in the world. If there's a 1% chance of a failure issue, it would be very unlikely for a reviewer to run into it, but it could affect thousands of people once the actual release happens. I don't know if it happens, but I also wouldn't be surprised if companies put review units through stricter quality control to make sure reviewers get ones without issues.
I definitely wouldn't have chosen the PS3 generation as the starting point though. The "red ring of death" was a huge issue with Xbox 360s, with tons of people having to get their consoles replaced or even trying crazy home fixes like putting it inside the oven for a few minutes (it was supposed to fix the soldering or something).
Oh yeah, reviews like this one definitely won't have the time for this kind of issues to present. But there are definitely enough complaints right now to warrant at least a mention, I think. Especially because many complaints are very similar to those from the PS4.
In my experience, though, the risk of moderate/solvable failure on a PS3/PS4 is a lot higher than 1%--I wasn't really talking about catastrophic failures such as rings of death. Both Xbox 360 and PS3 had their versions of those.
I'm talking about pervasive annoyances that impact day to day operation. I've had an Xbox 360 that worked liked a charm until I sold it many years later. PS3/PS4, on the other hand, were a constant source of grievances. I don't think there's much attention for that on the press, maybe because that is not very enticing reporting. For what it's worth, my friends who have both Sony and Microsoft consoles (current and previous generation) report me that Sony gives them a lot more headaches overall.
EDIT: I'm also not an early adopter, so catastrophic failures are usually sorted out in the revisions I buy. Small annoyances, not so much.
It's kinda incredible looking back at the 7th gen just how unreliable those machines were. Both the 360 and PS3 had major overheating problems and were generally just finnicky beasts.
I don't have a PS4 or Xbox One, but in my experience both the PS3 and 360 were prone to crashes, big performance drops even in menus, and other little glitches all the time. Looking back further, I didn't feel like stability was really a problem on the PS2 or OG Xbox. Not saying your experiences are invalid, but for me it'd at most be one gen of them having worse stability than Microsoft.
For me it comes down to the fact that last generation Sony blew Microsoft out of the water when it came to exclusives. As long as studios like Naughty Dog (Last of Us, Uncharted) and From (Bloodborne, not a first party studio but a negotiated exclusive) are only available on Playstation I'll continue to buy Playstations. Now, if Halo ever returns to form I might be convinced to go back.
I actually do prefer the UX on Xbox, but exclusives matter more to me.
I personally did not care for any of the exclusives. I may have a weird taste.
Ugh, this is making justification for a brand new gaming PC difficult. Thinking of a rig with the new Zen 3's, Ampere/RDNA2 and Valve Index. I know the specs aren't comparable, but for the price/performance ratio and PS5 exclusivess...
Really I'm just looking for someone to help me justify this, as I'm probably still going to go for the PC...
You'll be contributing in fighting the facebookulusization of VR!
Just up the thread @nothis made a great point about how minor the generational changes are now compared to what we used to see. Gaming is a mature industry, and for all the good that brings it does make for far fewer jaw-dropping "holy shit" experiences.
VR is something entirely different - it's as big a change as the move from 2D to 3D all those years ago, and the current crop of hardware can comfortably push the limits of any PC out there.
That's not an unequivocally good thing: the whole field is still rough around the edges, and much as I hate to say it the overall UX is quite a lot cleaner on the Quest than on SteamVR, but the immersion on the Index is absolutely second to none and it's exciting in exactly the same way that console gaming was back in the 16 & 32 bit era.
Yea that's my primary motivation, which nucleated with my experience with HL:Alyx on low settings. I felt like it was a revolution in the gameplay experience. Not simply just enhanced 3D. I just want to know I'm not spending $2500 to play one game... Creation in VR is great too, but I'm not sure that it requires high specs anyways.
Honestly, get the PC.
You won't be getting it just to play HL:Alyx, there's a load of other games out there that are good fun, boneworks is another one along with the trusty beat saber.
There's also this list here - https://docs.google.com/spreadsheets/d/1s7yu7EMzrLTeLJU4yYHeZorC1AeWpi6h6s2OQ4ffdW0/edit#gid=0
I would also suggest on picking up this bundle (or any other VR players) as it's active for another 2 days and you get £100+ worth of VR games for about £17 - https://www.humblebundle.com/games/fall-vr
If you're curious about playing GTA5 in vr (with a controller) which is pretty insane, look here - https://github.com/LukeRoss00/gta5-real-mod
Honestly do it, the jump is so worth it. You may get a few PS5 exclusives but you can always play them at a later date when the price of the console (and games respectively drop) but VR changes the game forever.
Damn, thanks for all those resources! In your experience, do any of those compare to HL:A?
I'm probably going to grab the bundle myself, not having played most of the games in it, but I've played a fairly wide selection of VR games, and I've never played anything like Alyx. But it's not really a fair comparison, the thing to understand about VR is that it's a lot like the early days of video game development, it's still a new platform that's mostly served by small developers releasing quirky and creative experimental titles, not one swimming in triple A bucks and polished releases; in fact, most mainstream studios' VR ports are half ass shovelware, to be blunt. But if you're willing to dig a little, there's a lot of fun to be had, rec room, the lab, vrchat if that's your thing, and a lot of the best experiences are free.
Although the hardware cost is a lot higher for PC the games are a lot cheaper, moddable, there's a much wider selection and huge back catalog. £70 for a PS5 game that, realistically will also have "micotransactions", is actually insane.
Plus you can use your PC for things other than just gaming - such as software development, web browsing or whatever else you like to use a computer for.
Any steam sale, humble game bundle or even epic store free games could let you build up a solid backcatalog of quality, and that may include some recent and traditionally consolly games (I'm thinking of games like Devil May Cry, Yakuza, Horizon Zero Dawn).
The £70 is what shocks me. I recently paid £25 for a steam game and that was the most expensive game I can remember purchasing. I guess when you tally it all up I spent £50 on civ 6 but that was in 3 small portions over more than year. Do you get dlc for console games the same way?
I almost never end up paying full price for games because I don’t have time to keep up with the latest. I only buy a game if I’m planning to play it immediately, and by the time I get around to anything it’s like, $40.
It's true that games are cheaper on PC, but comparing launch day price isn't that productive, IMO. You can get AAA PS4/Xbox games for like $20 a year or two after release and at launch, PC game prices aren't that cheap, either. It's ultimately a case of games being $10 to $15 cheaper and sales hitting a little earlier. But unless you buy like 2 games a month, it could take close to a console's life span to make up that difference.
IMO the only real argument for PC gaming is getting access to crazy high-end modes and early trends that aren't yet big enough to make it to consoles. You can do more with PCs than gaming, but I guess most people already have a laptop that can do most of that already (including playing stuff like Factorio, running emulators or installing Minecraft mods). Arguing for gaming PCs as a cheaper option... I dunno, it feels like a bit of a stretch.
Does higher end VR count?
With things like Humble and Fanatical bundles, PC gaming has become extremely cheap. I've been keeping tabs on it over the last generation to see how best to get the games I want and, invariably, they're always cheapest on PC even if they don't come as part of a greater bundle. For example, I've gotten Yakuza 0, Kiwami, and Kiwami 2 all through Humble Bundles. The cheapest I could get those for on console is still the price of at least Kiwami 2 at half price, which is still more than the Humble Bundle distribution.
Though this could be regional. I imagine if I lived in the US, the discrepancy would be a lot smaller.
Otherwise, the main advantage for me is modding. I never start a game without taking a peek at the PC Gaming Wiki page now without checking to see if there are fixes or QoL mods that help smooth over technical issues. It's especially important if you're gaming at high refresh rates too.
Depending on the games you play it's worth remembering some can't run on console at all and you can always play with a controller on pc if you prefer.
PC exclusives? I thought Crysis was the last of that!
Maybe among triple A titles, but there are a ton of indie games that are never going to see a console release.
And right now, indie games are where the creativity is. By not running in the AAA graphics race, there's a lot more room to experiment with gameplay.
I agree. I think 4/5 of my favorite games in the past few years were indie games that cost <$20, some <10$, with okay graphics but amazing in everything else.
What would those be because I'm always a fan of a cheap indie game?
Not necessarily recent and from the top of my head. In no order:
*Biggest bangs for the buck but you can beat them in one sitting.
It's games that you can't play easily without a keyboard and mouse like Anno, Civilisation, MOBAs, Age of Empires and all that ilk.