Assassin’s Creed Odyssey PC Performance Explored

Today is all about Assassin’s Creed Odyssey and it’s PC port. Assassin’s Creed hasn’t exactly had the best history when it comes to the PC Platform.  In it’s early days, paired with a shaky at best UPLAY, it wasn’t uncommon for your game saves to go missing unexpectedly.  Later down the road they paired up with NVIDIA to bring some new features through GameWorks and that wasn’t too well received because along this time it was faces that went missing along with save game files.  Origins saw a new style of gameplay and an insane demand on CPU and GeForce cards took a commanding lead at launch with that one.  This time around Ubisoft has partnered with AMD, even putting this game in their bundle for Radeon Rewards.  Should be interesting to see how things have come since Origins.

We set out a bit more ambitious on this title than we have in the past hoping to really round things out for the testing, but Denuvo had different plans.  After getting locked out of the retail purchased games many times we had to scale back a bit on some of the testing.  The idea was to bring different platforms into the fold and see how performance looked across them giving more people an idea of what to expect with a similar platform to theirs.  We wanted to look at Graphics Cards as usual, but wanted to see preset scaling as well as core scaling across both AMD and Intel platforms. This look at Assassin’s Creed Odyssey is what you can expect to see more of as we move forward, and no our GTX 1080ti has not made it here to this office yet.

assassins-creed-odyssey-conquest-preview-01-headerRelated Assassin’s Creed Odyssey Preview – Hands-On in a Battle For Conquest

Testing Methodology

Running through the tests for Assassin’s Creed Odyssey was pretty straight forward as the in-game benchmark was representative of the performance I saw after about a hour and a half of game play.  It did however, have one glaring catch that slowed things down considerably.  Who in the world puts dynamic weather patterns in a canned benchmark? So we made sure we only took results from scenes that had clear weather as rain introduced a wild variable as the clouds were sporadic and impacted performance a bit.  We used FRAPS to capture frame times from every frame rather than relying on the in game benchmark to tell us the averages, and we needed 1% and .1% low metric numbers that the game wouldn’t give.  Something to note on the in game results: they were higher than FRAPS, but it also appeared to be culling results as well giving an artificially higher results than what you are actually getting.  As far as settings go we explored the different presets and found Odyssey to be very reminiscent of Deus Ex Mankind Divided in terms of demand from settings so rather than going for the ‘compare everything at Ultra’ and since others have already testing the highest of settings we went with the ‘High’ preset as it more likely what people are going to be choosing.  ‘Low’ and ‘Medium’ have clear visual disadvantages while ‘Very High’ and ‘Ultra High’ offer very little over ‘High’ but come with massive drop-off in performance.

Test Systems

Components X370 Z370
CPU Ryzen 7 1700 @ 4GHz i5 8600k @ 5GHz
Memory 16GB G.Skill Flare X DDR4 3200 16GB Geil EVO X DDR4 3200
Motherboard MSI X370 XPower Gaming Titanium MSI Z370 Gaming Plus
Storage Adata SU800 128GB
2TB Seagate SSHD
Adata SU800 128GB
2TB Seagate SSHD
PSU Cooler Master V1200 Platinum Cooler Master V1200 Platinum

Graphics Cards Tested

GPU Architecture Core Count Clock Speed Memory Capacity Memory Speed
NVIDIA RTX 2080ti Turing 4352 1350/1635 11GB GDDR6 14Gbps
NVIDIA RTX 2080 FE Turing 2944 1515/1800 8GB GDDR6 14Gbps
NVIDIA GTX 1080 FE Pascal 2560 1607/1733 8GB GDDR5X 10Gbps
NVIDIA GTX 1070 FE Pascal 1920 1506/1683 8GB GDDR5 8Gbps
NVIDIA GTX 1060 FE 6GB Pascal 1280 1506/1708 6GB GDDR5 8Gbps
XLR8 GTX 1060 3GB Pascal 1152 1506/1708 3GB GDDR5 8Gbps
AMD RX Vega 64 Vega 10 4096 1247/1546 8GB HBM2 945Mbps
XFX RX Vega 56 Vega 10 3584 1156/1471 8GB HBM2 800Mbs
XFX RX 480 8GB Polaris 10 2304 1266 8GB GDDR5 8Gbps
Sapphire RX 570 Nitro+ 4GB Polaris 20 2048 1340 4GB GDDR5 7Gbps

Drivers Used

Drivers  
Radeon Settings 18.9.3
GeForce 416.16

Preset Scaling At 4K

2018-gamescom-awards-13-cyberpunk-2077-02Related Wccftech’s Best of Gamescom Awards – 2018 Edition

We wanted to start things off by seeing how the game scales using the built in presets at 4k with the i5 8600k and RTX 2080ti so that we could get a real look at the impact and decide which preset we were going to use.  Again, ‘Low’ and ‘Medium’ offer up substantial performance increases but at a noticeable hit to image quality, especially evident in the more open areas.

Ryzen 7 1700 Core Scaling

Core Scaling on Ryzen appears to be a similar song and dance as we’ve seen showing gains all the way up through 6 cores and 12 threads but this time around we still see gains when moving to the 8 core 16 thread configuration.  Due to Denuvo thinking that changing core and thread count configurations is a ‘new’ system this was one of the tests limited by that wonderful DRM Lockout we kept experiencing so we kept the core configuration limited to available and typical Ryzen core configurations.

Core i5 8600K Core Scaling

Only having the Core i5 8400 and Core i5 8600K on hand limited our testing of the Coffee Lake CPUs to straight cores.  Seeing how this game scaled with available cores I was a little concerned at this point that the hex core would struggle, but the high clocks on the 8600K make up for what it lacks in HyperThreading.  Showing the 6 core configuration outpacing the Ryzen 6 core 12 Thread and a 4 core configuration matching the 4 core 8 thread Ryzen configuration.  Also, stay home dual cores with no hyperthreading, nothing to see here.

8600K vs 1700

But what about the Ryzen 7 1700 (4GHz) put next to the Core i5 8600K (5GHz) when paired with the RTX 2080ti? Looks like a pretty solid competition with both swinging at each other and neither coming out the clear winner.  The 8600k inches out the .1% lows at 1080p but is  loses that edge at 4K.

1080p

1440p

4K

Conclusion

Well, it looks a lot like Assassin’s Creed Origins all over again when it comes to performance.  While Radeon is behind again it’s at least not a stutter fest, so I guess that’s a positive.  The CPU is still getting pounded to the point that there’s a clear and present bottleneck at 1080p and even then the frame rates aren’t anything to write home about.  Having to move down to 1080p High and pairing with (I know we don’t have one but I think we can all surmise) at least a GTX 1080ti to keep a constant 60 FPS or better is far from anything anyone should be happy about.  This would be understandable if this game were able to claim itself as ‘The New Crysis’ but it simply isn’t.

Now with that rant out of the way, since this game isn’t a twitch shooter or a game that necessarily benefits from higher frame rates outside of being smoother, if you’re staying over 45 FPS and on a variable refresh rate monitor it’s going to be okay.  The game plays well enough if you’re around the 45 FPS level.  I know how that sounds and it’s not the coveted 60 FPS barrier but it is okay.  The caveat is you’re going to want a variable refresh monitor or grab a controller and turn on VSYNC because this engine exhibits insane levels of tearing otherwise.  This appears to be another game that is actually good on it’s own that will be remembered more for it’s uninspiring performance rather than it’s actual game play on the PC.  It might be time for Ubisoft to sit back and take the Shadow of the Tomb Raider approach and see what DX12 could do for them, it did wonders for the Tomb Raider franchise on PC.



Submit



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *