Gears of War 4 PC Performance Review

Introduction

Gears of War 4 PC Performance Review

Introduction

When the last iteration of Gears of War was released on PC, gamers were not impressed, with the port suffering from huge technical issues and  lacking support for key features like FreeSync and G-Sync due to the game's use of UWP program. 

Since then Microsoft has made several key improvements to UWP as well as announcing the Xbox Play Anywhere program, which allows owners of the Xbox version to access the PC version on the same Microsoft account. Alongside these changes, The Coalition, the developers of Gears of War, has learnt a lot from their mistakes when developing Gears of War Ultimate Edition on the PC and are looking to show off their new found PC development skills. 

The new PC version of Gears of War Ultimate Edition is an ambitious release on PC, looking to regain the support of PC users with 4K texture support, built-in benchmarking tools and one of the largest options menus that we have ever seen in a PC game. Will Gears of War 4 perform well on PC, or is this another shoddy Xbox One port? 

 

 

Drivers 

For this game, we will be using the newest drivers that were available when the game released, which is Nvidia's Game Ready Geforce 373.06 driver and AMD's 16.10.1 driver, both of which are the most recent GPU drivers for either company. 

 

Test Setup  

We will be testing this game on our dedicated GPU test rig using both high-end and mid-range GPUs from both AMD and Nvidia. 

 

Game Test Rig
Intel i7 6850K
ASUS X99 Strix
G.Skill Ripjaws 4x4GB DDR4 3200MHz
Corsair HX1200i
Corsair H110i GT
Windows 10 x64 

       Rise of the Tomb Raider - AMD VS Nvidia Performance Review  Rise of the Tomb Raider - AMD VS Nvidia Performance ReviewNo Man's Sky PC Performance Review

Nvidia GTX 980Ti (Left), AMD R9 Fury X (Middle) GTX 1070 Founders Edition (Right)

 

For the high-end, we will be testing AMD's R9 Fury X, the GTX 980Ti and the GTX 1070.  Sadly we do not have a GTX 1080 to test at this time. 

For the Mid-range offerings, we will be testing the new RX 480 and GTX 1060, both of which will be the ASUS Strix Gaming models.

 

No Man's Sky PC Performance Review  No Man's Sky PC Performance Review   

ASUS GTX 1060 Strix (Left), ASUS RX 480 Strix (Right)

 

To represent AMD and Nvidia's lower-end GPU offerings we have decided to use the AMD R9 380 and the Nvidia GTX 960. Both of these GPUs will be the ASUS Strix models. 

Both of these GPUs offer very similar performance in most scenarios and come in at very similar price points, so it will be very interesting to see which GPU will come out on top. 

 

          Metal Gear Solid 5 Performance Review with ASUS  Metal Gear Solid 5 Performance Review with ASUS

Nvidia GTX 960(Left), AMD R9 380(Right)

 

«Prev 1 2 3 4 5 6 7 8 9 10 11 12 Next»

Most Recent Comments

13-10-2016, 16:23:05

AngryGoldfish
For every turd there is a chocolate bar.Quote

13-10-2016, 18:14:42

gijoe50000
Would be nice of they optimized the textures/game for certain amounts of vram. Having 4210MB makes no sense since the level down for 1080p is 2770MB. Everyone with a 4GB gpu will be missing out here.

There seems to be a huge gap between ultra and high..

It would make more sense to have the vram optimized as, say: 6GB for Ultra, 4GB for very high 3GB for high 2GB medium.
Does that make sense?Quote

13-10-2016, 18:47:06

NeverBackDown
Strange the only card using Async from Nvidia to lose performance was actually the 1080Quote

14-10-2016, 04:26:27

AlienALX
Might actually pull the trigger on this once I'm done with Mafia III (not even started yet lol).Quote

14-10-2016, 09:11:55

SPS
Quote:
Originally Posted by gijoe50000 View Post
Would be nice of they optimized the textures/game for certain amounts of vram. Having 4210MB makes no sense since the level down for 1080p is 2770MB. Everyone with a 4GB gpu will be missing out here.

There seems to be a huge gap between ultra and high..

It would make more sense to have the vram optimized as, say: 6GB for Ultra, 4GB for very high 3GB for high 2GB medium.
Does that make sense?
In UE4 you set a pre-defined texture streaming pool, this can either be set exactly or it can be set to be a percentage of VRAM. The streaming system well then only stream in the correct mips per texture where it is able to do so. If the texture pool is set to 20% of VRAM at low settings using 1080 (which has a fair bit of VRAM), it's possible that the VRAM report actually shows more VRAM being used than necessary. Once the pool is full it will only stream out textures when the memory being used is required for use by a texture coming into view. This helps prevent wasted time streaming out textures for the sake of it.

https://docs.unrealengine.com/latest...res/Streaming/Quote
Reply
x

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.