Test Configuration
Test Configuration
Test Configuration – Hardware
- Intel Core i7-4790K (reference 4.0GHz, HyperThreading and Turbo boost is on to 4.4GHz; DX11 CPU graphics), supplied by Intel.
- ASUS Z97-E motherboard (Intel Z97 chipset, latest BIOS, PCIe 3.0 specification, CrossFire/SLI 8x+8x)
- Kingston 16 GB HyperX Beast DDR3 RAM (2×8 GB, dual-channel at 2133MHz, supplied by Kingston)
- GTX 1080, 8GB, Founder’s Edition, reference clocks supplied by Nvidia
- GTX 1070, 8GB Founder’s Edition, reference clocks, supplied by Nvidia
- GeForce GTX 980 Ti, 6GB in SLI and also tested as single GPU, reference clocks, supplied by Nvidia
- EVGA GTX 980 Ti SC, 6GB in SLI and also tested as single GPU, at reference reference clocks, supplied by EVGA
- 2 x GeForce GTX 980, 4GB, reference clocks, in SLI and also tested as single GPU, supplied by Nvidia
- GALAX GTX 970 EXOC 4GB, GALAX factory overclock, supplied by GALAX
- PowerColor R9 Fury X 4GB, at reference clocks.
- VisionTek R9 290X 4GB, reference clocks, in CrossFire and also tested as single GPU; fan set to 100% to prevent throttling.
- PowerColor R9 290X, 4GB, reference clocks, in CrossFire; fan set to 100% to prevent throttling.
- PowerColor R9 280X, 3GB, reference clocks, supplied by PowerColor.
- Two 2TB Toshiba 7200 rpm HDDs
- EVGA 1000G 1000W power supply unit
- Cooler Master 2.0 Seidon, supplied by Cooler Master
- Onboard Realtek Audio
- Genius SP-D150 speakers, supplied by Genius
- Thermaltake Overseer RX-I full tower case, supplied by Thermaltake
- ASUS 12X Blu-ray writer
- Monoprice Crystal Pro 4K
Test Configuration – Software
- Nvidia’s GeForce GTX 1080 Launch Drivers 368.13 were used for the GTX 1080 and the TITAN X; and GTX 1070 368.19 launch drivers were used to benchmark the GTX 1070. The same family of GeForce drivers, the latest WHQL public drivers GeForce 368.22, were used for GTX 980 Ti and the GTX 980 – including for SLI – and also for the GTX 970 EXOC as noted on the Big Picture chart. High Quality, prefer maximum performance, single display.
- The AMD Crimson Software 16.5.3 beta hotfix driver was used for benching the Fury X, the 290X and for 290X CrossFire, and 16.5.2.1 Hotfix was used for the 280X. Global settings are noted below after the benchmark suite.
- VSync is off in the control panel.
- AA enabled as noted in games; all in-game settings are specified with 16xAF always applied
- All results show average frame rates including minimum frame rates shown in italics on the chart next to the averages in smaller font.
- Highest quality sound (stereo) used in all games.
- Clean install of Windows 10 64-bit Home edition, all DX11 titles were run under DX11 render paths. Our DX12 titles are run under the DX12 render path. Latest DirectX
- All games are patched to their latest versions at time of publication.
- EVGA’s Precision XOC, latest beta version for Nvidia cards.
The 25 PC Game benchmark suite & 1 synthetic test
- Synthetic
- Firestrike – Basic & Extreme
-
DX11* Games
- Crysis 3
- Metro: Last Light Redux (2014)
- GRID: Autosport
- Middle Earth: Shadows of Mordor
- Alien Isolation
- Dragon’s Age: Inquisition
- Dying Light
- Total War: Attila
- Grand Theft Auto V
- ProjectCARS
- the Witcher 3
- Batman: Arkham Knight
- Mad Max
- Fallout 4
- Star Wars Battlefront
- Assassin’s Creed Syndicate
- Just Cause 3
- Rainbow Six Siege
- DiRT Rally
- Far Cry Primal
- Tom Clancy’s The Division
- DOOM (OpenGL game)
- DX12 Games
- Ashes of the Singularity
- Rise of the Tomb Raider
- Hitman
Here are the general settings that we always use in AMD’s Crimson Control Center for our default benching. Specific settings for AMD CrossFire are shown under CrossFire Options. The new Power Efficiency Toggle which was made available for the Fury X and some 300 series cards after Crimson Software 16.3, is left off in our benching of the Fury X. Please note that 100% fan speed is used for benching the 290X reference versions, and they do not throttle at all.
Nvidia’s Control Panel settings
These are the general settings used in Nvidia’s Control Panel. Specific SLI settings are shown under SLI Options.
How We Benchmarked
We noted that CrossFire support, once scaling better for DX12 Hitman with 16.3, now gives somewhat less scaling performance for 290X Crossfire with the latest driver. We did not enable any workarounds for CrossFire or SLI – as with using Nvidia Inspector, for example – but we benchmarked using the default multi-GPU enabled settings that the latest performance drivers offered to us.
We tested 25 games with the options provided for us in the drivers – for example, forcing AFR-1 and AFR-2 for games where SLI did not work well; and we also used the profiles provided to us by the Crimson Software drivers.
CrossFire Options
These are the Global setting that we used with Crimson Software 16.5.3 with 290X CrossFire. CrossFire bridges are not used with 290X, 390X or Fury X cards, unlike with multi-GPU Nvidia cards which require an SLI bridge.
In Global Settings we enabled “CrossFire”, On, and “AMD CrossFire Logo”, On. Enabling the second setting displayed the AMD CrossFire Logo overlay in most DX11 games in the upper right corner of the game, although it does not display in any of our 3 DX12 games.
The CrossFire logo was displayed in Fallout 4 even though the game mostly scaled negatively for us using 290X CrossFire compared with using just one 290X.
Here are the Global settings that allow us to change profiles for each individual game.
Let’s use Rise of the Tomb Raider as an example to see what AMD allows us to do. First we see the default settings, which in our global settings enable or allow CrossFire for all games.
We saw little to no scaling in our three DX12 games as shown by our charts on the next page, although DX11 CrossFire may allow for it. Next we tried “AFR Friendly”.
Again, no real change to performance and no scaling for CrossFire over using just one 290X. Now we tried “AFR Compatible”.
We also tried AMD’s “Predefined Settings” which appear to be the same as the Global Settings.
Finally, we tried Optimize 1×1. Again, we saw no real changes that improved performance, and no scaling for CrossFire over using just one 290X compared with the other settings and as reflected by our charts on the next page. Generally there are no positive changes to performance and no extra performance over using just 1 GPU in Rise of the Tomb Raider’s DX12 setting. We see this happen with more games as shown on our charts on the next page.
The Optimize 1×1 setting is the same as disabling CrossFire or using just one 290X. Again, with most games that do not scale there was no change to performance, although games that scale negatively with CrossFire no longer do so, making it a very useful setting.
Let’s look at Nvidia’s SLI options.
SLI Options
First of all, it is important to “Maximize 3D Performance” in Nvidia’s Control Panel if you have two identical GPUs set up with a SLI bridge.
Using Precision XOC we make sure that our GPUs are linked (upper right corner) and set to the same settings.
Under 3D settings, you can change SLI settings for individual games. We are using Just Cause 3 and Rise of the Tomb Raider as examples to illustrate the available options for us.
First, here are the default settings, “Nvidia Recommended”, which in this case is SLI for Rise of the Tomb Raider
We saw no real positive scaling change to single-GPU performance as reflected by our charts on the next page. So we tried “Force Alternate Frame Rendering-1”
This time, the game locked up and we had to restart it, Next we tried “Force Alternate Frame Rendering-2”
This time, our PC locked up and we had to restart it. Forcing AFR apparently does not work very well with our three DX12 games, although forcing AFR does work positively with some DX11 games.
There is another option which disables SLI for each game individually when it scales negatively. In this case, we used Just Cause 3 as our example although the same setting exists for Rise of the Tomb Raider and for most games.
Just Cause 3 evidently scales very slightly with SLI although the improvement in performance is quite low, and setting it to Single-GPU disables SLI and returns the scaling to the same as using just one GPU. Single-GPU is quite useful when a game scales negatively.
Lets head right to our results.
I wonder how the scaling is in surround/eyefinity I’m running 2 980ti’s in SLI @ 7680×1440 which is 11 million pixels compared to 8.2 million pixels of 4K. DOOM runs around 45-50 fps with everything max except for textures, running those at ultra not nightmare, and I’m pretty sure single card wouldn’t be able to do that FPS. But I could be wrong.
CF of 290X run at 25-38 fps in AOTS ? An optimised multi-gpu pro-AMD game ? Seriously ?
Unfortunately I don’t find some of the above results reasonable. I own RX 480 and I ran many games.
I can tell you clearly that GTA V scales very well with it, you may also refer to TechPowerUp and see their results which are consistent with what I observed, in various resolutions. The only issue is microstuttering, but it is well acceptable to me.
I can’t say for sure if this is because 290X wasn’t optimized well, but definitely it doesn’t reflect what Crossfire and SLI are really doing.
You may refer to this instead for Crossfire performance:
http://amdcrossfire.wikia.com/wiki/Crossfire_Game_Compatibility_List
This review was posted in June and the SLI and CF results were accurate. Since then, AMD has made good progress with CrossFire in GTA V and it scales better now.
As you can see, Nvidia has also made good progress with SLI since June. This article shows GTX 1070 SLI vs. Titan XP and it was posted in September.
http://www.babeltechreviews.com/titan-x-vs-gtx-1070-sli/
From time to time, BTR will update CrossFire and SLI results in future evaluations.
Right but that’s a good reason why an article like this in particular might need to be pulled entirely. A huge point was made, and strong conclusion drawn, over what amounts to software tuning issues. These issues have largely been resolved, but the article doesnt even have a note indicating that yet still attracts comments and is standing as advice (and from the comments you can see supporting confirmation bias)
Every review is a snapshot of what is happening at the moment it is written, much like a historical document. There is no reason to take this article down.
As a matter of fact, BTR has an even newer SLI evaluation which is less than a month old:
http://www.babeltechreviews.com/gtx-1080-ti-sli-performance-25-games/
“GTX 1080 Ti SLI Performance in 25 Games”
I have a water-cooled tri-fire setup, and I have suffered greatly at the hands of micro-stutter and non-scaling. At this point however, cost wise, I would have to put in another $400 (cdn) at least after selling my stuff to get a GTX 1080. The idea that my cards can beat the 1080 or come close, and do so quietly, has me holding on to what I got.
I’m shaking my head that games are not optimized for crossfire/sli. Don’t they want crazy people like us spending all this money on multiple cards and accessories?
Bottom line, never again. Especially if support is getting weaker as new games come out.
I upgraded my horrible video card to a gtx 950 about 6 months ago. It was a massive improvement for the money. Now, I’ve decided to build a whole new box and realize that I should have spent that money on a 1070/1080. Should I recoup the money spent by adding another 950, or go with a 1070?
lol i think the article answers that for you(!)
avoid low-end SLI like the plague
get a 1070 bro you will be able to play games max setting on 1080p for a few years