11.11.2017 Update: For the most up-to-date SLI benching results using Core i7-8700K at 4.6GHz, please see The GTX 1070 Ti SLI Performance Review vs. the GTX 1080 Ti – 35 Games Tested.
What’s better than a GTX 1080 Ti, the fastest video card in the world?  Two of them in SLI!  But is it worth the extra $700 for the second GTX 1080 Ti plus the cost of a HB bridge for added performance?

This follow up to BTR’s launch evaluation of the GTX 1080 Ti is going to test the same 25 modern PC games at 4 resolutions – 1920×1080, 2560×1440, 3440×1440, and 3840×2160 – to see how well SLI’d GTX 1080 Tis scale.  We have tested SLI and CrossFire before with rather mixed results.  We concluded from our last evaluation of the TITAN X vs. GTX 1070 SLI: “It is pretty clear that CrossFire or SLI scaling in the newest games, especially with DX12, are going to depend on the developers’ support for each game requiring a mGPU gamer to fall back to DX11.  We also note that recent drivers may break multi-GPU scaling that once worked.  Even a new game patch may affect multi-GPU game performance drastically.”

We were able to borrow a second GTX 1080 Ti and we ran the same benchmarks as in the GTX 1080 Ti launch evaluation last Thursday.  We removed the backplate from the bottom GTX 1080 Ti so that the top card could intake air more easily and we used our EVGA HB SLI bridge.

The EVGA HB SLI Bridge

No longer do the flexible ribbon SLI bridges bundled free with SLI motherboards carry enough bandwidth for Pascal SLI.  Now High Bandwidth (HB) SLI bridges are necessary to support the bandwidth for high display resolutions.  We received a HB SLI bridge from EVGA which enabled us to run these benchmarks.

Our HB bridge is “single spacing” and it also features a RGBW switcher to feature Green, Blue, White or (even) Red.

Here’s a closer look.


Here is the other side:

hb-sli-2Here is GTX 1080 Ti SLI installed and lit up.  Since we use the “zero spacing” configuration, we removed the bottom card’s backplate so the top card could intake air more easily.  It made a few degrees improvement to the hot-running SLI’s GTX 1080s.Temperatures of both cards generally stayed in the mid-80s C and occasionally neared 90C.  Better cooling would be helpful as it is very likely that at least one of the GTX 1080 Tis throttled pretty regularly in our very warm (Summer-like) test room.

Let’s check out the test configuration.


  1. Thank you for providing SLI results for the 1080 Ti and more importantly thank you for testing at 3440×1440. As someone who games at 3440×1440, I like to see that resolution tested so I don’t have to extrapolate from another resolution. I’m running 1080s in SLI and though the extra card is of dubious value, my gaming rig has a bit more flair with a pair!

    • That’s not true. Maxing out settings in 4k no way do you end up CPU bound in anything. You can see that in any 1080Ti or Titan X Pascal review. There are a ton of games where they can’t even touch 60fps at 4k max.

      I moved from Titan X Pascal to 1080Ti SLI on a lowly 6700K with no overclock and saw a massive jump at 4k max in the games where SLI actually scales.

    • There is no such thing as your GPU being CPU bound. The GPU is always working at 100%. Always.

      Some games have higher CPU requirements, so you get lower framerate because the CPU cannot keep up. That is irrelevant of your GPU. In fact, upgrading your GPU will give you a higher framerate even when the game is CPU bound, because the time leftover to render is the same but the power of the GPU is increased. When you have lower utilization, it isn’t because your GPU is limited. It is because you have a limited amount of time to render thanks to incompetent programming that took up too much CPU. When your CPU is not utilized & your GPU is at 100%, it is because you have a lot of free time to render.

      BOTH situations (low & high amount of time left to render in 1 second) mean the GPU is working at 100% rendering as fast as it can & that the faster your GPU the more frames you will get within that time.

      If the CPU limited your GPU, then you wouldn’t see a gain when upgrading GPU’s. You would see your CPU at 100% & no matter what GPU you used, it would net you 0 extra frames (within some margin of GPU capacity).

      Instead, you do see a change in fps when you change your GPU. So this is a myth propagated by people who have no idea what they’re talking about (nearly everyone on tomshardware forums, where the myth is extremely strong).

      As for SLI, there are additional benefits outside of just higher framerates (options in Nvidia control panel) but I doubt anyone thinks those are worth $400-$800+ extra.

      Developers are always the most important part of a game’s performance.

      1. Is the Developer competent?

      Did the developer incompetently target a low FPS, lazily port over from a console (target 30 or 60), break Game Programming 101 by tying Physics with Framerate (worse-than-a-newbie Bethesda), or did the engineer incompetently create some horribly performing UI (ex. SWTOR gains 10-20 fps by turning off the UI, because it is such a CPU drain, a.k.a. programmed by a retarded monkey).

      Is the game CPU or GPU bound? The game should never be CPU bound (Modern CPU’s are blazing fast & thus gamedev 101 is to make sure your game performs well). SWTOR, Everquest 2 & SupCom are great examples of games that were CPU bound. They should not be.

  2. Here is footage captured from my computer which is running 2 GTX 1080 Ti in SLI mode. It is Tom Clancy’s Ghost Recon Wildlands captured in 4k UHD 2160p at 60FPS I just got it uploaded so it may not be processed into 2160p by youtube yet but here it is.

    Even with the killer rig i’m running the game is VERY demanding and it taxes even my system. None the less, I average between 80 to 100 FPS and have never dipped below 60. I think it could be optimized for SLI better.

    • If they are throttling due to high temps, then the results are nearly worthless. It tells you nothing.

      It’s strange that a website which benchmarks expensive hardware with multiple monitor setups & high end systems wouldn’t use an open case & GPU risers.

      I have two Zotac Extreme Core 1080 ti in SLI in a Core P5 thermltake open case, both cards extended out, with a Zotac HB Bridge with 2 slots open (so there is actual airflow for GPU1).

      In Witcher 3, with Hairworks on for everything, and every setting maxed, with DSR x4 resolution, I get 140fps average at the starting area, only dipping to 120 for a brief second once or twice sprinting around outside. I’m sure there are more performance intensive locations in that game, but the fact I’m running at DSR x4 @ 2560×1440 @ 130-140 avg fps & 110-120 lowest ever (and rarely that low), is pretty damn fantastic. I could overclock my cards too, but I don’t like the fan speed above 70% while playing so I haven’t tested it.

    • at the current gen of monitors you dont really need fps above 60 at 4k but you should take a look at the monitors coming in 2017 4K HDR at 144hz , you really need the scaling otherwise nothing would be able to push the limits of these new monitors

  3. I run two reference 980’s (not Ti’s) and get very close to these numbers in 4k resolution. We’re talking maybe a 10fps difference. I’m running an older i7 4960x, and I bought the Nvidia HB 2 slot bridge, even though maxwell supposedly doesn’t utilise it. My cards are also zero spacing & run 80-82 degrees Celsius. Imo, not worth $1400 atm.

    • Not to sound like a dick but here it comes …… if you think that a 980 even in SLI is only a 10fps difference you are talking out of your ass. 980s run roughly 50% slower than a 1080TI at 4k in ANY game….. you would need to run the 980’s in SLI to get FPS comparable to 1 card. Good job using the games that SLI does not scale as your benchmark for value.

      • You are correct. It’s obvious 1080ti’s will blow 980’s out of the water. You cannot run any of these titles with only one 980. Maybe if you really dumb down the graphics, but then what’s the point, might as well stick to 1080p resolution. My point is, when I sli 980’s, I have more scalability in these titles than 1080ti in sli. This is evidence of that. My performance goes up, not parallel, or down. As of right now, you spend an additional $699 for bragging rights. That’s it. In the real world, cost vs performance is just plain stupid. Idk how many times you’ve spent hard earned cash on the lasted & greatest product, plug it in, and it’s performance wasn’t what you expected. There is “On paper” & “The Real World”. A lot of Hype. If developers find it profitable to spend the time to optimize SLI, then, duh…. I would jump on the bandwagon.

        • It’s called drivers. They get updated. Articles like this run once, somehow never get updated, and become gospel forever. 1080Ti SLI is handling 4K, max settings at 60fps+ in everything listed here at this point. I’ve personally tested probably 80% of this list.

          Stick with 1080p if you want, but that’s a completely different and unrelated point. Relative scaling percentage between generations is even more pointless of a metric.

          Fact is if I turn off one of my 1080Tis I do *not* maintain 60fps at 4K ultra and If I run both I exceed 60fps. Lowering res, settings, “value”, whatever isn’t the point. Not near 60fps vs over 60fps apples to apples. That’s the point.

          • Good! I’m happy to hear NVidia has made driver improvements to SLI. I only wish game developers worked with hardware manufactures in timely fashion. How long has it been? Since my last response, I’ve replaced my SLI 980’s with one EVGA 1080TI FTW3. It’s 2000mhz out of the box! I’ll admit the one 1080ti out performs 980 SLI, and most importantly without SLI issues. I will never buy a second card to SLI again. I’m just not that guy. I spend too much of my day diagnosing & repairing vehicles, to then come home and diagnose and tinker with my recreational equipment. Some folks need 1080ti SLI, or their soul will wither. To me, it’s a huge waste. Running new titles in 4k @60 fps, and my oculus rift does great. I believe it’s an exaggeration on just how much improvement a second 1080TI gives, vs price. I also believe it is that way by design, with drivers, development, and future model sales.

          • Are you still running 1080ti sli? Have there been any improvements due to driver updates yet or is the scaling still horrible?

      • The older Maxwell architecture GPU’s obviously aren’t as powerful on their own but they scale much better in SLI than Pascal does, they even nativity support 3 and 4 way. Nvidea have been try to phase out SLI I think purely to entice people to buy a brand new card when they upgrade as opposed to connecting a second one. If you are going for a multi-GPU set up your better playing team red because Crossfire is used in the PS4 Pro (maybe the XBox Scorpio as well) so obviously they have to continue to support it. I’m not expecting the up and coming VEGA to match a 1080 Ti but I think it will scale much better in crossfire than the !080 Ti does in SLI.

        • Welp, the Maxwell had a wall around 1500mhz for I believe, while the pascal is about 2100mhz, out of the box, most 1080tis do 1800mhz while overclocking it makes it able to hit 2100mhz, 980 ran about 1250mhz out of the box could overclock to 1500ish MHz, they are pretty much on par if you consider overclock ability

  4. I understand it’s probably too time consuming for your test – but you should put forth a little more effort to find a multi-gpu profile that is better optimized for each game. You can see gains of over 80% on mankind divided if properly configured, not 29% as the default is (at 4K). This looks like test results from a multi-gpu setup by a person who has no clue what they’re doing.