Overclocking the XFX Fury X vs. overclocking the GTX 980 Ti

 

Overclocking the GTX 980 Ti 

Overclocking the GTX 980 Ti is just as easy as overclocking the GeForce TITAN X, the GTX 980, or any other Maxwell architecture-based card. The reference GTX 980 Ti accepted a stable offset of +230MHz core/+500MHz memory over the base clock of 1000MHz.   The reference version Boost peaked at 1418MHz and hit 1408MHz regularly.

We did not adjust any of our cards’ fan profiles (except for the 290Xes whose fans were allowed to spin up to 100% to prevent throttling), nor voltage for our benchmark runs.  We did however, push up the temperature and power target controls to maximum since we tested in Summer-like (warm – 78F-80F) conditions.  We also made very sure to warm up all of our cards before benching.

pxWe used the very latest version of EVGA PrecisionX for GPU overclocking.  You can download PrecisionX directly here: www.evga.com/precision where you can download it for free. It is also available for free on Steam.

Moving up the power slider to 110% and the temperature up to the maximum 91C using EVGA’s Precision showed very little performance gain over simply setting the temperature limiter to 85C.  However, we always run our benches with both PowerTune and AMD’s PowerDraw to their maximum although we never touch the voltage nor fan profile (except for the 290X fan profile which is allowed to spin up to 100% to prevent throttling)

The reference GTX 980 Ti’s VGA fan became noticeable over 60% and much more so at 75%.  Our reference GTX 980 Ti appears to let the fan spin up a bit higher than the TITAN X with a little more noise under full load.

Overclocking the XFX Fury X

We used the latest version of MSI’s Afterburner which exposed HBM overclocking.  Overclocking the Fury X is problematic and there appears to be almost no headroom as the 1050MHz clocks appear to be very close to their maximum even with the stock watercooling.  After many hour of experimenting with our XFX Fury X, we found we could manage only +45MHz offset to the core for a maximum of 1095MHz.  Adding +5MHz more to reach 1100MHz caused instability in many games.  We also found that the heat from the radiator increased significantly with this mini-overclock.Afterburner-Heaven-2-mem-co

Overclocking the HBM was just as disappointing.  We finally settled on a +30MHz overclock after finding that we could overclock it further to 50MHz, but there was no performance improvement.

Fury X runs cool but it requires watercooling to do it.  We would say that even after voltage tools were exposed over a month ago yet not made publically available, Fury X is still not a great overclocker as it appears to be already pushed to the edge.

Let’s head to the performance charts and graphs to see how the GTX 980 Ti compares with XFX Fury X, and with the top cards of Autumn, 2015.

3 COMMENTS

  1. If you don’t OC video cards the Fury X looks okay, but since most 980TIs will do >1400 MHz boost, even without additional voltage, not OCing one would almost be a crime against humanity. I get it, w/ DX12 the Fury X looks better, but it still doesn’t match an overclocked 980Ti (which most reviewers are refusing to include in recent graphs for “reasons”), and you’d be stuck with a noisy pump, tons of additional heat (may be okay in the winter, to help warm your feet), and 50% less framebuffer. It just isn’t a sound tradeoff.

  2. This article was biased since page one paragraph one, Half the information have no confirmation reports it’s mainly a “trust us” article, Im not really an AMD nor an Nvidia fan, both companies are great, but your emphasize on the GTX Ti pros and the Fury X cons is very noticeable, Having a stand-off between AMD and Nvidia right now is like conducting a running contest between two men with one of them having a large ball and chain strapped to his legs, I’ve been reading about the DirectX 12 lately and until it is launched, you cannot compare the two brands really, Nearly everyone knows that AMD suffers bottlenecking with DX11 as it favors SCT

    Personally I’d pick my personal “favorite” brand when enough tests are done with the DX12 comparing the two, and frankly Nvidia doesn’t seem like it has a firm footstep hold so far… In fact, it doesn’t seem to have a footstep forward at all!

Comments are closed.