Intro
15 Premium VR Oculus Rift benchmarks
BTR benchmarked 10 VR games in late August and we concluded that AMD has delivered on its promise for entry-level VR with Polaris, but that the RX Vega 64 needed serious driver improvement before its performance could be called “premium”. Four months later, we now have another premium AMD VR card to test – the PowerColor Red Devil RX Vega 56 – along with our liquid-cooled edition of the RX Vega 64, AMD’s top card. This time, we are benchmarking 15 VR games on the Oculus Rift including Fallout 4 VR using FCAT-VR, pitting the RX Vega 56 and 64 versus the GTX 1070 Ti, the GTX 1080, and the GTX 1080 Ti.
We have been playing more than 50 VR Oculus Rift games using midrange and high-end NVIDIA and AMD video cards last year. Since we posted our original VR evaluation last January, we then benchmarked 6 VR games in our follow-up using FCAT VR, followed by 3 more VR games. We have favorably compared FCAT VR with our own video benchmarks which use a camera to capture images directly from a Rift HMD lens. For BTR’s VR testing methodology, please refer to this evaluation. Currently, we are benching 15 VR games, and will continue to expand our VR benchmarking suite for 2018.
We are going to test 15 VR games using the GTX 1080 Ti FE, the GTX 1080 FE, the GTX 1070 Ti FE, a PowerColor Red Devil RX Vega 56, and a Gigabyte RX Vega 64 liquid cooled edition on a Core i7-8700K where all 6 cores turbo to 4.6GHz, an EVGA Z370 FTW motherboard and 16GB of HyperX DDR4 at 3333MHz on Windows 10 64-bit Home Edition. Here are the fifteen VR games we are benchmarking:
- Alice VR
- Batman VR
- Battlezone
- Chronos
- DiRT: Rally
- EVE: Valkyrie
- Fallout 4
- Landfall
- The Mage’s Tale
- Obduction
- Project CARS 2
- Robinson: The Journey
- Serious Sam: The Last Hope
- The Unspoken
- The Vanishing of Ethan Carter
Until FCAT VR was released in March, there was no universally acknowledged way to accurately benchmark the Oculus Rift as there are no SDK logging tools available. To compound the difficulties of benchmarking the Rift, there are additional complexities because of the way it uses a type of frame reprojection called asynchronous space warp (ASW) to keep framerates steady at either 90 FPS or at 45 FPS. It is important to be aware of VR performance since poorly delivered frames will actually make a VR experience quite unpleasant and the user can even become VR sick.

It is very important to understand how NVIDIA’s VRWorks and AMD’s LiquidVR each work to deliver a premium VR experience, and it is also important to understand how we can accurately benchmark VR as explained here. And before we benchmark our 15 VR games, let’s take a look at our Test Configuration on the next page.
The review looks dodgy because the reviewers disabled AMD Surface Format Optimization, disabled AMD Tessellation Mode, set Texture Filtering Quality to High and set Vega memory frequency to 800MHz (the minimum).
Texture filtering is set to High for ALL cards including GeForce and Radeon. Tessellation mode is tested equally for ALL cards – any other way gives an unfair advantage to AMD cards. And the memory was set at the default 945MHz for the Liquid Cooled 64 edition and the default 800MHz for the Vega 56 edition.
No, I don’t buy your arguments. “Oh, I am going to use NVidia-like settings on my AMD card” said no AMD user ever. Your settings for AMD cards are unrealistic.
Red Team damage control is obvious here. We know about AMD’s stealth marketing.
We don’t favor one card over another at BTR and the playing field is level.
Funny you would say that, since all the NVidia cards in your review are “supplied by NVIDIA”. Anyway, how do I get my money from “AMD’s stealth marketing” budget?
Funny you would say that, since all the NVidia cards in your review are “supplied by NVIDIA”. Regardless, how do I get my money from “AMD’s stealth marketing” budget?
And the Vega 56 card is supplied by PowerColor. Next week, we are benching a Gigabyte GTX 1070 Ti which we purchased ourselves.
We disclose everything at BTR. However, AMD employees are allowed to post freely on Social Media without a requirement to identify themselves by their company. And we have never seen a Red Team member identify themselves when they shill for AMD.
I would suggest leaving these settings on their DEFAULT configuration in future tests!
BECAUSE that is how 99% of users are going to play these titles in VR with their hardware.
Many Rift/Vive enthusiasts who have premium video cards are quite technically savvy. And if you are comparing AMD to Nvidia cards, the tests need to be run with identical settings for them to have any meaning.
I’ve gotta agree with the others you shouldn’t start playing around with the settings for any reason unless you’re doing an overclocked review, I’ve just finished my Ryzen Vega build and I haven’t play around with settings like that, never have, If you want to start overclocking and tweaking settings fine but this should of been an out of the box test. This reminds me of the RE7 review where the Grenada cards took advantage of certain tech which allowed them to out perform everything including Pascal and Fiji. The reviewer then found the cause and turned it off to redo all the tests with the Grenada cards nerfed.
BTR does not “play around” with settings. Rather we level the playing field between all cards by giving no card an advantage with “optimizations”. Settings are identical for tessellation, quality, and performance across all 5 cards tested..
Should we have left Vega on its out-of-the-box “balanced” setting? Performance would then be lower for the AMD cards.
If that’s it’s out of the box settings then yes you should of left it as is.
What part of “level the playing field” do you not understand?
Hello, A level playing field is leaving everything as it would be straight out of the box because that’s what makes testing reliable, When a new game releases and you want to see how the cards perform with that game you should turn every setting on and max everything out so you get a reliable base line. Then if you want to start messing around with various settings and even overclocking do the tests again knowing you have the baseline to compare them too. When Resident Evil 7 released and one review site found the 390 & 390x doing better than they usually did they messed around with the settings until they found that it was the shadow cache setting that the Grenada cards were taking advantage of so they turned it off and then redid all the tests so everything would sit where you would expect. Turning something off because only one brand is taking advantage of it defeats the purpose of testing because it’s about finding the best way for an architecture to shine, Does anyone ask for gpu boost 3.0 to be turned off so the Pascal architecture can’t overclock the gpu on the fly? Of course not because that’s the way it comes out of the box so why is it okay for one to play on it’s strengths and not the other?
No it isn’t. If you leave settings at default, you will have to deal with Vsync in some games and then your comparison is ridiculous.
We turn off AMD cheats or optimizations just as we turn off NVIDIA’s to make sure everything is completely equal between the cards tested.
NVIDIA, AMD, PowerColor, VisionTek, EVGA, ASUS, Gigabyte, PNY and other manufacturers have approved our method of benching. We have been benching this way for 10 years and we are not going to change the way we do it. We do not do things exactly like “other” tech sites but offer an alternative way that is very useful for testing the differences between the way cards perform.
You’re just being silly now, obviously V-sync get’s turned off, That shouldn’t of even needed explaining. What would be the point of running performance tests with an artificial fps limiter in play, Digital Foundry used to do that, I like how you call them AMD cheats, Maybe you should look up the official boost speeds of the Nvidia cards from the board partners and lock them to that as well, after all if you’re going to limit anyone shouldn’t it be everyone to ensure a fair playing field? No, Of course not cause that would be silly.
I’m sorry that you don’t understand BTR’s benchmarking methods. Our results are understood, valid, and well-respected by the vast majority of our readers – including the vendors who send us their expensive hardware for review. We are not going to change our benching methodology of nearly ten years just to suit a few AMD advocates.
AMD has approved the way we benchmark years ago, and we have never varied from the way we benchmark to insure complete fairness in comparing video cards.
I did not single out AMD as you falsely claim. “We turn off AMD cheats or optimizations just as we turn off NVIDIA’s”.
I’m sorry that I’ve offended you as it wasn’t my intent, you’ve incorrectly assumed the reasons for why I said what I did, I say what I think regardless of whether it negatively impacts a brand, any brand.
We are not offended and intent does not matter. We choose an alternate method method of benching that does not negatively impact any brand, and it is acceptable to both NVIDIA and to AMD as well as every brand of video card vendor that sends them to us for review. Thank-you for your opinion.
I agree.. why turn off features that are defaults for a card? Let them run a game how they’re meant to be..
did you just seriously accussed someone as a shill because he called you out by using unrealistic settings on amd cards?
do you benchmark cards with gameworks features off as they are on amd ?
do you benchmarks games at x8 tesselation as they are on nvidia?
that is some next level ignorance right there
I wrote that we are aware of AMD’s stealth marketing and the Red Team’s efforts to undermine independent tech reviews.
We benchmarked all of our VR games with Nvidia optimizations off.
We benchmark games with the *default* tessellation that the developer implements in the game. We do not limit tesselation for any card in an attempt to make one card perform better than another.