• 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AMD 6800 series thread (big Navi)
#1
This launch from AMD was way more than I expected. For AMD to catch up to nVidia in terms of rasterized game performance is nothing short of astounding. It's going to take a few years for AMD to truly be back in the game, though. They need an answer to DLSS, and they need to fix the ray tracing performance. Apparently AMD is working on their own kind of DLSS; we shall see how it turns out.

I would still go with nVidia this round unless I was a 1440p or lower gamer.

I think AMD is going to have some killer low end/midrange cards based off these products.

The real shocker for me this round is that the AMD cards actually run cooler than the nVidia cards. For a very long time now the nVidia based cards have typically run cooler than the AMD cards.

This is overall a really amazing launch for AMD and I'm happy to see them back in the game. The new consoles will probably be a big hit/win for them also.

It's nice to see some competition for Intel and nVidia. I was disappointed in the pricing for the nVidia 2000 series cards. The current gen is pretty compelling if you're in the market for a card for 4k gaming.
  Reply
#2
Well summed up. I got a late RX 6800 XT from AMD and a Red Devil 6800 XT is also arriving tomorrow for a in depth more traditional review later this week using workstation, 35 games including ray tracing at 3 resolutions, and overclocking - plus another VR follow-up using the 3070.

From the 9 pancake games that I benchmarked, AMD has mostly caught up top Nvidia in rasterized performance. But not yet ray tracing as it is their first gen and already AMD is talking about Navi 3. So we'll see a DLSS kind of equivalent that will progress as AMD works on it.

Also AMD fell short in VR - they win 4 out of 15 VR games tested vs the RTX 3080 and their frame delivery is not as smooth in some games.

https://babeltechreviews.com/vr-wars-the...nchmarked/

Of course, these are the launch drivers and AMD hopefully will be able to address VR performance and smoothness in future drivers. But overall, the 6800 XT is a very nice card and I can't find any flaws with the reference version. It's a large step up over my 5700 XT.

BtW, record traffic crashed BTR's host server - within an hours of my VR review going live, it was discussed simultaneously on 9 Reddit subs (the big ones - r/hardware; r/nvidia;r/amd; r/valveindex;r/nvidia;r/virtualreality). I never would have dreamed starting out with VR four years ago that I could post a very popular video card launch review that lead with VR performance. And it has been much more successful than my traditional review.
  Reply
#3
Congrats, apoppin, I'm really happy for you. I was wondering why BTR was down yesterday and I had thoughts of emailing you about it. It's nice to see things turning out so nicely for you with BTR.

And yeah, AMD fell short in VR but not by much as far as I could tell. They appeared to be within 15% of the nVidia card. It kind of reminded me of the 4k rasterized gaming situation with the two cards. It's nice to see AMD at least in the same ballpark as nVidia. Things got really bad for a bit. I honestly didn't think AMD would ever catch up with nVidia again.
  Reply
#4
I shouldn't say this, but ... I told you not to count AMD out. Lisa Su is the equal of her cousin Jensen and probably even more motivated to catch up and then surpass Nvidia. And I am sure Jensen wants to stay ahead of AMD graphics. I love these kinds of family quarrels
Biggrin

And I will say it again - now - don't count Intel out either. They may produce a solid CPU again to beat AMD in a few years or less - and fix their processes. So this competition is good. With Apple's M1 they have ended dependence on any other chip maker which is awesome for them and the way they integrate their software. BUT don't count ARM out either since they have been acquired by Nvidia (shortly) they will have a massive leap in performance using more GPU like functions.

So it's all good and exciting for the near-future in hardware and SW. And good to see the consoles also now pushing graphics forward with ray tracing in real time.

BTR is doing very well and well continue to focus on pancake game and PC-VR gaming. I think it's official now and you video gamers now know what we VR enthusiasts call traditional PC gaming - "Pancake gaming".
  Reply
#5
nVidia faces a serious uphill battle trying to compete with Apple in terms of their CPU design. nVidia appears to have largely dropped development on their Tegra line of products. It will indeed be interesting to see what they can come up with once they regroup and leverage their new Arm purchase.

Apple has been consistently developing the best ARM based CPUs for a very long time now. Every single iPhone generation has had the fastest smartphone CPU on the market, often by a very large margin.

It's great to see competition, but Apple is going to have what looks like a decent 2-3 year run at things where they might take a pretty decent dent out of the PC pie.
  Reply
#6
Well, I will just post what I posted at ArsTechnica and tack a little bit on the end to rile Jensen up.  Basically though, the RT performance vs nvidia doesn't matter thanks to the consoles (games will be developed for console first, pc 2nd and this will dictate their basic RT needs).

Quote:Well, I see it a little differently. This time around you have the consoles also with ray tracing capability and they will set the standard developers code for (sponsored efforts on PC not withstanding) until the PS5/Series X gets replaced, so 2080 Super-ish performance plus should be adequate on PC to play upcoming games.

Hardware accelerated ray tracing itself is still in its infancy despite Turing being over two years old now and I expect the PC hardware will get more performant with newer GPU's but the consoles and their eventual replacements will drive what developers build their games around unless for some unknown reason the new consoles are a massive market failure (which I do not see happening).

Quote:I don't think all of us bought into the hype, I know I sure as hell didn't. At least it isn't a situation where the industry has to downgrade a bunch of stuff it was planning to do like last gen (think Witcher 3, Watchdogs, AC Unity etc and the downgrades vs what was initially promised due to how badly Jaguar performance sucked).

Regardless, what was delivered in silicon for the consoles is what developers will be targeting in games for the next 5+ years.

Both quotes are from the AMD 6000 review thread there if curious.

No for the riling up Jensen bit.  Raytracing is still in its infancy for both players.  It has a long way to go yet before it becomes mainstream.  Turing was NV30, Ampere is NV35 (the infamous Geforce FX series for those unaware).  Nividia is fortunate that AMD currently has no R300 they can counter with.  RDNA2 is good, but it is more like R200 (radeon 8000 series) in response to NV2x (Geforce 3 series).  NV40 is currently nowhere in sight.
  Reply
#7


I hate to disappoint you, but Jensen doesn't read what we post here. I only have reason to believe that he has only read one of my reviews (the AMID Evil review; and he evidently liked it). He is a pretty busy guy and he probably doesn't have a lot of time to read social media posts like when he was in university.

Anyway, ray tracing is going to be in every major AAA game - even if it is a checkbox feature. Nvidia did popularize it and it exceeded their expectations. AMD, MS, and Intel have adopted it and it's the new buzzword for "amazing" graphics - even if you need a magnifying glass to see it when the action is paused.

The consoles will use it also - it will use low RT settings but just enough for people to say they can tell the difference. And here is where the PC will differentiate itself as a platform from the other two. It will allow for more ray tracing - AMD, Nvidia, MS, and Intel will all be happy that they shiny expensive new HW toys are "ray tracing ready" and the game devs also jump on the RT gravy train. And we consumers finally get a little more eye candy instead of raw FPS for a lot more money. And ray tracing will head into VR also.

What I want to see is those Tensor cores used for NPC and game AI enhancements. I want to *talk* to NPCs instead of picking 'A', 'B', or 'C' for my dialogue "choices" in a RPG - and have them reply to *my* questions.

Anyway, although I could read the tea leaves, the AMD fans were holding on to the vain hope that the AMD launch would somehow be "better" than Nvidia's. Well, they didn't disappoint. They currently hold the paper launch crown having won it fair and square from Nvidia and Intel with the 6800 launch. Reference versions sold out in seconds. AIBs price $100 to $200 (plus) above Reference MSRP - and we hear the reference versions have been discontinued.
Dirol

Did MLID make a video decrying what just happened or was their hate just reserved only for Nvidia?

I am glad BTR is a review site and hardware "politics" mostly stay out of my reviews other than to report what was/is/will be happening. I do put my personal feeling here and I don't like the consumer unfriendly choices we have seen recently. At any rate, the 6800 series - itself, the hardware/software - are solid and are definitely competitive with Nvidia for rasterized games although they fall short in ray tracing and in VR. It may be drivers, but we will have to wait and see.

But the cards offer a viable alternative for the vast majority of gamers. If they could be found in stock for reasonable prices. It isn't as bad as during the mining craze, but it's still bad. And it will be months before we see any really good stock of the new cards.
  Reply
#8
MLID has criticized AMD just as much for their paper launch as he did nvidia.

AMD's predicament is a lot more understandable than nvidia's however - every man and his dog is fighting for a share of TSMC's 7nm process and there is only so much to go around and within AMD itself it has to produce Ryzen CPU's, console APU's and also their new graphics cards, nvidia only has to produce GPU's at a different foundry on a different node that is all but barren of production competition compared to TSMC 7nm.

Appparently nvidia has been very busy selling Ampere to coin miners while they let gamers dangle in the breeze....
  Reply
#9
What coin miners? Ampere sucks at mining as does Big Navi. The best mining cards are still Vegas and it is no longer profitable to mine cryptocurrency using video cards.

It appears that Nvidia may not be too happy with Samsung's yields and it looks like TSMC may get even more busy. Neither AMD, Nvidia, nor Intel has been particularly consumer friendly with their recent "low supply/high demand" launches.
  Reply
#10
https://www.tweaktown.com/news/76468/can...index.html
Quote:Yeah it's going to be months before you can buy a new GeForce RT X30 series cards -- and that's directly from NVIDIA. But now, we're hearing why there's so little cards -- $175,000,000 worth of them were reportedly sold to crypto miners.

In a new report from Barrons, it appears that NVIDIA made around $175 million selling Ampere GPUs to crypto miners. Ethereum changes are coming, so miners will not be able to use older GPUs so many of them are upgrading right now -- and picking up whatever Ampere GPUs they can in the process.
  Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)