samsung-galaxy-note-20-ultra-vs-galaxy-note-20:-what’s-the-difference?

Samsung Galaxy Note 20 Ultra vs Galaxy Note 20: What’s the difference?

(Pocket-lint) – Samsung revealed two new models of the Samsung Galaxy Note earlier in 2020 – the Galaxy Note 20 Ultra and the regular Galaxy Note.

As fans of the series will recall, in 2019, Samsung offered two sizes of this phone, taking the Note 10 smaller while pushing the Note 10+ as the larger size model. In reality, it was the Note 10+ that was the true successor to the Note crown, while the “normal” model slipped into a smaller and more affordable position. 

That gap between the Note 20 and the Ultra model has become wider in 2020. Here’s how they compare.

squirrel_widget_326997

Design

  • Note 20 Ultra: 164.8 x 77.2 x 8.1mm, 208g Gorilla Glass 7
  • Note 20: 161.6 x 75.2 x 8.3mm, 192g, polycarbonate back

Never has so much been written about design when it comes to two phones in the same family. In the past, Samsung has often offered much the same design between regular and plus models. That changed with the launch of the S20 Ultra – and the Note 20 Ultra is different to the regular Note 20 too. 

While the difference in size is to be expected because the displays are a different size, the design itself is quite different too. The Note 20 Ultra has flattened ends and squared corners, while the Note 20 has softer curved corners. 

The Note 20 also moves to a plastic back – or “glasstic” as Samsung calls it – rather than glass. This is quite a move, considering that Samsung has been using glass for its rear panels for some time. It also means the Note 20 is positioned quite differently to the Note 20 Ultra with the Ultra being the far more premium model.

Display 

  • Note 20 Ultra: 6.9in, 3088 x 1440 pixel (496ppi), 120Hz
  • Note 20: 6.7in, 2400 x 1080 pixels (393ppi), 60Hz 

While the displays are a different size, there’s a big difference in technology too. The Ultra gets a 6.9-inch AMOLED display with a 120Hz refresh rate and Quad HD+ resolution. It’s pretty much as flagship as you can get.

The Note 20 display has the same display as the Note 10 Lite. That’s a 6.7-inch AMOLED Full HD+ at 60Hz and flat – so missing Samsung’s signature flagship curved edges.

It’s a pretty big difference, although there will be many who don’t mind the lower resolution or refresh rate. What’s important is that it still offers the S Pen features on a display that’s big and that’s a hallmark of the Galaxy Note family. 

What is different to previous years is that the Note 20 isn’t getting that smaller display like the Note 10 offered, which was 6.3 inches. 

Hardware

  • Note 20 Ultra: Qualcomm SD865 Plus or Exynos 990, 8GB/12GB RAM, 128GB/256GB/512GB storage, 4500mAh
  • Note 20: Qualcomm SD865 Plus or Exynos 990, 8GB RAM, 128GB/256GB storage, 4300mAh

When it comes to the core hardware, we return to some sort of parity between the two Note models. Both are powered either by the Qualcomm Snapdragon 865 Plus or the Exynos 990, obviously using Qualcomm in some regions and Exynos in others as we’ve previously seen from Samsung. 

The Ultra comes with 8GB of RAM in the LTE model and 12GB RAM in the 5G model, while the Note 20 sticks to 8GB in both, reinforcing a different positioning of these phones. Storage options differ depending on LTE or 5G too.

The Note 20 LTE comes in one model with 256GB storage, while the 5G model is offered in 128GB and 256GB options, all region dependant. The Note 20 Ultra comes in 128GB, 256GB and 512GB storage options in the 5G model and 256GB and 512GB storage options in the LTE model, again region dependant. Only the Ultra offers microSD support for storage expansion, like the Note 10+ did.

When it comes to batteries, the Note 20 Ultra has a 4500mAh capacity while the Note 20 has a 4300mAh capacity, so the smaller Note 20 gets better stamina than the Ultra, because of lower hardware demands on the battery.

Cameras

  • Note 20 Ultra
    • Main: 108MP f/1.8
    • Ultra-wide: 12MP f/2.2
    • Zoom: 12MP f/3.0 5x, 50X SpaceZoom
  • Note 20
    • Main: 12MP f/1.8
    • Ultra-wide: 12MP f/2.2
    • Zoom: 64MP f/2.0 3x, 30X SpaceZoom

If you’re a Samsung fan, then the cameras in these respective devices might look familiar. On first glance they are similar to the load-out on the S20 Ultra and S20 models, although the 48-megapixel zoom of the S20 Ultra has been swapped out for a 12-megapixel zoom, now giving you 50X zoom, rather than the 100X zoom of the S20 Ultra, with 5x optical.

The regular Note 20 also gets a respectable camera load-out. It has a system very similar to the Galaxy S20, with a 12-megapixel main sensor with big pixels. It also offers zoom, but only 30X digital – which is 3x optical. It uses the 64-megapixel sensor here to enable the 8K video capture (as it did on the S20), while the Ultra uses the 108-megapixel sensor for 8K. 

Both phones also offer an ultra-wide camera, which is the same. They also both have the same front 10-megapixel selfie camera. 

What’s clear here is this is an area where Samsung doesn’t appear to be dropping the Note 20 too far. Sure, it’s not the same as the Ultra, but then increasing the resolution just so you can combine pixels back to 12-megapixels doesn’t automatically make for a better camera – a lot will comes down to the computation behind the lens and that’s pretty much the same in that regard.

squirrel_widget_327438

Summing up 

The two Galaxy Note 20 models are radically different this year, with Samsung seemingly aiming to open up a wider gap between these two devices than it did in 2019. That might be a reflection of how the Galaxy Note 10, or the Galaxy Note 10 Lite, was received.

The Note 20 picks up some of what the Note 10 Lite offered but sticks to some of the premium aspects in the core hardware and the camera. This is reflected in the price of the handset somewhat. Even without the top specs, that larger display is much more useful for the S Pen.

The Note 20 Ultra is rather more predictable. It is the true flagship with a high price to match and the best of everything Samsung has to offer. At its heart, that’s what the Galaxy Note should be – but with so many big screen – affordable – phones around, we suspect that’s what’s driven Samsung to make the regular Note 20 a little more ordinary.

Writing by Chris Hall. Editing by Britta O’Boyle.

samsung-galaxy-note-20-ultra-vs-galaxy-s20-ultra-vs-s20+:-which-should-you-buy?

Samsung Galaxy Note 20 Ultra vs Galaxy S20 Ultra vs S20+: Which should you buy?

(Pocket-lint) – Samsung revealed the Galaxy Note 20 Ultra on 5 August 2020, alongside the Note 20, but it’s the Ultra model that has the top specifications, just as the S20 Ultra does for the Galaxy S range.

How does the top Note model compare to the top Galaxy S models though? Should you go with the Galaxy S20 Ultra, Galaxy S20+ or the Galaxy Note 20 Ultra if you’re in the market for the best Samsung has to offer?

Here are the similarities and differences between the Galaxy S20 Ultra, Galaxy S20+ and the Note 20 Ultra to help you decide which is right for you.

squirrel_widget_326997

Design

  • Note 20 Ultra: 164.8 x 77.2 x 8.1mm, 208g
  • S20 Ultra: 166.9 x 76 x 8mm, 220g
  • S20+: 161.9 x 73.7 x 7.8mm, 186g

The Samsung Galaxy Note 20 Ultra, Galaxy S20 Ultra and Galaxy S20+ all feature metal and glass designs, with curved edges, centralised punch hole cameras at the top of their displays and pronounced camera housings in the top left corner on the rear.

As you might expect, the Note 20 Ultra has a slightly different look the to Galaxy S20 Ultra and the Galaxy S20+. It is squarer in its approach, has a built-in S Pen and it features a more prominent camera system on the rear. Meanwhile, the Galaxy S20 Ultra and S20+ are almost identical with rounder edges than the Note but the S20 Ultra has a wider camera housing on the rear.

All devices have microSD slots, USB Type-C charging ports, no 3.5mm headphone jack and they all have an IP68 water and dust resistance. The Galaxy S20 Ultra is the largest and heaviest, followed by the Note 20 Ultra and then the Galaxy S20+.

Display

  • Note 20 Ultra: 6.9-inch, AMOLED, 3088 x 1440 (496ppi), 120Hz
  • S20 Ultra:  6.9-inch, AMOLED, 3200 x 1440 (509ppi), 120Hz
  • S20+: 6.7-inch, AMOLED, 3200 x 1440 (524ppi), 120Hz

The Samsung Galaxy Note 20 Ultra, S20 Ultra and S20+ all have Infinity-O Dynamic AMOLED displays with HDR10+ certification and 120Hz refresh rates, though the Note 20 Ultra is said to be brighter.

The Galaxy Note 20 Ultra and S20 Ultra both have 6.9-inch screens, while the S20+ is a little smaller at 6.7-inches. All have a Quad HD+ resolution, making the S20+ the sharpest in terms of pixels per inch but in reality, the difference is not something the human eye would be able to see easily, if at all.

squirrel_widget_184581

Hardware and specifications

  • Note 20 Ultra: SD865 Plus/Exynos 990, 12/8GB RAM, 128/256/512GB storage, S Pen
  • S20 Ultra: SD865/Exynos 990, 16/12GB RAM, 128/256/512GB storage
  • S20+: SD865/Exynos 990, 12/8GB RAM, 128/256/512GB storage

The Samsung Galaxy Note 20 Ultra, S20 Ultra and S20+ all feature an under display fingerprint sensor, 5G and LTE model variants and they all have microSD support for storage expansion. 

The Note 20 Ultra runs on the Qualcomm Snapdragon 865 Plus, or the Exynos 990, while the S20 Ultra and the S20+ run on the Qualcomm Snapdragon 865 or the Exynos 990, region dependant.

In terms of RAM and storage support, the Note 20 Ultra 5G has 12GB with storage options of 128GB, 256GB and 512GB. The 4G model has 8GB with storage options of 256GB and 512GB. Not all variants will be available in all countries though. 

The S20 Ultra meanwhile, offers 16GB of RAM with 512GB storage in its 5G model, while the 4G model has 12GB of RAM and 128GB or 256GB of storage. The S20+ has 12GB of RAM in its 5G model with storage options of 128GB, 256GB and 512GB, like the Note 20 Ultra. The 4G model has 8GB of RAM and 128GB storage.

The Note 20 Ultra offers the S Pen functionality too, as well as the ability to connect DeX wirelessly and it has Ultra Wideband technology on board too.

Battery

  • Note 20 Ultra: 4500mAh
  • S20 Ultra: 5000mAh
  • S20+: 4500mAh

The Samsung Galaxy Note 20 and Galaxy S20+ have a 4500mAh battery under their hoods, while the S20 Ultra has a 5000mAh battery capacity. All offer similar overall performance.

All devices offer wireless charging and fast charging. 

Cameras

  • Note 20 Ultra: Triple rear (12MP + 108MP + 12MP), 10MP front 
  • S20 Ultra: Quad rear (12MP + 108MP + 48MP + DepthVision), 48MP front
  • S20+: Quad rear (12MP + 12MP + 64MP + DepthVision), 10MP front

The Samsung Galaxy Note 20 has a triple rear camera system, comprised of a 12-megapixel ultra wide-angle, a 108-megapixel wide-angle and a 12-megapixel telephoto. There is also a laser autofocus sensor on board to help achieve the 50X zoom and 5x optical.

The S20 Ultra meanwhile, has a 12-megapixel ultra wide, 108-megapixel wide-angle and a 48-megapixel telephoto sensor. It too has a DepthVision sensor and it is capable of 100X Zoom, although the long zoom isn’t really that useful, so it’s understandble why they dropped it for the Note 20 Ultra.

 The S20+ has a 12-megapixel ultra wide, 12-megapixel wide-angle and a 64-megapixel telephoto, the S20+ offers 30X zoom. The large telephoto sensors on the Galaxy S20 models is really to support 8K video. 

The Note 20 Ultra has a 10-megapixel front camera, as does the S20+, while the S20 Ultra has a 48-megapixel front camera. 

squirrel_widget_184580

Pricing and conclusion

The Samsung Galaxy S20 Ultra started at £1199 or $1399.99 when it first launched. The S20+ started at £999 or $1199.99 when it first launched. The Note 20 Ultra starts at £1179 or $1299, placing it in between the S20 Ultra and the S20+.

On paper, the Galaxy S20 Ultra remains the device with the top hardware specs, offering more RAM and a larger battery capacity than the Note 20 Ultra, but you miss out on the extra power from the SD865 Plus processor (in those regions), the S Pen and the Ultra Wideband technology. If it’s the S Pen you want, then it’s the Note that you choose.

The Galaxy S20+ meanwhile offers plenty of reasons to buy it, including the same battery capacity as the Note 20 Ultra and a similar hardware loadout. It is also a little cheaper if the S Pen doens’t bother you, and the only area where it really can’t match the S20 Ultra is in zoom performance.

Writing by Britta O’Boyle. Editing by Chris Hall.

samsung-galaxy-s20-fe-vs-galaxy-s20+:-what’s-the-difference?

Samsung Galaxy S20 FE vs Galaxy S20+: What’s the difference?

(Pocket-lint) – Samsung announced the Galaxy S20 Fan Edition, or Galaxy S20 FE in September 2020. It’s a lighter take on the Galaxy S20 family, offering many of the important specs, but making a few cuts so it’s a little more affordable.

It goes head-to-head with the Samsung Galaxy S20+, which was our pick of the previous models, and the device that potentially has the greatest appeal. Has it now been undercut?

Let’s take a look at how these phones compare.

squirrel_widget_184580

Prices and availability

  • Galaxy S20 FE: £599 (4G), £699 (5G)
  • Galaxy S20+: £999 (5G)

Price comparisons are a little tricky given that there are so many different versions of the Galaxy S20+ globally, but the easy version is this: the Galaxy S20 FE is cheaper, no matter which you choose.

Even with discounts from the original launch price, the S20+ is still more expensive than the S20 FE. Not all regions get all models of the S20+ and not all regions get all versions of the S20 FE, but whichever way you cut it, the FE costs less.

Build and dimensions 

  • Samsung Galaxy S20 FE: 159.8 x 74.5 x 8.4mm, 190g
  • Samsung Galaxy S20+: 161.9 x 73.7 x 7.8mm, 186g

The size of the S20+ and the S20 FE are surprisingly close. There’s a few millimetres difference, with the FE being slightly shorter – explained by the smaller display – and slightly wider, likely because the display is flat. It’s also a little thicker, not that you’d notice. There are wider bezels on the S20 FE, again most likely due to the flatter display, so it doesn’t look quite as premium as the S20+.

Both of these phones offer IP68 waterproofing, both offer stereo speakers supporting Dolby Atmos and both have a similar camera arrangement on the back of the phone.

The major difference is that the rear of the S20 FE is glasstic – plastic – rather than glass of the S20+. This might make it more durable, it might mean it doesn’t feel as premium, but it does have a matte finish, so it’s less likely to gather fingerprints.

The Galaxy S20 FE also comes in a range of colours – blue, red, lavender, mint, white, orange – whereas the Galaxy S20+ is all about the serious grey, black and light blue models.

In reality, there’s very little difference.

Display 

  • Samsung Galaxy S20 FE: 6.5-inch, 120Hz, AMOLED, Full HD+
  • Samsung Galaxy S20+: 6.7-inch, 120Hz, AMOLED, Quad HD+

When it comes to the display, both use the same type of panel, AMOLED in both cases with a punch hole for the front camera. Technically, Samsung says that the S20 FE has a Super AMOLED X2, while the Galaxy S20+ has a Dynamic AMOLED X2.

The real difference is in the resolution. The Samsung Galaxy S20+ offers Quad HD+, that’s 3200 x 1440 pixels (524ppi), while the Galaxy S20 FE offers 2400 x 1080 pixels (404ppi). Technically, the S20+ can render finer detail – but Samsung’s default on the S20+ is Full HD anyway and many people never use the full resolution, so it’s arguably, no big loss.

Both phones also offer 120Hz and that’s going to be something that fans do want, so to see it in the cheaper device is welcomed.

As we mentioned above, the display on the Galaxy S20 FE is slightly smaller at 6.5-inches, a small reduction of 0.2-inch over the S20+ which won’t make a huge difference in reality. The S20 SE is also flat, so there’s no curves to the edges.

This might actually be a benefit: although curves look good and make it slightly easier to grip a large phone, it can lead to some reduction in touch sensitivity towards the edges. Give us a flat display for gaming any day of the week.

squirrel_widget_2682132

Core hardware and battery

  • Samsung Galaxy S20 SE: Qualcomm Snapdragon 865 (5G), Exynos 990 (4G), 6GB/128GB, 4500mAh
  • Samsung Galaxy S20+: Qualcomm Snapdragon 865 (4/5G), Exynos 990 (4/5G), 8/12/128GB, 4500mAh

When it comes to the core hardware, the story really reveals itself. The headline is that the Galaxy S20 FE 5G version is powered by the Qualcomm Snapdragon 865 globally. The 4G version will be Exynos 990, but the 4G version won’t be available in all regions (like the US).

The Galaxy S20+ is much more complicated: there are 4 and 5G versions of both the Snapdragon 865 and the Exynos 990. In Europe, it’s been the Exynos 990 version that been available – so the S20 FE is a chance to get a Qualcomm-powered Samsung device, but make sure you buy the 5G version.

There’s a reduction in RAM to 6GB with 128GB storage. In reality, the reduction in RAM is unlikely to have a big impact on how the phone runs. MicroSD storage expansion is supported on all devices.

They both also have the same battery capacity at 4500mAh. The S20 FE has slightly greater endurance thanks to the slightly lower spec display but there isn’t much difference.

Cameras

  • Galaxy S20 FE:
    • Main: 12MP, f/1.8, 1.8µm, OIS
    • Tele: 8MP, f/2.4, 1.0µm, OIS, 3x
    • Ultra-wide: 12MP, f/2.2, 1.12µm
    • Front: 32MP, f/2.2, 0.8µm, FF
  • Galaxy S20+:
    • Main: 12MP, f/1.8, 1.8µm, OIS  
    • Tele: 64MP, f/2.0, 0.8µm, OIS, 3x
    • Ultra-wide: 12MP, f/2.2, 1.4µm
    • DepthVision
    • Front: 10MP, f/2.2, 1.22µm, AF

There’s a lot in common between the Samsung Galaxy S20+ and the Galaxy S20 FE cameras. Broadly they have the same selection, based around the same main camera. That’s a 12-megapixel camera with nice large pixels to absorb lots of light without the nonsense of pixel combining that’s popular elsewhere.

It’s joined by ultra-wide and telephoto cameras, but here the specs are different. Starting with the telephoto, the big switch is from a 64-megapixel sensor to an 8-megapixel sensor. It’s a totally different approach from the hardware, but both offer 3x optical zoom, which has OIS, while both then also offer 10x digital zoom for Samsung’s 30X Space Zoom feature.

Why the switch? The 64-megapixel camera on the S20+ also handles 8K video capture, so we suspect that the reason for the switch is that it doesn’t offer 8K capture on the S20 FE.

The ultra-wide is also a different camera and with a switch to a smaller sensor in the S20 FE, it’s not quite as good as the performance of the S20+.

The S20+ has a decent 10-megapixel camera selfie. For some reason, Samsung has moved to a 32-megapixel front camera. There doesn’t seem to be any logic to this move that we can see at all and it’s fixed focus rather than autofocus, so it’s a little weaker.

Finally, the S20+ also has a DepthVision sensor, but we don’t really think it does very much, so it won’t be missed on the S20 FE.

squirrel_widget_2682131

Conclusion 

Given that the Samsung Galaxy S20 FE is the cheaper phone, it has a lot going for it. So what do you actually miss out on? There are some small camera changes, although with the same main camera, the experience is going to be broadly the same.

There are some minor spec changes like less RAM, although that doesn’t have a huge impact in use, compared to the option for those in Europe to get a Qualcomm device instead of Exynos, which is likely to be popular.

There are changes in the display: the flatter display may actually suit some and again the reduction in resolution is only going to bother some people and have very little difference on things like games or media consumption.

Finally, there’s the plastic back. Sure, you won’t have the most premium finish, but at the same time, you’ll have more cash in your pocket still. Considering this, the Samsung Galaxy S20 FE looks like a win to us, a respectable push back against the rising power of mid-range devices and the antidote to over-specced and over-priced flagships.

Writing by Chris Hall. Editing by Britta O’Boyle.

msi-geforce-rtx-3090-suprim-x-review

MSI GeForce RTX 3090 Suprim X Review

Introduction

The MSI GeForce RTX 3090 Suprim X is the company’s new flagship air-cooled graphics card; it also introduces the new Suprim brand extension denoting the highest grade of MSI engineering and performance tuning. Last week, we brought you our review of the RTX 3080 Suprim X and today, we have with us its bigger sibling. Actually, both the RTX 3080 Suprim X and RTX 3090 Suprim X are based on a nearly identical board design, but the silicon underneath—the mighty RTX 3090—and support for NVLink SLI is what’s new. The Suprim X series is positioned a notch above the company’s Gaming X Trio and likely a replacement for the company’s Gaming Z brand, which probably had too many similarities in board design to the Gaming X to warrant a price increase. MSI is also giving its product stack a new class of graphics cards to compete against the likes of the EVGA air-cooled FTW3 Ultra. It’s also taking a crack at NVIDIA’s Founders Edition in the aesthetics department.

With the RTX 30-series “Ampere,” NVIDIA reshaped the upper end of its GeForce GPU family. The RTX 3080 is designed to offer premium 4K UHD gaming with raytracing and is already being referred to as the company’s flagship gaming graphics card. NVIDIA has been extensively comparing the RTX 3080 to the RTX 2080 Ti, which it convincingly beats. The new RTX 3090, on the other hand, is what NVIDIA is positioning as its new “halo” product with comparisons to the $2,500 TITAN RTX, while being $1,000 cheaper, starting at $1,500. Both the RTX 3080 and RTX 3090 share a common piece of silicon, with the RTX 3090 almost maxing it out, while the RTX 3080 is quite cut down.

The GeForce Ampere graphics architecture represents the 2nd generation of the company’s RTX technology, which combines conventional raster 3D graphics with certain real-time raytraced elements, such as lighting, shadows, reflections, ambient-occlusion, and global-illumination, to radically improve realism in games. Processing these in real time requires fixed-function hardware as they’re extremely taxing on programmable shaders. The GeForce Ampere architecture hence combines the new Ampere CUDA core, which can handle concurrent FP32+INT32 math operations, significantly increasing performance over past generations; the new 2nd generation RT core, which in addition to double the ray intersection and BVH performance over Turing RT cores offers new hardware to accelerate raytraced motion blur; and the 3rd generation Tensor core, which leverages the sparsity phenomenon in AI deep-learning neural networks to increase AI inference performance by an order of magnitude over the previous generation.

NVIDIA is equipping the RTX 3090 with a mammoth 24 GB of video memory and targets it at creators as well, not just gamers. Creators can pair it with NVIDIA’s feature-rich GeForce Studio drivers, while gamers can go with GeForce Game Ready drivers. That said, the RTX 3090 isn’t strictly a creator’s card, either. NVIDIA is taking a stab at the new 8K resolution for gaming, which is four times the pixels of 4K and sixteen times Full HD—not an easy task even for today’s GPUs. The company hence innovated the new 8K DLSS feature, which leverages AI super-resolution to bring higher fidelity gaming than previously thought possible, for 8K.

As we mentioned earlier, the RTX 3090 is based on the 8 nm “GA102” silicon, nearly maxing it out. All but one of the 42 TPCs (84 streaming multiprocessors) are enabled, resulting in a CUDA core count of 10,496, along with 328 Tensor cores, 82 RT cores, 328 TMUs, and 112 ROPs. To achieve 24 GB, the RTX 3090 maxes out the 384-bit wide memory bus on the “GA102” and uses the fastest 19.5 Gbps GDDR6X memory, which gives the card an astounding 940 GB/s of memory bandwidth.

The MSI GeForce RTX 3090 Suprim X is designed to give NVIDIA’s RTX 3090 Founders Edition a run for its money in a beauty contest, with lavish use of brushed aluminium in the construction of the cooler shroud, perfect symmetry throughout the card, and sharp edges beautifully finished off with RGB LED elements. The amount of illumination on this card is similar to the Gaming X Trio, but more tastefully designed. The RTX 3090 Suprim X also features MSI’s highest factory overclock for the RTX 3090, with the core ticking at 1860 MHz (vs. 1695 MHz reference and 1785 MHz on the Gaming X Trio). MSI is pricing the RTX 3090 Suprim X at $1,750, a $250 premium over the NVIDIA baseline pricing and $160 pricier than the RTX 3090 Gaming X Trio. In this review, we find out if it’s worth spending the extra money on this card over the Gaming X Trio, or even NVIDIA’s Founders Edition.

GeForce RTX 3080 Market Segment Analysis
  Price Shader

Units
ROPs Core

Clock
Boost

Clock
Memory

Clock
GPU Transistors Memory
GTX 1080 Ti $650 3584 88 1481 MHz 1582 MHz 1376 MHz GP102 12000M 11 GB, GDDR5X, 352-bit
RX 5700 XT $370 2560 64 1605 MHz 1755 MHz 1750 MHz Navi 10 10300M 8 GB, GDDR6, 256-bit
RTX 2070 $340 2304 64 1410 MHz 1620 MHz 1750 MHz TU106 10800M 8 GB, GDDR6, 256-bit
RTX 2070 Super $450 2560 64 1605 MHz 1770 MHz 1750 MHz TU104 13600M 8 GB, GDDR6, 256-bit
Radeon VII $680 3840 64 1802 MHz N/A 1000 MHz Vega 20 13230M 16 GB, HBM2, 4096-bit
RTX 2080 $600 2944 64 1515 MHz 1710 MHz 1750 MHz TU104 13600M 8 GB, GDDR6, 256-bit
RTX 2080 Super $690 3072 64 1650 MHz 1815 MHz 1940 MHz TU104 13600M 8 GB, GDDR6, 256-bit
RTX 2080 Ti $1000 4352 88 1350 MHz 1545 MHz 1750 MHz TU102 18600M 11 GB, GDDR6, 352-bit
RTX 3070 $500 5888 96 1500 MHz 1725 MHz 1750 MHz GA104 17400M 8 GB, GDDR6, 256-bit
RX 6800 $580 3840 96 1815 MHz 2105 MHz 2000 MHz Navi 21 23000M 16 GB, GDDR6, 256-bit
RX 6800 XT $650 4608 128 2015 MHz 2250 MHz 2000 MHz Navi 21 23000M 16 GB, GDDR6, 256-bit
RTX 3080 $700 8704 96 1440 MHz 1710 MHz 1188 MHz GA102 28000M 10 GB, GDDR6X, 320-bit
RTX 3090 $1500 10496 112 1395 MHz 1695 MHz 1219 MHz GA102 28000M 24 GB, GDDR6X, 384-bit
MSI RTX 3090

Suprim X
$1750 10496 112 1395 MHz 1860 MHz 1219 MHz GA102 28000M 24 GB, GDDR6X, 384-bit
amd-radeon-rx-6800-xt-and-rx-6800-review:-nipping-at-ampere’s-heels

AMD Radeon RX 6800 XT and RX 6800 Review: Nipping at Ampere’s Heels

(Image credit: Tom’s Hardware)

The AMD Radeon RX 6800 XT and Radeon RX 6800 have arrived, joining the ranks of the best graphics cards and making some headway into the top positions in our GPU benchmarks hierarchy. Nvidia has had a virtual stranglehold on the GPU market for cards priced $500 or more, going back to at least the GTX 700-series in 2013. That’s left AMD to mostly compete in the high-end, mid-range, and budget GPU markets. “No longer!” says Team Red. 

Big Navi, aka Navi 21, aka RDNA2, has arrived, bringing some impressive performance gains. AMD also finally joins the ray tracing fray, both with its PC desktop graphics cards and the next-gen PlayStation 5 and Xbox Series X consoles. How do AMD’s latest GPUs stack up to the competition, and could this be AMD’s GPU equivalent of the Ryzen debut of 2017? That’s what we’re here to find out.

We’ve previously discussed many aspects of today’s launch, including details of the RDNA2 architecture, the GPU specifications, features, and more. Now, it’s time to take all the theoretical aspects and lay some rubber on the track. If you want to know more about the finer details of RDNA2, we’ll cover that as well. If you’re just here for the benchmarks, skip down a few screens because, hell yeah, do we have some benchmarks. We’ve got our standard testbed using an ‘ancient’ Core i9-9900K CPU, but we wanted something a bit more for the fastest graphics cards on the planet. We’ve added more benchmarks on both Core i9-10900K and Ryzen 9 5900X. With the arrival of Zen 3, running AMD GPUs with AMD CPUs finally means no compromises.

Update: We’ve added additional results to the CPU scaling charts. This review was originally published on November 18, 2020, but we’ll continue to update related details as needed.

AMD Radeon RX 6800 Series: Specifications and Architecture 

Let’s start with a quick look at the specifications, which have been mostly known for at least a month. We’ve also included the previous generation RX 5700 XT as a reference point. 

Graphics Card RX 6800 XT RX 6800 RX 5700 XT
GPU Navi 21 (XT) Navi 21 (XL) Navi 10 (XT)
Process (nm) 7 7 7
Transistors (billion) 26.8 26.8 10.3
Die size (mm^2) 519 519 251
CUs 72 60 40
GPU cores 4608 3840 2560
Ray Accelerators 72 60 N/A
Game Clock (MHz) 2015 1815 1755
Boost Clock (MHz) 2250 2105 1905
VRAM Speed (MT/s) 16000 16000 14000
VRAM (GB) 16 16 8
Bus width 256 256 256
Infinity Cache (MB) 128 128 N/A
ROPs 128 96 64
TMUs 288 240 160
TFLOPS (boost) 20.7 16.2 9.7
Bandwidth (GB/s) 512 512 448
TBP (watts) 300 250 225
Launch Date Nov. 2020 Nov. 2020 Jul-19
Launch Price $649 $579 $399

When AMD fans started talking about “Big Navi” as far back as last year, this is pretty much what they hoped to see. AMD has just about doubled down on every important aspect of its architecture, plus adding in a huge amount of L3 cache and Ray Accelerators to handle ray tracing ray/triangle intersection calculations. Clock speeds are also higher, and — spoiler alert! — the 6800 series cards actually exceed the Game Clock and can even go past the Boost Clock in some cases. Memory capacity has doubled, ROPs have doubled, TFLOPS has more than doubled, and the die size is also more than double.

Support for ray tracing is probably the most visible new feature, but RDNA2 also supports Variable Rate Shading (VRS), mesh shaders, and everything else that’s part of the DirectX 12 Ultimate spec. There are other tweaks to the architecture, like support for 8K AV1 decode and 8K HEVC encode. But a lot of the underlying changes don’t show up as an easily digestible number.

For example, AMD says it reworked much of the architecture to focus on a high speed design. That’s where the greater than 2GHz clocks come from, but those aren’t just fantasy numbers. Playing around with overclocking a bit — and the software to do this is still missing, so we had to stick with AMD’s built-in overclocking tools — we actually hit clocks of over 2.5GHz. Yeah. I saw the supposed leaks before the launch claiming 2.4GHz and 2.5GHz and thought, “There’s no way.” I was wrong.

AMD’s cache hierarchy is arguably one of the biggest changes. Besides a shared 1MB L1 cache for each cluster of 20 dual-CUs, there’s a 4MB L2 cache and a whopping 128MB L3 cache that AMD calls the Infinity Cache. It also ties into the Infinity Fabric, but fundamentally, it helps optimize memory access latency and improve the effective bandwidth. Thanks to the 128MB cache, the framebuffer mostly ends up being cached, which drastically cuts down memory access. AMD says the effective bandwidth of the GDDR6 memory ends up being 119 percent higher than what the raw bandwidth would suggest.

The large cache also helps to reduce power consumption, which all ties into AMD’s targeted 50 percent performance per Watt improvements. This doesn’t mean power requirements stayed the same — RX 6800 has a slightly higher TBP (Total Board Power) than the RX 5700 XT, and the 6800 XT and upcoming 6900 XT are back at 300W (like the Vega 64). However, AMD still comes in at a lower power level than Nvidia’s competing GPUs, which is a bit of a change of pace from previous generation architectures.

It’s not entirely clear how AMD’s Ray Accelerators stack up against Nvidia’s RT cores. Much like Nvidia, AMD is putting one Ray Accelerator into each CU. (It seems we’re missing an acronym. Should we call the ray accelerators RA? The sun god, casting down rays! Sorry, been up all night, getting a bit loopy here…) The thing is, Nvidia is on its second-gen RT cores that are supposed to be around 1.7X as fast as its first-gen RT cores. AMD’s Ray Accelerators are supposedly 10 times as fast as doing the RT calculations via shader hardware, which is similar to what Nvidia said with its Turing RT cores. In practice, it looks as though Nvidia will maintain a lead in ray tracing performance.

That doesn’t even get into the whole DLSS and Tensor core discussion. AMD’s RDNA2 chips can do FP16 via shaders, but they’re still a far cry from the computational throughput of Tensor cores. That may or may not matter, as perhaps the FP16 throughput is enough for real-time inference to do something akin to DLSS. AMD has talked about FidelityFX Super Resolution, which it’s working on with Microsoft, but it’s not available yet, and of course, no games are shipping with it yet either. Meanwhile, DLSS is in a couple of dozen games now, and it’s also in Unreal Engine, which means uptake of DLSS could explode over the coming year.

Anyway, that’s enough of the architectural talk for now. Let’s meet the actual cards.

Meet the Radeon RX 6800 XT and RX 6800 Reference Cards 

Image 1 of 11

(Image credit: Tom’s Hardware)

Image 2 of 11

(Image credit: Tom’s Hardware)

Image 3 of 11

(Image credit: Tom’s Hardware)

Image 4 of 11

(Image credit: Tom’s Hardware)

Image 5 of 11

(Image credit: Tom’s Hardware)

Image 6 of 11

(Image credit: Tom’s Hardware)

Image 7 of 11

(Image credit: Tom’s Hardware)

Image 8 of 11

(Image credit: Tom’s Hardware)

Image 9 of 11

(Image credit: Tom’s Hardware)

Image 10 of 11

(Image credit: Tom’s Hardware)

Image 11 of 11

(Image credit: Tom’s Hardware)

We’ve already posted an unboxing of the RX 6800 cards, which you can see in the above video. The design is pretty traditional, building on previous cards like the Radeon VII. There’s no blower this round, which is probably for the best if you’re worried about noise levels. Otherwise, you get a similar industrial design and aesthetic with both the reference 6800 and 6800 XT. The only real change is that the 6800 XT has a fatter heatsink and weighs 115g more, which helps it cope with the higher TBP.

Both cards are triple fan designs, using custom 77mm fans that have an integrated rim. We saw the same style of fan on many of the RTX 30-series GPUs, and it looks like the engineers have discovered a better way to direct airflow. Both cards have a Radeon logo that lights up in red, but it looks like the 6800 XT might have an RGB logo — it’s not exposed in software yet, but maybe that will come.

Image 1 of 11

(Image credit: Tom’s Hardware)

Image 2 of 11

(Image credit: Tom’s Hardware)

Image 3 of 11

(Image credit: Tom’s Hardware)

Image 4 of 11

(Image credit: Tom’s Hardware)

Image 5 of 11

(Image credit: Tom’s Hardware)

Image 6 of 11

(Image credit: Tom’s Hardware)

Image 7 of 11

(Image credit: Tom’s Hardware)

Image 8 of 11

(Image credit: Tom’s Hardware)

Image 9 of 11

(Image credit: Tom’s Hardware)

Image 10 of 11

(Image credit: Tom’s Hardware)

Image 11 of 11

(Image credit: Tom’s Hardware)

Otherwise, you get dual 8-pin PEG power connections, which might seem a bit overkill on the 6800 — it’s a 250W card, after all, why should it need the potential for up to 375W of power? But we’ll get into the power stuff later. If you’re into collecting hardware boxes, the 6800 XT box is also larger and a bit nicer, but there’s no real benefit otherwise.

The one potential concern with AMD’s reference design is the video ports. There are two DisplayPort outputs, a single HDMI 2.1 connector, and a USB Type-C port. It’s possible to use four displays with the cards, but the most popular gaming displays still use DisplayPort, and very few options exist for the Type-C connector. There also aren’t any HDMI 2.1 monitors that I’m aware of, unless you want to use a TV for your monitor. But those will eventually come. Anyway, if you want a different port selection, keep an eye on the third party cards, as I’m sure they’ll cover other configurations.

And now, on to the benchmarks.

Radeon RX 6800 Series Test Systems 

Image 1 of 10

(Image credit: Tom’s Hardware)

Image 2 of 10

(Image credit: Tom’s Hardware)

Image 3 of 10

(Image credit: Tom’s Hardware)

Image 4 of 10

(Image credit: Tom’s Hardware)

Image 5 of 10

(Image credit: Tom’s Hardware)

Image 6 of 10

(Image credit: Tom’s Hardware)

Image 7 of 10

(Image credit: Tom’s Hardware)

Image 8 of 10

(Image credit: Tom’s Hardware)

Image 9 of 10

(Image credit: Tom’s Hardware)

Image 10 of 10

(Image credit: Tom’s Hardware)

It seems AMD is having a microprocessor renaissance of sorts right now. First, it has Zen 3 coming out and basically demolishing Intel in every meaningful way in the CPU realm. Sure, Intel can compete on a per-core basis … but only up to 10-core chips without moving into HEDT territory. The new RX 6800 cards might just be the equivalent of AMD’s Ryzen CPU launch. This time, AMD isn’t making any apologies. It intends to go up against Nvidia’s best. And of course, if we’re going to test the best GPUs, maybe we ought to look at the best CPUs as well?

For this launch, we have three test systems. First is our old and reliable Core i9-9900K setup, which we still use as the baseline and for power testing. We’re adding both AMD Ryzen 9 5900X and Intel Core i9-10900K builds to flesh things out. In retrospect, trying to do two new testbeds may have been a bit too ambitious, as we have to test each GPU on each testbed. We had to cut a bunch of previous-gen cards from our testing, and the hardware varies a bit among the PCs.

For the AMD build, we’ve got an MSI X570 Godlike motherboard, which is one of only a handful that supports AMD’s new Smart Memory Access technology. Patriot supplied us with two kits of single bank DDR4-4000 memory, which means we have 4x8GB instead of our normal 2x16GB configuration. We also have the Patriot Viper VP4100 2TB SSD holding all of our games. Remember when 1TB used to feel like a huge amount of SSD storage? And then Call of Duty: Modern Warfare (2019) happened, sucking down over 200GB. Which is why we need 2TB drives.

Meanwhile, the Intel LGA1200 PC has an Asus Maximum XII Extreme motherboard, 2x16GB DDR4-3600 HyperX memory, and a 2TB XPG SX8200 Pro SSD. (I’m not sure if it’s the old ‘fast’ version or the revised ‘slow’ variant, but it shouldn’t matter for these GPU tests.) Full specs are in the table below.

Anyway, the slightly slower RAM might be a bit of a handicap on the Intel PCs, but this isn’t a CPU review — we just wanted to use the two fastest CPUs, and time constraints and lack of duplicate hardware prevented us from going full apples-to-apples. The internal comparisons among GPUs on each testbed will still be consistent. Frankly, there’s not a huge difference between the CPUs when it comes to gaming performance, especially at 1440p and 4K.

Besides the testbeds, I’ve also got a bunch of additional gaming tests. First is the suite of nine games we’ve used on recent GPU reviews like the RTX 30-series launch. We’ve done some ‘bonus’ tests on each of the Founders Edition reviews, but we’re shifting gears this round. We’re adding four new/recent games that will be tested on each of the CPU testbeds: Assassin’s Creed Valhalla, Dirt 5, Horizon Zero Dawn, and Watch Dogs Legion — and we’ve enabled DirectX Raytracing (DXR) on Dirt 5 and Watch Dogs Legion.

There are some definite caveats, however. First, the beta DXR support in Dirt 5 doesn’t look all that different from the regular mode, and it’s an AMD promoted game. Coincidence? Maybe, but it’s probably more likely that AMD is working with Codemasters to ensure it runs suitably on the RX 6800 cards. The other problem is probably just a bug, but AMD’s RX 6800 cards seem to render the reflections in Watch Dogs Legion with a bit less fidelity.

Besides the above, we have a third suite of ray tracing tests: nine games (or benchmarks of future games) and 3DMark Port Royal. Of note, Wolfenstein Youngblood with ray tracing (which uses Nvidia’s pre-VulkanRT extensions) wouldn’t work on the AMD cards, and neither would the Bright Memory Infinite benchmark. Also, Crysis Remastered had some rendering errors with ray tracing enabled (on the nanosuits). Again, that’s a known bug.

Radeon RX 6800 Gaming Performance

We’ve retested all of the RTX 30-series cards on our Core i9-9900K testbed … but we didn’t have time to retest the RTX 20-series or RX 5700 series GPUs. The system has been updated with the latest 457.30 Nvidia drivers and AMD’s pre-launch RX 6800 drivers, as well as Windows 10 20H2 (the October 2020 update to Windows). It looks like the combination of drivers and/or Windows updates may have dropped performance by about 1-2 percent overall, though there are other variables in play. Anyway, the older GPUs are included mostly as a point of reference.

We have 1080p, 1440p, and 4K ultra results for each of the games, as well as the combined average of the nine titles. We’re going to dispense with the commentary for individual games right now (because of a time crunch), but we’ll discuss the overall trends below.

9 Game Average

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Borderlands 3

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

The Division 2

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Far Cry 5

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Final Fantasy XIV

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Forza Horizon 4

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Metro Exodus

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Red Dead Redemption 2

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Shadow Of The TombRaider

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Strange Brigade

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

AMD’s new GPUs definitely make a good showing in traditional rasterization games. At 4K, Nvidia’s 3080 leads the 6800 XT by three percent, but it’s not a clean sweep — AMD comes out on top in Borderlands 3, Far Cry 5, and Forza Horizon 4. Meanwhile, Nvidia gets modest wins in The Division 2, Final Fantasy XIV, Metro Exodus, Red Dead Redemption 2, Shadow of the Tomb Raider, and the largest lead is in Strange Brigade. But that’s only at the highest resolution, where AMD’s Infinity Cache may not be quite as effective.

Dropping to 1440p, the RTX 3080 and 6800 XT are effectively tied — again, AMD wins several games, Nvidia wins others, but the average performance is the same. At 1080p, AMD even pulls ahead by two percent overall. Not that we really expect most gamers forking over $650 or $700 or more on a graphics card to stick with a 1080p display, unless it’s a 240Hz or 360Hz model.

Flipping over to the vanilla RX 6800 and the RTX 3070, AMD does even better. On average, the RX 6800 leads by 11 percent at 4K ultra, nine percent at 1440p ultra, and seven percent at 1080p ultra. Here the 8GB of GDDR6 memory on the RTX 3070 simply can’t keep pace with the 16GB of higher clocked memory — and the Infinity Cache — that AMD brings to the party. The best Nvidia can do is one or two minor wins (e.g., Far Cry 5 at 1080p, where the GPUs are more CPU limited) and slightly higher minimum fps in FFXIV and Strange Brigade.

But as good as the RX 6800 looks against the RTX 3070, we prefer the RX 6800 XT from AMD. It only costs $70 more, which is basically the cost of one game and a fast food lunch. Or put another way, it’s 12 percent more money, for 12 percent more performance at 1080p, 14 percent more performance at 1440p, and 16 percent better 4K performance. You also get AMD’s Rage Mode pseudo-overclocking (really just increased power limits).

Radeon RX 6800 CPU Scaling and Overclocking

Our traditional gaming suite is due for retirement, but we didn’t want to toss it out at the same time as a major GPU launch — it might look suspicious. We didn’t have time to do a full suite of CPU scaling tests, but we did run 13 games on the five most recent high-end/extreme GPUs on our three test PCs. Here’s the next series of charts, again with commentary below. 

13-Game Average

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Assassin’s Creed Valhalla

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Borderlands 3

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

The Division 2

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Dirt 5

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Far Cry 5

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Final Fantasy XIV

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Forza Horizon 4

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Horizon Zero Dawn

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Metro Exodus

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Red Dead Redemption 2

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Shadow of the Tomb Raider

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Strange Brigade

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Watch Dogs Legion

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

These charts are a bit busy, perhaps, with five GPUs and three CPUs each, plus overclocking. Take your time. We won’t judge. Nine of the games are from the existing suite, and the trends noted earlier basically continue.

Looking just at the four new games, AMD gets a big win in Assassin’s Creed Valhalla (it’s an AMD promotional title, so future updates may change the standings). Dirt 5 is also a bit of an odd duck for Nvidia, with the RTX 3090 actually doing quite badly on the Ryzen 9 5900X and Core i9-10900K for some reason. Horizon Zero Dawn ends up favoring Nvidia quite a bit (but not the 3070), and lastly, we have Watch Dogs Legion, which favors Nvidia a bit (more at 4K), but it might have some bugs that are currently helping AMD’s performance.

Overall, the 3090 still maintains its (gold-plated) crown, which you’d sort of expect from a $1,500 graphics card that you can’t even buy right now. Meanwhile, the RX 6800 XT mixes it up with the RTX 3080, coming out slightly ahead overall at 1080p and 1440p but barely trailing at 4K. Meanwhile, the RX 6800 easily outperforms the RTX 3070 across the suite, though a few games and/or lower resolutions do go the other way.

Oddly, my test systems ended up with the Core i9-10900K and even the Core i9-9900K often leading the Ryzen 9 5900X. The 3090 did best with the 5900X at 1080p, but then went to the 10900K at 1440p and both the 9900K and 10900K at 4K. The other GPUs also swap places, though usually the difference between CPU is pretty negligible (and a few results just look a bit buggy).

It may be that the beta BIOS for the MSI X570 board (which enables Smart Memory Access) still needs more tuning, or that the differences in memory came into play. I didn’t have time to check performance without enabling the large PCIe BAR feature either. But these are mostly very small differences, and any of the three CPUs tested here are sufficient for gaming.

As for overclocking, it’s pretty much what you’d expect. Increase the power limit, GPU core clocks, and GDDR6 clocks and you get more performance. It’s not a huge improvement, though. Overall, the RX 6800 XT was 4-6 percent faster when overclocked (the higher results were at 4K). The RX 6800 did slightly better, improving by 6 percent at 1080p and 1440p, and 8 percent at 4K. GPU clocks were also above 2.5GHz for most of the testing of the RX 6800, and it’s default lower boost clock gave it a bit more room for improvement.

Radeon RX 6800 Series Ray Tracing Performance 

So far, most of the games haven’t had ray tracing enabled. But that’s the big new feature for RDNA2 and the Radeon RX 6000 series, so we definitely wanted to look into ray tracing performance more. Here’s where things take a turn for the worse because ray tracing is very demanding, and Nvidia has DLSS to help overcome some of the difficulty by doing AI-enhanced upscaling. AMD can’t do DLSS since it’s Nvidia proprietary tech, which means to do apples-to-apples comparisons, we have to turn off DLSS on the Nvidia cards.

That’s not really fair because DLSS 2.0 and later actually look quite nice, particularly when using the Balanced or Quality modes. What’s more, native 4K gaming with ray tracing enabled is going to be a stretch for just about any current GPU, including the RTX 3090 — unless you’re playing a lighter game like Pumpkin Jack. Anyway, we’ve looked at ray tracing performance with DLSS in a bunch of these games, and performance improves by anywhere from 20 percent to as much as 80 percent (or more) in some cases. DLSS may not always look better, but a slight drop in visual fidelity for a big boost in framerates is usually hard to pass up.

We’ll have to see if AMD’s FidelityFX Super Resolution can match DLSS in the future, and how many developers make use of it. Considering AMD’s RDNA2 GPUs are also in the PlayStation 5 and Xbox Series S/X, we wouldn’t count AMD out, but for now, Nvidia has the technology lead. Which brings us to native ray tracing performance.

10-game DXR Average

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

3DMark Port Royal

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Boundary Benchmark

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Call of Duty Black Ops Cold War

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Control

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Crysis Remastered

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Dirt 5

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Fortnite

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Metro Exodus

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Shadow of the Tomb Raider

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

WatchDogs

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

Well. So much for AMD’s comparable performance. AMD’s RX 6800 series can definitely hold its own against Nvidia’s RTX 30-series GPUs in traditional rasterization modes. Turn on ray tracing, even without DLSS, and things can get ugly. AMD’s RX 6800 XT does tend to come out ahead of the RTX 3070, but then it should — it costs more, and it has twice the VRAM. But again, DLSS (which is supported in seven of the ten games/tests we used) would turn the tables, and even the DLSS quality mode usually improves performance by 20-40 percent (provided the game isn’t bottlenecked elsewhere).

Ignoring the often-too-low framerates, overall, the RTX 3080 is nearly 25 percent faster than the RX 6800 XT at 1080p, and that lead only grows at 1440p (26 percent) and 4K (30 percent). The RTX 3090 is another 10-15 percent ahead of the 3080, which is very much out of AMD’s reach if you care at all about ray tracing performance — ignoring price, of course.

The RTX 3070 comes out with a 10-15 percent lead over the RX 6800, but individual games can behave quite differently. Take the new Call of Duty: Black Ops Cold War. It supports multiple ray tracing effects, and even the RTX 3070 holds a significant 30 percent lead over the 6800 XT at 1080p and 1440p. Boundary, Control, Crysis Remastered, and (to a lesser extent) Fortnite also have the 3070 leading the AMD cards. But Dirt 5, Metro Exodus, Shadow of the Tomb Raider, and Watch Dogs Legion have the 3070 falling behind the 6800 XT at least, and sometimes the RX 6800 as well.

There is a real question about whether the GPUs are doing the same work, though. We haven’t had time to really dig into the image quality, but Watch Dogs Legion for sure doesn’t look the same on AMD compared to Nvidia with ray tracing enabled. Check out these comparisons:

Apparently Ubisoft knows about the problem. In a statement to us, it said, “We are aware of the issue and are working to address it in a patch in December.” But right now, there’s a good chance that AMD’s performance in Watch Dogs Legion at least is higher than it should be with ray tracing enabled.

Overall, AMD’s ray tracing performance looks more like Nvidia’s RTX 20-series GPUs than the new Ampere GPUs, which was sort of what we expected. This is first gen ray tracing for AMD, after all, while Nvidia is on round two. Frankly, looking at games like Fortnite, where ray traced shadows, reflections, global illumination, and ambient occlusion are available, we probably need fourth gen ray tracing hardware before we’ll be hitting playable framerates with all the bells and whistles. And we’ll likely still need DLSS, or AMD’s Super Resolution, to hit acceptable frame rates at 4K.

Radeon RX 6800 Series: Power, Temps, Clocks, and Fans 

We’ve got our usual collection of power, temperature, clock speed, and fan speed testing using Metro Exodus running at 1440p, and FurMark running at 1600×900 in stress test mode. While Metro is generally indicative of how other games behave, we loop the benchmark five times, and there are dips where the test restarts and the GPU gets to rest for a few seconds. FurMark, on the other hand, is basically a worst-case scenario for power and thermals. We collect the power data using Powenetics software and hardware, which uses GPU-Z to monitor GPU temperatures, clocks, and fan speeds. 

GPU Total Power

Image 1 of 4

(Image credit: Tom’s Hardware)

Image 2 of 4

(Image credit: Tom’s Hardware)

Image 3 of 4

(Image credit: Tom’s Hardware)

Image 4 of 4

(Image credit: Tom’s Hardware)

AMD basically sticks to the advertised 300W TBP on the 6800 XT with Metro Exodus, and even comes in slightly below the 250W TBP on the RX 6800. Enabling Rage Mode on the 6800 XT obviously changes things, and you can also see our power figures for the manual overclocks. Basically, Big Navi can match RTX 3080 when it comes to power if you relax increase the power limits.

FurMark pushes power on both cards a bit higher, which is pretty typical. If you check the line graphs, you can see our 6800 XT OC starts off at nearly 360W in FurMark before it throttles down a bit and ends up at closer to 350W. There are some transient power spikes that can go a bit higher as well, which we’ll discuss more later. 

GPU Core Clocks

Image 1 of 4

(Image credit: Tom’s Hardware)

Image 2 of 4

(Image credit: Tom’s Hardware)

Image 3 of 4

(Image credit: Tom’s Hardware)

Image 4 of 4

(Image credit: Tom’s Hardware)

Looking at the GPU clocks, AMD is pushing some serious MHz for a change. This is now easily the highest clocked GPU we’ve ever seen, and when we manually overclocked the RX 6800, we were able to hit a relatively stable 2550 MHz. That’s pretty damn impressive, especially considering power use isn’t higher than Nvidia’s GPUs. Both cards also clear their respective Game Clocks and Boost Clocks, which is a nice change of pace. 

GPU Core Temp

Image 1 of 4

(Image credit: Tom’s Hardware)

Image 2 of 4

(Image credit: Tom’s Hardware)

Image 3 of 4

(Image credit: Tom’s Hardware)

Image 4 of 4

(Image credit: Tom’s Hardware)

GPU Fan Speed

Image 1 of 4

(Image credit: Tom’s Hardware)

Image 2 of 4

(Image credit: Tom’s Hardware)

Image 3 of 4

(Image credit: Tom’s Hardware)

Image 4 of 4

(Image credit: Tom’s Hardware)

Temperatures and fan speeds are directly related to each other. Ramp of fan speed — which we did for the overclocked 6800 cards — and you can get lower temperatures, at the cost of noise levels. We’re still investigating overclocking as well, as there’s a bit of odd behavior so far. The cards will run fine for a while, and then suddenly drop into a weak performance mode where performance might be half the normal level, or even worse. That’s probably due to the lack of overclocking support in MSI Afterburner for the time being. By default, though, the cards have a good balance of cooling with noise. We’ll get exact SPL readings later (still benchmarking a few other bits), but it’s interesting that all of the new GPUs (RTX 30-series and RX 6000) have lower fan speeds than the previous generation. 

Image 1 of 2

(Image credit: Future)

Image 2 of 2

(Image credit: Future)

We observed some larger-than-expected transient power spikes with the RX 6800 XT, but to be absolutely clear, these transient power spikes shouldn’t be an issue — particularly if you don’t plan on overclocking. However, it is important to keep these peak power measurements in mind when you spec out your power supply.

Transient power spikes are common but are usually of such short duration (in the millisecond range) that our power measurement gear, which records measurements at roughly a 100ms granularity, can’t catch them. Typically you’d need a quality oscilloscope to measure transient power spikes accurately, but we did record several spikes even with our comparatively relaxed polling.

The charts above show total power consumption of the RX 6800XT at stock settings, overclocked, and with Rage Mode enabled. In terms of transient power spikes, we don’t see any issues at all with Metro Exodus, but we see brief peaks during Furmark of 425W with the manually overclocked config, 373W with Rage Mode, and 366W with the stock setup. Again, these peaks were measured within one 100ms polling cycle, which means they could certainly trip a PSU’s over power protection if you’re running at close to max power delivery.

Image 1 of 4

(Image credit: Future)

Image 2 of 4

(Image credit: Future)

Image 3 of 4

(Image credit: Future)

Image 4 of 4

(Image credit: Future)

To drill down on the topic, we split out our power measurements from each power source, which you’ll see above. The RX 6800 XT draws power from the PCIe slot and two eight-pin PCIe connectors (PEG1/PEG2). 

Power consumption over the PCIe slot is well managed during all the tests (as a general rule of thumb, this value shouldn’t exceed 71W, and the 6800 XT is well below that mark). We also didn’t catch any notable transient spikes during our real-world Metro Exodus gaming test at either stock or overclocked settings.

However, during our FurMark test at stock settings, we see a power consumption spike to 206W on one of the PCIe cables for a very brief period (we picked up a single measurement of the spike during each run). After overclocking, we measured a simultaneous spike of 231W on one cable and 206W on the other for a period of one measurement taken at a 100ms polling rate. Naturally, those same spikes are much less pronounced with Rage Mode overclocking, measuring only 210W and 173W. A PCIe cable can easily deliver ~225W safely (even with 18AWG), so these transient power spikes aren’t going to melt connectors, wires, or harm the GPU in any way — they would need to be of much longer duration to have that type of impact.  

But the transient spikes are noteworthy because some CPUs, like the Intel Core i9-9900K and i9-10900K, can consume more than 300W, adding to the total system power draw. If you plan on overclocking, it would be best to factor the RX6800 XT’s transient power consumption into the total system power.   

Power spikes of 5-10ms can trip the overcurrent protection (OCP) on some multi-rail power supplies because they tend to have relatively low OCP thresholds. As usual, a PSU with a single 12V rail tends to be the best solution because they have much better OCP mechanisms, and you’re also better off using dedicated PCIe cables for each 8-pin connector.  

(Image credit: Tom’s Hardware)

Radeon RX 6800 Series: Prioritizing Rasterization Over Ray Tracing 

It’s been a long time since AMD had a legitimate contender for the GPU throne. The last time AMD was this close … well, maybe Hawaii (Radeon R9 290X) was competitive in performance at least, while using quite a bit more power. That’s sort of been the standard disclaimer for AMD GPUs for quite a few years. Yes, AMD has some fast GPUs, but they tend to use a lot of power. The other alternative was best illustrated by one of the best budget GPUs of the past couple of years: AMD isn’t the fastest, but dang, look how cheap the RX 570 is! With the Radeon RX 6800 series, AMD is mostly able to put questions of power and performance behind it. Mostly.

The RX 6800 XT ends up just a bit slower than the RTX 3080 overall in traditional rendering, but it costs less, and it uses a bit less power (unless you kick on Rage Mode, in which case it’s a tie). There are enough games where AMD comes out ahead that you can make a legitimate case for AMD having the better card. Plus, 16GB of VRAM is definitely helpful in a few of the games we tested — or at least, 8GB isn’t enough in some cases. The RX 6800 does even better against the RTX 3070, generally winning most benchmarks by a decent margin. Of course, it costs more, but if you have to pick between the 6800 and 3070, we’d spend the extra $80.

The problem is, that’s a slippery slope. At that point, we’d also spend an extra $70 to go to the RX 6800 XT … and $50 more for the RTX 3080, with its superior ray tracing and support for DLSS, is easy enough to justify. Now we’re looking at a $700 graphics card instead of a $500 graphics card, but at least it’s a decent jump in performance.

Of course, you can’t buy any of the Nvidia RTX 30-series GPUs right now. Well, you can, if you get lucky. It’s not that Nvidia isn’t producing cards; it’s just not producing enough cards to satisfy the demand. And, let’s be real for a moment: There’s not a chance in hell AMD’s RX 6800 series are going to do any better. Sorry to be the bearer of bad news, but these cards are going to sell out. You know, just like every other high-end GPU and CPU launched in the past couple of months. (Update: Yup, every RX 6800 series GPU sold out within minutes.)

What’s more, AMD is better off producing more Ryzen 5000 series CPUs than Radeon RX 6000 GPUs. Just look at the chip sizes and other components. A Ryzen 9 5900X has two 80mm square compute die with a 12nm IO die in a relatively compact package, and AMD is currently selling every single one of those CPUs for $550 — or $800 for the 5950X. The Navi 21 GPU, by comparison, is made on the same TSMC N7 wafers, and it uses 519mm square, plus it needs GDDR6 memory, a beefy cooler and fan, and all sorts of other components. And it still only sells for roughly the same price as the 5900X.

(Image credit: Tom’s Hardware)

Which isn’t to say you shouldn’t want to buy an RX 6800 card. It’s really going to come down to personal opinions on how important ray tracing will become in the coming years. The consoles now support the technology, but even the Xbox Series X can’t keep up with an RX 6800, never mind an RTX 3080. Plus, while some games like Control make great use of ray tracing effects, in many other games, the ray tracing could be disabled, and most people wouldn’t really miss it. We’re still quite a ways off from anything approaching Hollywood levels of fidelity rendered in real time.

In terms of features, Nvidia still comes out ahead. Faster ray tracing, plus DLSS — and whatever else those Tensor cores might be used for in the future — seems like the safer bet long term. But there are still a lot of games forgoing ray tracing effects, or games where ray tracing doesn’t make a lot of sense considering how it causes frame rates to plummet. Fortnite in creative mode might be fine for ray tracing, but I can’t imagine many competitive players being willing to tank performance just for some eye candy. The same goes for Call of Duty. But then there’s Cyberpunk 2077 looming, which could be the killer game that ray tracing hardware needs.

We asked earlier if Big Navi, aka RDNA2, was AMD’s Ryzen moment for its GPUs. In a lot of ways, it’s exactly that. The first generation Ryzen CPUs brought 8-core CPUs to mainstream platforms, with aggressive prices that Intel had avoided. But the first generation Zen CPUs and motherboards were raw and had some issues, and it wasn’t until Zen 2 that AMD really started winning key matchups, and Zen 3 finally has AMD in the lead. Perhaps it’s better to say that Navi, in general, is AMD trying to repeat what it did on the CPU side of things.

RX 6800 (Navi 21) is literally a bigger, enhanced version of last year’s Navi 10 GPUs. It’s up to twice the CUs, twice the memory, and is at least a big step closer to feature parity with Nvidia now. If you can find a Radeon RX 6800 or RX 6800 XT in stock any time before 2021, it’s definitely worth considering. RX 6800 and Big Navi aren’t priced particularly aggressively, but they do slot in nicely just above and below Nvidia’s competing RTX 3070 and 3080.