Nvidia has scheduled a press event for January 12th that will focus on its RTX products. The company hasn’t specified what will be presented during this event, but rumours suggest it could be RTX 30 series mobile GPUs, RTX 3080Ti, RTX 3060 or any combination of the three.
Just one day after the CES 2021, Nvidia will host its own event to “unveil the latest innovations in gaming and graphics”. Although they are not clear about what will be shown, there’s a good chance that Nvidia will present us the 12GB variant of the RTX 3060, as we have previously covered. This GPU is expected to compete with AMD Radeon RX 6700 XT, which also might be introduced during AMD CES 2021’s conference.
Another RTX product that might be announced during Nvidia’s conference is the RTX 30 Mobile series of GPUs, which is expected to include RTX 3080, RTX 3070, and RTX 3060 SKUs. These SKUs are aimed at gaming laptops.
The last SKU that might be introduced is the RTX 3080Ti, which is expected to release in February. This rumoured 20GB graphics card is expected to compete with the currently “available” AMD Radeon RX 6900XT at the same price-point, while offering similar performance to an RTX 3090.
If you want to be notified when the event starts, you can do so HERE.
KitGuru says: What do you expect from Nvidia’s event? Do you think Nvidia will announce both the RTX 3060 and 3080Ti? Or will it just focus on the RTX 30 mobile SKUs?
Become a Patron!
Check Also
Phanteks launches Glacier One series AIO liquid coolers
Today, Phanteks is launching its very first line of all-in-one CPU liquid coolers. The Glacier …
After a six-month test phase, the Khronos Group consortium has published a first final ray tracing implementation for the Vulkan graphics API. Corresponding graphics effects run on both GeForce and Radeon graphics cards and use the ray tracing cores of modern GPUs (RTX 2000 / 3000, RX 6000). On older models the whole thing works via compute shaders, but in principle with significantly lower frame rates.
Vulkan version 1.2. 162 contains seven raytracing extensions, including VK_KHR_acceleration_structure , VK_KHR_ray_tracing_pipeline and VK_KHR_ray_query . The current Software Development Kit (SDK) provides developers with tools for integrating and testing ray tracing graphics effects in 3D games.
AMD and Nvidia provide suitable drivers: Vulkan ray tracing works with AMD graphics cards from the Radeon software adrenalin 20. 11. 3 and with Nvidia GPUs from the GeForce 460. 89 respectively the Linux version 460. 27. 04. AMD’s Linux driver has not yet received a corresponding update.
Simple DirectX – 12 – Porting With the help of the open source HLSL compiler DXC, the porting of Microsoft’s Windows API DirectX 12 with integrated DirectX Raytracing (DXR) to Vulkan turn out to be very simple. In addition, porting layers such as Proton can display DXR graphic effects with Vulkan.
Nvidia has on 15. December 2020 released the patch 1.4.0 for “Quake 2 RTX”, which the Raytracing implementation from Vulkan’s own extension VK_NV_ray_tracing updated to Khronos’ final version. As a result, the first-person shooter now also runs on AMD graphics cards, but so far not particularly fast: A GeForce RTX 3090 achieves about twice as high frame rates as a Radeon RX 6900. Mind you, “Quake 2 RTX” has not yet been optimized for AMD’s hardware ray tracing; rather, Nvidia may have tailored the game to its own architectures.
Quake 2 RTX (vanilla presets) AMD Radeon RX 6900 XT MSI GeForce RTX 3000 Gaming X Trio 515 p 269, 7 fps 515, 9 fps 1440 p 89, 2 fps 176, 5 fps 2160 p 52, 8 fps 106, 6 fps measured under Windows 10 (20 H2), Ryzen 9 5900 X, 32 GByte DDR4 – 3600 – RAM (mma)
12/7/2020 Update: We have now tested Cyberpunk 2077 benchmarks on a collection of GPUs, using a preview build. Based on our initial testing results, it looks like the official Cyberpunk 2077 system requirements target performance of 30-40 fps.
The official Cyberpunk 2077 system requirements have been updated with recommendations for ultra quality as well as ray tracing. As we expected, running at higher settings and resolutions will need a fair amount of hardware, at least on the graphics card front. After pushing the launch back to December 10 (from November 19), here’s the sort of PC hardware you’ll need to run the game in all its glory.
We’ve known that Cyberpunk 2077 will support ray tracing and DLSS 2.x for a while. It’s no surprise then that if you want the full fidelity experience, you’ll need one of the best graphics cards, something from the top of our GPU benchmarks — and by that, we mean you’ll want at least a GeForce RTX GPU. Cyberpunk also requires DirectX 12, though Windows 7 is supported as DX12 has been ported to it.
If you have the money and some luck, any of the latest GPUs should suffice. For dream territory, there’s the GeForce RTX 3090, and the GeForce RTX 3080 or GeForce RTX 3070 will also suffice. The new AMD Radeon RX 6800 cards don’t show up on the list, but we’ll assume that’s because they were just launched on November 18. Unfortunately, the latest word is that supply likely won’t catch up to demand until February 2021.
We have our own thoughts on the type of hardware you’ll need as well, but let’s start with the official Cyberpunk 2077 system requirements. We’ll then move on to our own recommendations. So let’s jack into the matrix, put on a Cyberpunk 2077 Brain Dance, and get ready for the year’s most anticipated game.
Cyberpunk 2077 Minimum PC: 1080p Low
CD Projekt Red and Nvidia provided the above slide detailing all of the recommended specs. Interestingly, all mention of AMD graphics cards has been scrubbed from the list. However, the left two columns of the list basically haven’t changed since what was stated in Night City Wire episode 3 (starting around the 20:30 mark), with Nvidia apparently adding five additional columns. Let’s start with the minimum specs first:
Core i5-3570K or FX-8310
GTX 780 3GB (or RX 470 4GB)
8GB RAM
3GB VRAM
70GB storage
Windows 7 or Windows 10 64-bit
Target: 1080p Low
So, that’s your clunker standard cyberdeck that’s barely enough to get you started as a netrunner. It has a CPU from 2012, a GPU from 2013 (Nvidia) or 2016 (AMD), a modest amount of memory, and a 64-bit version of Windows. Cyberpunk 2077 requires DirectX 12 (DX12), which was backported to Windows 7 but not Windows 8/8.1.
What sort of experience will this deck get you? It says 1080p low, but it doesn’t state whether that’s for 60 fps or 30 fps. If we were to hazard a guess, it’s closer to 30 than 60.
Cyberpunk 2077 Recommended PC: 1080p High
Core i7-4790 or Ryzen 3 3200G
GTX 1060 6GB, GTX 1660 Super (or R9 Fury)
12GB RAM
6GB VRAM
70GB SSD storage
Windows 10 64-bit
Target: 1080p High
The recommended hardware for 1080p high is still pretty tame. The GPUs are moderately faster — according to our GPU benchmarks and perforamnce hierarchy, Nvidia’s 1060 6GB is about 30 percent faster than the GTX 780; meanwhile, AMD’s R9 Fury X is about 30 percent faster than the RX 570 4GB, so the vanilla R9 Fury is perhaps 35 percent faster than the minimum RX 470. Note that the R9 Fury X is also about 25 percent faster than the GTX 1060 6GB.
For newer GPUs, both the GTX 1650 Super and RX 5500 XT provide similar performance, but the 1650 cards only have 4GB VRAM, so Nvidia also suggests the GTX 1660 Super. That’s a big spread in performance, as the 1660 Super is over 40 percent faster than the 1060 6GB.
On the CPU side, things are a bit weird. The Core i7-4790 is similar to the i7-4770K, and both should be substantially faster than the Ryzen 3 3200G. A Ryzen 5 1400 would have been a more sensible minimum CPU, so it feels a bit like CDPR is just pulling models out of a hat. Ryzen 3 3200G is limited to an x8 PCIe bus link, plus it’s a 4-core/4-thread CPU. It’s not going to beat a Core i7 Haswell or Devil’s Canyon in gaming performance, in other words.
While the recommended PC hardware targets 1080p high, we again don’t know if that’s 30 fps or 60 fps. The 1060 will probably land in the 35-45 fps range on average, while the GTX 1660 Super should be close to 60. Again, it’s a pretty low bar for a gaming PC.
Cyberpunk 2077 Recommended PC: 1440p Ultra
Core i7-4790 or Ryzen 3 3200G
RTX 2060 (or RX 5600 XT)
12GB RAM
6GB VRAM
70GB SSD storage
Windows 10 64-bit
Target: 1440p Ultra
Moving up to 1440p ultra recommendations, the only real change is in the GPU department. This is still without ray tracing enabled, and CD Projekt Red (or Nvidia) recommends an RTX 2060. It’s not clear if that’s with or without DLSS, but we figure an RX 5600 XT should be relatively close to the RTX 2060 if the recommendation is for without DLSS.
Cyberpunk 2077 Recommended PC: 4K Ultra
Core i7-4790 or Ryzen 3 3200G
RTX 2080 Super, RTX 3070 (or RX 6800)
16GB RAM
8GB VRAM
70GB SSD storage
Windows 10 64-bit
Target: 4K Ultra
4K ultra continues to push up the GPU ladder, and it also moves to recommending 16GB of system RAM and 8GB of VRAM. The RTX 2080 Super and RTX 3070 are recommended here, but is this with DLSS or not? We don’t know for sure. An RX 6800 from AMD should also do pretty well, we think, but we’ll have to test next month to see how they actually stack up.
Cyberpunk 2077 Recommended PC: 1080p Medium Ray Tracing
Core i7-4790 or Ryzen 3 3200G
RTX 2060 (or RX 6800)
16GB RAM
6GB VRAM
70GB SSD storage
Windows 10 64-bit
Target: 1080p RT Medium
Of course, turning on ray tracing kicks the requirements up quite a bit on the GPU front. For the medium ray tracing setting, running at 1080p (and almost certainly with DLSS enabled now), we’re back at 1080p now. The CPU continues to be a bit of an oddity, the VRAM obviously drops back to 6GB (because that’s what the 2060 has), but otherwise, it’s basically the 1440p ultra non-RT specs. AMD’s RX 6800 is most likely going to be highly competitive in this range, but then it’s a $580 card going up against a $300-$350 card.
Cyberpunk 2077 Recommended PC: 1440p Ultra Ray Tracing
Core i7-6700 or Ryzen 5 3600
RTX 3070 (or RX 6800 XT?)
16GB RAM
8GB VRAM
70GB SSD storage
Windows 10 64-bit
Target: 1440p RT Ultra
1440p ultra ray tracing continues the march up the GPU ladder, with the RTX 3070 getting the primary recommendation. Again, likely with DLSS enabled, we’re looking at hardware that AMD may not actually be able to match in terms of RT performance. 1440p DLSS quality means rendering at something like 1810×1018 and upscaling, and as we showed in the RX 6800 XT review, Nvidia generally has superior ray tracing performance plus Tensor cores that AMD can’t match. If you want ray tracing at 1440p, your best bet will be the 3070 or a 2080 Ti, and maybe the RX 6800 XT will come close.
The CPU finally gets a bump up as well, to the i7-6700 or Ryzen 5 3600. The latter certainly makes a lot of sense. Can the original Skylake 4-core/8-thread i7-6700 actually keep up, though? We doubt it, unless everything is just GPU limited at this point (which it probably is). Still, we’d suggest aiming for the i7-8700 or higher on the Intel side if you want to run at maxed-out settings.
Cyberpunk 2077 Recommended PC: 4K Ultra Ray Tracing
Core i7-4790 or Ryzen 3 3200G
RTX 3080 (or RTX 3090)
16GB RAM
10GB VRAM
70GB SSD storage
Windows 10 64-bit
Target: 4K Ultra
Finally, for 4K ultra with ray tracing enabled, only the RTX 3080 will suffice, or the RTX 3090 if you really have deep pockets and can find one. The CPU recommendations are the same as for 1440p RT Ultra, and VRAM is bumped to 10GB (because that’s what the 3080 has). Will this provide 60 fps using DLSS quality, DLSS balanced, or DLSS performance? That’s what we want to know.
Tom’s Hardware Cyberpunk 2077 Recommended PC
Given the above recommendations and trends, here’s our recommendation for a complete high-end Cyberpunk 2077 PC build. We still need to run the benchmarks, but with DLSS, even 4K should be viable.
Core i7-10700K : $378
GeForce RTX 3080: $700 (when supply improves)
NZXT Kraken X63 : $149
Asus TUF Gaming Z490-Plus : $180
G.Skill Aegis 2x16GB DDR4-3200 : $115
Adata XPG Gammix S5 1TB M.2 NVMe SSD : $110
Phanteks Eclipse P400A : $70
Thermaltake GF1 850W Gold : $130
TOTAL PRICE: $1,832
Obviously, that’s a lot of money for a gaming PC, but there’s a good chance you won’t need to upgrade everything just to play Cyberpunk 2077 at max settings. Mostly, getting the GeForce RTX 3080 is your best bet at being able to handle anything Night City might throw at you. Good luck finding one this side of February 2021.
In terms of performance, while we don’t know exactly how demanding Cyberpunk 2077 will be, having the fastest current GPU (that doesn’t cost over $1,000) should suffice. If you already have a 2070 Super or similar GPU, 1080p or 1440p with ray tracing should be fine as well.
Cyberpunk 2077 ‘Budget’ Ray Tracing Build
If you’re more interested in the minimum requirements to get Cyberpunk 2077 running with ray tracing support, here’s a lesser build. It’s still moderately expensive, and we can’t guarantee high framerates with all the graphics settings turned up. But it matches the CDPR / Nvidia recommended RT PC and should deliver a good gaming experience.
Ryzen 5 3600 (6-core/12-thread) : $200
ASRock B550 Phantom Gaming 4/ac : $125
G.Skill Aegis 2x8GB DDR4-3200 : $57
Adata XPG Gammix S5 512GB M.2 NVMe SSD : $65
Asus ROG Strix RTX 2060 : $360
Corsair Carbide 175R : $60
Thermaltake 600W Gold: $74
TOTAL PRICE: $941
That’s about half the total cost, but it’s also about half the performance. It’s also half the memory and half the storage capacity — you might want to add a secondary drive or just upgrade to a 1TB SSD for $35 more. Nvidia hasn’t officially announced the RTX 3060 Ti or an RTX 3060, but both are expected to arrive in the coming month or two. Whether they’ll actually be available in sufficient quantities is another matter, but if you’re looking to buy a new GPU, we wouldn’t pay for an RTX 20-series these days.
Hopefully, this type of PC will be able to run Cyberpunk 2077 at 1080p and high settings, with ray tracing and DLSS, while still getting close to 60 fps. However, that’s only a guesstimate as we don’t actually know what actual performance will be like. We do know that adding even one ray tracing effect can drop performance quite a bit in other games, and there are four RT effects planned for Cyberpunk 2077.
Cyberpunk 2077 Graphics Card Considerations
You don’t need to buy an entire PC either, naturally. If you already have a decent PC, the main consideration for running Cyberpunk 2077 is your graphics card. You can see how performance stacks up between the various options in our full GPU benchmarks and performance hierarchy, but Nvidia’s RTX 30-series GPUs are obviously enticing, what with their superior ray tracing performance and the added benefit of DLSS.
We selected the penultimate consumer GPU right now, the RTX 3080. The GeForce RTX 3090 might be a bit faster, but at more than double the price, we’re not going to be heavily recommending that. The RTX 3080 is a beast on its own, pummeling the last-gen RTX 2080 Ti by over 30% on average at 4K, or sometimes more with ray tracing and DLSS games.
What about AMD GPUs? Nvidia has been working with CDPR to get ray tracing effects incorporated into Cyberpunk 2077 for at least the past year, plus DLSS, so an Nvidia GPU is probably the safer bet. The RX 6800 and RX 6800 XT should also be able to do ray tracing, but probably only at 1080p, considering they’ll have to stick to native rendering.
Based on the various rendering features planned for Cyberpunk 2077, DLSS is pretty much required for a decent ray tracing experience. Here’s the rundown of what CDPR has implemented, courtesy of Nvidia’s blog post:
Ray-traced ambient occlusion – Ambient occlusion is a shading and rendering technique used to calculate how exposed each point in a scene is to ambient lighting. The result is a diffuse shading effect that darkens enclosed and sheltered areas and enhances the rendered image’s overall tone. In Cyberpunk 2077, ray-traced ambient occlusion additionally can be used with local lights to approximate local shadowing effects where shadows are missing.
Ray-traced diffuse illumination – This technique is used to capture sky radiance as well as emissive lighting from various surfaces, which is difficult to achieve with traditional rendering techniques.
Ray-traced reflections – In Cyberpunk 2077, ray-traced reflections are used on all surfaces and can trace ranges for up to several kilometers. They are present on both opaque and transparent objects to simulate the way light reflects from glossy and metal surfaces by tracing a single bounce of reflection rays against the scene. This includes smooth natural mirrors like window glass, but also rougher surfaces like brushed metal. Unlike screen space techniques which can only reflect what’s on screen, ray-traced reflections incorporate the entire scene around the character, and can accurately represent objects outside the camera view or facing away from the camera.
Ray-traced shadows – Cyberpunk 2077 preview supports directional shadows from the sun and the moon. These shadows aim to be physically accurate and even account for light scattering from clouds. Shadows may be enhanced in the final release to support other types of light sources where it is needed.
If that’s all a bit complicated, let’s sum up: At maximum quality settings, you can expect Cyberpunk 2077 to push ray tracing hardware to the limit. Many previous games have only used one of those techniques: RT shadows are in Shadow of the Tomb Raider and Call of Duty: Modern Warfare; RT reflections are used in Battlefield V, Control, and Wolfenstein Youngblood; and RT AO and diffuse lighting are used in Metro Exodus.
Combine all of those in one game, and we expect framerates to plummet. Just look at the Control and Fortnite RT benchmarks from the RX 6800 review as an example. (This is at native resolution, without DLSS.)
Image 1 of 6
Image 2 of 6
Image 3 of 6
Image 4 of 6
Image 5 of 6
Image 6 of 6
DLSS 2.0 will help offset that, but perhaps more important than having a GPU that can do ray tracing effects will be having a second-gen ray tracing GPU. Which brings us back to the RTX 3080 and Ampere.
The RTX 3080 is nearly double the performance of the RTX 2080, and roughly twice the ray tracing performance of Turing, thanks to improvements in the architecture. You’re basically going to need that for 4K ultra. Or for 1440p, the RTX 3070 is probably the next best option.
Cyberpunk 2077 CPU Considerations
Cyberpunk 2077 doesn’t seem to be going too high on CPU requirements, but it’s always good to have a fast processor for the smoothest experience. Average fps might not drop too much by running on an old Haswell i7 chip, but minimum fps will almost certainly suffer.
Our best recommendation for a CPU you can actually buy right now is the Core i7-10700K, which is basically just a new name on the old i9-9900K. Yes, AMD’s new Ryzen 7 5800X is likely a close match, maybe even superior in performance … but it’s not in stock. That’s also why we have the Ryzen 5 3600 (and not the Ryzen 5 5600X) as the baseline recommendation.
Games also tend to be more forgiving of older CPUs than of previous-gen GPUs, so even a CPU that’s several generations old should still be okay. As far as minimum CPU requirements go, Cyberpunk 2077 will probably still run okay even on a second-gen Core i5, FX-series AMD, or similar. Just don’t plan on a smooth 60 fps or more if you’re sporting an old CPU. Worst case, though: Give it a shot. If it doesn’t run well, you can always upgrade after the fact.
Cyberpunk 2077 System Requirements, Closing Thoughts
The main hurdle for any PC to run Cyberpunk 2077 is undoubtedly going to be the graphics card. If you’re willing to run at minimum quality and a lower resolution, or maybe enable resolution scaling, and if you’re okay with 30 fps, it will probably run just fine on whatever hardware your current gaming PC has. That’s assuming you have a GTX 970 or R9 390 or better GPU. You might even be able to go to older / slower hardware and still run the game, but no guarantees — and none of the fancy graphics effects.
But if you want to get the most out of Cyberpunk 2077, judging by what we’ve seen and the promised graphics features, we recommend at least running a 6-core CPU to hit a consistent 60 fps or more in the crowds of Night City, and perhaps even that won’t be sufficient.
One thing to keep in mind is that while Cyberpunk 2077 will be launching on PC and the next-generation PlayStation 5 and Xbox Series X consoles, it will also be available on current-gen consoles. The hardware in the PlayStation 4 and Xbox One is pretty decrepit by today’s standards, so any modest PC should be fine if you just want to run the game. It might be at 30 fps, but it should still be playable. That’s basically what CDPR seems to be aiming for.
So we’re not suggesting that you run out and buy a new PC or upgrade your existing PC in advance of the game launch. If you’re already running an Nvidia RTX graphics card, you should be okay for at least trying ray tracing. If you have an AMD, maybe the ray tracing effects won’t really be that amazing (or worth the hit to performance). Rest assured, we’re planning to run a full suite of Cyberpunk 2077 benchmarks once the game arrives. We’ll see you then.
The official integration of the ray tracing in the Vulkan API brings a first novelty: Nvidia has updated Quake II RTX to support the new extensions, allowing the title to also run on the latest AMD Radeon RX GPUs 6000.
by Manolo De Agostini published 16 December 2020 , at 10: 41 in the Videogames channel Vulkan NVIDIA GeForce Radeon AMD
Nvidia updated Quake II RTX to version 1.4.0 , introducing some improvements, in particular support for official Vulkan Ray Tracing extensions : this means that even the latest video cards from AMD can play this version of the game with ray tracing effects enabled .
Quake II RTX debuted in June last year, but until today the reproduction of the RT effects was based on the VKRay extensions designed by Nvidia itself, created to make up for a lack of official Vulkan extensions. Finally the Khronos Group consortium responsible for the development of open APIs has completed the development and all formal steps, introducing the official extensions for ray tracing in the API.
Nvidia therefore updated Quake II RTX to take advantage of the official extensions – opening consequently the title also the owners of Radeon RX 6000 – and published the new GeForce Game Ready drivers 460. 89 WHQL (downloadable from the Nvidia site or GeForce Experience) just to support the new features of the Vulkan API. Quake II RTX is on Steam, Nvidia’s website and GitHub. The game uses path tracing, that is, it applies ray tracing in real time on all the light effects present and not just on some.
“Bringing ray tracing to Vulkan is the fruit of a multi-year effort by many companies . Nvidia has assumed a leadership role at every stage its evolution, “commented the US company. “We were elected to chair the sub-group dedicated to the integration of ray tracing in Vulkan, contributing with our extension to the rapid progress of the working group. We have distributed drivers for the interim version of the Vulkan RT extensions in order to obtain the feedback. of developers and now we are the first to adopt these extensions in a game “.
“ Standardization of ray tracing in Vulkan is an important step to make the technology available on a wide range of devices, as well as allowing developers to use it to their advantage, ”said Andrej Zdravkovic, AMD’s senior vice president of software development. “We support all major features of this extension, including ray shading and ray queries via our driver. We are also working with developers to deliver exceptional performance by supporting hardware ray tracing on RDNA 2 architecture based video cards.”
Gigabyte has revealed information and details about Aorus Master and Gaming OC, its custom proposals based on the projects of the Radeon RX 6800 XT and the RX 6800. Generous dimensions and important cooling systems to make the new AMD GPUs perform at their best.
by Manolo De Agostini published 16 December 2020 , at 08: 41 in the Video Cards channel Gigabyte Aorus Radeon Ships
Gigabyte presented the video cards custom based on the projects of Radeon RX 6800 XT and RX 6800 by AMD . The Taiwanese company offers two lines, Aorus and Gaming , which complement the products with reference design. As for the Aorus Master models, we have two RX 6800 XT and an RX 6800.
In both cases the cards are equipped with a huge cooling solution, capable of occupying 3.2 slots and equipped with three fans that rotate in different directions (the two external ones counterclockwise and the central one clockwise) to reduce turbulence. Under the fans is a huge radiator with multiple heatpipes and a large copper GPU contact base. The whole thing makes up the cooling design nicknamed Max-Covered .
The cards are also equipped with a backplate at the end of which we find an opening that allows the passage of air, in order to push the heat into the upper part of the case so that it is captured by the fans and expelled. Also on the back, the Aorus brand lights up and can be synchronized via Fusion RGB. To power the cards we find two 8-pin PCIe connectors .
One of the most interesting features of these proposals is the LCD screen on the side of the card which can be used for see GPU information (temperatures, frequencies, etc.) or customized at will with text, images and GIFs. As for the frequencies , the Gigabyte RX 6800 XT Aorus Master operates at 2065 MHz (Game Clock) and 2310 MHz (Boost Clock), while the RX 6800 Aorus Master is set to 1980 MHz (Game Clock) and 2190 MHz (Boost Clock).
The cards have two HDMI and two DisplayPorts, but there is also a Type C variant of the XT model that replaces a DisplayPort with a USB Type C . The RX 6800 XT Aorus Master has a US list price of 899 dollars , while the model 6800 is sold to 719 dollars .
As for the Gaming OC models, here is a Windforce 3X cooling system marked by three fans with 11 blades. Also in this case the operation is in alternating direction and there are two 8-pin PCIe connectors for power supply and an output configuration characterized by two HDMI and as many DisplayPort.
The Gigabyte RX 6800 XT Gaming OC comes with a Game Clock of 2045 MHz rising to 2285 MHz in Boost, while the RX model 6800 Gaming OC works a 1925 MHz (Game Clock) and 2155 MHz (Boost Clock). The prices indicated by Gigabyte are respectively 849 and 649 dollars .
As I told you not long ago, the custom implementations for the AMD Radeon RX series 6000 they started arriving in the laboratory. After seeing what the first implementation of RX 6800 received in tests looks like, today it’s time to analyze an XT RX 6800, with GIGABYTE flavor.
This is about GIGABYTE Radeon RX 6800 XT GAMING OC 16 G, an implementation that boasts frequencies similar to the Rage Mode on the reference, being also equipped with an oversized cooling system, which occupies 3 slots in the housing.
Of course, I’ve seen such beasts in recent years, but in general it was about Nvidia graphics cards. A good example of this is AORUS GeForce RTX 3080 XTREME 12 G, one of the most impressive versions of RTX 3080 that I had the opportunity to test.
Well, GIGABYTE Radeon RX 6800 XT GAMING OC 16 G is not part of the AORUS range, but it has comparable dimensions, which means that today we will have the opportunity to see how an RX implementation behaves 6800 XT really nervous.
Several ray tracing games using NVIDIA plug-ins have already been released for the Vulkan interface, but the Quake II RTX update made it the first game to support official plug-in extensions.
Maintaining multiple open interfaces, Khronos has released a major update to the Vulkan interface. Radiation tracking extensions released last month now supported, among other things, the SDK (Software Development Kit).
Valve-funded LunarG has released a new version 0 of the official SDK for the Volcano Interface, which brings full support for the interface). for fresh radius tracking extensions. At the same time, Microsoft’s open source HLSL compiler DXC has been updated with radius tracking support for the Vulkan interface, making it much easier to compile applications that use the DirectX Raytracing interface to the Vulkan interface.
Khronos has also released new example applications using open source Vulkan beam tracking and updated the Vulkan Guide to include information on beam tracking.
In the same context, the Quake II RTX developed by NVIDIA was updated to support official plug-ins for the Vulkan interface in addition to NVIDIA plug-ins. With the update, the game, which utilizes advanced path tracing beam tracking, now also works with AMD’s Radeon RX 6000 series graphics cards and will work with Intel’s Xe-HPG game graphics cards, which will be released next year. With the new version, PC Gamer has already had quick tests, according to which the RX 6800 XT achieves more than 60 FPS performance 1080 with p-resolution and RTX 3080 1440 with p-resolution, but with 4K resolution it remains a distant dream of both.
The official radius tracking extensions for the volcano interface are supported by AMD’s Radeon Software 20
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
If you missed the restock of the PS5 and Xbox Series X / S at Best Buy, you have another chance at Walmart. The retailer announced that at 3PM ET on December 15th, it will have both next-gen consoles available for purchase exclusively through its website.
Visiting the retailer page for either the PlayStation 5 or Xbox Series X / S will display a message that reads, “Available online only, Dec 15 at 3:00 PM ET. While supplies last,” as seen in screenshots below.
A Walmart spokesperson told The Verge that anyone successful in purchasing either a PlayStation 5 or Xbox Series X / S will have to wait until the end of the month to get their new console(s) as the items will be available for delivery only and will arrive “after December 25th.”
Like most retailers, Walmart has exclusively sold both consoles online. Much like Nvidia’s RTX 30-series or AMD’s Radeon RX 6000 graphics cards, both the PlayStation 5 and Xbox Series X / S are some of the most popular tech products on the market right now, and they are also some of the most difficult to find in stock.
German news outlet Igor’s Lab has received some tasty information regarding Nvidia’s approaching Ampere-powered graphics cards. It would appear that the chipmaker has reshuffled the launch dates for the GeForce RTX 3080 Ti and RTX 3060. Until we receive official confirmation, take this rumor with a pinch of salt.
The GeForce RTX 3080 Ti was rumored to debut next month, probably at CES 2021. However, Igor’s sources claim that Nvidia has pushed the launch to after the Chinese New Year holidays. Therefore, the up-and-coming challenger to AMD’s Radeon RX 6900 XT won’t arrive until after February 17.
Staying true to its name, the GeForce RTX 3080 Ti is the the SKU that bridges the gap between the GeForce RTX 3080 and RTX 3090. If the buzz around town is accurate, the GeForce RTX 3080 Ti may inherit that 10,496 CUDA cores from the GeForce RTX 3090, while sporting 20GB of 19.5 Gbps GDDR6X memory, just 4GB shy of the GeForce RTX 3090.
According to Igor, the GeForce RTX 3060 will take the GeForce RTX 3080 Ti’s place at CES 2021. The GeForce RTX 3060 is rumored to be available in two variants: one with 12GB of GDDR6 memory and another with 6GB of GDDR6 memory. The first could launch around CES 2021, while the latter might not come until the end of the month.
The differences don’t stop at the memory configuration though. Igor’s sources said that the CUDA cores will also vary between the two models. In order to protect his sources, Igor didn’t share the CUDA core count for the GeForce RTX 3060.
All in all, Nvidia seems to have a couple of interesting products to take on Big Navi in the months to come, but you can get AMD won’t approach the fight with its hands down. The Radeon RX 6700 XT is the model to watch since it’ll be a tough cookie to crack.
The latest rumors inform us of the alleged GeForce RTX slip 3080 Ti in February. In the meantime, however, at least two other cards will arrive, two variants of the GeForce RTX 3060 different not only for the VRAM, but also for the specifications.
by Manolo De Agostini published 15 December 2020 , at 16: 21 in the Video Cards channel GeForce NVIDIA Ampere
According to the German website Igor’s Lab, Nvidia would have changed the release month of the GeForce RTX 3080 Moving it from January to February . The new model should offer 10496 CUDA core like the GeForce RTX 3090, but in combination with a bus to 320 bit and the same TGP as the 3080, or 320 W. On board we should also find 20 GB of GDDR6X memory , double the current one 3080.
As for the price, a list of 999 tax-free dollars, such as the Radeon RX 6900 XT by AMD . If confirmed, it would be 300 dollars more than the RTX 3080 (699 $) but at the same time 500 dollars less than the RTX 3090. The GeForce RTX 3080 You could debut around the Chinese New Year, i.e. between 11 and the 17 February, so we will most likely have to wait for the second half of the month.
The German site also confirmed that the GeForce RTX 3060 will come in two variations, with 12 GB or 6GB of GDDR6 memory. The two models should also have a number of different CUDA cores , and this could create some headaches for less experienced users when purchasing. In practice, Nvidia would have decided for the moment to cancel the RTX 3050 Ti 6GB from its plans, renaming it to RTX 3060 6GB.
Consequently, the ‘arrival of a GeForce RTX 3060 12 GB with a GA GPU 106 equipped with 3840 CUDA core, while the 6GB model will only have 3548 , although based on the same graphics chip. Both proposals should have a 192 bit bus. It is not clear if we will see a GeForce RTX 3050 Ti later, but instead it is certain that we will see the debut of an RTX 3050 based on a GA GPU 107 with 2304 CUDA core and bus a 128 bit and 4GB of GDDR6 memory. There may therefore be ample room for a Ti model, perhaps based on a GA 107 to its full potential.
Unlike the RTX 3080 Ti, the two RTX models 3060 should be unveiled in January , even if on the actual debut date there is no maximum certainty: it seems that the first model to arrive on the market will be the one from 12 GB.
The premiere of Cyberpunk 2077 was not as successful as Polish studio CD Projekt RED would probably wish that. The biggest problem is the technical condition of the new RPG running on the improved engine from The Witcher 3 (Red Engine 4), especially in the version for old-generation consoles, which are ailing in this game, which translates into poor performance. The rash of bugs in Cyberpunk 2077 plagues every hardware platform, but the console editions are – rightly – the most criticized. The Digital Foundry team from the British website Eurogamer decided to look at the game running on each console. What does the graphics and optimization look like on PlayStation 4 (Pro), PS5 and Xbox One (X), Series X and Series S? It’s not pink on older devices, to say the least …
The Digital Foundry team checked Cyberpunk 2077 on all consoles. Conclusions? The Xbox One version was the worst in the tests – both in terms of graphics and performance, while the best visual experience is provided by the game running on the Xbox Series X.
Cyberpunk Performance Test 2077 PC – What are the hardware requirements? Test of AMD Radeon and NVIDIA GeForce
Cyberpunk 2077 failed yet a full-fledged release of the game on PS5 and Xbox Series X / S (it will be released only next year). The title is therefore backwards compatible. How is it going? On PS5, the game runs in dynamic resolution (minimum 972 p, and maximum 28 FPS. Dips typically go to the lower end of 50 frames per second while driving. On the Xbox Series X there are two graphics modes: performance (1080 p with 60 FPS) and qualitative (resolution between 1512 bye 1728 p / 1800 p for fixed 30 FPS with increased crowd and vehicle density or better reflection quality and ambient occlusion). Performance on the more powerful version of the Microsoft console is a bit more shaky than on the PS5, because it drops to approx. 40 frames per second during major matches. In the resolution mode, the smoothness of the animation is practically maintained. For people who prefer graphics over performance, the best choice for Cyberpunk 2077 is the Xbox Series X.
However, for many players it may be more important to act and the appearance of the game on other consoles: Xbox Series S, PlayStation 4 (Pro) and Xbox One (X). On the weaker representative of the new generation, the game offers a variation on the quality mode from the Xbox Series X, which translates into the resolution sometimes falling below 1080 p, to get 1296 p, or even more than on PS5, with practically stable 30 FPS, despite the increased number of NPCs and cars and the activated graphic options like ambient occlusion.
Cyberpunk 2077 – Insightful review. We look at the Samurai under the kimono, and there … a sinusoid of ups and downs. Which more?
Digital Foundry finally left itself Cyberpunk 2077 running on PlayStation 4 (Pro) and Xbox One (X) and it is hardly surprising that the players were bitter, some of whom decided to return because they felt cheated by CD Projekt RED. After all, the developers did not show the game running on the basic versions of the previous generation consoles, which they apologized for recently. The game was tested after Update 1. and on regular PS4 the maximum resolution is 900 p with drops to 720 p, but excluding graphics, liquidity can drop to 15 – 17 frames per second, and often does not exceed 25 frames / sec. Value 30 FPS actually succeeds achieved mainly indoors. The situation looks better on PS4 Pro, which has the resolution of the PS5, i.e. the range 972 p – 1188 p depending on the screen situation. Cyberpunk 2077 works slightly better on this console, but also drops below 20 FPS e.g. in crowded places or while driving in the Night City.
CD Projekt RED apologizes for the console version of Cyberpunk 2077. Possible return of the game, more patches to fix the game on the way
As predicted, it is the worst at Cyberpunk 2077 Primary Xbox One with resolution at most 810 p with dips in more aggressive scenes to below 720 p. In turn, performance while exploring Night City is not very stable. The number of frames per second drops to even a dozen. The comfort of the game on the Xbox One X is better (drops to slightly below 670 frames), although the game performs worse than on PS4 Pro (e.g. 25 – 26 vs FPS 28 – 29 FPS during the scene of shooting enemies from the car). The bigger difference can be seen in the image quality thanks to the higher resolution (the rest looks the same). The game runs on Xbox One X in dynamic 1674 p with decreases to the maximum resolution of PS4 Pro and PS5, i.e. 1188 p, which is a very good result considering the current state of the game and its effect on other consoles. Cyberpunk 2077 will get a new update 1 in the next few days. 05, which is said to slightly improve optimization (also on PC), followed by larger patches in January and February. Hopefully, by then, the game will be brought to the state that Cyberpunk 2077 should be on the release date, i.e. . 10 of December.
Hardly any other topic currently occupies the DIY PC hobbyist as much as the availability of graphics cards. No matter whether NVIDIA’s new GeForce RTX – 30 – or AMD’s Radeon-RX – 6000 – series: Not only are The cards are difficult to get because there are too few quantities, and the prices are far above what the manufacturers of the GPUs (i.e. AMD and NVIDIA) call for the entry-level models, or above the recommended retail prices for the custom- Specify models.
However, you have to differentiate between what manufacturers do with their own online shops and what the offer on the overall market looks like. Yes, AMD and NVIDIA sell the reference versions or the Founders Editions at their own recommended retail prices. Currently, however, none of the GeForce RTX 30 cards in the Founders Edition is available from NVIDIA and this is no longer an RRP card. The same applies to the Radeon-RX – 6000 – series from AMD and the direct offer via AMD.com.
Manufacturers such as ASUS and EVGA have their own Web shops and there the cards are of course offered at their own RRPs, but are also currently not available. Although the manufacturers point out that small numbers of items are always available in the web shop, there can be no question of the willing buyer simply being able to purchase the card – the timing must be right in this case.
What is offered via the price search machines in the “free trade” then usually amounts to a few models that are offered far above the manufacturer’s recommended prices. Supply and demand determine the price and with the gigantic demand this ensures that 150 to 300 euros Surcharges depending on the model are now common – regardless of whether with the GeForce RTX – 30 – or Radeon-RX – 6000 Series.
Partners consider EIAs to be unrealistic
In the past few days and weeks we have worked with numerous partners from AMD and NVIDIA and came across the same assessments over and over again. According to the partners, AMD is said to have come under pressure from NVIDIA’s price targets. You had to keep up with this and not only place your products on an equal footing in terms of performance, but also price them in such a way that they can be an alternative. We now know that the performance (and mostly also the price / performance ratio) is right, but NVIDIA’s aggressive price specifications have evidently pushed all manufacturers (their own partners and AMD together with its partners) into a corner.
The EIAs or entry-level specifications for the smallest models are too tight. In most cases, a manufacturer or the entire value chain, starting with the manufacture of the individual components through to the sale of the entire card in stores, can no longer earn much from the card.
AMD and NVIDIA demand Fixed prices for the packages consisting of GPU and memory and the manufacturer also has little leeway with his customers. The shops are currently the big winners, because they can write almost any price on the cards – the middleman, if any, will certainly also get his piece of the pie.
Improvement is not in sight
In the first quarter 2021 a certain part of the demand should be covered – the manufacturers promise. Then the price structure should normalize again halfway. But there is great doubt that the majority of the cards will then be available at the original MSRP. 2021 the prices of almost all components and assemblies involved should rise. This starts with the manufacture of the GPUs, the memory, the PCB and many other components. The production lines will also 2021 be well utilized, so that the corresponding capacities will cost more.
All of this will hardly be included in an overall price package be able to lace which corresponds to the current specifications. Some manufacturers therefore assume that prices will remain roughly as high as they are currently, or that they will not decline quite as far as one would expect.
There but even the assessment of the current high demand was apparently difficult to assess, this also applies to the graphics card market 2021. At the moment it doesn’t look like normalization.
This year we have the premiere of four Nvidia GeForce RTX series graphics cards 3000, i.e. RTX 3080, RTX 3090, RTX 3070 and RTX 3060 Ti (in chronological order). The last of them debuted on December 2 after a few slides related to the desire to improve accessibility. It seems that a similar fate awaits the Nvidia GeForce RTX 3080 Ti, a model to be a direct competition to the Radeon RX released on December 8) XT (also affordable – 999 dollars). As Igor Wallosek from the German service Igor’s Lab reports, the Greens have changed their plans regarding the release date of the next top design. It was originally supposed to be released in January 2021, but this graphics card has been delayed for at least a few weeks. The GeForce RTX 3060 are still to be released next month.
Nvidia GeForce RTX 3080 Ti is expected to be released at the end of February at the earliest. In January we would receive the GeForce RTX 3060 from 12 GB of VRAM, which will later also have a second variant (6 GB).
NVIDIA GeForce RTX graphics card test 3090 – Mega (expensive) card
Source Nvidia GeForce RTX graphics card 3080 Ti is expected to hit the market after the Chinese New Year Celebration, which falls on 12 February. The people of this country will return to work 18 February, so the new GPU would be released at the earliest end of February 2021 of the year. Igor Wallosek noted that the delay could be even greater (taking into account the poor availability of the current models, this move would not be a surprise). According to rumors, GeForce RTX 3080 Ti will be based on the Ampere GA chip 102-250 with 10496 CUDA cores (the same as in RTX 3090). The card is to be equipped with 20 GB of GDDR6X memory (320 – bit) with effective clock speed 19000 MHz and bandwidth 760 GB / sec.
AIDA 64 with new version adding support for NVIDIA GeForce RTX 3080 Ti, RTX 3060, RTX 3050 and mobile RTX systems 3000
In January, we would receive at least one of the weaker models from the Ampere family. Unofficial reports of the Igor’s Lab website coincide with what the VideoCardz portal previously suggested. Nvidia GeForce RTX 3060 from 12 GB of GDDR6 memory is expected to be available in mid-January around CES 2021 (11 – 14 January 2021 year), but this is not the only card variant that will hit sales. The 6GB version is also planned for the Greens, but will debut a bit later – possibly at the end of January, although the Chinese ChannelGate has revealed that this variant was delayed by the RTX pattern 3080 Ti. Finally, it is worth mentioning that Nvidia has decided to rename GeForce RTX 3050 Ti to RTX 3050 6 GB (GA core 106), so probably this model it would be sold in two versions (the other one would be GA based 107 with 4 GB of memory). The card was also supposed to hit the market in January, but the release date has not been confirmed by Igor’s Lab or VideoCardz, so there may be more slides.
As part of a live stream presentation, Gigabyte launched its first own models of the Radeon RX 6800 XT and Radeon RX 6800 shown. Unlike the competition, the manufacturer also called price recommendations for the custom designs that have it all: The top model Radeon RX 6800 XT Aorus Master costs 885 US dollars, the Radeon RX 6800 XT Gaming OC with less expensive cooler 850 US Dollar and the Radeon RX 6800 (Non-XT) Aorus Master 720 U.S. dollar.
Converted including 19 – percent VAT (not included in US prices) corresponds to just under 885, 835 or 710 Euro. Gigabyte names the prices in the introduction video from minute 40 : 40. For comparison: AMD recommends the Radeon RX 6000 XT 650 Euro, for the Radeon RX 6800 about 580 Euro.
Excessive prices So far, all graphics cards from the current generation are poorly available and therefore overpriced. In addition to AMD’s Radeon series RX 3000, this also applies to Nvidia’s GeForce RTX 3000. So far, however, the high prices have primarily come from retailers. Gigabyte is now the first manufacturer to openly admit to triggering the price problem at the root.
Gigabyte Radeon RX 6800 XT Aorus Master and Gaming OC (16 Pictures) Radeon RX 6800 XT Aorus Master (Image: Gigabyte) No matter how good the cooler is – for the first time Gigabyte is offering an Aorus Radeon with superimposed fans, previously these were reserved for GeForce graphics cards -, no air cooler should be worth more than 200 Euro surcharge. The GPU clock speeds just drop 30 to 60 MHz higher from.
AMD is apparently also aware of this: Manager Scott Herkelman recently announced on Twitter that the reference models of the Radeon RX – 6000 – graphics cards should continue to be produced until further notice. Actually, the AMD versions are only available at the start, until the partner manufacturers’ own creations are widely available.
Breaking down the best 4K gaming monitors we’ve tested. (Image credit: Shutterstock / Krivosheev Vitaly)
With great pixels comes great image quality. So it’s not surprising when PC gamers drool over monitors with 4K resolution. A panel packing 8.3 million pixels (3840 x 2160) makes your favorite games look incredibly sharp and realistic. In addition to being the highest resolution you can get in a good gaming monitor these days, going 4K also offers the ability to expand past 20-inch screens. With that loaded pixel army, you can stretch your screen size well past 30 inches without having pixels so big that you can see them. And the new graphics cards from Nvidia’s RTX 30-series and AMD’s Radeon RX 6000-series make the move to 4K even more tempting.
But that image quality comes at a steep price. Anyone who’s shopped for a 4K monitor before knows they’re not cheap. Yes, 4K is about high-res gaming, but you’re still going to want solid gaming specs, like a 60Hz-plus refresh rate, low response time and your choice of Adaptive-Sync (Nvidia G-Sync or AMD FreeSync, depending on your system’s graphics card). And you can’t forget the cost of the decently beefy graphics card you’ll require to game properly in 4K.
If you’re not 4K-ready yet, check out lower-resolution recommendations on our Best Gaming Monitors page.
If you’re ready to dive into high-res gaming (lucky you), below are the best 4K gaming monitors of 2020, based on our own testing.
Best 4K gaming monitors at a glance:
1. LG 27GN950-B
2. Asus ROG Strix XG27UQ
3. Acer Predator XB273K
4. Asus TUF Gaming VG289Q
5. Asus ROG Swift PG27UQ
6. Acer Predator X27
7. Acer Predator CG437K
8. Acer ConceptD CP7271K
9. Asus ROG Swift PG43UQ
10. Alienware AW5520QF
11. HP Omen X 65 Emperium
When seeking the best 4K gaming monitor for you, consider the following:
4K gaming requires a high-end graphics card. If you’re not using an Nvidia SLI or AMD Crossfire multi-graphics card setup, you’ll want at least a GTX 1070 Ti or RX Vega 64 for games at medium settings or an RTX-series card or Radeon VII for high or greater settings. Visit our Graphics Card Buying Guide for help.
G-Sync or FreeSync? A monitor’s G-Sync feature will only work with PCs using an Nvidia graphics card. FreeSync will only run with PCs carrying an AMD card. Only FreeSync monitors work over HDMI (for more, see our DisplayPort vs. HDMI analysis), but we’ve seen negligible differences in mainstream gaming capabilities for fighting screen tearing between the two. Our Nvidia G-Sync vs. AMD FreeSync article offers an in-depth performance comparison.
4K and HDR go hand-in-hand. 4K displays often support HDR content for extra bright and colorful images. But for Adaptive-Sync optimized for HDR media, you’ll want a G-Sync Ultimate or FreeSync Premium Pro (formerly FreeSync 2 HDR) display. For a noticeable upgrade from an SDR monitor, opt for at least 600 nits brightness. You can learn more in our article on HDR’s meaning and our buying guiding for picking the best HDR monitor.
For more guidance picking a monitor of any resolution–gaming or otherwise–check out our PC Monitor Buying Guide.
With speed, accurate color and high contrast, the LG 27GN950-B is the best 4K gaming monitor and our top recommendation. There’s tough competition on this page, but the 27GN950-B stands out with some of the best input lag scores we’ve seen of a 144Hz monitor (tying with the Asus ROG Strix XG27UQ below) while also keeping up with its rivals in our response time testing.
Image quality is also a sight to hold. With an edge array backlight with a local dimming feature, the 27GN950-B doesn’t quite hit FALD-level HDR but still brought stellar performance with 8,475.3:1 contrast ratio. LG also implemented its Nano IPS panel, the answer to Samsung’s Quantum Dot tech, to achieve massive color coverage (94.5% of DCI-P3 and 133.9% of sRGB after our recommended calibration) that really made games pop.
The Asus ROG Strix XG27UQ is the best 144Hz 4K gaming monitor and may be cheaper than you expect. Even though it competes specs and performance-wise with the Acer Predator X27 and Asus ROG Swift PG27UQ, the ultimate in 4K gaming, the ROG Strix XG27UQ should be much more affordable. We’ve seen it listed for $800, but, sadly, as of writing we’re only seeing it in stock at $1,099. It doesn’t have the premium FALD backlight that results in beautiful HDR; however, HDR performance was still impressive, with thanks due to an effective edge-array backlight and Dynamic Dimming feature.
The ROG Strix XG27UQ stacked up well in our testing when it came to both response time and input lag. In the input lag test, it outperformed other 144Hz monitors, including the aforementioned X27 and PG27UQ. And while it’s listed as a FreeSync monitor, we were able to run G-Sync on it successfully.
If you prefer the viewing angles and color of IPS monitors, the Acer Predator XB273K is the best 4K gaming monitor for you. It’s a tough competitor with a 144 Hz refresh rate. During fast-paced games with settings maxed, there was no blur. G-Sync worked successfully–with both standard and HDR content– to fight screen tearing when paired with an Nvidia graphics card. The monitor kept up well with other 144Hz displays during our testing and even beat the Asus ROG Swift PG27UQ and Acer Predator X27 when it came to input lag.
Of course, Image quality is also important. The Predator delivers with pro-level color accuracy and contrast that reached over 4,000:1 during our testing and over 2,000:1 after our calibration. Again, HDR doesn’t look as good as it does on the Asus ROG Swift PG27UQ or Acer Predator X27 because those two display pack FALD backlights. But we consider the Predator XB273K the next best thing.
You don’t often see the word budget associated with a 4K monitor, but the Asus TUF Gaming VG289Q isn’t just affordable, it’s a great gaming monitor too. Despite being available for $330 – $350 as of writing, the display offers a great amount of performance, making it a fantastic value for gamers looking to get to 4K without breaking the bank. We’ve even seen 4K monitors at the $400 mark offer lesser gaming performance.
There was no ghosting when we gamed on the VG289Q, and overdrive successfully helped eliminate motion blur. SDR titles looked extra colorful, but there was hardly any improvement when moving over to HDR games.
With the VG289Q priced so low, it’s not surprising that its refresh rate is limited to just 60 Hz (FreeSync works down to 48 Hz). Hardcore gamers will want more Hz, but casual players can make due with fast-paced scenes showing sufficient detail and great pixel density.
The Asus ROG Swift PG27UQ is the best 4K gaming monitor for enjoying HDR. When it comes to mouthwatering HDR delivery, nothing can beat a full-array WLED backlight with zone dimming. We were able to tell the difference in HDR games, like Call of Duty: WWII, through detailed shadows, brilliant highlights and realistic-looking textures.
Of course, this is also a great monitor for competitive gaming, thanks to its high 120Hz refresh rate at 4K resolution that can climb to 144Hz with overclock. The PG27UQ’s closest rival is the Acer Predator X27, which has that same type of backlight that makes HDR look its best. But the PG27UQ has a small leg up on the X27 with a 1ms faster response time.
For more on picking an HDR monitor and additional recommendations, see our How to Choose the Best HDR Monitor buying guide.
The Acer Predator X27 offers near-identical performance to the Asus ROG Swift PG27UQ above. The Asus edges the Acer out spec-wise with a 1ms shorter response time, but in our testing we found the monitors comparable in both gaming capabilities and SDR and HDR quality. On the other hand, the Predator X27 showed a slight edge over the PG27UQ in out-of-box color accuracy and comes with a light-blocking hood. If you’re stuck between the two, your best bet is likely to opt for the one currently selling at the lower price.
Like the PG27UQ, the Predator X27 has a FALD backlight with zone dimming that produces mouth-watering HDR. Its insane gaming specs ensured tear-free gaming at high frame rate during our testing. If you’ve got the graphics horsepower to make the most of it, this monitor sits atop the 4K gaming displays list.
144Hz requires two DisplayPort cables and sacrificing G-Sync Compatibility and HDR support
If you’re looking for a 4K gaming monitor that’s big but still able to fit on your desk, the Acer Predator CG437K is a great fit (pun intended). In our time with the monitor, we found it big enough to fill our peripheral vision from 3-4 feet away. Plus, you get a remote in the box. Ultimately, the Predator CG437K like having an ultra-wide without the curve but with the extra height you crave.
Its gaming credentials are out of this world too. The Predator CG437K comes with G-Sync Compatibility from 48-120 Hz. The monitor can also hit 144 Hz with overclock –but only if you have two DisplayPort cables and don’t need G-Sync or HDR. With a big-screen VA panel boasting 4,000:1 contrast, games looked lifelike and proved a different experience than when using other premium displays, including the 4K Acer Predator X27 and 3440 x 1440 Acer Predator X35.
HDR with FALD backlight at 1,000 nits brightness and G-Sync Ultimate
No portrait mode
No selectable color gamuts
The lines separating the monitor needs of gamers and professionals keeps blurring. Besides that, there’s nothing wrong with a photo editor wanting to game during their free time, right? The Acer ConceptD CP271K is the best 4K gaming monitor for professionals because it boasts impressive gaming specs coupled with accurate color space coverage.
Creative professionals can get work done with the monitor’s 110% coverage of the DCI-P3 color space, although the very meticulous will find that to be slightly too colorful. You can, however, reduce color with a software look-up table. You also get great HDR output with a FALD backlight that reaches 1,000 nits.
At the same time, the ConceptD CP271K offers gamers accurate sRGB coverage (96.3%), as well as powerful performance that kept up with 144 Hz gaming monitors in our response time and input lag benchmarking.
We loved the Asus ROG Swift PG43UQ when we first saw it in June, but it’s been hard to find online ever since. We reviewed it at $1,500 but have seen it sell for more. If you can find this massive screen at the right price, you won’t be disappointed.
At 43 inches, the ROG Swift PG43UQ is juggernaut that makes for a great TV replacement — it even has a remote. From a 4-foot distance, it lends to a highly immersive experience that rivals a curved ultra-wide. And with DisplayHDR 1000 certification, HDR movies pack a punch. Games looked incredibly realistic and warm in HDR and also natural and vibrant in SDR.
Most importantly, the PG43UQ is specced for high-performance gaming. Response time competed well against other 144 Hz screens in our benchmarks, and input lag was better than the Acer Predator CG437K, Asus ROG Swift PG27UQ and Acer Predator X27 above.
If you have a lot of room in your budget and are seeking the best-looking 4K experience, it doesn’t get better than the Alienware AW5520QF, the first real OLED gaming monitor. With its unprecedented technology, it delivered the best image quality we’ve ever seen, boasting immeasurable blacks and, therefore, theoretically unlimited contrast.
But the Alienware OLED still isn’t perfect. Max brightness with regular SDR content is just 130 nits, while HDR only bumps it up to 400 nits. That means its potential is best realized in a darker room. But keep in mind that with its large size, 150 nits with SDR would’ve been acceptable, so the Alienware is just a little off. Of course, as our resident splurge, the AW5520QF’s also expensive–even by OLED TV standards. And for better audio, consider the HP Omen X 65 Emperium, which features a built-in soundbar.
If you’re a couch gamer, you need a monitor that’s fit for replacing your TV. With a 64.5-inch display, the HP Omen X 65 Emerpium is amply equipped to do just that. This juggernaut of a gaming monitor offers larger-than-life gaming. In testing, performance matched its high price tag with zero gaming hiccups and high frame rates at high settings.
HP also included some unique bonuses that make this monitor even more fitting for the living room. An included soundbar featuring four 4-inch woofers, two 1-inch tweeters and two passive radiators add to the feeling of immersion. The monitor also comes with Nvidia Shield Android-based streaming interface, which means gaming, TV and movie-streaming options are built right into your gaming display. A remote completes the living room package.
And if you’re into HDR, the Omen X 65 Emperium would make a great HDR display with high contrast, according to our tests, and certification to play HDR at a minimum brightness of 1,000 nits.
More: HP Omen X 65 Emperium review
MORE: Best Gaming Monitors
MORE: How We Test Monitors
MORE: All Monitor Content
MORE: HDMI vs. DisplayPort: Which Is Better For Gaming?
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.