Biostar has announced the new B550M-Silver motherboard for gamers, mainstream users and content creators. The motherboard aims to offer high-performance within an accessible price point.
Like its name suggests, Biostar’s new offering features a black PCB with some passive heatsinks slathered in a silver color to accentuate the design. Powered by AMD’s B550 chipset, the B550M-Silver comes in a standard microATX form factor and with native support for Ryzen 5000 (Vermeer) processors. The motherboard is armed with four DDR4 memory slots that support frequencies over DDR4-4400 and a maximum capacity of 128GB of memory.
High-speed storage options on the B550M-Silver consist of one speedy M.2 PCIe 4.0 x4 slot and a standard M.2 PCIe 3.0 x4 slot. Both accept SSDs with lengths up to 80mm, whether they’re SATA-or PCIe-based. However, you’d want to use the latter if you want to exploit the interfaces’ full performance. The motherboard also lands with six conventional SATA III connectors.
In terms of expansion, the B550M-Silver supplies a single PCIe 4.0 x16 slot for AMD’s Radeon RX 6000 (Big Navi) or Nvidia’s GeForce RTX 30-series (Ampere) graphics cards. There’s also a PCIe 3.0 x16 slot that runs at x4 and a PCIe 3.0 x1 slot for you to connect your other devices that are less bandwidth hungry.
The B550M-Silver doesn’t suffer from slow Internet connectivity either. The motherboard has a 2.5 Gigabit Ethernet port, which is based of the Realtek RTL8125B controller. There is support for Wi-Fi 6 too, but you’ll have to buy your own wireless card to get that feature.
In case you plan to use the B550M-Silver with a compatible APU, the motherboard puts one DVI-D port, one HDMI port and one DisplayPort output at your disposal. The motherboard also comes with a PS/2 keyboard and mouse combo port, one USB 3.2 Gen 2 Type-C port, one USB 3.2 Gen 2 port, four USB 3.2 Gen 1 ports and two USB 2.0 ports.
The Realtek ALC1150 codec takes care of the audio workloads on the B550M-Silver. The motherboard has three standard 3.5mm audio jacks, but support 7.1-channel audio.
Biostar didn’t reveal when the B550M-Silver will be available or how much it’ll cost.
We had the opportunity to perform some 8K tests with the RTX 3090, in order to verify the words of Nvidia that they paint as the world’s first gaming GPU to play at that resolution. In the tests we also included the RTX 3080 and the RX 6900 XT.
by Manolo De Agostini published December 2020 , at 17: 41 in the Video Cards channel GeForce NVIDIA Radeon AMD
Upon presentation of the GeForce RTX 3090 , Nvidia spoke of it as a multipurpose video card, equipped with a lot of video memory to manage particularly complex renderings and at the same time strong of specifications higher than the RTX 3080 in the gaming sector. We have observed how the difference in performance with the RTX 3080, even in 4K, does not justify the purchase, however there was an open question to be verified, namely whether the RTX 3090, as stated by Nvidia, truly represents “ the world’s first gaming GPU to play in 8K “. We carried out some tests to draw the often blurred boundary between marketing and reality.
Before pass to the numbers however, remember that the 8K resolution (7680 x 4320 pixel) requires GPU to play on screen 12 times the number of pixels del 1080 p (1920 x 1080 pixels) and therefore a large amount of VRAM memory is required, as well as a very, very powerful GPU . GeForce RTX 3080, with its 10 GB of memory, cannot be load of the 8K in every situation (and we will see it later). In order for a video card to be able to play 8K on a TV / monitor at that resolution with a single cable, an HDMI 2.1 connection is also required. , a feature that both GeForce cards and the latest generation Radeons are equipped with.
But all this is not enough. The GA GPU 102 on board the RTX 3090, however powerful with its 10496 CUDA core, the 82 2nd generation RT core ei 329 Third Generation Tensor Core, cannot always manage the 8K resolution by performing a native rendering, especially if you set the details to the maximum level and above all, ray tracing is enabled. And this is where the DLSS (more details here) comes into play, a technology that Nvidia has put point and that it supports the performance of the GeForce RTX when ray tracing is activated.
Together with the RTX 3090, Nvidia has introduced the “ Ultra Performance Mode”, a further branch of the DLSS designed specifically for 8K . In practice, and simplifying a lot, the DLSS goes to do a sort of upscaling of the resolution from 950 pa 8K, a nine times boost, more than the four times guaranteed by the “simple” Performance Mode. This causes the image quality to be high, but the performance impact is significantly lower than native rendering.
Control, Death Stranding and Wolfenstein: Youngblood are among the games that support Ultra Performance Mode , so we ran some tests with these and other titles using both the RTX 3090 that the RTX 3080 – in some cases, where possible, we have also included the Radeon RX 6900 XT. For the test we used a Samsung Q TV 950 TS, an 8K QLED from 65 inches , to check the behavior of Nvidia’s video cards at that resolution. Below are the game detail settings:
Control: DX 12, high quality, high ray tracing, DLSS enabled, render resolution 1440 p (for the test with DLSS), final resolution 8K (in the case of the “no DLSS” test the rendering resolution is equal to the final resolution)
Wolfenstein Youngblood: Vulkan, high quality, active ray tracing reflexes, DLSS Uber Performance (for testing with DLSS, otherwise off), Riverside integrated test
Doom Eternal: Vulkan, Ultra, resolution scaling disabled, HDR active
Metro Exodus: Ultra details, Ultra ray tracing, DLSS On (Off in the “no DLSS” test)
DiRT 5: 8K resolution locked, details Ultra
Shadow of the Tomb Raider: Maximum detail setting, RTX Ultra, DLSS On (Off for “no DLSS” test)
Death Stranding: DX 12, high details, TAA, medium model details, DLSS Ultra Performance (for DLSS testing, otherwise off)
Our tests tell us that in titles with Ultra Performance Mode, the GeForce RTX 3090 offers almost playable performance. There is also the case of Doom Eternal , a game that focuses on responsiveness and speed (based on the Vulkan API), in which the native rendering in 8K allows the card to get closer to the fateful 30 fps . Death Stranding and Wolfenstein Youngblood return playable performances even with the RTX 3080 in the presence of the new mode by DLSS.
Regarding the card from AMD, which is not advertised in any way for 8K , we must report that in some cases we were unable to run the tests (for example Tomb Raider or Control) due to sudden crashes or 1-2 fps performance. This is also true for the RTX 3080, in some cases we have seen crashes, and in Wolfenstein and Control we were told that the VRAM memory was finished. In general the RX 6900 XT performs worse than RTX 3090, with the exception of DiRT 5, which is particularly favorable to AMD GPUs, where it is slightly faster (although it’s more of a draw actually) –
Nvidia, therefore, does not lie when it talks about a GPU capable of handling 8K, but we need to add a bit, that is “with DLSS” . There are titles that may come close to 60 fps even with native rendering, especially by lowering the level of detail, but they must be rather light or dated games. Having said that, let’s see how in some titles we arrive at 30 fps, which bodes well for future generations of GPUs, DLSS or not. In any case, there remains further proof of how the GPUs of this generation represent a clear step forward compared to previous proposals. In conclusion, even if the marketing does its job by advertising the card also for the 8K, certainly it must not be this promise to guide your purchase idea, even with a look to the future : when this resolution becomes popular, and years will pass, the GeForce RTX 3090 will already have become a technological residue.
Last week Cyberpunk 2077 finally made its debut, which currently looks best on strong computers with cards NVIDIA GeForce RTX (thanks to Ray Tracing). Good rasterization performance in Full HD and Quad HD is also achieved by the AMD Radeon RX 6000 cards, which made their first debut. The situation is completely different with the Xbox One and PlayStation 4. The quality of this release was shrouded in mystery before the premiere, because CD Projekt RED was not eager to present them. When something was finally shown, it was only for Xbox One X, Xbox Series X (backwards compatible), PlayStation 4 Pro and PlayStation 5 (also backwards). After the premiere, unfortunately, it turned out that the console editions are disastrous in terms of technical refinement (or rather, underdevelopment). CD Projekt RED apologizes to console players in the official announcement.
CD Projekt RED apologizes to console players for the quality of Cyberpunk 2077 on Xbox One and PlayStation 4 devices. The developers promise to help with possible returns, moreover, we will see more patches soon to improve the questionable quality of the game on consoles.
Cyberpunk Performance Test 2077 PC – What are the hardware requirements? Test of AMD Radeon and NVIDIA GeForce graphics cards
Frequent game freezes, shutdowns and automatic switching to the main menu of Xbox One and PlayStation 4 consoles – this is actually what Cyberpunk looks like today 2077 on cheaper Sony and Microsoft devices. Patch 1 was released before the weekend 04 (on Friday, PS, Saturday, Xbox) to improve game stability and change the way HDR implementation (the one before version 1. 04 was suitable for the trash). Things managed to improve a bit – the image became a bit sharper, HDR worked better, but the game is still far from a state that would satisfy players. Already over the weekend, there were reports that Sony and Microsoft are refunding players money for digital Cyberpunk releases 2077 without much problem. Today, CD Projekt RED studio officially apologized to console players for the terrible technical condition. They promised to help with returns of both digital and boxed editions of the game – in the latter case, it is suggested first to contact the store where the game was purchased.
Cyberpunk 2077 – Insightful review. We look at the Samurai under the kimono, and there … a sinusoid of ups and downs. Which more?
On the occasion, CD Projekt RED announced that it is working hard to improve the technical side of the game on all target platforms. Already this week (maybe even tomorrow) patch 1 will be released 05, which will continue to improve the stability of the title and fix other bugs that prevent completion of some quests – both story and side quests. In the near future, the developer intends to publish two major updates – Patch 1 and Patch 2, which are to fix most of the bugs encountered in the game. Patch 1 will be released in January 2021. In February, Patch 2 will be published. In the meantime, we should also count on smaller patches, positively influencing the stability of Cyberpunk 2077. In the first quarter of the year 2021, an update should also be made available, prepared for PlayStation 5 and Xbox Series X / S.
The same German store that leaked the ASUS TUF Gaming A 17 with AMD Ryzen 7 processor 5800 H has revealed three new models, also from ASUS, with the Ryzen 5 low-power variants 5500 U and Ryzen 7 5700 U of the new family of processors from AMD. Although the team with the 5800 H has already been deleted, it seems these three are still online:
ASUS S 553 UA-BQ 048 T with AMD Ryzen 7 5700OR
ASUS S 746 UA-AU 059 T with AMD Ryzen 7 5700OR
ASUS TM 420 UA-EC 004 T with AMD Ryzen 5 5500 U
We can see that the AMD Ryzen 5 5500 U has 6 cores 2.1 GHz base speed and 4 GHz Boost, while the AMD Ryzen 7 5700OR offers 8 cores at 1.8 GHz base and 4.3 GHz Boost. Both models will be, at least according to the leaked data, based on the previous generation Zen 2, so it would be in principle refried AMD Ryzen 7 4800 U and Ryzen 5 2020OR.
PROCESSOR
ARCHITECTURE
CORES / THREADS
GHZ BASE
GHZ BOOST
GPU
CACHE
TDP
AMD RYZEN 5300 U Series
AMD Ryzen 7 5800OR
Zen 3
8 / 16
2.0 GHz
4.4 GHz
8CU 2 , 0 GHz
18 MB L3
12 – 30 W
AMD Ryzen 7 5700OR
Zen 2
8 / 17
1.8 GHz
4.3 GHz
8CU 1.9 GHz
8 MB L3
11 – 25 W
AMD Ryzen 5 5600OR
Zen 3
6 / 14
2.3 GHz
4.2 GHz
7CU 1.8 GHz
14 MB L3
10 – 30 W
AMD Ryzen 5 5500OR
Zen 2
6 / 12
2.1 GHz
4.0 GHz
7CU 1.6 GHz
8 MB L3
10 – 25 W
AMD Ryzen 3 5400OR
Zen 3
4/8
2.6 GHz
4.0 GHz
6CU 1.6 GHz
8 MB L3
11 – 25 W
AMD Ryzen 3 5300OR
Zen 2
4/8
2.6 GHz
3,85 GHz
6CU 1.5 GHz
4 MB L3
10 – 30 W
Before they are removed , we leave you the captures and their complete specifications:
End of Article. Tell us something in the Comments or come to our Forum!
Antonio Delgado
Computer Engineer by training, editor and hardware analyst at Geeknetic since 2011. I love gutting everything that comes my way, especially the latest hardware that we get here for reviews. In my spare time I fiddle with 3d printers, drones and other gadgets. For anything here you have me.
You may also like Other articles and news about technology
One of the fastest 4K monitors at 144 Hz, the LG 27GN950-B delivers accurate color and excellent HDR for cheaper than a top-of-the-line FALD display.
For
Local dimming produces VA-like contrast
Excellent HDR
Wide and accurate color gamut
Fast and responsive
FreeSync & G-Sync Compatible
Against
Oversaturated sRGB mode
Features and Specifications
The pinnacle of desktop PC monitor resolution is 4K (3840 x 2160 pixels), at least for now and the foreseeable future. Even at 32 inches diagonal, screen density is enough to hide any trace of the image’s pixel structure. A 27-inch 4K monitor sports a highly-packed 163 pixels per inch, and that means you can sit super close and never see the dots.
What does this mean for those seeking the best gaming monitor? High resolution is great but has a significant impact on frame rates. It takes one of the top cards on our GPU benchmarks hierarchy to drive 8.3 million pixels faster than 100 frames per second (fps). That fact, and the high price of speedy 4K monitors has made this a small and exclusive club.
LG adds to this short list of choices with the LG 27GN950-B($800 as of writing). It combines a 144 Hz refresh rate with AMD FreeSync Premium Pro and G-Sync Compatibility, DisplayHDR 600 and a wide color gamut. Is it a worthy competitor to leaders on our list of the Best 4K Gaming Monitors. Can it take on the standard-bearing Acer Predator X27 and Asus ROG Swift PG27UQ? Let’s take a look.
LG 27GN950-B Specifications
Panel Type & Backlight
IPS / W-LED, edge array
Screen Size & Aspect Ratio
27 inches / 16:9
Max Resolution & Refresh
3840 x 2160 @ 144 Hz
FreeSync: 48-144 Hz
G-Sync Compatible
Native Color Depth & Gamut
10-bit (8-bit+FRC) / P3
HDR10, DisplayHDR 600
Response Time (GTG)
1ms
Brightness (mfr)
400 nits SDR
600 nits HDR
Contrast (mfr)
1,000:1
Speakers
None
Video Inputs
1x DisplayPort 1.4 (DSC)
2x HDMI 2.0
Audio
3.5mm headphone output
USB 3.0
1x up, 2x down
Power Consumption
30.4w, brightness @ 200 nits
Panel Dimensions WxHxD w/base
23.9 x 18.1-22.5 x 11.5 inches (607 x 460-572 x 292mm)
Panel Thickness
2.1 inches (53mm)
Bezel Width
Top/sides: 0.2 inch (5mm)
Bottom: 0.4 inch (10mm)
Weight
16.9 pounds (7.7kg)
Warranty
3 years
The 27GN950-B’s price of entry is the first thing we noticed. At this writing, LG is selling it for $800. That undercuts the aforementioned Asus and Acer monitors significantly. The principal reason for this is its backlight. Rather than the full-array local-dimming (FALD) units used by Asus and Acer, LG employs an edge-lit backlight but offers a local dimming feature of its own. By selectively dimming the individual LEDs, it achieves HDR quality that comes close to its FALD cousins.
The panel is dubbed “Nano IPS” and that refers to the IPS panel’s sub-pixel structure. Its goal is to widen the color gamut, much like the Quantum Dot technology used by Samsung and others. LG achieves that goal with a measured 95% coverage of the DCI-P3 color space. The backlight also delivers over 600 nits in HDR mode, enough to earn a VESA DisplayHDR 600 certification (see our article on how to choose the best HDR monitor).
The big story here is the 27GN950-B’s refresh rate. 144 Hz 4K monitors are rare and have serious bandwidth requirements. To run at full honk, you’ll need a graphics card (likely one of the best graphics cards) with DisplayPort 1.4 capability because LG uses Display Stream Compression to get all those pixels over a single cable. Definitely check your graphics card specs before pulling the trigger on this monitor.
If you have the hardware, the 27GN950-Bcan deliver 144 Hz with HDR and FreeSync Premium Pro over DisplayPort. That means it includes Low Framerate Compensation for speeds below 48 Hz. It also carries G-Sync Compatibility certification from Nvidia with the same capabilities.
Assembly and Accessories for LG 27GN950-B
After mating the base and upright with two captive bolts, the 27GN950-B’s panel snaps in place. You’ll find a tiny clip in the box for cable management, along with wires for DisplayPort, HDMI and USB. The power supply is an external brick. Given the monitor’s bandwidth requirements, LG recommends using the supplied cables to run 4K resolution at 144 Hz.
LG 27GN950-B Product 360
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
The LG 27GN950-B has the thinnest bezel we’ve seen to date, just 5mm at the top and sides and 10mm on the bottom. While not truly frameless, it’s closer to that goal than any other monitor we know of.
Styling is angular with corners rounded just enough to not be sharp. Finishes are the usual matte black in brushed and pebble styles. The base features red trim on the bottom with LG and G-Sync logos in proud view.
In back, is a swoopy version of the LG logo, along with a ring of RGB LEDs around the upright’s attachment point. You can remove the stand to use the 100mm VESA mount and still enjoy the lights. You control the RGB with a dial on the bottom-center. Pressing it turns them on. Then, you can change the effect by scrolling through six options. You can also coordinate the effects with sound and on-screen content by using LG’s downloadable Ultragear Control Center app.
The stand offers a 4.4-inch height adjustment with 5/15 degree tilt and a portrait mode. There is no swivel function. Movements are very solid and feel like a premium monitor should.
LG’s trademark rear-facing input panel makes it super easy to see what you’re plugging in. There are two HDMI 2.0 ports and one DisplayPort 1.4 (see DisplayPort vs HDMI: which is better for gaming?). Also included is USB 3.0, one upstream and two down. A 3.5mm audio jack supports headphones or powered speakers. There are no speakers built into the 27GN950-B.
OSD Features of LG 27GN950-B
The 27GN950-B’s OSD and power are controlled by a tiny joystick found in the bottom center of the panel. You can click it to the sides for volume control and fore/aft to adjust brightness. Press it, and you get a quick menu with input selection, game mode, power and the full menu.
Game Mode refers to the monitor’s picture eight presets. Gamer 1 is the default and accurate enough to be used without calibration if the P3 gamut is your goal. sRGB is included but only shrinks the gamut slightly with over 130% coverage. If you’d like to apply the look of HDR to SDR content, HDR Effect can do this. It preserves highlight and shadow detail well, but its look is a matter of personal preference. You also get two Calibration modes, which require LG’s True Color Pro software to unlock.
Game Adjust has an Adaptive-Sync toggle, Black Stabilizer for brighter shadows, three-level overdrive (Fast is the best setting) and a selection of aiming points.
In the Gamer 1 preset, all the LG 27GN950-B’s calibration controls are available. There are four gamma presets, three color temps, plus a custom mode, a method of selection by Kelvin value and a six-color hue and saturation menu. All adjustments are precise with fine resolution and excellent accuracy. Minor changes will produce a very accurate monitor.
Though it has an edge backlight, the LG 27GN950-B features local dimming. It has a small impact on SDR quality but a big one in HDR mode, offering deep contrast, true blacks bright highlights and, therefore, some of the best HDR performance we’ve seen among mainstream monitors.
LG 27GN950-B Calibration Settings
The 27GN950-B doesn’t require calibration, but there are small gains available with a few tweaks. The best gamma preset is Mode 4, and that requires a single click of the red slider to bring grayscale tracking to an error-free state.
Color is always in the P3 gamut, regardless of picture mode. Selecting sRGB reduces saturation a little but not enough to qualify as a true sRGB monitor. We also lowered Contrast a little to improve gamma and fix a red clipping issue at 100% brightness. Note that any change to gamma requires recalibration of the white point.
Here are the settings we used for SDR content: HDR locks out all image controls except brightness.
Picture Mode
Gamer 1
Brightness 200 nits
26
Brightness 120 nits
7
Brightness 100 nits
3 (min. 89 nits)
Contrast
68
Gamma
Mode 4
Color Temp Custom
Red 51, Green 50, Blue 50
HDR locks out all image controls except brightness.
Gaming and Hands-on with the LG 27GN950-B
The 27GN950-B offers a few different dynamic contrast options. You can just use it as it comes from the box in Gamer 1 mode with Local Dimming turned off, which looks quite good. The monitor has similar contrast to other IPS monitors but a bit more color saturation, thanks to its wide P3 gamut coverage. With the backlight set for 200 nits brightness, we could work all day without fatigue on documents and graphic editing.
With Local Dimming on, contrast doubled, and it shows. Color looked more vivid, with blacks appearing deeper. Highlight and shadow detail was also typically preserved without clipping. A few times, we saw it working when a small bright object appeared against a black background. Then, it was possible to see a vertical band where the backlight was stronger than the surrounding area. It’s like the halo effect sometimes seen in FALD displays but the halo extends the full height of the screen. This was a rare occurrence.
This is one of the very few monitors that looks good running Windows productivity apps in HDR mode. Color is a little hazy, but contrast is very good. And if it looks too harsh, you can turn down the brightness. The LG 27GN950-B is one of the few monitors we’ve seen that lets you dial down the light level in HDR mode. An alternative to this is the HDR Effect mode. It’s searingly bright at its default brightness setting of 100. But turn the backlight down to 26 (around 200 nits) and it’s more palatable. Users have several good, useful options here.
SDR gaming was much the same experience. Tomb Raider always looks great when there’s extra color available. The LG 27GN950-B has plenty to offer. We stuck with the Gamer 1 mode and Local Dimming turned on. Contrast was superb, and color was extremely enjoyable. Motion processing was flawless as we played at around 110 fps.
The GeForce GTX 1080 Ti in our test system can’t do 4K at 144 Hz because it lacks DisplayPort 1.4. But the LG 27GN950-B ran fine at 120 Hz, and we never hit 120 fps in the game with details maxed. When we played with a Radeon RX 5700 XT, the full 144 Hz became available as this card supports DisplayPort 1.4 and Display Stream Compression. In both cases though, Adaptive-Sync worked perfectly.
Turning to Call of Duty: WWII and its HDR capability, we enjoyed the effect immensely. Highlights popped brightly while shadows looked dark and detailed. If you’re shopping for an HDR monitor, the 27GN950-B is a great alternative to a more expensive FALD display. We played with HDR and 144 Hz on our RX 5700 XT-based system, though in-game speeds never exceeded 110 fps. The same was true of the GTX 1080 Ti platform. You’ll need a stout graphics board (or two) to hit 144 Hz at 4K.
Last week I told you about the excellent AMD Radeon RX 6800 XT and AMD Radeon RX 6800, two graphics cards through which AMD managed to return to competition in the high-end segment. This is due to a combination of factors, the main culprits being the optimizations brought at the architectural level, the use of a mature manufacturing process, the integration of technologies whose usefulness has been demonstrated in the processor segment (Infinity Cache) and nu why not access to funds R&D significantly higher than 2-3 years ago.
Of course, the list of news doesn’t stop there, with AMD implementing a number of new or significantly improved technologies, such as Radeon Anti-lag, Radeon Boost, FidelityFX Super Resolution, Variable Rate Shading or Smart Access Memory. We will discuss the latter today. Of course, you will probably wonder why we did not treat this topic from the beginning, respectively why we dedicate an entire article to it. Well, the answer just takes time.
You see, when I developed the new methodology for testing video cards, I tested 11 graphics cards, of these being models from the previous generation Nvidia and AMD, the eleventh being, obviously, RTX 3080. At that time, I was moving for the first time in the history of LAB 570 from an Intel platform 8700 K), which I have been using for two years, on an AMD platform, 3900 XT being at that time one among the best AMD processors.
Since then and until now, we have tested 4 new models (RTX 3090, RTX 3070, RX 6800 XT and RX 6800), as well as 9 implementations of graphics cards from the RTX series 3060, from various partners. Basically, in two months I tested 11 graphics cards, all on the new test platform, based on AMD Ryzen 9 3900 XT.
Well, Smart Access Memory is a technology that builds on the new AMD processors in the Ryzen series 5000, launched two weeks ago, together with an X chipset motherboard 570 and a graphics card from would be AMD Radeon RX 6000. Smart Access Memory offers the processor the ability to access the entire amount of VRAM memory, which normally has access only to 11 MB.
However, although AMD first implemented this feature on AMD RX series graphics cards 6000, the technology itself has been around for some time in the PCI-E specifications, with most motherboards having the two necessary options (Above 4G Decode and Resizable Bar Support) present in the BIOS. As a result, we may soon see this feature used by any CPU / GPU combination, whether we’re talking about AMD, Nvidia or Intel.
Until then, however, in order to benefit from this technology, we need an AMD processor from the Ryzen series 5000. As a result, to test this function, we had to use such a processor for tests, which means that we did not have to test only RX 6800 XT, with and without SAM activated, but also the main competitors, ie RTX 3080 and RTX 3090, thus preparing us for the launch of AMD Radeon RX 6900 XT. This means, in our case, 5 more sets of tests, in addition to the 3 prepared for the launch of RX 6800 XT, which was not possible in the context of time had available from the moment I received the samples until the launch. Moreover, we changed the test methodology for DXR games, retesting 5 Nvidia boards without DLSS, respectively testing 3 situations for AMD 6800, 6800 XT, 6800 XT Rage Mode) which also consumed time.
Last but not least, although we can’t wait to move on to a new methodology based on an AMD Ryzen series processor 5000, this will not happen this year, because new models will be launched from Nvidia and AMD, custom RX models will appear 6800 XT and RX 6800 and last but not least a whole series of new games have appeared and it is possible to replace 1-2 games in the current methodology with new titles . After most of the major releases (3060 Ti, 6900 XT) will take place, we will retest all the references from the AMD Radeon RX series 6000, respectively Nvidia GeForce RTX 3060 on a processor from the AMD Ryzen series 5000, we will add the new games, where appropriate and we will develop the video card testing methodology for 2021.
Resident Evil VII: Biohazard appeared at the beginning of 2017. More months to come after the premiere of the title, players realized that we will not get “eight” soon. The publishing priority was given to two remakes of the cult trilogy – Resident Evil 2 and 3. Today, nothing stands in the way of talking about Resident Evil 8 in terms of the game that is just around the corner. The first premises about the production appeared this summer, when Resident Evil 8: Village made its debut (apparently by mistake) in the offer of the German game store GamesOnly. At that time, we also learned the first few details about the game, but only now can we talk about the details. A trailer appeared on the web.
A long trailer for Resident Evil: Village appeared on the web. He acquaints us with the plot, characters and the mountain setting of the game. Premiere in 2021.
Cyberpunk 2077 – Insightful review. We look at the Samurai under the kimono, and there … a sinusoid of ups and downs. Which more?
As the trailer of Resident Evil Village indicates, it will be a direct continuation of Resident Evil 7. We will again play the role of the well-known Ethan, although we will get a completely different place of action, namely the village located on mountainous terrain. So it seems that in moments free from fear and escape, we will also be able to admire the picturesque landscapes. After all, these look really nice in the trailer. As you may know, a month ago there was a hack on Capcom’s servers …
Cyberpunk Performance Test 2077 PC – What are the hardware requirements? Test of AMD Radeon and NVIDIA GeForce graphics cards
… For this reason, quite a lot of spoiler-rich content has already appeared on the web. On forums or social media, we find detailed descriptions of the plot, events or character models. I deliberately do not include them here, so as not to spoil anyone’s fun, but they are very easy to find. Meanwhile, I leave you with the trailer, which should sufficiently fuel your appetite for the game. And this one will probably appear in April next year on PC, PlayStation 5 and Xbox Series X platforms.
João Silva 42 mins ago Featured Tech News, Graphics, Software & Gaming
AIDA64’s latest beta release notes show that the diagnostic tool has received support for a plethora of unannounced Nvidia GeForce RTX 30 series cards, including the RTX 3080Ti, the RTX 3060, the RTX 3050, and multiple RTX 3070/3060 Mobile graphics cards.
As the year of 2021 comes closer, the release of the remaining lineup of the Nvidia RTX 30 series is imminent. With the rest of the SKUs expected to release in early 2021, FinalWire has added support for the rest of the RTX 30 series desktop cards and for some RTX 30 series Mobile GPUs, as noted by @KOMACHI_ENSAKA.
The official release notes for AIDA64 Extreme 6.30.5523 beta seem to be based on a PCI ID repository, meaning that the GPU information in the application might be based on rumours. Nonetheless, this beta version of AIDA64 Extreme has the “GPU information” of the RTX 3080Ti, the RTX 3060, the RTX 3050, the RTX 3060 Mobile/Max-Q, and the RTX 3070 Mobile/Max-Q and respective 16GB variant.
The full release notes of the AIDA64 Extreme version 6.30.5523 beta are:
Computer / Sensor / improved dynamic refresh when number of lines changes
Motherboard / SPD / improved picking of XMP profile with the highest clock
GPU information for AMD Radeon RX 6900 XT (Navi 21)
GPU information for nVIDIA GeForce RTX 3050 (GA107)
GPU information for nVIDIA GeForce RTX 3060 (GA106)
GPU information for nVIDIA GeForce RTX 3060 Mobile / Max-Q (GA106M)
GPU information for nVIDIA GeForce RTX 3070 Mobile (GA104M)
GPU information for nVIDIA GeForce RTX 3070 Mobile 16GB (GA104M)
GPU information for nVIDIA GeForce RTX 3070 Mobile / Max-Q (GA104M)
GPU information for nVIDIA GeForce RTX 3080 Ti (GA102)
GPU information for nVIDIA Quadro T500 (TU117GL)
GPU information for nVIDIA RTX A40 (GA102GL)
sensor support for ITE IT8638 sensor chip
sensor support for Dell SMI of Vostro 3501, Vostro 3671, Vostro 3888
motherboard specific sensor info for Asus Prime X299 Edition 30, Vanguard B85
motherboard specific sensor info for MSI MS-7C67
KitGuru says: Are you planning on acquiring any of the new upcoming RTX 30 series desktop cards, or even an RTX 30 series powered laptop in 2021?
Become a Patron!
Check Also
AMD Radeon Pro W5500 Professional Graphics Card Review
We look at AMD’s W5500 workstation GPU – how does it compare to Quadro P2200?
James Morris 15 mins ago Component, Design & Create, Featured Tech Reviews, Graphics, Professional, Reviews, Workstation
AMD has been providing stiff competition for Intel in the workstation CPU market, with the Ryzen and Ryzen Threadripper now delivering a lot more performance for the money for professional content creators. But the company’s pro-grade graphics haven’t been giving NVIDIA such a hard time. The Radeon Pro WX 8200 posed a short-lived challenge to the Quadro P4000, until the RTX 4000 came along, and then the Radeon Pro W5700 arrived to take over the battle, with only some success. Now, we have its little brother, the Radeon Pro W5500.
The W5500 is another AMD graphics card based on its 7nm process and the Navi GPU family, like the W5700. However, it uses a more recent spin of Navi, the Navi 14 RDNA GPU. It only has 22 compute units and 1,408 stream processors, compared to the W5700’s 36 and 2,304 respectively. However, it still comes with 8GB of GDDR6 memory, so supports resolutions up to 8K, although the interface is merely 128-bit making bandwidth half as much.
But the W5500 is also less than half the price of the W5700, so it is not competing against the market leading NVIDIA Quadro RTX 4000. In fact, NVIDIA doesn’t have an RTX card aimed at this price level, so the Quadro P2200 from the Pascal GPU generation is the main opposition for almost exactly the same money. This gives the W5500 a clear window of opportunity, because if it can deliver better performance, it also has more frame buffer (the P2200 only has 5GB), so it could be a budget contender for professional content creation. Let’s find out how it fares.
GPU
AMD Radeon Pro W5700
AMD Radeon Pro W5500
NVIDIA Quadro P2200
Compute Units
36
22
80
Stream Processors / CUDA cores
2,304
1,408
1,280
GPU Architecture / Variant
Navi 10 XL
Navi 14 PRO XL
Pascal GP106
Base Clock
1,183 MHz
1,354 MHz
1,000 MHz
GPU Boost Clock
1,930 MHz
1,855 MHz
1,493 MHz
Total Video memory
8 GB GDDR6
8 GB GDDR6
5 GB GDDR5X
Memory Clock (Effective)
1,750 (14,000) MHz
1,750 (14,000) MHz
1,251 (10,000) MHz
Memory Bandwidth
448 GB/sec
224 GB/sec
200.2 GB/sec
Bus Width
256-bit
128-bit
160-bit
Manufacturing Process
7nm
7nm
16nm
TDP
205 W
125 W
75 W
Display Outputs
5 x Mini-DisplayPort 1.4, USB-C
4 x DisplayPort 1.4
4 x DisplayPort 1.4
Display Resolution
5 @ 1920×1080
5 @ 3840×2160
5 @ 5120×2880
1 @ 7680×4320
(all at 60Hz)
4 @ 1920×1080
4 @ 3840×2160
4 @ 5120×2880
1 @ 7680×4320
(all at 60Hz)
4 @ 1920×1080
4 @ 4096×2160
4 @ 5120×2880
(all at 60Hz)
Software API Support
DirectX 12, OpenGL 4.6, OpenCL 2.1, Vulkan 1.2
DirectX 12, OpenGL 4.6, OpenCL 2.1, Vulkan 1.2
DirectX 12, OpenGL 4.6, OpenCL 1.2, Vulkan 1.2, CUDA 6.1
AMD Radeon Pro W5500 Retail Price: £404.30
Become a Patron!
Check Also
Adata XPG Defender Pro System Build – With Added RGB!
Adata sent us a stack of new XPG parts to try out, including a brand …
On 376 October AMD announced the Radeon RX series 6000, an ad that simply seemed too good to be true. This was followed by the presentation of the first two models, RX 6800 and RX 6800 XT, on 16 November, respectively the launch review, published on 25 November . There should have been reviews with implementations of the partners, on 25 November, but in Romania they were completely missing, both from the editorial offices of the profile sites, as well as in the stores.
Because, isn’t it, 2020 He taught us that even the things that really they are too beautiful to be true, such as AMD’s return to competition at the highest level, they can sometimes have footnotes. Footnote in this case sounds like this – Contains extraordinarily strong products, equivalent in performance to Nvidia RTX series 3000. Warning – side effects may include extremely limited availability and out-of-control pricing.
Unlike Nvidia, AMD has kept its word, and RX benchmark implementations 6800 XT and RX 6800 were traded for the communicated price. However, their availability was extremely low, and the implementations of the partners appeared in stores with delays and with prices much higher than what we saw in the case of references. Moreover, RX 6900 XT has not appeared in stores in Romania until now, not even in the form of a reference implementation.
In other words, the situation is not much different from what we can see in the case of Nvidia – beautiful products, but completely missing from stores. And for the parallel to be complete, starting today we will discuss the implementations of the partners, as we did in the case of Nvidia graphics cards. How to say… if you still can’t find the plates in stores, at least you know what you’re missing out on )
To get an idea of Nvidia’s new GPU family, we got ourselves a representative of the GeForce RTX 3070 (RRP: 500 Euro), RTX 3080 (RRP: 700 Euro) and RTX 3090 (RRP: 1500 Euro). The TUF Gaming GeForce RTX 3070 OC comes from Asus, which with its three fans and the long cooler is an impressive one Makes an impression. A slide switch can be used to switch between quiet operation in quiet mode or more cooling and a bit more 3D power in performance mode.
MSI controlled the factory-overclocked GeForce RTX 3080 Gaming X Trio 10 G at, a three- Slot Trumm with three eight-pin PCIe connections. The GeForce RTX 3090 is represented by Nvidia’s in-house Founders Edition, which we bought in stores. It also occupies three full PCIe expansion slots. Further features are the twelve-pin power connection, which Nvidia wants to replace the PCIe power plug (an adapter for two eight-pin plugs is included), and the cooling system with two fans. One of them blows onto the card from above, the other conveys the cooling air from below through the open heat sink towards the CPU.
For AMD a Radeon RX 5700 XT from Sapphire, which plays a good one and a half price and performance classes lower. The Radeon RX 6800 reached us too late for this article.
Access to all contents of heise + exclusive tests, advice & background: independent, critically sound c’t, iX, Technology Review, Mac & i, Make, c’t read photography directly in the browser register once – read on all devices – can be canceled monthly first month free, then monthly 9, 95 € Weekly newsletter with personal reading recommendations from the editor-in-chief Start FREE month Start your FREE month now heise + already subscribed?
Register and read Register now and read the article immediately More information about heise +
ASRock has introduced a second mini PC series with AMD’s Ryzen mobile processors 4000 U alias Renoir, intended for end customers . The models Mars 4000 U are marked with 26 mm height flatter than the 4X4 BOX – 4000, but still have space for a 2.5-inch data carrier thanks to the wider housing – a fast PCI Express SSD (PCIe 3.0 x4) can also be installed in an M.2 slot.
To Mars – 4000 U- PCs installed by ASRock all Renoir CPUs from the four-core Ryzen 3 4300 U to the eight-core Ryzen 7 4800 U with 10 Threads thanks to Simultaneous Multithreading (SMT) including an integrated Radeon Vega graphics unit. A radial fan keeps the processor at the right temperature. There are also up to 64 GByte DDR4 – 3200 – RAM in the form of two SO-DIMMs in dual-channel. An M.2 E-Key slot optionally houses a WLAN card such as Intel’s AX 200 for Wi-Fi 6.
ASRock Mars 4000 U (4 pictures ) (Image: ASRock) Few display connections Compared to the 4X4 BOX – 4000 ASRock makes significant cuts in the connections: on USB 3.2 Gen 2 Type C with data transfers of 10 GBit / The manufacturer does not use s and DisplayPort Altmode, instead there are five USB 3.2 Gen 1 with 5 GBit / s, including a Type-C port, and two USB 2.0 Type A. The Type-C connection does not support Altmode, so users can only use an HDMI 2.0 and D-Sub to connect monitors. That is enough for one Ultra and Full HD display with 60 Hertz .
Realteks Soundchip ALC 233 for connecting a headset, Gigabit Ethernet and an SD card reader round off the equipment. The .
Manufacturers usually sell mini PCs like the Mars 4000 U as barebone systems without memory. ASRock does not comment on availability and prices in the press release. Most recently the mini-PCs of the 4X4 BOX – 4000 disappeared again from the German trade and also competitor models like Asus’ PN 26 with Ryzen 4000 U are poorly available. A look at the notebook market leads to the assumption that AMD cannot deliver enough mobile processors.
After AMD with the Radeon-RX – 6800 – Series finally an opponent for NVIDIA’s GeForce RTX – 3000 series, we can also look at a game in direct comparison with the associated ray tracing effects for the first time. Call of Duty: Black Ops Cold War is this title in question and uses the DXR interface to calculate shadows and ambient lighting.
In addition to the pure Ray tracing performance we will also look at the fundamentally possible FPS for some current cards. The focus is of course on the new series from AMD and NVIDIA.
Call of Duty: Black Ops Cold War was developed by Treyarch and Raven Software. The in-house “IW Engine”, which was version number 3.0 for Call of Duty: Black Ops, is used. It is not known whether and to what extent there will be improvements. The predecessor also used ray tracing to calculate shadows and ambient lighting. The reflections that are otherwise present everywhere are still not found here. Instead, the developers rely on the classic Screen Space Reflections.
Let’s first take a look at the settings:
There the rendering resolution and output resolution can be set separately. Downscaling or upscaling is therefore possible directly in the game. In addition, V-Sync can be deactivated in the game and in the menu. There are also different texture and model qualities and an HD texture and model pack, which must be downloaded directly from the Battle.net client. The reflections are realized using Screen Space Reflections.
Light and shadow quality can also be selected in different stages without ray tracing acceleration. Sun shadows, local shadows (or better contact shadows) and surrounding obscuration, i.e. the lighting situation of the scene, are calculated using ray tracing. We will now examine some of the settings in more detail with regard to their influence on performance.
We did the first tests with a GeForce RTX 3080 Founders Edition and Radeon RX 6800 XT made on the graphics card test system and looked at what influence the quality settings have on performance. In the above gallery you can see the corresponding display quality.
Call of Duty: Black Ops Cold War – Quality Presets
3. 840 x 2. 160 – Raytracing Off
94. 6 XX
87.1 XX
94. 6 XX
88. 3 XX
92. 3 XX
86. 2 XX
91. 9 XX
83. 3 XX
90. 4 XX
83. 3 XX
89. 6 XX
81.1 XX
89. 2 XX
82. 0 XX
87. 4 XX
80. 2 XX
87. 3 XX
79. 5 XX
84. 5 XX
73 .5 XX
Frames per second
More is better
If you only change the texture and model quality, the performance for both cards is reduced from the smallest to the largest setting by only 10%. These settings are not necessarily the decisive ones, but they have a major influence on the quality. It is advisable to select the higher settings here.
DXR-Raytracing
For the ray tracing settings you can choose between Off, Medium, High and Ultra. The following gallery is intended to show the differences:
Since you can really not see the differences well, we have made some direct comparisons:
When making comparisons, you should pay particular attention to areas that are actually displayed much too brightly and are only darkened by means of ray tracing. Static shadow maps cannot cover every detail and so the dynamic ray tracing calculation can enable a much more realistic representation.
Contact shadows of objects lying on the table or on the wall hanging are also much clearer and more realistic. They are calculated depending on the angle of incidence and the distance from the light source and are not static. All screenshots for the ray tracing comparison can be downloaded here in full resolution and uncompressed for a better representation.
The influence of the DXR settings on the performance is like follows:
Call of Duty: Black Ops Cold War – DXR-Presets
3. 840 x 2. 160 – Ultra
87. 4 XX
80. 2 XX
84. 5 XX
73. 5 XX
65. 3 XX
60. 9 XX
62.8th XX
58.1 XX
59. 3 XX
55. 5 XX
52. 9 XX
48. 6 XX
48. 5 XX
46. 6 XX
42. 7 XX
41. 0 XX
Frames per second
More is better
Here you should keep in mind that Call of Duty: Black Ops Cold War was created in cooperation with NVIDIA and that there was an optimization here that, in case of doubt, did not take place with hardware from AMD.
The NVIDIA hardware loses around a quarter to a third of its performance due to the DXR. With AMD hardware, we speak of halving the FPS in the worst case. A GeForce RTX 3000 is even faster than a Radeon RX with full DXR settings 6800 XT in the lowest DXR Settings.
DLSS
Deep Learning Super Sampling is exclusively for GPUs from NVIDIA available. The following gallery shows the differences in the display without DLSS regarding quality, balanced, power and ultra-performance.
There are no recognizable differences beyond the options “Quality” and “Balanced” Differences in the presentation. You only see a slightly reduced quality from “performance”. This becomes even clearer for “ultra-performance”. Uncompressed and in full resolution you can find the screen shots in this archive.
The influence on the performance is as follows:
The quality”- We are already gaining termination of DLSS 50 % at FPS. For “balanced” we speak of 080%, ” Performance “it can already double. With “Ultra-Performance” the displayed FPS are more than doubled.
Ampere vs. Big Navi
Now let’s look at the performance of the graphics cards with AMD and NVIDIA GPU with and without ray tracing as well as with DLSS (only for NVIDIA) and get for them three resolutions give a glimpse into the performance offered by the cards.
Call of Duty: Black Ops Cold War
1. 920 x 1.0 80 Pixel – Ultra – DXR: High – DLSS: Balanced
222.8th XX
150 .3 XX
215. 9 XX
168. 6 XX
209. 0 XX
145. 2 XX
187. 3 XX
148. 6 XX
185.1 XX
137. 9 XX
181. 2 XX
120. 6 XX
172. 9 XX
134. 6 XX
170. 0 XX
108.1 XX
164.1 XX
109. 9 XX
163. 5 XX
111. 7 XX
159. 3 XX
124. 4 XX
155. 9 XX
120. 3 XX
155. 3 XX
116. 5 XX
142. 7 XX
117.8th XX
141. 3 XX
105. 3 XX
135.1 XX
112. 0 XX
125. 9 XX
99. 7 XX
125. 2 XX
101.1 XX
121. 0 XX
99 .3 XX
116.1 XX
94. 2 XX
109. 2 XX
93. 9 XX
103. 6 XX
85. 7 XX
97. 0 XX
82. 0 XX
95. 3 XX
80. 7 XX
82. 5 XX
70. 0 XX
Frames per second
More is better
Call of Duty: Black Ops Cold War
2. 560 x 1. 440 Pixels – Ultra – DXR: High – DLSS: Balanced
166. 6 XX
133. 6 XX
166.1 XX
120. 0 XX
157. 0 XX
110.8th XX
154. 7 XX
129.8th XX
147. 6 XX
114. 0 XX
137. 7 XX
108. 4 XX
135. 3 XX
105. 3 XX
134. 9 XX
113 .9 XX
128.1 XX
102. 6 XX
119.1 XX
97. 9
XX
116. 9 XX
97. 9 XX
115.8th XX
94. 3 XX
108. 6 XX
88.1 XX
103. 2 XX
89. 6 XX
103.1 XX
84. 6 XX
96.1 XX
82. 7 XX
92.1 XX
78 .0 XX
89. 0 XX
74. 9 XX
81. 5 XX
69. 7 XX
81. 4 XX
68.1 XX
80. 9 XX
70. 6 XX
70.1 XX
60. 5 XX
69. 0 XX
60. 0 XX
64. 3 XX
56. 7 XX
55. 3 XX
48.8th XX
Frames per second
More is better
Call of Duty: Black Ops Cold War
3. 840 x 2. 160 Pixels – Ultra – DXR: High – DLSS: Balanced
125.8th XX
107. 4 XX
109. 4 XX
90. 6 XX
101. 7 XX
85. 7 XX
88. 4 XX
71. 2 XX
87. 5 XX
75. 4 XX
86. 5 XX
70. 4 XX
85. 2 XX
73. 6 XX
76. 0 XX
63. 0 XX
73.8th XX
64.1 XX
72.8th XX
61.8th XX
69 .7 XX
58. 3 XX
63. 5 XX
53. 7 XX
63. 3 XX
56.1 XX
62. 3 XX
53.1 XX
58. 6 XX
49. 2 XX
54.8th XX
48. 3 XX
50. 7 XX
44. 5 XX
46.1 XX
37. 4 XX
44. 3 XX
40. 2 XX
45.1 XX
39. 6 XX
42. 4 XX
37. 0 XX
38. 7 XX
33.8th XX
38. 2 XX
31. 6 XX
33. 6 XX
30. 2 XX
28 .3 XX
25. 5 XX
Frames per second
More is better
The GeForce RTX 3080 is unsurprisingly the fastest card – across all resolutions away. The Radeon RX 6800 XT beats the GeForce RTX 3080 at least in 1080 p and 1164 p, in 4K the NVIDIA model is again ahead. A Radeon RX 6800 is always faster than a GeForce RTX 3070. As soon as ray tracing comes into play, the Radeon RX cards fall sharply, which then sometimes lets them fall far behind the GeForce competitors. The DLSS is only coming off 1440 p so really into the game and can bring the performance of the GeForce cards back to the previous level without these effects with the ray tracing effects. For a display in 4K this means for the GeForce RTX 3080 and GeForce RTX 3090 more than 100 FPS.
Assessment
Call of Duty: Black Ops Cold War is and remains an action fireworks display. Where the story plays doesn’t really matter, it’s always about completing a specific mission objective so that at the end of the campaign the big bad guy can’t start a world war or a global pandemic. But the content shouldn’t be the focus of this article, instead it was about the technology.
Although the game already looks very good without the RTX effects, the hardware hunger persists within limits. 4K are even with a GeForce RTX 2020 Possible without any problems. However, one should distinguish between single player and multiplayer here. The multiplayer mode is usually a bit more demanding. A lot of the performance can be optimized via the graphics settings without the optics falling behind too much. The graphics engine knows how to convince from a technical and optical point of view.
The ray tracing effects are less noticeable than in games that use them to calculate reflections. Instead, shadows and the lighting situation are calculated. On the other hand, the effects in this form contribute to more immersion – at least more than some reflections. The benchmarks speak for themselves and allow an assessment of the performance for the various settings. If you want to activate ray tracing, you will generally be better off with graphics cards with NVIDIA GPU or you can compensate for the missing FPS with DLSS. The Radeon RX 6800 and Radeon RX 6800 XT, however, are also fast and absolutely on eye level without ray tracing.
Data protection notice for Youtube
At this point we would like to show you a YouTube video. Protecting your data is very important to us: Youtube sets cookies on your computer by embedding and playing them, with which you can possibly be tracked. If you want to allow this, just click the play button. The video will then be loaded and then played.
Who thought a GeForce RTX 3080 and GeForce RTX 3090 in the Founders Edition or one of the custom models would have been difficult to obtain, yesterday the “sales start” of the custom models of the Radeon RX 6800 and Radeon RX 6800 XT not experienced. This happened a little more than 24 hours and we had the ASUS ROG Strix LC Radeon RX 6800 XT OC Edition immediately tested a model with AiO cooling.
We published a comment on the topic a few days ago , because the reference versions were and are difficult or even impossible to obtain. Nothing has changed in this situation to this day. For the custom models of the Radeon RX 6800 and Radeon RX 6800 XT it looks like yesterday Start really bad. If you could still get lucky at the starts in the past few weeks and get hold of a card, the first custom models from Big Navi were practically not available at all, and if so, only in quantities of less than a handful.
Our “AMD RDNA2 Availability and Laber” thread is bursting at the seams. Of course, heated discussions also flare up here about whether one can even speak of a start, since virtually no cards were available. Prices of over 1. 000 Euro are for a custom model of the Radeon RX 6800 XT, of course, far too high, and yet some cards were sold at these prices over the virtual counter.
AMD and NVIDIA defend themselves, of course, from to speak to a paper launch. As long as you have been able to bring a certain number of items on the market, there can be no question of a paper launch – according to AMD and NVIDIA. But we would contradict that. NVIDIA spoke of having delivered similar quantities to the market at the start of the first two models, as in previous launches. For the GeForce RTX 3070, due to a delay of two weeks, even larger quantities could be offered at the start. But even this was not enough. At least NVIDIA confirms high numbers of items, even if it was not enough.
This should not look any different with AMD – actually. Scott Herkelman, Vice President & General Manager Graphics Business Unit at AMD said in an interview:
“You don’t have to wait long for your delivery. (.. .) For a few weeks now, we have been sending our own ASICs every day so that they can be installed on the custom designs, (…) “
So you want to have your supply chain prepared, but here too the demand surprised you. Now the situation with graphics cards with Big Navi GPU is even more devastating than it is for the GeForce RTX – 3000 Cards was the case and in some cases still is. The big question is whether AMD and NVIDIA could have been better prepared. However, it is not possible to react quickly to such an onslaught.
From the actual GPUs to all other components in the supply chain, the manufacturers are limited in number and capacity. It is not possible to simply ramp up production by a factor of two. In addition, the production capacities in Asia are already being pushed to their limits – there is not much more leeway here.
But we still see the manufacturers and thus AMD and NVIDIA in their duty here better act and above all communicate. Nothing annoys a (potential) customer more than standing there without feedback and not knowing whether his order has now gone through or how long he has to wait.
Also for us as tech press and authors the article this is not a nice situation. We present you with tests on products that are hardly or not at all available. The information we provide about the prices is either no longer valid when the test is published or changes every few minutes – depending on which shop has found a box and charges almost any surcharge. An assessment based on the price is therefore currently not possible for us.
For many, it means still waiting
So if you want to get hold of one of the new cards, whether it’s Founders Edition, reference design from AMD or one of the many custom designs from the various manufacturers, at a market-driven price, you have to be patient or have a lot of luck. At the moment, it is becoming apparent that the current situation will continue into the next year. While the situation with the GeForce cards has already eased somewhat, this will take some time for the cards with AMD GPU – despite all the assurances of the companies involved, which of course try to get as many cards as possible to the man
By the way: AMD and NVIDIA probably benefit the least from the high or, better, excessive prices. The shops and middlemen adjust the prices to the demand and can apparently ask what they want at the moment.
Availability notes for graphics cards and processors
Our forum moderator @ralle_h has started a small project, which asks shops / APIs for the availability of the new graphics cards and processors. You can find the threads here:
GeForce-RTX – 3000 – Series availability notes
Radeon-RX- 6000 – Series Availability Notes
AMD Ryzen – 5000 – Series availability notes
Please note that all information is provided without guarantee! The data of the respective shop API does not necessarily represent the availability in the front end of the shop! Some shops have cached the article pages, so it can happen that the shop API marks a certain article as “available”, but it is still displayed as “not available” in the shop frontend. You may have to wait until the page is recreated or the cache is deleted! At Saturn and Mediamarkt it sometimes takes 20 – 40 Minutes before the article pages update and reflect the availability of the API! It may be faster using a memo.
But it can also be that the shop is or was overrun by purchase bots immediately, and the available items were sold almost immediately.
ASRock has announced the thinnest mini PC to date that integrates AMD processors 4000 U, Mars 4300 U Series.
We could already see a few months ago the plans that the manufacturer had for these low-power processors, when they implemented them in their mini PC 4×4 BOX – 4000. Now, ASRock wants to integrate them in an even smaller format, in only 0.7 liters of volume , it is the mini PC ASRock Mars 4300 U Series . These barebones come with support for AMD Ryzen 3 processors 4300 U, Ryzen 5 4500 U and 4600 U, Ryzen 7 4700 U and Ryzen 7 4800 U. Up to can be added GB of RAM through two slots, as well as an M.2 2280 PCIe and another 2.5 “slot SATA, although it will be necessary to choose one of the latter two as they overlap.
Maintain a total of 7 USB ports (one of them USB-C) and incorporate an SD card reader. Similarly, we find an RJ- 45 Gigabit and one slot for the module WiFi + Bluetooth in the usual M.2 format 2280 .As image outputs, the ASRock Mars 2280 U Series They offer an HDMI and a D-Sub, while for sound we have the usual audio output and microphone input.
MODEL
CORES / WIRE
TDP (
BOOST / BASE (GHz)
RADEON ™ GRAPHICS
GPU CORES
L2 / L3 CACHE (MB)
AMD Ryzen ™ 7 4800 U
8C / 16 T
15 W
Up to 4.2 / 1.8 GHz
Radeon ™ Graphics
8
12
AMD Ryzen ™ 7 4700OR
8C / 8T
15 W
Up to 4.1 / 2.0 GHz
Radeon ™ Graphics
7
13
AMD Ryzen ™ 5 4600 OR
6C/12 T
15 W
Up to 4.0 / 2.1 GHz
Radeon ™ Graphics
6
11
AMD Ryzen ™ 5 4500OR
6C / 6T
15 W
Up to 4.0 / 2.3 GHz
Radeon ™ Graphics
6
11
AMD Ryzen ™ 3 4300OR
4C / 4T
15 W
Up to 3.7 / 2.7 GHz
Radeon ™ Graphics
5
6
The manufacturer has not disclosed prices of these small barebones with the Ryzen 4000 U , although they will surely come to compete with the recently sighted Gigabyte AMD BRIX.
End of Article. Tell us something in the Comments or come to our Forum!
Pablo López
With 15 years I started to overclock my PC to get every extra FPS I could in games and scratch a few milliseconds in SuperPi, while I was constantly posting about hardware on the Geeknetic forum as a user and reader. They must have been so fed up with continually reading me on the forum that I became part of the writing team, where I continue to report on the latest in technology. Astrophysics and PC games are the hobbies that, after hardware, cover most of my free time.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.