We have with us the Zotac GeForce RTX 3080 AMP Holo, the company’s top custom-design RTX 3080 offering available in the North American market. The AMP Holo is positioned a notch above the RTX 3080 Trinity, which we had a chance to review last year. For the most part, the card retains the design of the Trinity’s IceStorm 2.0 cooling solution, but with lavish use of ARGB LED illumination. The lighting is tastefully executed through a large diffuser along the top edge, and across the metal back-plate. The card also features a higher factory overclock than the RTX 3080 Trinity OC.
The NVIDIA GeForce RTX 3080 “Ampere” is NVIDIA’s current-generation flagship gaming product. Despite the existence of the faster RTX 3090, NVIDIA continues to refer to the RTX 3080 as its flagship because it fulfills everything an enthusiast-gamer would want—maxed out gaming at 4K UHD with RTX raytracing enabled. The GeForce “Ampere” graphics architecture introduces the second generation of NVIDIA’s path-breaking RTX technology, which combines conventional raster 3D graphics with certain real-time raytraced elements to significantly improve realism, such as lighting, reflections, shadows, and global illumination. With 2nd Gen RTX, NVIDIA is also introducing raytraced motion blur, an extremely difficult effect to pull off in real-time, which required the company to innovate a whole new component into its 2nd Gen RT core.
The GeForce Ampere architecture combines new-generation Ampere CUDA cores that can perform concurrent INT32+FP32 math operations, 2nd Gen RT cores that double ray intersection performance over the previous generation and introduce new temporal components, and 3rd Gen Tensor cores, which leverage the sparsity phenomenon in AI DNN to accelerate building and training of neural nets by an order of magnitude. NVIDIA leverages AI for de-noising and its DLSS performance enhancement.
Based on the 8 nm “GA102” silicon, the GeForce RTX 3080 more than doubles the number of unified shaders over the previous generation RTX 2080. It packs 8,704 CUDA cores, 68 RT cores, and 272 Tensor cores. The memory amount has been increased by 25%, to 10 GB, as has the memory bus width, to 320-bit. NVIDIA and Micron Technology have innovated a whole new memory standard for the RTX 3080 and RTX 3090, which they call GDDR6X. This memory operates at a blistering data rate of 19 Gbps and helps NVIDIA hit memory bandwidth levels of 760 GB/s.
As we mentioned earlier, the Zotac GeForce RTX 3080 AMP Holo features the company’s highest factory overclock available in the North American market, with the GPU Boost frequency set at 1770 MHz instead of the 1710 MHz reference. It features the company’s IceStorm 2.0 cooling solution that has two large aluminium fin stacks to which heat drawn from the base is conveyed by five heat pipes. A trio of fans ventilate the cooler. The Spectra 3.0 ARGB lighting package could be the brightest spot in your gaming PC build.
We have with us the Zotac GeForce RTX 3080 AMP Holo, the company’s top custom-design RTX 3080 offering available in the North American market. The AMP Holo is positioned a notch above the RTX 3080 Trinity, which we had a chance to review last year. For the most part, the card retains the design of the Trinity’s IceStorm 2.0 cooling solution, but with lavish use of ARGB LED illumination. The lighting is tastefully executed through a large diffuser along the top edge, and across the metal back-plate. The card also features a higher factory overclock than the RTX 3080 Trinity OC.
The NVIDIA GeForce RTX 3080 “Ampere” is NVIDIA’s current-generation flagship gaming product. Despite the existence of the faster RTX 3090, NVIDIA continues to refer to the RTX 3080 as its flagship because it fulfills everything an enthusiast-gamer would want—maxed out gaming at 4K UHD with RTX raytracing enabled. The GeForce “Ampere” graphics architecture introduces the second generation of NVIDIA’s path-breaking RTX technology, which combines conventional raster 3D graphics with certain real-time raytraced elements to significantly improve realism, such as lighting, reflections, shadows, and global illumination. With 2nd Gen RTX, NVIDIA is also introducing raytraced motion blur, an extremely difficult effect to pull off in real-time, which required the company to innovate a whole new component into its 2nd Gen RT core.
The GeForce Ampere architecture combines new-generation Ampere CUDA cores that can perform concurrent INT32+FP32 math operations, 2nd Gen RT cores that double ray intersection performance over the previous generation and introduce new temporal components, and 3rd Gen Tensor cores, which leverage the sparsity phenomenon in AI DNN to accelerate building and training of neural nets by an order of magnitude. NVIDIA leverages AI for de-noising and its DLSS performance enhancement.
Based on the 8 nm “GA102” silicon, the GeForce RTX 3080 more than doubles the number of unified shaders over the previous generation RTX 2080. It packs 8,704 CUDA cores, 68 RT cores, and 272 Tensor cores. The memory amount has been increased by 25%, to 10 GB, as has the memory bus width, to 320-bit. NVIDIA and Micron Technology have innovated a whole new memory standard for the RTX 3080 and RTX 3090, which they call GDDR6X. This memory operates at a blistering data rate of 19 Gbps and helps NVIDIA hit memory bandwidth levels of 760 GB/s.
As we mentioned earlier, the Zotac GeForce RTX 3080 AMP Holo features the company’s highest factory overclock available in the North American market, with the GPU Boost frequency set at 1770 MHz instead of the 1710 MHz reference. It features the company’s IceStorm 2.0 cooling solution that has two large aluminium fin stacks to which heat drawn from the base is conveyed by five heat pipes. A trio of fans ventilate the cooler. The Spectra 3.0 ARGB lighting package could be the brightest spot in your gaming PC build.
After almost a decade of total market dominance, Intel has spent the past few years on the defensive. AMD’s Ryzen processors continue to show improvement year over year, with the most recent Ryzen 5000 series taking the crown of best gaming processor: Intel’s last bastion of superiority.
Now, with a booming hardware market, Intel is preparing to retake some of that lost ground with the new 11th Gen Core Processors. Intel is claiming these new 11th Gen CPUs offer double-digit IPC improvements despite remaining on a 14 nm process. The top-end 8-core Intel Core i9-11900K may not be able to compete against its AMD rival Ryzen 9 5900X in heavily multi-threaded scenarios, but the higher clock speeds and alleged IPC improvements could be enough to take back the gaming crown. Along with the new CPUs, there is a new chipset to match, the Intel Z590. Last year’s Z490 chipset motherboards are also compatible with the new 11th Gen Core Processors, but Z590 brings some key advantages.
First, Z590 offers native PCIe 4.0 support from the CPU, which means the PCIe and M.2 slots powered off the CPU will offer PCIe 4.0 connectivity when an 11th Gen CPU is installed. The PCIe and M.2 slots controlled by the Z590 chipset are still PCI 3.0. While many high-end Z490 motherboards advertised this capability, it was not a standard feature for the platform. In addition to PCIe 4.0 support, Z590 offers USB 3.2 Gen 2×2 from the chipset. The USB 3.2 Gen 2×2 standard offers speeds of up to 20 Gb/s. Finally, Z590 boasts native support for 3200 MHz DDR4 memory. With these upgrades, Intel’s Z series platform has feature parity with AMD’s B550. On paper, Intel is catching up to AMD, but only testing will tell if these new Z590 motherboards are up to the challenge.
The AORUS line from Gigabyte spans a broad range of products: laptops, peripherals, and core components. Across the enthusiast spectrum, the AORUS name denotes Gigabyte’s gaming-focused products, with the AORUS motherboard range featuring a consistent naming scheme that includes the Pro, Elite, Ultra, Master, and Extreme motherboards. Within this lineup, the Master serves as the high-end mainstream option offering prime features at a high but attainable price point.
The Gigabyte Z590 AORUS Master features a monster 19-phase VRM utilizing 90 A power stages and Gigabyte’s signature finned cooling solution. Both Q-Flash and a dual BIOS have been included, providing a redundant safety net for ambitious overclocking. The Gigabyte Z590 AORUS Master also offers a full-coverage aluminium backplate for added rigidity and additional VRM cooling. Additionally, Gigabyte has included a 10 Gb/s LAN controller from Aquantia. All of the features are in order, so let’s see how the Gigabyte Z590 AORUS Master stacks up against the competition.
1x Q-Flash Plus button 1x Clear CMOS button 2x SMA antenna connectors 1x DisplayPort 1x USB Type-C® port, with USB 3.2 Gen 2×2 5x USB 3.2 Gen 2 Type-A ports (red) 4x USB 3.2 Gen 1 ports 1x RJ-45 port 1x optical S/PDIF Out connector 5x audio jacks
Audio:
1x Realtek ALC1220 Codec
Fan Headers:
9x 4-pin
Form Factor:
ATX Form Factor: 12.0 x 9.6 in.; 30.5 x 24.4 cm
Exclusive Features:
APP Center
@BIOS
EasyTune
Fast Boot
Game Boost
RGB Fusion
Smart Backup
System Information Viewer
USB TurboCharger
Support for Q-Flash Plus
Support for Q-Flash
Support for Xpress Install
Testing for this review was conducted using a 10th Gen Intel Core i9-10900K. Stay tuned for an 11th Gen update when the new processors launch!
The Redmi Note 10 Pro is the best phone of the series that Xiaomi announced on Thursday, though the naming could have been better. In India fans will find an almost identical phone under the Redmi Note 10 Pro Max name (the key difference is the removal of NFC).
Xiaomi Redmi Note 10 Pro
The two phones are certainly similar enough that our written review applies to both. The same goes for our video review, which you can watch below.
The Note 10 Pro impresses with the value it brings on a modest budget. This €250 phone stands out with its 120 Hz AMOLED display, most phones in its price range switch to LCD in order to afford a high refresh rate. The other highlight is the 108 MP main camera that shoots flagship-level photos (though low light performance leaves something to be desired).
If you like what you see, the Redmi Note 10 Pro will become available globally on March 8 (Monday) and you’ll be able to grab one for $300/€250. India’s Redmi Note 10 Pro Max will be up on Mi.com and Amazon from March 18 (the starting price is INR 19,000 for a 6/64 GB unit).
(Pocket-lint) – Think ‘Montblanc’ and in your mind’s eye you could be picturing any number of things: wallets, pens, jewellery, watches, bags, belts, or even notebooks. The one thing that they all have in common (apart from often being made from black leather) is that they’re luxury items and aren’t cheap. A Meisterstück gold-coated Classique ballpoint pen could set you back hundreds.
So when Montblanc launches a Wear OS smartwatch it’s best to go in with the expectation that it won’t be cheap. But actually, if you compare this second-gen watch – here the Summit Lite – to other Montblanc watches, it’s relatively cost efficient. That means there’s still definitely some appeal here for anyone wanting a luxury smartwatch but who doesn’t wish to spend more than a grand.
Design
Colours: Grey or black
43mm aluminium case
Straps: Fabric or rubber
Anti-scratch crystal glass
Water resistant to 50m (5ATM)
Rotating crown and 3 push buttons
Montblanc’s first smartwatch, the Summit, was pretty but underwhelming. From a design perspective there was a missed opportunity – it had a stylish looking crown, but it didn’t rotate and it was the only button on the side; and we found the whole device too big.
The company improved things considerably with the Summit 2, which launched in 2019, and now there’s the new Summit Lite model – hence that slightly more affordable price point.
The Summit Lite has three buttons on its side. Each of them feels sumptuous when pressed, giving a lovely ‘click’ and feeling just like a proper watch with proper buttons should. But the best thing about these buttons is that the middle one has a proper rotating crown.
Rotating it is smooth and effortless without it feeling too loose. Doing so enables you interact with elements on the screen. For instance, you can use it to scroll up and down lists or messages, or – when on the watch face – bring up notifications or the quick settings tiles.
Our only complaint about the rotating crown – as pretty and shiny as it is – is the surface is just a little too smooth and shiny. That means you need a little firm pressure to make sure you finger gets enough traction to turn it. A slightly toothier edge would have made this a little easier.
What’s great about traditional fashion and design companies getting involved in the smartwatch market is that they deliver decent case designs. For its full-fat Summit watches, Montblanc uses stainless steel for the case material. With the Lite model it’s aluminium.
The 43mm case isn’t too big and sits comfortably on the wrist. The contrast between the glossy bezel and buttons with their softer anodised finish on the case is eye-catching. It has that glint of dress watch that looks great just subtly poking out from under your blazer or cardigan sleeve.
There are some subtle angles on the lugs that make the edges softer in appearance, while they curve downwards towards the strap to create a skinny side-on profile. It’s nice and lightweight too thanks to that shift from steel to aluminium.
It’s not just about being pretty though. The casing feels like it’s well put together, while the screen is capped off with crystal glass to help avoid scratches from when you inevitably brush it against all manner of hard surfaces in your daily activity.
Our unit shipped with a thick black rubber strap which had something of a ‘sticky’ feel when we first put it on, but that sensation has since tamed. Other fabric strap options are available too. However, the case will fit any 22mm strap and the quick-release catches mean it’s super simple to swap for one you really want.
Turn the Summit Lite upside down and you’ll see its well-considered underside. Right in the centre is the optical heart-rate sensor – built within a subtle protrusion that’s surrounded by a metal ring – and accompanied by a four-pin connector for the charging base.
It looks and feels more purposeful than a lot of other Wear OS undersides and, happily, it snaps onto its magnetic charging cradle with ease. It holds the watch in position well and – thanks to having a rounded cutout for the rotating crown – only fits the watch one way, so there’s no chance you’ll ever find yourself placing the watch in the wrong way.
If there’s any criticism it’s that the cradle itself is relatively lightweight plastic and so – because of the strong connection – if you try and remove the watch one-handed you’ll more than likely take the cradle with you. You need to hold both in order to separate them.
On the plus side, the underside is coated in an almost-sticky rubber-like material that helps it not to slide around all over the place.
Display and software
1.2-inch circular AMOLED display
390 x 390 resolution
Wear OS software
For the most part, the software situation with the Montblanc Summit Lite is the same as pretty much every other Google Wear OS watch. The main interfaces and preinstalled apps are the same, but it comes with Montblanc’s own watch faces.
Press the middle button and it launches your apps list, and the top and bottom buttons can be customised to launch any number of functions or apps. By default, however, they launch two elements of Montblanc’s own activity tracker screens. And this is where the Summit Lite is slightly different to some of the other Wear OS devices.
The activity app can be used to manually track any workout, but will also track your movement, heart-rate and stress levels throughout the day, and your sleep quality at night. Combining that information it can also measure how well rested you are and give you an Energy Level reading. It’s similar in theory to Garmin’s Body Battery feature.
Go running and it’ll work out your VO2 Max (that’s blood oxygen saturation) and judge your fitness level. It’ll even give you the time frame you need to rest for in order to recover for you next workout session. Interestingly, there’s also a Cardio Coach function which tells you what you should aim for in terms of heart rate intensity and duration for your next activity.
There are some pretty glaring holes in this workout software though. Firstly, there’s no mobile companion app. That means all that useful data and detail just stays on the watch. Secondly, if you go on a run or bike ride, there’s no map to look at afterwards to see if it tracked your route properly.
The solution to these issues is to use third-party apps – like Strava for running/cycling – or just use the Google Fit app that’s built-in as standard to all Wear OS watches.
For those who want those features it makes more sense to completely bypass Montblanc’s offering. It’s a shame really, because otherwise that data and information on the watch could be really useful. It’d just be nice to get access to it from a phone.
Otherwise accuracy seems on point. Comparing the Summit Lite’s data to that captured on the Garmin Vivoactive 4 reveals that the average heart-rate was within one or two beats-per-minute away from matching. There was a slight difference in distance measured and, as a result, pace – but not enough that it made any serious difference to the tracked activity. It was about 10-20 metres out on a 25 minute 4km run, which is a pretty standard discrepancy between watches.
All of this software and detail is shown on a fully round AMOLED panel. It’s a 1.2-inch screen, and boasts 390 pixels both vertically and horizontally, making pretty much on par with the latest hardware from the likes of Fossil.
Hardware and battery performance
Snapdragon Wear 3100 platform
1GB RAM + 8GB storage
Tech aficionados will complain that a watch in 2021 doesn’t feature the newest Snapdragon Wear 4100 processor. Nonetheless, there’s not a huge amount wrong with the way the Montblanc Summit Lite performs.
The Wear 3100 processor here ensures that the interface and animations are mostly smooth and responsive. There are elements that still feel a little laggy and slow, however, which is usually when extra data is required – like when browsing the Google Play Store on the wrist to download apps. There’s a little bit of a wait launching most apps, too. You’ll maybe need to wait three seconds for Google’s Keep Notes to launch, for example.
As far as connectivity and modern tech goes, the Summit Lite has pretty much everything you’d want from a smartwatch. There’s NFC (near field communication) to enable Google Pay for contactless payments. There’s Wi-Fi for direct downloading apps on to the watch. And there’s GPS for location tracking.
Best Apple Watch apps 2021: 43 apps to download that actually do something
By Britta O’Boyle
·
Battery life is pretty standard for a Wear OS watch too: you’ll get roughly two days between charges. We managed to get through two work days even with the always-on display switched on – because the watch faces run a lower brightness and lower refresh rate than the main watch face.
Verdict
The Montblanc Summit Lite’s side buttons have been purposefully redesigned with a proper rotating crown for enhanced interaction, paired with a great all-round display, plus all the features you’d expect from a Wear OS watch.
Despite being a ‘Lite’ model it’s still expensive, though, so you’re very much still paying for the Montblanc brand name. Furthermore Montblanc’s otherwise useful activity tracking doesn’t have a companion phone app to download and view your data in much detail. So it’s more decoration than designed for those super serious about tracking fitness.
Overall, things have improved dramatically since the first Montblanc Summit watch. The Summit Lite is really well designed, with its subtle, stylish and almost minimalist look, while also featuring practical material choices and the durability you’d expect from any modern smartwatch.
Also consider
Tag Heuer Connected 2020
squirrel_widget_231495
Compare the prices and the Montblanc starts to look like good value for money. The Tag is about double the price, but it’s still the luxury smartwatch champ that has a lot going for it.
Read our review
Fossil Gen 5 Garrett HR
squirrel_widget_307331
On the complete opposite end of the scale, but with a similar approach to style, Fossil’s Garrett is one of the nicest looking and more affordable options from the popular fashion brand.
Intel recently dropped its new 670p SSD onto the market with a slew of upgrades over its predecessor, the 660p, making Intel’s new NVMe SSD very competitive with some of the best SSDs on the market. But in our review of the 670p, we found Intel’s pricing to be too aggressive for the performance and capacity you get in return.
Fortunately, it seems that Newegg (one of the only retailers currently selling the 670p) has dropped 670p prices by 16-24% — depending on the storage capacity you choose.
However, we aren’t sure if Newegg is giving us the new official prices for the Intel 670p or if Newegg is simply offering these SSDs at a major discount for the time being. So keep that in mind.
Intel 670p Price Changes
Capacity
Price at Launch
Current Newegg Pricing
512GB
$89.00
$69.99
1TB
$154.00
$129.99
2TB
$329.00
$249.99
If Newegg decides to hold these discounted prices forever, Intel’s 670p SSD should be a much more competitive option in the NVMe SSD landscape. We were already impressed with its performance, as the 670p beat out popular SSDs like the Samsung 970 EVO Plus in some tests. With the big price cut, this should help secure the 670p as a good value offering in the NVMe SSD landscape.
While not the best looking M.2 NVMe SSD we’ve come across, Silicon Power’s UD70 is a responsive PCIe 3.0 SSD with an attractive price point.
For
Competitive performance
Single-sided form factor
AES 256-bit hardware encryption
5-year warranty
Against
Low endurance ratings
Slow write speed after write cache fills
Features and Specifications
Silicon Power’s UD70 oozes value and is a perfect option for tight-budgeted gamers and ordinary everyday office users. The UD70 comes with a potent combination of Phison’s E12S PCIe 3.0 NVMe SSD controller and Micron’s latest QLC flash, making it quite similar to the Sabrent Rocket Q we reviewed last year.
The UD70 delivers solid performance for a PCIe 3.0 SSD, has an inexpensive price tag, and comes with hardware encryption support. But due to its rather weak sustained write performance, it’s not too appealing to the prosumer crowd.
Specifications
Product
UD70 500GB
UD70 1TB
UD70 2TB
Pricing
$76.50
$109.99
$195.99
Capacity (User / Raw)
500GB / 512GB
1000GB / 1024GB
2000GB / 2048GB
Form Factor
M.2 2280
M.2 2280
M.2 2280
Interface / Protocol
PCIe 3.0 x4 / NVMe 1.3
PCIe 3.0 x4 / NVMe 1.3
PCIe 3.0 x4 / NVMe 1.3
Controller
Phison PS5012-E12S
Phison E12S
Phison E12S
DRAM
DDR3L
DDR3L
DDR3L
Memory
Micron 96L QLC
Micron 96L QLC
Micron 96L QLC
Sequential Read
3,400 MBps
3,400 MBps
3,400 MBps
Sequential Write
1,000 MBps
1,900 MBps
3,000 MBps
Read
95,000 IOPS
120,000 IOPS
250,000 IOPS
Write
250,000 IOPS
500,000 IOPS
650,000 IOPS
Security
AES 256-bit encryption
AES 256-bit encryption
AES 256-bit encryption
Endurance (TBW)
120 TB
260 TB
530 TB
Part Number
SP500GBP34UD7005
SP01KGBP34UD7005
SP02KGBP34UD7005
Warranty
5-Years
5-Years
5-Years
Silicon Power’s UD70 comes in capacities of 500GB, 1TB, and 2TB and is priced as low as $0.10 per GB. The 2TB model is rated for peak speeds of up to 3.4/3 GBps of sequential read/write throughput, while the 500GB and 1TB models write a little slower at up to 1.0 GBps and 1.9 GBps, respectively. The entire series is rated for up to 250,000/650,000 random read/write IOPS.
These peak performance ratings are measured within the SLC cache, though. The UD70 features a large dynamic SLC cache that spans one-quarter of its capacity, but the cache will shrink as you fill the drive, or grow as you free up space. As such, because the SSD comes armed with QLC flash, that sustained performance won’t last long after you begin filling the SSD with data.
The UD70 comes backed by a five-year warranty and supports standard data integrity mechanisms like end-to-end data protection, LDPC ECC, and RAID-like parity protection. These mechanisms help enable reasonable endurance ratings for a low-end device with QLC flash — our 2TB sample can absorb up to 530TB of writes within its warranty period. The drive also comes with optional support for hardware-accelerated AES 256-bit encryption, enabling responsive and secure data storage.
Software and Accessories
Silicon Power also includes access to the company’s SSD toolbox. While not as robust or polished as some of the bigger brands, SP Toolbox enables end-users to monitor the SSD’s SMART data and run diagnostic tests.
A Closer Look
Image 1 of 3
Image 2 of 3
Image 3 of 3
Silicon Power’s UD70 comes in a compact single-sided M.2 2280 form factor at all capacities, enabling broad compatibility. Like the US70, the UD70 isn’t the best-looking M.2 SSD we’ve come across. The blue PCB and red-themed sticker clash, and the compliance marks and barcodes detract from the aesthetic even more.
At the heart of the UD70 is a Phison PS5012-E12S PCIe 3.0 x4 8-channel NVMe 1.3 complaint SSD controller. The E12S utilizes dual Arm Cortex R5 CPUs clocked in at 666 MHz and leverages CoXProcessor 2.0 technology (dual coprocessors) to help maintain a consistent latency profile during write workloads.
The controller is built on a 12nm node and uses a smaller physical design than its predecessor, the original E12, downsizing from a 16×16 mm to a 12x12mm package, but retaining the same performance. This enables more flexible NAND placement to allow for up to four NAND emplacements on a single side of the PCB, while the original could only handle up to two.
The controller leverages active state power management (ASPM) and autonomous power state transition (ASPT). The company dubs these features a ‘dual self-cooling system,’ but this tech is found in most of the other NVMe SSDs on the market.
Image 1 of 2
Image 2 of 2
The controller caches its FTL metadata with a single 4Gb Kingston DDR3L DRAM chip on our 2TB review sample. The controller interfaces with sixteen dies of Micron’s 1Tb N28A 96-layer QLC flash that operate at 666 MTps. Each die is split into four virtual planes, as Micron’s design leverages a tile-based architecture for both performance and reliability. Micron claims the floating gate architecture provides better data retention than most charge trap flash designs.
The Yamaha SR-B20A isn’t perfect, but it does offer an impressive spread of sound for minimal outlay
For
Clean and detailed
Good spread of effects
Decent bass
Against
Could be more expressive
Timing not expert
Yamaha describes both of its latest ‘entry-level’ soundbars, of which the SR-B20A is the larger, as upgrades for your TV sound. That’s something very much required these days when flat TV screens generally mean weedy built-in TV sound.
Unlike many rivals at this price, there is mercifully little pretence from Yamaha that the bars will deliver ‘surround’ sound, other than options labelled ‘3D Surround’ and some rather optimistic claims for the inclusion of DTS Virtual:X. These marketing extravagances aside, what’s promised is simply solid sound with which to enjoy your shiny new flatscreen.
What we particularly like about the SR-B20A on review here is that it aims to perform without the usual wireless subwoofer. This not only keeps the price down, it makes the whole package far more convenient to use.
The question is whether a long flat bar of a speaker like this can create solid enough sound without that subwoofer in support. And thankfully the answer here is a fairly resounding yes.
Build
Despite the entry-level price, the Yamaha SR-B20A feels solidly built, and looks stylish too, with black fabric wrap and curving ends. It stretches to 91cm wide, which makes it a good match for 55-inch TVs but won’t prevent its use with either smaller or larger TVs.
Yamaha SR-B20A tech specs
Power 120W
Outputs HDMI out (TV ARC)
Inputs 2x digital optical
4K passthrough No
Surround tech DTS Virtual:X
Dimensions (hwd) 5 x 91 x 13cm
Weight 3.2kg
Among the literature in the box is a mounting template that shows the soundbar can be stuck flat to the wall with the supplied foam spacers and a couple of sturdily fixed screws.
If you hang it on the wall, the controls, the indicator lights and most of the drivers usefully face the listener. You can always bench it in front of your TV instead, in which case the controls and lights are less usefully hidden.
The SR-B20A is a stereo soundbar, with six drivers in all. Four of these fire upwards if the unit is benched (forward if on the wall) – according to the specs these are 55mm mid-range units near each end and 75mm bass drivers halfway from each end to the centre, augmented with side-firing ports in the curves of the bar.
The quoted power ratings also suggest that the woofers are running in a dual-mono configuration rather than stereo, which Yamaha confirms. Finally, there are two 25mm tweeters on the front edge (benched) or firing down (wall-mounted); no amount of torchlight could reveal their location through the grille fabric.
Features
The bar can get sound from your TV in two (or potentially three) ways. The best option is an HDMI cable from the bar to an ARC-equipped HDMI socket on your TV, assuming it has one. If it doesn’t, or if you don’t wish to give up one of the HDMI sockets on your TV, the next best option is an optical digital connection, for which a cable comes in the box; there is no HDMI cable included.
There is no analogue mini-jack fall-back input here, though one does feature on the smaller SR-C20A (£229, $180, AU$299), and there’s no networking of any kind. There is Bluetooth with support for both SBC and AAC codecs, primarily intended for music streaming from a smart device, although TVs that can output audio via Bluetooth could also send their audio to the bar in this way (at the peril of potential transmission delay, depending on the system).
There are a couple of other connections, besides the mains cable – a second optical input for any suitable device, and a coaxial digital output.
Given that adding a separate subwoofer of quality will more than double the price, you’d do better to buy one in a package if you’re after that bottom octave of movie-style bass: Yamaha has a number of such combos, but the SR-B20A does a good job without additional support.
Sound
Despite its slim dimensions, the SR-B20A carries that welcoming full Yamaha tone. It’s close to its smaller sibling in terms of character, only the scale here is predictably greater.
The balance is good, too; it is easy for manufacturers to roll off a lot of top-end at this entry-level price point, to avoid anything too harsh, but the SR-B20A is happy to go high into the high frequencies with confidence.
Those built-in bass units get through a fair bit of work as well. You can dial in as much or as little as you like – you might want to tweak a bit if your TV rack is less than robust, to avoid any flabbiness – but with good solid support this is a weighty performance not short of presence. It doesn’t quite rumble like a dedicated subwoofer, but that rarely equates to great sound in an all-in-one soundbar anyway.
More impressive, though, is how the SR-B20A is able to spread effects. This is a long soundbar, and it makes good use of size. While you aren’t going to get anything like 3D sound in reality, this Yamaha is skilled enough to place sounds either side of the listening position in a manner you certainly wouldn’t regularly associate with this kind of price tag.
However, you lose a bit of focus with the soundbar firing upwards at the TV screen, which leads us to consider whether the SR-B20A is actually more suited to wall mounting.
That slight lack of precision is present in the SR-B20A’s dynamic and rhythmic performance, too. Large-scale dynamic shifts are presented well, but we end up wishing there was a little more in the way of expression on offer. Music playback also reveals this Yamaha isn’t quite the last word on timing, but again we’re far off calling it a poor performance.
Verdict
It’s difficult to know what to expect when approaching an entry-level product such as the Yamaha SR-B20A, but it’s safe to say on this occasion any expectations have been met.
This is a big-sounding soundbar that makes full use of its size. It might be best suited to wall mounting, but it’s pretty difficult to make it sound bad.
(Pocket-lint) – Think ‘Montblanc’ and in your mind’s eye you could be picturing any number of things: wallets, pens, jewellery, watches, bags, belts, or even notebooks. The one thing that they all have in common (apart from often being made from black leather) is that they’re luxury items and aren’t cheap. A Meisterstück gold-coated Classique ballpoint pen could set you back hundreds.
So when Montblanc launches a Wear OS smartwatch it’s best to go in with the expectation that it won’t be cheap. But actually, if you compare this second-gen watch – here the Summit Lite – to other Montblanc watches, it’s relatively cost efficient. That means there’s still definitely some appeal here for anyone wanting a luxury smartwatch but who doesn’t wish to spend more than a grand.
Design
Colours: Grey or black
43mm aluminium case
Straps: Fabric or rubber
Anti-scratch crystal glass
Water resistant to 50m (5ATM)
Rotating crown and 3 push buttons
Montblanc’s first smartwatch, the Summit, was pretty but underwhelming. From a design perspective there was a missed opportunity – it had a stylish looking crown, but it didn’t rotate and it was the only button on the side; and we found the whole device too big.
The company improved things considerably with the Summit 2, which launched in 2019, and now there’s the new Summit Lite model – hence that slightly more affordable price point.
The Summit Lite has three buttons on its side. Each of them feels sumptuous when pressed, giving a lovely ‘click’ and feeling just like a proper watch with proper buttons should. But the best thing about these buttons is that the middle one has a proper rotating crown.
Rotating it is smooth and effortless without it feeling too loose. Doing so enables you interact with elements on the screen. For instance, you can use it to scroll up and down lists or messages, or – when on the watch face – bring up notifications or the quick settings tiles.
Our only complaint about the rotating crown – as pretty and shiny as it is – is the surface is just a little too smooth and shiny. That means you need a little firm pressure to make sure you finger gets enough traction to turn it. A slightly toothier edge would have made this a little easier.
What’s great about traditional fashion and design companies getting involved in the smartwatch market is that they deliver decent case designs. For its full-fat Summit watches, Montblanc uses stainless steel for the case material. With the Lite model it’s aluminium.
The 43mm case isn’t too big and sits comfortably on the wrist. The contrast between the glossy bezel and buttons with their softer anodised finish on the case is eye-catching. It has that glint of dress watch that looks great just subtly poking out from under your blazer or cardigan sleeve.
There are some subtle angles on the lugs that make the edges softer in appearance, while they curve downwards towards the strap to create a skinny side-on profile. It’s nice and lightweight too thanks to that shift from steel to aluminium.
It’s not just about being pretty though. The casing feels like it’s well put together, while the screen is capped off with crystal glass to help avoid scratches from when you inevitably brush it against all manner of hard surfaces in your daily activity.
Our unit shipped with a thick black rubber strap which had something of a ‘sticky’ feel when we first put it on, but that sensation has since tamed. Other fabric strap options are available too. However, the case will fit any 22mm strap and the quick-release catches mean it’s super simple to swap for one you really want.
Turn the Summit Lite upside down and you’ll see its well-considered underside. Right in the centre is the optical heart-rate sensor – built within a subtle protrusion that’s surrounded by a metal ring – and accompanied by a four-pin connector for the charging base.
It looks and feels more purposeful than a lot of other Wear OS undersides and, happily, it snaps onto its magnetic charging cradle with ease. It holds the watch in position well and – thanks to having a rounded cutout for the rotating crown – only fits the watch one way, so there’s no chance you’ll ever find yourself placing the watch in the wrong way.
If there’s any criticism it’s that the cradle itself is relatively lightweight plastic and so – because of the strong connection – if you try and remove the watch one-handed you’ll more than likely take the cradle with you. You need to hold both in order to separate them.
On the plus side, the underside is coated in an almost-sticky rubber-like material that helps it not to slide around all over the place.
Display and software
1.2-inch circular AMOLED display
390 x 390 resolution
Wear OS software
For the most part, the software situation with the Montblanc Summit Lite is the same as pretty much every other Google Wear OS watch. The main interfaces and preinstalled apps are the same, but it comes with Montblanc’s own watch faces.
Press the middle button and it launches your apps list, and the top and bottom buttons can be customised to launch any number of functions or apps. By default, however, they launch two elements of Montblanc’s own activity tracker screens. And this is where the Summit Lite is slightly different to some of the other Wear OS devices.
The activity app can be used to manually track any workout, but will also track your movement, heart-rate and stress levels throughout the day, and your sleep quality at night. Combining that information it can also measure how well rested you are and give you an Energy Level reading. It’s similar in theory to Garmin’s Body Battery feature.
Go running and it’ll work out your VO2 Max (that’s blood oxygen saturation) and judge your fitness level. It’ll even give you the time frame you need to rest for in order to recover for you next workout session. Interestingly, there’s also a Cardio Coach function which tells you what you should aim for in terms of heart rate intensity and duration for your next activity.
There are some pretty glaring holes in this workout software though. Firstly, there’s no mobile companion app. That means all that useful data and detail just stays on the watch. Secondly, if you go on a run or bike ride, there’s no map to look at afterwards to see if it tracked your route properly.
The solution to these issues is to use third-party apps – like Strava for running/cycling – or just use the Google Fit app that’s built-in as standard to all Wear OS watches.
For those who want those features it makes more sense to completely bypass Montblanc’s offering. It’s a shame really, because otherwise that data and information on the watch could be really useful. It’d just be nice to get access to it from a phone.
Otherwise accuracy seems on point. Comparing the Summit Lite’s data to that captured on the Garmin Vivoactive 4 reveals that the average heart-rate was within one or two beats-per-minute away from matching. There was a slight difference in distance measured and, as a result, pace – but not enough that it made any serious difference to the tracked activity. It was about 10-20 metres out on a 25 minute 4km run, which is a pretty standard discrepancy between watches.
All of this software and detail is shown on a fully round AMOLED panel. It’s a 1.2-inch screen, and boasts 390 pixels both vertically and horizontally, making pretty much on par with the latest hardware from the likes of Fossil.
Hardware and battery performance
Snapdragon Wear 3100 platform
1GB RAM + 8GB storage
Tech aficionados will complain that a watch in 2021 doesn’t feature the newest Snapdragon Wear 4100 processor. Nonetheless, there’s not a huge amount wrong with the way the Montblanc Summit Lite performs.
The Wear 3100 processor here ensures that the interface and animations are mostly smooth and responsive. There are elements that still feel a little laggy and slow, however, which is usually when extra data is required – like when browsing the Google Play Store on the wrist to download apps. There’s a little bit of a wait launching most apps, too. You’ll maybe need to wait three seconds for Google’s Keep Notes to launch, for example.
As far as connectivity and modern tech goes, the Summit Lite has pretty much everything you’d want from a smartwatch. There’s NFC (near field communication) to enable Google Pay for contactless payments. There’s Wi-Fi for direct downloading apps on to the watch. And there’s GPS for location tracking.
Apple watchOS 7: All the key new Apple Watch features explored
By Maggie Tillman
·
Battery life is pretty standard for a Wear OS watch too: you’ll get roughly two days between charges. We managed to get through two work days even with the always-on display switched on – because the watch faces run a lower brightness and lower refresh rate than the main watch face.
Verdict
The Montblanc Summit Lite’s side buttons have been purposefully redesigned with a proper rotating crown for enhanced interaction, paired with a great all-round display, plus all the features you’d expect from a Wear OS watch.
Despite being a ‘Lite’ model it’s still expensive, though, so you’re very much still paying for the Montblanc brand name. Furthermore Montblanc’s otherwise useful activity tracking doesn’t have a companion phone app to download and view your data in much detail. So it’s more decoration than designed for those super serious about tracking fitness.
Overall, things have improved dramatically since the first Montblanc Summit watch. The Summit Lite is really well designed, with its subtle, stylish and almost minimalist look, while also featuring practical material choices and the durability you’d expect from any modern smartwatch.
Also consider
Tag Heuer Connected 2020
squirrel_widget_231495
Compare the prices and the Montblanc starts to look like good value for money. The Tag is about double the price, but it’s still the luxury smartwatch champ that has a lot going for it.
Read our review
Fossil Gen 5 Garrett HR
squirrel_widget_307331
On the complete opposite end of the scale, but with a similar approach to style, Fossil’s Garrett is one of the nicest looking and more affordable options from the popular fashion brand.
With memory prices at one of the lowest points in years, now is a great time to be looking for memory upgrades. Summer hardware releases are in full swing, there is competition from both Intel and AMD, and the Red brand has thoroughly fixed the memory issues of generations past. No longer do users have to worry about memory compatibility or shopping for expensive AMD-branded kits. With 3200 MHz natively supported on the new Ryzen platform, options for enthusiasts have never been more open.
Corsair is a brand that needs no introductions. The California-based company has been a staple of the enthusiast community for decades, producing power supplies, peripherals, SSDs, coolers, and even entire pre-built systems. One of the brand’s most storied and successful family of products consists of their high-performance memory kits. Balancing high performance with style and impeccable quality, Corsair memory kits are a staple of the enthusiast hardware market.
The Corsair Vengeance RGB Pro SL is a lower-profile take on the popular Corsair Vengeance RGB Pro. Corsair aims to give builders with potential cooler clearance concerns a no compromise option. As for performance, my test kit is a 2×8 GB kit of 3600 MHz at 18-22-22-42 and 1.35 V. Let’s see how this new edition of the Corsair Vengeance RGB Pro SL performs!
(Pocket-lint) – Xiaomi has officially launched the global Redmi Note 10 range. The naming convention is rather convoluted, however, as there are a lot of models and a shift in naming based on region.
There’s the Redmi Note 10 at the base of the range, the Redmi Note 10S above that (which, in India arrives as a larger-screen variant called the Redmi Note 10 Pro), and the Redmi Note 10 Pro above that again (which is the Redmi Note 10 Pro Max in India). There’s also a Note 10 5G model (not available in India), which feels like a total departure from the series.
Do keep this naming in mind when looking over the below, as we have run with the global naming and specification. So which of those Redmi Note 10 models is most fitting for you? Here we break down the differences between the four handsets.
Design
Note 10 & 10S: 160.5 × 74.5mm × 8.3mm / 179g
Note 10 5G: 161.8 × 75.3 × 8.9mm / 190g
Note 10 Pro: 164 × 77 × 8.1mm / 193g
All: Side-mounted fingerprint scanner
All: IP53 splashproof design
Although the design language is more-or-less mirrored across each Note 10 model – ignoring the obvious differences in physical size – there are different colour options to help different models stand out.
The main trio of Note 10 devices have an Onyx Gray option, while the Note 10 5G shifts this to Graphite Gray. As you’ll see, the 5G model is largely different from the main trio – and we’re frankly not sure why it’s been made part of the series. Here’s the full colourway breakdown per model:
Note 10 Pro: 6.67-inch AMOLED, 2400 x 1080 resolution, 120Hz
All: central punch-hole front-facing camera
Although front-on the four models look the same, with the punch-hole camera front and centre, they’re different sizes on account of different displays.
The base Note 10 and its 10S counterpart get a 6.34-inch AMOLED with 60Hz refresh rate. The Note 10 Pro bumps this size up to 6.67-inch, with a doubling of the refresh rate to 120Hz – making it the top of the bunch. The Note 10 5G is in-between those, at 6.5-inch and 90Hz.
However, the India versions – i.e. the Note 10 Pro and 10 Pro Max – both feature 6.67-inch displays with 120Hz refresh. Just to add to the confusion.
The rumour for a long time was that the Note 10 series would run on Qualcomm’s Snapdragon 732 platform. That’s true – but only for the Note 10 Pro model.
Wind further down the series and the base Note 10 has a Snapdragon 678, while the 10S and 10 5G make a departure for MediaTek hardware instead.
It’s the first time we’ve seen MediaTek’s Dimensity 700 deployed, utilised for its 5G connectivity – which is, of course, only possible in the Note 10 5G model.
Elsewhere there’s a 5,000mAh battery minimum for all models, so great longevity, along with 33W fast-charging (it’s only 18W for the 5G model).
Cameras
Note 10: 48MP main, 8MP wide, 2MP macro, 2MP depth / 13MP front camera
Note 10S: 64MP main, 8MP wide, 2MP macro, 2MP depth / 13MP front camera
Note 10 5G: 48MP main, 2MP macro, 2MP depth / 8MP front camera
Note 10 Pro: 108MP main, 8MP wide, 5MP macro, 2MP depth / 16MP front camera
The main trio of devices features a quad rear camera setup with main, ultra-wide, macro and depth sensor. The Note 10 5G is the odd one out, once again, with a triple rear camera setup – as it ditches the ultra-wide.
The top-end model is the most accomplished, as its 108-megapixel main sensor and 5-megapixel telemacro – similar to what you’ll find on the Xiaomi Mi 11 – are a cut above. The 108MP lens uses a nine-in-one pixel method to produce 12-megapixel results as standard.
The other models are a mere 2-megapixels in the macro department and, we believe, lack autofocus for this particular camera too. This is typical of budget devices, but we would rather this optic was entirely absent – as, from past experience, results are poor.
You’ll see that there’s no optical zoom for any Redmi Note 10, which isn’t a surprise at this price point. But the only thing that we think is really absent is any form of optical image stabilisation – which we highlighted in our Note 10 Pro review, link below.
Conclusion
Pricing: TBC
Clearly it’s all about the Note 10 Pro – effectivly a cut-price version of the Xiaomi Mi 11, which will give it lots of appeal.
Otherwise we find Xiaomi’s choice to release so many Note 10 variants simply confusing. Not to mention the name and spec shift of these handsets in different regions. Oh, and that the 5G model is such an odd-one-out that it doesn’t belong in the series as we see it.
Brush all the other models aside and opt for the Note 10 Pro (Note 10 Pro Max in India) and Xiaomi is onto a winner here. But it really needs to sort out this naming malarkey.
(Pocket-lint) – Redmi is fast becoming a key disruptor in the affordable phones market. The company’s Note 10 Pro, as reviewed here, makes it clear to see why: it’s dripping with specification that puts it a cut above its nearest of competition.
The brand name might not be instantly recognisable to all – Redmi is an offshoot of Xiaomi, hence no surprise the Note 10 Pro is like a watered-down Xiaomi Mi 11 in many respects – but when affordability is your main goal, and it simply functions as well as this, that’s not going to be a major barrier.
So if you’re seeking a phone that costs around a couple of hundred, is the Redmi Note 10 Pro appealing enough to knock the likes of the Motorola G30 out of contention?
Design & Display
6.67-inch AMOLED display, 1080 x 2400 resolution, 120Hz refresh rate
Upon pulling the Redmi Note 10 Pro from its box – here in “Onyx Gray”, which has a soft, almost blue hue about it – it’s comes across as a pretty good-looking slab of glass and plastic. There’s Gorilla Glass 5 to protect the front, and not a mass of bezel cutting into the screen either.
What is cutting into that screen more prominently than most is the punch-hole camera. It’s not even the scale of it – it’s a smaller diameter than you’ll find on recent Motorola handsets, for example – but because it’s got a silvery, shiny ring that can catch light and is a bit distracting. We’d rather it was pushed to the left side, more out of sight, and darkened please.
The Note 10 Pro’s rear is plastic, but not in a budget-looking way. Indeed it catches fingerprints in a similar fashion to glass, but it’s easy enough to wipe clean. And Redmi has chosen some pretty classy colour options too – none of the “Pastel Sky” (read: pink and mud-green) nonsense that Motorola opted for with the G30.
The only bother of the rear is that protruding camera bump. Not only is it large, it’s off-centre and, therefore, the phone wobbles about all over the place when laid upon a desk. Not that the main goal of a phone is to use it flat on a desk – you’ll normally have it in the hand – but it’s still a bugbear. A different camera enclosure would have negated this little aspect of the design.
The Note 10 Pro’s side-mounted fingerprint scanner is very neatly integrated, though, and we’ve found it to function very rapidly for logins. There’s also face unlock by using that front-facing camera, should you prefer. Oh, and if you’re still part of the wired headphones gang then the 3.5mm jack will prove a point of appreciation for you too.
And so to the screen. This is one aspect of the phone that really helps to sell it for a number of reasons. First, it’s large, at 6.67-inches on the diagonal. But, more important than that, it’s got a Full HD+ resolution that puts it a step beyond many of its near competitors. Motorola, for example, has dropped to just HD+ in its lower-end Moto G family (so around 50 per cent fewer pixels).
The Note 10 Pro’s screen is AMOLED based, too, meaning it can have an always-on display activated – which illuminates the edges in a subtle fashion when there’s a notification, as one example – for visuals to be available without actively needing to turn the display on.
That screen tech also means deep blacks, while colour is decent. As the software allows a brightness selection for night use we’ve not found the auto-brightness to be of any bother here either – which is refreshing, as it’s been a pain in basically every other MIUI software-based handset of recent times.
The other big feature of this screen is that it offers a 120Hz refresh rate. The theory here is that it can run at double the rate – 120 refreshes per second – to give a smoother visual experience. That can often be the case, too, just not in every single aspect of use. That’s the oddity of higher refresh rates: if you don’t have the hardware-software combination to handle it, then it’ll come a cropper. Thankfully it’s not too bad here, but there are some moments where the ultra-smooth swiping in, say, the Photos app gets stuttery when moving over to a different app instead.
Faster refresh is one of those nice-to-haves, sure, but 120Hz it’s not on by default – and even when you do go to activate it, MIUI describes it as a “medium” level refresh. It’s “low” for 60Hz, apparently, despite that being perfectly fine. And, um, there is no “high” – so the scale doesn’t make huge amount of sense. But it’s all a distraction really, from what’s an otherwise perfectly decent screen.
Performance & Battery
Qualcomm Snapdragon 732G platform, 6GB RAM
5,020mAh battery, 33W fast-charging
MIUI 12 software (over Android 11)
Even with the 120Hz refresh rate activated, the Redmi Pro doesn’t suffer from limited battery life. We’ve been using the phone for the week prior to the launch event as our own device – and in that time there’s usually 50 per cent battery remaining by bedtime. That’s 16 hours a time, so it’s on the edge of being a two-day laster.
The battery capacity is large, which is part of the reason for this longevity, but there’s also the instance of the processor and software combination. With Qualcomm’s Snapdragon 732G platform under the hood the Redmi hits that sweetspot of reasonable performance, limits overheating, and there’s no 5G possibility to grind it down either.
As chipsets go, the SD732 is capable of handling multiple apps, including games without particular graphical insufficiencies, meaning whether you want to run Zwift on your phone, dabble in a bit of PUBG Mobile, or hit some South Park: Phone Destroyer, it’s all within the Redmi Note 10’s reach.
The only slight stutters – and we mean very slight – tend to appear when jumping between apps. That’s when you can visually see a lowering of the frame-rate, hence the question over whether 120Hz is actually all that important here.
Running everything is Xiaomi’s MIUI 12 software, skinned over the top of Google’s Android 11 operating system. We’ve had very mixed experiences with this software in the recent past – with the Xiaomi Mi 11 it was limiting, in the Poco M3 it was irksome – but, oddly, in the Redmi Note 10 it’s caused us no significant issues. We’ve previously criticised Xiaomi’s software for being wildly inconsistent between devices (sometimes even on the same software version), but at least the Redmi gets the upper hand here.
That said, MIUI 12 does need some ‘training’, if you like. By default it battery limits every app, which you need to dig into in individual settings to rectify and ensure there’s no issue with limiting what an app can do and when, or how much power it can or can’t use in the background. However, even with the default option selected we’ve not had notification delays like we did have with the Xiaomi Mi 11. So there’s greater stability here.
In the past there’s been criticism for targeted ads in Xiaomi software, but that’s no proven a bother in this Redmi setup either. Yes, there’s still a separate Xiaomi store in addition to Google Play – which sometimes means apps will update from one, some from the other – but it’s enough in the background and out of the way that you basically needn’t worry about it.
So while we’d usually be criticising the software experience as the thing to hold a MIUI handset back, the Redmi Note 10 Pro actually fares well. In combination with its hardware loadout that makes for a generally smooth experience, too, plus a long-lasting one. Can’t say much better than that.
In terms of cameras the Redmi Note 10 Pro features what it calls a quad rear setup. That’s a bit of a stretch, really, as the depth sensor isn’t really needed or useful at all. And the ultra-wide angle isn’t the best of quality. But that’s most of the bad news out of the way.
The 5-megapixel macro sensor that’s on board is, just like that of the Mi 11, rather good fun. It’s not wildly accurate with autofocus, but at least it offers some. And sharpness isn’t pristine either – but it’s far better than what we’ve seen from umpteen lower-resolution so-called macro sensors on other phones.
The real take-way of the setup, however, is the 108-megapixel camera. If you can really consider it as that. While most makers use a four-in-one pixel methodology to gather more information and produce an image a quarter the size of the headline resolution, this Redmi goes with a nine-in-one pixel method. That means you’ll get 12-megapixel results as standard instead.
By using these nine pixels – think of it as a three by three row in a square – there’s the prospect of adding lots of comparison, more colour data, all of which can be processed into a sharp looking shot. Even in low-light conditions the Redmi Note 10 Pro’s results hold up well. We’ve been impressed.
There are limits though. As there’s no optical image stabilisation here, you’ll need a steady hand. And the Night Mode – which uses long exposure to combine multiple frames into one ‘brighter’ shot – doesn’t work well as a result. Without the stabilisation here things just don’t line-up well, making for soft, ‘mushy’ results. MIUI
No, there’s no zoom lens, so you don’t get any optical zoom fanciness, and the camera app is a bit compartmentalised in its approach, but the overall take-away from the Redmi Note 10 Pro’s camera is that the main lens delivers a lot from an affordable device. You could do a lot worse elsewhere.
First Impressions
As we said up top, Redmi is becoming a key disruptor in the affordable phone market. The Note 10 Pro makes it clear to see why: this device doesn’t just have a decent specification, it comes good on delivery too.
There’s more resolution here than on close rival Motorola handsets, the software is more stable than we’ve seen from other MIUI 12 handsets (although Motorola’s approach is clearly better), and that main 108-megapixel camera is most capable unit (although it does output at 12MP by default).
The shortcomings are only few and far between – that punch-hole camera is weird, the lack of optical stabilisation is a shame, and the camera bump (which causes ‘desk wobble’) jars somewhat – making the Redmi Note 10 Pro the most accomplished affordable phone we’ve yet seen in 2021.
Also consider
Moto G30
squirrel_widget_4238700
Motorola always delivers better on software experience – and it’s the same here – but you’ll have to accept a lower-level processor and less attractive overall design as part of this otherwise well-priced budget handset.
While we still don’t have an Intel Rocket Lake-S Core i9-11900K CPU to use for testing, Intel Z590 boards have been rolling in. So while we await benchmark results, we’ll be walking in detail through the features of these brand-new boards. First up on our bench was the ASRock Z590 Steel Legend 6E Wi-Fi, followed by the Gigabyte Z590 Aorus Master and Gigabyte’s Z590 Vision G. Today, we take a close look at the MSI MEG Z590 Ace. We’ll have to wait for benchmark results, though, to see if it deserves a spot on our best motherboards list.
The latest version of the Ace board features robust power delivery, four M.2 sockets, a premium audio codec and more. The new Ace also has updated styling on the heatsink and shrouds while still keeping the black with gold highlights theme from the previous generation. Emblazoned on the rear IO is the MSI Dragon (with RGB LEDs) and the Ace name (no lighting). We don’t have an exact price for the MEG Z590 Ace. However, the Z490’s MSRP was $399, so we expect the Z590 version to cost the same or slightly more.
MSI’s current Z590 product stack consists of 11 models, with most falling into the MEG (high-end) MPG (mid-range) and MAG (budget) lineups. We’re greeted by several familiar SKUs and a couple of new ones. Starting at the top is the flagship MEG Z590 Godlike, the Ace we’re looking at now, and a Mini ITX MEG Z590I Unify. The mid-range MPG line consists of four boards (Carbon EK X, Gaming Edge WiFi, Gaming Carbon WiFi and Gaming Force), while the less expensive MAG lineup consists of two boards (Z590 Tomahawk WiFi, and Torpedo). Wrapping up the current product stack are two ‘Pro’ boards in the Z590 Pro WiFi and Z590-A Pro. The only thing missing out of the gate is a Micro ATX board, but it’s likely we see one or two down the line.
We can’t talk about Rocket Lake-S performance yet — not that we have a CPU at this time to test boards with anyway. All we’ve seen at this point are rumors and a claim from Intel of a significant increase to IPC. But the core count was lowered from 10 cores/20 threads in Comet Lake (i9-10900K) to 8 cores/16 threads in the yet-to-be-released i9-11900K. To that end, we’ll stick with specifications and features, adding a full review that includes benchmarking, overclocking and power consumption shortly.
MSI’s MEG Z590 Ace includes all the bits you expect from a premium motherboard. The board has a stylish appearance, very capable power delivery (16-phase 90A Vcore) and the flagship Realtek ALC4082 audio codec with included DAC. We’ll cover these features and much more in detail below. First, here are the full specs from MSI.
(1) Intel Wi-Fi 6E AX210 (MU-MIMO, 2.4/5/6GHz, BT 5.2)
USB Controllers
??
HD Audio Codec
Realtek ALC4082
DDL/DTS Connect
✗ / DTS:X Ultra
Warranty
3 Years
The accessories included with the board are reasonably comprehensive, including most of what you need to get started. Below is a full list.
Manual
Quick Installation Guide
USB drive (Drivers)
Cleaning brush
Screwdrivers
Stickers (MEG/Cable)
(4) SATA cables
(4) Screws/standoff sets for M.2 sockets
Thermistor cable
1 to 2 RGB LED Y cable, Corsair RGB LED cable, Rainbow RGB LED cable
DP to mini DP cable
Image 1 of 3
Image 2 of 3
Image 3 of 3
Looking at the Z590 Ace for the first time, we see the black PCB along with black heatsinks and shrouds covering most of the board. MSI stenciled on identifying language such as the MEG Ace name and the MSI Gaming Dragon in gold, setting this SKU apart from the rest. The VRM heatsinks are both made from a solid block of aluminum with lines cut out. Additionally, the shroud is made of metal and connected to the heat pipes, increasing surface area significantly. Also worth noting is the VRM heatsinks share the load connected via heatpipe. RGB LED lighting is minimal here, with a symbol on the chipset shining through a mesh cover on the chipset heatsink and the MSI dragon above the rear IO. While tastefully done, some may want more. With its mostly black appearance, the board won’t have trouble fitting in most build themes.
Focusing on the top half of the board, we’ll get a better look at what’s going with the VRM heatsinks and other board features in this area. In the upper-left corner, we spot two 8-pin EPS connectors, one of which is required for operation. Just below this is the shroud covering the rear IO bits and part of the VRM heatsink. On it is a carbon-fiber pattern along with the MSI Gaming Dragon illuminated by RGB LEDs. The socket area is relatively clean, with only a few caps visible.
Just above the VRM heatsink is the first of eight fan headers. All fan headers on the board are the 4-pin type and support PWM- and DC-controlled fans and pumps. The CPU_FAN1 header supports up to 2A/24W and auto-detects the attached device type. The PUMP_FAN1 supports up to 3A/36W. The rest of the system fan headers support up to 1A/12W. This configuration offers plenty of support for most cooling systems. That said, I would like to have seen all pump headers auto-detect PWM/DC modes instead of only CPU_FAN1.
To the right of the socket are four reinforced DRAM slots. The Z590 Ace supports up to 128GB of RAM with speeds listed up to DDR4 5600 (for one stick with one rank). The highest supported speed with two DIMMs is DDR4 4400+, which is plenty fast enough for an overwhelming majority of users.
MOving down the right edge of the board, we see the 2-character debug LED up top, a system fan header, five voltage read points (Vcore/DRAM/SA/IO/IO2), 4-LED debug, 24-pin ATX connector, and finally, a USB 3.2 Gen2 Type-C front panel header. Between both debug tools and the voltage read points, you’ll have an accurate idea of what’s going on with your PC.
With the MEG Z590 Ace towards the top of the product stack, you’d expect well-built power delivery and you wouldn’t be wrong. MSI lists the board as 16+2+1 (Vcore/GT/SA) and it uses a Renesas ISL69269 (X+Y+Z = 8+2+1) PWM controller that feeds power to eight-phase doublers (Renesas ISL617A), then onto 16 90A Renesas ISL99390B MOSFETs for the Vcore. This configuration yields 1440A of power for the CPU, which is plenty for ambient and sub-ambient/extreme overclocking. It won’t be this board holding you back in any overclocking adventures, that’s for sure.
As we focus on the bottom half, we’ll take a closer look at the integrated audio, PCIe slot configuration and storage. Starting with the audio bits on the left side, under the shroud, is the Realtek latest premium codec, the ALC4082. Additionally, the Z590 Ace includes an ESS Sabre 9018Q2C combo DAC, a dedicated headphone amplifier (up to 600 Ohm) and high-quality Chemicon audio capacitors. This audio solution should be more than adequate for most users.
In the middle of the board are four M.2 sockets and five PCIe slots. With the PCIe connectivity, all three full-length slots are reinforced to prevent shearing and EMI, while the two PCIe x1 slots don’t have any reinforcement. The top slot supports PCIe 4.0 x16 speeds, with the second and third slots PCIe 3.0. The slots break down as follows, x16/x0/x4 x8/x8/x4 or x8/x4+x4/x4. This configuration supports 2-Way Nvidia SLI and 2-Way AMD Crossfire technologies. All x1 slots and the full-length bottom slot are fed from the chipset, while the top two full-length slots source their lanes from the CPU.
M.2 storage on the Z590 Ace consists of four onboard sockets supporting various speeds and module lengths. The top slot, M2_1, supports PCIe 4.0 x4 modules up to 110mm. Worth noting on this socket is that it only works with an 11th Gen Intel CPU installed. M2_2, M2_3, M2_4 are fed from the chipset, with M2_2 and M2_3 supporting SATA- and PCIe-based modules up to 80mm, while M2_4 supports PCIe only. M2_2/3/4 are all PCIe 3.0 x4.
The way this is wired, you will lose some SATA ports and PCIe bandwidth depending on the configuration. For example, SATA2 is unavailable when using a SATA-based SSD in the M2_2 socket. SATA 5/6 are unavailable when using the M2_3 socket with any type of device. Finally, the bandwidth on M2_4 switches from x4 to x2 when PCI_E5 (bottom x1 slot) is used. The M.2 sockets support RAID 0/1 for those who would like additional speed or redundancy.
Finally, along the right edge of the board are six horizontally oriented SATA ports. The Z590 Ace supports RAID 0, 1 and 10 on the SATA ports. Just be aware you lose a couple of ports on this board if you’re using some of the M.2 sockets. Above these ports is a USB 3.2 Gen1 front panel header along with another 4-pin system fan header.
Across the board’s bottom edge are several headers, including more USB ports, fan headers, and more. Below is the full list, from left to right:
Front Panel Audio
aRGB and RGB headers
(3) System Fan headers
Supplemental PCIe power
Tuning controller connector
Temperature sensor
(2) USB 2.0 headers
LED switch
BIOS selector switch
OC Retry jumper
TPM header
Power and Reset buttons
Slow mode jumpers
Front panel connectors
Moving to the rear IO area, we see the integrated IO plate sporting a black background with gold writing matching the board theme. There are eight USB Type-A ports (two USB 3.2 Gen2, four USB 3.2 Gen1 and two USB 2.0 ports). On the Type-C front, the Z590 Ace includes two Thunderbolt 4 ports capable of speeds up to 40 Gbps. Just to the right of those are Mini-DisplayPort inputs for running video through the Thunderbolt connection(s). Handling the video output for the CPU’s integrated graphics is a single HDMI (2.0b) port. We also spy here the Wi-Fi antenna connections, 5-plug plus SPDIF audio stack, Intel 2.5 GbE and finally, a Clear CMOS button and BIOS Flashback button that can be used without a CPU.
Software
For Z590, MSI has changed up its software offerings. We used to have several individual programs to adjust the system, but MSI moved to an all-in-one application called MSI Center with thisboard. The new Software is a central repository for many of the utilities (12) MSI offers. These include Mystic Light (RGB control), AI Cooling (adjust fan speeds), LAN Manager (control the NIC), Speed Up (for storage), Gaming Mode (auto-tune games), among several others (see the screenshots below for details). The User Scenario application has a couple of presets for system performance and is where you manually adjust settings, including CPU clock speeds and voltage, RAM timings, and more. Overall, I like the move to a single application. The user interface is easy to read and get around in. However, sometimes loading these applications takes longer than I would like to see. But MSI Center does an excellent job of pulling everything in.
Image 1 of 10
Image 2 of 10
Image 3 of 10
Image 4 of 10
Image 5 of 10
Image 6 of 10
Image 7 of 10
Image 8 of 10
Image 9 of 10
Image 10 of 10
Firmware
To give you a taste of the Firmware, we’ve gathered screenshots showing most BIOS screens. MSI’s BIOS is unique from the other board partners in that the headings aren’t at the top but split out to the sides. In each section, all the frequently used options are easy to find and not buried deep within menus. Overall, MSI didn’t change much here when moving from Z490 to Z590 and their BIOS continues to be easy to use.
Image 1 of 23
Image 2 of 23
Image 3 of 23
Image 4 of 23
Image 5 of 23
Image 6 of 23
Image 7 of 23
Image 8 of 23
Image 9 of 23
Image 10 of 23
Image 11 of 23
Image 12 of 23
Image 13 of 23
Image 14 of 23
Image 15 of 23
Image 16 of 23
Image 17 of 23
Image 18 of 23
Image 19 of 23
Image 20 of 23
Image 21 of 23
Image 22 of 23
Image 23 of 23
Future Tests and Final Thoughts
With Z590 boards arriving but now Rocket Lake-S CPUs yet, we’re in an odd place. We know most of these boards should perform similarly to our previous Z490 motherboard reviews. And while there are exceptions, they are likely mostly at the bottom of the product stack. To that end, we’re posting these as detailed previews until we get data using a Rocket Lake processor.
Once we receive a Rocket Lake CPU and as soon as any embargos have expired, we’ll fill in the data points, including the benchmarking/performance results, as well as overclocking/power and VRM temperatures.
We’ll also be updating our test system hardware to include a PCIe 4.0 video card and storage. This way, we can utilize the platform to its fullest using the fastest protocols supported. We will also update to the latest Windows 10 64-bit OS (20H2) with all threat mitigations applied and update the video card driver and use the newest release when we start this testing. We use the latest non-beta motherboard BIOS available to the public unless otherwise noted. While we do not have performance results from the yet-to-be-released Rocket Lake CPU, we’re confident the 70A VRMs will handle the i9-11900K processor without issue. A quick test of the i9-10900K found the board quite capable with that CPU, easily allowing the 5.2 GHz overclock we set. For now, we’ll focus on features, price, and appearance until we gather performance data from the new CPU.
The MSI MEG Z590 Ace is a premium motherboard adorned with several high-end features, including a very robust VRM capable of handling 10th and 11th generation flagship Intel processors at both stock speeds and overclocked. Additionally, the board includes four M.2 sockets, 2.5 GbE and integrated Wi-Fi 6E, and two Thunderbolt 4 ports for increased bandwidth and peripheral flexibility.
The MEG Z590 Ace’s 16-phase 90A VRM handled our i9-10900K without issue, even overclocked to 5.2 GHz. We’ll retest once we receive our Rocket Lake-based i9-11900K, but so long as the BIOS is right, it shouldn’t pose any problems for this board. Although it has four M.2 sockets, unlike the Gigabyte Z590 Vision G, using these sockets causes SATA ports to drop, because more lanes are tied to the chipset on this board). That said, if you’re in a worst-case scenario, you can run four M.2 modules and still have three SATA ports left over. Most users should find this acceptable.
As far as potential drawbacks go, the price point of $400-plus will be out of reach for some users. Another concern for some may be the lack of RGB elements on the board. The MSI dragon and chipset heatsink light up with RGB LEDs, but that’s it. If you like a lot of RGB LED bling, you can add it via the four aRBG/RGB headers located around the board. The other drawback is the lack of a USB 3.2 Gen2x2 Type-C port, but the faster Thunderbolt 4 ports certainly make up for that.
Direct competitors at this price point are the Asus ROG Strix Z590-E Gaming, Gigabyte Z590 Aorus Master, and the ASRock Z590 Taichi. All of these boards are plenty capable with the differences residing in VRMs (Gigabyte gets the nod here), M.2 storage (MSI and Giga both have four) and audio (the Ace has the most premium codec). Beauty is in the eye of the beholder, but if you forced me to pick among these, the Taichi would be the board I’d want to show off the most. That said, no board here is a turnoff and has its own benefit over another.
The Ace’s appearance, including the brushed aluminum and carbon fiber-like finish, really gives it a premium look and feel, while easily blending in with your build theme. If your budget allows for a ~$400 motherboard and you’re looking for a lot of M.2 storage and enjoy a premium audio experience, the MEG Z590 Ace is an excellent option near that price point. Stay tuned for benchmarking, overclocking, and power results using the new Rocket Lake CPU.
(Pocket-lint) – There are countless TVs out there that might have amazing picture quality and resolution, but a bad software experience – whether it’s slow and annoying, or missing key apps.
Best UK streaming services: Disney+, Netflix, Amazon Prime Video, Now TV, BritBox and more
Media streaming sticks are the easiest and generally most affordable way around problems like these. They’re one-stop shops for your streaming needs, packing in your favourite apps like Netflix and Amazon Prime Video into a small package that generally plugs into an HDMI slot behind your TV. They don’t need space for a set-top box, and are quick and easy to set up – but which ones are the best? Read on to find out.
Which is the best streaming stick for you?
Amazon Fire TV 4K
squirrel_widget_146520
Amazon’s 4K stick is a marvel that boasts all the major streaming apps in one place, and is quick and responsive to use.
Its 4K output is reliable and also speedy, and the stick itself is unobtrusive enough, while the remote is similarly intuitive. The inclusion of Alexa means that you can use voice search if you’re so inclined, although we’re not always convinced by its results.
Amazon Fire TV Stick 4K review: Superbly priced Prime streamer
Chromecast with Google TV
squirrel_widget_2709201
Google’s recent return to the world of streaming has produced the new Chromecast with Google TV – it’s not technically a stick, but it’s small and hangs off your TV, so in every way that matters, it counts.
The software experience is great, and being able to cast to it is super helpful, although there are a couple of services that haven’t quite made it to the new Google TV yet, which is the only thing holding it back.
Google Chromecast with Google TV review: Welcome to the party, Google
Roku Streaming Stick+
squirrel_widget_143466
Roku’s been making good in-roads in the last few years by making really low-cost streaming devices that are easy to use, and this stick is no exception.
It’s well-priced, and the UI is really easy to use, plus it’s got all the major apps accounted for to make sure that you can watch whatever you want – in 4K.
Roku Streaming Stick+ review: First-class streaming
Amazon Fire TV Stick (2020)
squirrel_widget_140302
If you fancy the Amazon experience for your streaming (and that’s sensible – it’s great), but you don’t have a 4K TV or don’t think you want to spend that much, this is a great alternative.
It’s basically Amazon’s full package, just without 4K capability, and it works great as just that. You’ll save a bit of cash, although if you think you’ll go 4K soon we’d probably splash out for the pricier, future-proofed version. For an even bigger saving, check out the Fire TV Stick Lite.
Now TV Smart Stick
squirrel_widget_143487
You might notice the similarities between the remotes on Roku’s stick and this Now TV-branded one – they’re near identical, and the experiences are similarly close.
For those in the UK, Now TV has a range of good options available that make it a great place to start with streaming, and while it’s not quite as adaptable as some of the others on this list, and is missing some key services, it’s a good pick if you want a simple solution. It maxes out at 1080p, though.
Now TV Smart Stick review: Flexible passes, now in Full HD
Writing by Max Freeman-Mills. Editing by Dan Grabham.
We now have official specs for the AMD Radeon RX 6700 XT, yet another poorly kept secret in the land of GPUs you can’t actually buy. We’ve been expecting Navi 22 to join the ranks of the best graphics cards and land somewhere near the RTX 3060 Ti in our GPU benchmarks hierarchy for several months now, and it will officially arrive on March 18, 2021, at 9am Eastern. It will be completely sold out by 9:00:05, and based on recent events like the RTX 3060 12GB, we doubt more than a handful of people will manage to acquire one at whatever MSRP AMD sets.
Speaking of which, AMD revealed that it plans to launch the RX 6700 XT with a starting price of $479. Considering AMD expects it to be faster than the RTX 3070, never mind the RTX 3060 Ti, that’s a reasonable target. The die size also appears to be relatively large, thanks to a still-sizeable Infinity Cache. Here’s the full list of known specs:
The AMD Radeon RX 6700 XT comes in with the highest GPU clocks we’ve to date, 2424 MHz. The RX 6800 XT and RX 6900 XT both have 2250 MHz game clocks, though in actual benchmarks, we’ve seen speeds of more than 2500 MHz already — the Game Clock is more of a conservative boost clock. Even with a drop down to 40 CUs (from 60 CUs on the RX 6800), the higher clock speeds should prove relatively potent. Raw theoretical performance sits at 12.4 TFLOPS, and assuming AMD uses 16Gbps GDDR6 again (which is likely), it will have 384GBps of bandwidth. Except, it still has a honking 96MB L3 Infinity Cache.
We were very curious about how far AMD would cut down the Infinity Cache from Navi 21. The answer appears to be “not very much.” The Biggest Navi chip has up to 80 CUs and 128MB of Infinity Cache, so AMD cut the computational resources in half but only lopped off a quarter of the cache. That should keep cache hit rates high, which means effective bandwidth — even from a 192-bit memory interface — should be much higher than Nvidia’s similarly-equipped RTX 3060 12GB.
Let’s go back to that TFLOPS number for a moment, though. 12.4 TFLOPS may not sound like much, but it’s a big jump from the previous gen 40 CU part. The RX 5700 XT had a theoretical 9.8 TFLOPS, and we know the Infinity Cache allows the GPU to get closer to that maximum level of performance in games. That means a 40-50 percent jump in performance might be possible. On the other hand, the RX 6800 with 60 CUs, even at lower clocks, is rated for 16.2 TFLOPS, a 31% increase in compute potential. It also has 33% more memory bandwidth, which means on average it should be at least 20% faster than the 6700 XT, for about 20% more money (well, if MSRP was anything but a fantasy right now).
There are other indications this will still be a performant card, like the 230W board power (just 20W lower than RX 6800). And then there’s the die shot comparison.
AMD didn’t reveal all of the specs, but based on that image, it looks like RX 6700 XT / Navi 22 will max out at 96 ROPs (Render Outputs), and the total die size looks to be in the neighborhood of 325mm square, with around 16-17 billion transistors (give or take 10%). That’s quite a bit smaller than Navi 21 (520mm square and 26.8 billion transistors), and perhaps the above images aren’t to scale, but clearly, there’s a lot of other circuitry besides the GPU cores that still needs to be present — the cores and cache only account for about half of the die area.
By way of comparison, Nvidia’s GA106 measures 276mm square with 12 billion transistors, while the GA104 has 17.4 billion transistors and a 393mm square die size. AMD’s Navi 22 should be competitive with GA104, but with a smaller size thanks to its TSMC N7 process technology. However, TSMC N7 costs more and is in greater demand, which leads back to the $479 price point.
Performance, as usual, will be the real deciding factor on how desirable the RX 6700 XT ends up being. AMD provided some initial benchmark results — using games and settings that generally favor its GPUs, naturally. Take these benchmarks with a grain of salt, in other words, but even reading between the lines, the 6700 XT looks pretty potent.
That’s eight games, three with definite AMD ties (Assassin’s Creed Valhalla, Borderlands 3, and Dirt 5) and two with Nvidia ties (Cyberpunk 2077 and Watch Dogs Legion). AMD says “max settings,” but we suspect that means max settings but without ray tracing effects. Still, there are a lot of games that don’t use RT, and of those that have it, the difference in visual quality isn’t even that great for a lot of them, so rasterization performance still reigns as the most important factor. Based on AMD’s data, it looks like the RX 6700 XT will trade blows with the RTX 3070.
AMD had a few other announcements today. It’s bringing resizable BAR support, called AMD Smart Access Memory, to Ryzen 3000 processors. That excludes the Ryzen 3200G and 3400G APUs, which of course, are technically Zen+ architecture and have a limited x8 PCIe link to the graphics. AMD also didn’t mention any Ryzen 4000 mobile or desktop APUs (i.e., Renoir), so those may not be included either, but every Zen 2 and Zen 3 AMD CPU will have Smart Access Memory.
AMD didn’t discuss future Navi 22-derived graphics cards, but there will inevitably be more products built around the GPU. From what we can tell, RX 6700 XT uses the fully enabled chip with 40 CUs. Just as we’ve seen with Navi 21 and previous GPUs like Navi 10, not all chips are fully functional, and harvesting those partial dies is a key component of improving yields. We expect to see an RX 6700 (non-XT) at the very least, and there are opportunities for OEM-only variants as well (i.e., similar to the RX 5500 non-XT cards of the previous generation). We’ll probably see the RX 6700 (or whatever the final name ends up being) within the next month.
Again, pricing and availability are critical factors for any GPU launch, and while we have no doubt AMD will sell every RX 6700 XT it produces, we just hope it can produce more than a trickle of cards. When asked about this, AMD issued the following statement:
“We hear, and understand, the frustration from gamers right now due to the unexpectedly strong global demand for graphics cards. With the AMD Radeon RX 6700 XT launch, we are on track to have significantly more GPUs available for sale at launch. We continue to take additional steps to address the demand we see from the community. We are also refreshing stock of both AMD Radeon RX 6000 Series graphics cards and AMD Ryzen 5000 Series processors on AMD.com on a weekly basis, giving gamers and enthusiasts a direct option to purchase the latest Ryzen CPUs and Radeon GPUs at the suggested etail and retail price.”
That’s nice to hear, but we remain skeptical. We’ve been tracking general trends in the marketplace, and it’s clear Nvidia continues to sell far more graphics cards than AMD, and it’s still not coming anywhere close to meeting demand. Will Navi 22 buck that trend? Our Magic 8-Ball was cautiously optimistic, as you can see:
All joking aside, we’re looking forward to another likely frustrating GPU launch. There’s no indication that AMD will follow Nvidia’s example and try to limit mining performance on its future GPUs, but with or without high mining performance, the RX 6700 XT will inevitably sell out. There’s at least some good news in recent GPU mining profitability trends, however: Cards that were making $12–$15 per day last month are now mining in the $6–$8 range and dropping. That’s not going to stop mining completely, but hopefully it means fewer people trying to start up mining farms if the potential break-even point is more than a year away, rather than 3–4 months out.
The AMD Radeon RX 6700 XT officially launches on March 18. We’ll have a full review at that time. Given the pictures AMD sent along, we expect there will be dual-fan reference cards, but AMD will want to shift the bulk of cards over to its AIB partners. We should see various models from all the usual partners, and we’re eager to see how the GPU fares in independent testing. Check back on March 18 to find out.
Below is the full slide deck from AMD’s announcement today.
Image 1 of 35
Image 2 of 35
Image 3 of 35
Image 4 of 35
Image 5 of 35
Image 6 of 35
Image 7 of 35
Image 8 of 35
Image 9 of 35
Image 10 of 35
Image 11 of 35
Image 12 of 35
Image 13 of 35
Image 14 of 35
Image 15 of 35
Image 16 of 35
Image 17 of 35
Image 18 of 35
Image 19 of 35
Image 20 of 35
Image 21 of 35
Image 22 of 35
Image 23 of 35
Image 24 of 35
Image 25 of 35
Image 26 of 35
Image 27 of 35
Image 28 of 35
Image 29 of 35
Image 30 of 35
Image 31 of 35
Image 32 of 35
Image 33 of 35
Image 34 of 35
Image 35 of 35
MORE: Best Graphics Cards
MORE: GPU Benchmarks and Hierarchy
MORE: All Graphics Content
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.