Microsoft has pulled the trigger on the May 2021 update and the patch is available to download right now. The update comes with a couple of new features like Windows hello multicamera support, the removal of the legacy Microsoft Edge browser, and a ton of bug fixes and other quality of life improvements to Windows 10.
If you want to download the update, you can either use Microsoft’s Update Assistant/Media Creation Tool or manually “check for updates” through Windows Update (in Settings). However, not every PC will have 21H1 available if you use the Windows Update method. This is perfectly normal as Microsoft releases new updates in ‘waves’ to ensure the reliability of the product.
If you use your PC for work or you need a stable system, we recommend waiting until Windows Update automatically installs 21H1, instead of manually updating to it. While this new May update does have a lot of bugs and improvements, plenty of Microsoft’s updates in the past have had major problems upon the first release.
Microsoft’s focus on bug fixes and quality of life for the 21H1 update makes a lot of sense, as the companies next big feature update for Windows 10 codenamed ‘Sun Valley’ is supposed to be the largest update to Windows 10 we’ve ever seen. Sun Valley plans to give Windows 10 a big overhaul to its user interface and also bring in features from its now canceled Windows 10X operating system.
Microsoft finally addressed the state of Windows 10X in a blog post about the May 2021 update. Reports of Windows 10X being shelved started showing earlier this month.
“Following a year-long exploration and engaging in conversations with customers, we realized that the technology of Windows 10X could be useful in more ways and serve more customers than we originally imagined. We concluded that the 10X technology shouldn’t just be confined to a subset of customers,” vice president of program management John Cable wrote.
“Instead of bringing a product called Windows 10X to market in 2021 like we originally intended, we are leveraging learnings from our journey thus far and accelerating the integration of key foundational 10X technology into other parts of Windows and products at the company.”
We’re unlikely to see too much of that here, but the upcoming 21H2 update may have a far bigger redesign.
There are new features, but it’s the biggest design update in years
Google is announcing the latest beta for Android 12 today at Google I/O. It has an entirely new design based on a system called “Material You,” featuring big, bubbly buttons, shifting colors, and smoother animations. It is “the biggest design change in Android’s history,” according to Sameer Samat, VP of product management, Android and Google Play.
That might be a bit of hyperbole, especially considering how many design iterations Android has seen over the past decade, but it’s justified. Android 12 exudes confidence in its design, unafraid to make everything much larger and a little more playful. Every big design change can be polarizing, and I expect Android users who prefer information density in their UI may find it a little off-putting. But in just a few days, it has already grown on me.
There are a few other functional features being tossed in beyond what’s already been announced for the developer betas, but they’re fairly minor. The new design is what matters. It looks new, but Android by and large works the same — though, of course, Google can’t help itself and again shuffled around a few system-level features.
I’ve spent a couple of hours demoing all of the new features and the subsequent few days previewing some of the new designs in the beta that’s being released today. Here’s what to expect in Android 12 when it is officially released later this year.
Material You design and better widgets
Android 12 is one implementation of a new design system Google is debuting called Material You. Cue the jokes about UX versus UI versus… You, I suppose. Unlike the first version of Material Design, this new system is meant to mainly be a set of principles for creating interfaces — one that goes well beyond the original paper metaphor. Google says it will be applied across all of its products, from the web to apps to hardware to Android. Though as before, it’s likely going to take a long time for that to happen.
In any case, the point is that the new elements in Android 12 are Google’s specific implementations of those principles on Pixel phones. Which is to say: other phones might implement those principles differently or maybe even not at all. I can tell you what Google’s version of Android 12 is going to look and act like, but only Samsung can tell you what Samsung’s version will do (and, of course, when it will arrive).
The feature Google will be crowing the most about is that when you change your wallpaper, you’ll have the option to automatically change your system colors as well. Android 12 will pull out both dominant and complementary colors from your wallpaper automatically and apply those colors to buttons and sliders and the like. It’s neat, but I’m not personally a fan of changing button colors that much.
The lock screen is also set for some changes: the clock is huge and centered if you have no notifications and slightly smaller but still more prominent if you do. It also picks up an accent color based on the theming system. I especially love the giant clock on the always-on display.
Android’s widget system has developed a well-deserved bad reputation. Many apps don’t bother with them, and many more haven’t updated their widget’s look since they first made one in days of yore. The result is a huge swath of ugly, broken, and inconsistent widgets for the home screen.
Google is hoping to fix all of that with its new widget system. As with everything else in Android 12, the widgets Google has designed for its own apps are big and bubbly, with a playful design that’s not in keeping with how most people might think of Android. One clever feature is that when you move a widget around on your wallpaper, it subtly changes its background color to be closer to the part of the image it’s set upon.
I don’t have especially high hopes that Android developers will rush to adopt this new widget system, so I hope Google has a plan to encourage the most-used apps to get on it. Apple came very late to the home screen widget game on the iPhone, but it’s already surpassed most of the crufty widget abandonware you’ll find from most Android apps.
Bigger buttons and more animation
As you’ve no doubt gathered already from the photos, the most noticeable change in Android 12 is that all of the design elements are big, bubbly, and much more liberal in their use of animation. It certainly makes the entire system more legible and perhaps more accessible, but it also means you’re just going to get fewer buttons and menu items visible on a single screen.
That tradeoff is worth it, I think. Simple things like brightness and volume sliders are just easier to adjust now, for example. As for the animations, so far, I like them. But they definitely involve more visual flourish than before. When you unlock or plug in your phone, waves of shadow and light play across the screen. Apps expand out clearly from their icon’s position, and drawers and other elements slide in and out with fade effects.
More animations mean more resources and potentially more jitter, but Samat says the Android team has optimized how Android displays core elements. The windows and package manager use 22 percent less CPU time, the system server uses 15 percent less of the big (read: more powerful and battery-intensive) core on the processor, and interrupts have been reduced, too.
Android has another reputation: solving for jitter and jank by just throwing ever-more-powerful hardware at the problem: faster chips, higher refresh rate screens, and the like. Hopefully none of that will be necessary to keep these animations smooth on lower-end devices. On my Pixel 5, they’ve been quite good.
One last bit: there’s a new “overscroll” animation — the thing the screen does when you scroll to the end of a page. Now, everything on the screen will sort of stretch a bit when you can’t scroll any further. Maybe an Apple patent expired.
Shuffling system spaces around
It wouldn’t be a new version of Android without Google mucking about with notifications, Google Assistant, or what happens when you press the power button. With Android 12, we’ve hit the trifecta. Luckily, the changes Google has made mostly represent walking back some of the changes it made in Android 11.
The combined Quick Settings / notifications shade remains mostly the same — though the huge buttons mean you’re going to see fewer of them in either collapsed or expanded views. The main difference in notifications is mostly aesthetic. Like everything else, they’re big and bubbly. There’s a big, easy-to-hit down arrow for expanding them, and groups of notifications are put together into one bigger bubble. There’s even a nice little visual flourish when you begin to swipe a notification away: it forms its own roundrect, indicating that it has become a discrete object.
The thing that will please a lot of Android users is that after just a year, Google has bailed on its idea of creating a whole new power button menu with Google Wallet and smart home controls. Instead, both of those things are just buttons inside the quick settings shade, similar to Samsung’s solution.
Holding down the power button now just brings up Google Assistant. Samat says it was a necessary change because Google Assistant is going to begin to offer more contextually aware features based on whatever screen you’re looking at. I say the diagonal swipe-in from the corner to launch Assistant was terrible, and I wouldn’t be surprised if it seriously reduced how much people used it.
I also have to point out that it’s a case of Google adopting gestures already popular on other phones: the iPhone’s button power brings up Siri, and a Galaxy’s button brings up Bixby.
New privacy features for camera, mic, and location
Google is doing a few things with privacy in Android 12, mostly focused on three key sensors it sees as trigger points for people: location, camera, and microphone.
The camera and mic will now flip on a little green dot in the upper-right of the screen, indicating that they’re on. There are also now two optional toggles in Quick Settings for turning them off entirely at a system level.
When an app tries to use one of them, Android will pop up a box asking if you want to turn it back on. If you choose not to, the app thinks it has access to the camera or mic, but all Android gives it is a black nothingness and silence. It’s a mood.
For location, Google is adding another option for what kind of access you can grant an app. Alongside the options to limit access to one time or just when the app is open, there are settings for granting either “approximate” or “precise” locations. Approximate will let the app know your location with less precision, so it theoretically can’t guess your exact address. Google suggests it could be useful for things like weather apps. (Note that any permissions you’ve already granted will be grandfathered in, so you’ll need to dig into settings to switch them to approximate.)
Google is also creating a new “Privacy Dashboard” specifically focused on location, mic, and camera. It presents a pie chart of how many times each has been accessed in the last 24 hours along with a timeline of each time it was used. You can tap in and get to the settings for any app from there.
The Android Private Compute Core
Another new privacy feature is the unfortunately named “Android Private Compute Core.” Unfortunately, because when most people think of a “core,” they assume there’s an actual physical chip involved. Instead, think of the APCC as a sandboxed part of Android 12 for doing AI stuff.
Essentially, a bunch of Android machine learning functions are going to be run inside the APCC. It is walled-off from the rest of the OS, and the functions inside it are specifically not allowed any kind of network access. It literally cannot send or receive data from the cloud, Google says. The only way to communicate with the functions inside it is via specific APIs, which Google emphasizes are “open source” as some kind of talisman of security.
Talisman or no, it’s a good idea. The operations that run inside the APCC include Android’s feature for ambiently identifying playing music. That needs to have the microphone listening on a very regular basis, so it’s the sort of thing you’d want to keep local. The APCC also hands the “smart chips” for auto-reply buttons based on your own language usage.
An easier way to think of it is if there’s an AI function you might think is creepy, Google is running it inside the APCC so its powers are limited. And it’s also a sure sign that Google intends to introduce more AI features into Android in the future.
No news on app tracking — yet
Location, camera, mic, and machine learning are all privacy vectors to lock down, but they’re not the kind of privacy that’s on everybody’s mind right now. The more urgent concern in the last few months is app tracking for ad purposes. Apple has just locked all of that down with its App Tracking Transparency feature. Google itself is still planning on blocking third-party cookies in Chrome and replacing them with anonymizing technology.
What about Android? There have been rumors that Google is considering some kind of system similar to Apple’s, but there won’t be any announcements about it at Google I/O. However, Samat confirmed to me that his team is working on something:
There’s obviously a lot changing in the ecosystem. One thing about Google is it is a platform company. It’s also a company that is deep in the advertising space. So we’re thinking very deeply about how we should evolve the advertising system. You see what we’re doing on Chrome. From our standpoint on Android, we don’t have anything to announce at the moment, but we are taking a position that privacy and advertising don’t need to be directly opposed to each other. That, we don’t believe, is healthy for the overall ecosystem as a company. So we’re thinking about that working with our developer partners and we’ll be sharing more later this year.
A few other features
Google has already announced a bunch of features in earlier developer betas, most of which are under-the-hood kind of features. There are “improved accessibility features for people with impaired vision, scrolling screenshots, conversation widgets that bring your favorite people to the home screen” and the already-announced improved support for third-party app stores. On top of those, there are a few neat little additions to mention today.
First, Android 12 will (finally) have a built-in remote that will work with Android TV systems like the Chromecast with Google TV or Sony TVs. Google is also promising to work with partners to get car unlocking working via NFC and (if a phone supports it) UWB. It will be available on “select Pixel and Samsung Galaxy phones” later this year, and BMW is on board to support it in future vehicles.
For people with Chromebooks, Google is continuing the trend of making them work better with Android phones. Later this year, Chrome OS devices will be able to immediately access new photos in an Android phone’s photo library over Wi-Fi Direct instead of waiting for them to sync up to the Google Photos cloud. Google still doesn’t have anything as good as AirDrop for quickly sending files across multiple kinds of devices, but it’s a good step.
Android already has fast pairing for quickly setting up Bluetooth devices, but it’s not built into the Bluetooth spec. Instead, Google has to work with individual manufacturers to enable it. A new one is coming on board today: Beats, which is owned by Apple. (Huh!) Ford and BMW cars will also support one-tap pairing.
Android Updates
As always, no story about a new version of Android would be complete without pointing out that the only phones guaranteed to get it in a timely manner are Google’s own Pixel phones. However, Google has made some strides in the past few years. Samat says that there has been a year-over-year improvement in the “speed of updates” to the tune of 30 percent.
A few years ago, Google changed the architecture of Android with something called Project Treble. It made the system a little more modular, which, in turn, made it easier for Android manufacturers to apply their custom versions of Android without mucking about in the core of it. That should mean faster updates.
Some companies have improved slightly, including the most important one, Samsung. However, it’s still slow going, especially for older devices. As JR Raphael has pointed out, most companies are not getting updates out in what should be a perfectly reasonable timeframe.
Beyond Treble, there may be some behind-the-scenes pressure happening. More and more companies are committing to providing updates for longer. Google also is working directly with Qualcomm to speed up updates. Since Qualcomm is, for all intents and purposes, the monopoly chip provider for Android phones in the US, that should make a big difference, too.
That’s all heartening, but it’s important to set expectations appropriately. Android will never match iOS in terms of providing timely near-universal updates as soon as a new version of the OS is available. There will always be a gap between the Android release and its availability for non-Pixel phones. That’s just the way the Android ecosystem works.
That’s Android 12. It may not be the biggest feature drop in years, but it is easily the biggest visual overhaul in some time. And Android needed it. Over time and over multiple iterations, lots of corners of the OS were getting a little crufty as new ideas piled on top of each other. Android 12 doesn’t completely wipe the slate clean and start over, but it’s a significant and ambitious attempt to make the whole system feel more coherent and consistent.
The beta that’s available this week won’t get there — the version I’m using lacks the theming features, widgets, and plenty more. Those features should get layered in as we approach the official release later this year. Assuming that Google can get this fresh paint into all of the corners, it will make Google’s version of Android a much more enjoyable thing to use.
Microsoft is confirming today that it no longer plans to release Windows 10X. The operating was originally supposed to arrive alongside new dual-screen devices like the Surface Neo, with a more lightweight and simplified interface and features. This was all before the pandemic hit, and Microsoft then decided to prioritize Windows 10X for single-screen laptops instead. Windows 10X is now officially over, and Microsoft is now planning to bring its best bits into Windows 10.
“Instead of bringing a product called Windows 10X to market in 2021 like we originally intended, we are leveraging learnings from our journey thus far and accelerating the integration of key foundational 10X technology into other parts of Windows and products at the company,” confirms John Cable, head of Windows servicing and delivery.
Some of that has already started appearing in the form of a new app container technology, better voice typing, and a modernized touch keyboard for Windows 10. Microsoft says it will now “continue to invest in areas where the 10X technology” makes sense for both software and hardware in the future. It’s unlikely that we’ll ever see the Surface Neo device now, though.
Windows 10X was going to appear in 2021 as more of a Chrome OS competitor, and Microsoft had spent years trying to modernize Windows in an attempt to bring a more lightweight version to market. Windows RT first released alongside the original Surface tablet in 2012, and then Windows 10 S arrived in 2017. Both failed to simplify Windows, but Windows 10X had some promising features that will now start to appear in Windows 10 instead.
While Microsoft released a smaller Windows 10 May 2021 Update today, a larger one is planned for October. This next major update includes some big visual changes in the form of new system icons, File Explorer improvements, and the end of Windows 95-era icons. Microsoft is also focusing on some key features and additions like fixing the rearranging apps issue on multiple monitors, adding the Xbox Auto HDR feature, and also improving Bluetooth audio support.
Microsoft’s next major Windows 10 update is starting to roll out to devices today. The Windows 10 May 2021 Update focuses on improving remote work scenarios, with changes like being able to use multiple Windows Hello cameras on a single machine. That’s particularly useful for Surface devices that owners might want to connect to a monitor with an additional webcam while working from home.
Here are the full new features of the Windows 10 May 2021 Update (version 21H1):
Windows Hello multicamera support to set the default as the external camera when both external and internal Windows Hello cameras are present.
Windows Defender Application Guard performance improvements including optimizing document opening scenario times.
Windows Management Instrumentation (WMI) Group Policy Service (GPSVC) updating performance improvement to support remote work scenarios.
Microsoft typically delivers a big major update of Windows 10 during the springtime, with a smaller one in the fall. The company has reversed that cadence for 2021, so the update that will likely arrive in October will be full of changes.
The next major update will include new system icons, File Explorer improvements, and even the end of Windows 95-era icons. Microsoft has some even broader visual changes arriving in Windows 10, as part of a “sweeping visual rejuvenation of Windows.” The October update will also fix the rearranging apps issue on multiple monitors, add the Xbox Auto HDR feature, and even improve Bluetooth audio support.
Today’s May 2021 Update is so small that you’ll barely even notice it install. Microsoft has been using a special enablement package so that the features are simply hidden on your Windows 10 PC right now, and this update switches them on.
As always, the Windows 10 May 2021 Update will be available on Windows Update, but if you don’t see it yet, it’s because Microsoft is rolling this out in waves to ensure there are no compatibility issues. If you’re feeling brave, Microsoft does let people force the update through its installation media tool right here.
The Alienware m15 Ryzen Edition R5 is so good that it makes us wonder why Dell didn’t team up with AMD on a laptop sooner.
For
+ Strong gaming performance
+ Excellent productivity performance
+ Unique chassis
+ Not too costly for it power
Against
– Internals run hot
– Middling audio
– Bad webcam
It’s been 14 years since Alienware’s used an AMD CPU in one of its laptops, but AMD’s recent Ryzen processors have proven to be powerhouses that have generated a strong gamer fanbase. It also doesn’t hurt that AMD-based laptops have frequently undercut Intel in price. Point being, times have changed and now Team Red can easily compete with the best gaming laptops that Intel has to offer.
So it makes sense that Alienware’s finally been granted permission to board Dell’s UFO. And with the Alienware m15 Ryzen Edition R5, it’s getting a first class treatment.
Alienware m15 Ryzen Edition R5 Specifications
CPU
AMD Ryzen 7 5800H
Graphics
Nvidia GeForce RTX 3060 6GB GDDR6, 1,702 MHz Boost Clock, 125 W Total Graphics Power
Memory
16GB DDR4-3200
Storage
512GB M.2 PCIe NVMe SSD
Display
15.6 inch, 1920 x 1080, 165Hz, IPS
Networking
802.11ax Killer Wi-Fi 6, Bluetooth 5.2
Ports
USB-A 3.2 Gen 1 x 3, HDMI 2.1, USB-C 3.2 Gen 2 x 1 (DisplayPort), RJ-45 Ethernet, 3.5mm combination headphone/microphone port
Camera
720p
Battery
86 WHr
Power Adapter
240W
Operating System
Windows 10 Home
Dimensions(WxDxH)
14.02 x 10.73 x 0.9 inches (356.2 x 275.2 x 22.85 mm)
Weight
5.34 pounds (2.42 kg)
Price (as configured)
$1,649
Design of the Alienware m15 Ryzen Edition R5
Image 1 of 7
Image 2 of 7
Image 3 of 7
Image 4 of 7
Image 5 of 7
Image 6 of 7
Image 7 of 7
Unlike other recent Alienware laptops, the m15 R5 Ryzen Edition only comes in black. The “lunar light” white isn’t an option here. Still, it’s a bold design that puts the emphasis on the laptop’s build quality rather than on decoration, and it pays off. The m15 R5 feels sturdy in the hand and its smooth edges give it a unique premium flare. It’s not too plain, since lighting options for the Alienware logo on the lid plus a circular LED strip along the back rim add a touch of flair. On that note, the stylized “15” on the lid is stylish, though it can look a bit much like a “13” from the wrong angle.
Hexagonal vents that sit above the keyboard and along the back also give the m15 R5 a bit of functional decoration and help make up for the small and well hidden side vents. The keyboard on this model has four-zone RGB, but it can be a little dim in well-lit areas.
This laptop veers on the large and heavy end for systems with an RTX 3060. At 14.02 x 10.73 x 0.9 inches large and 5.34 pounds heavy, it’s generally bulkier than the Asus TUF Dash F15 we reviewed, which has a mobile RTX 3070 and is 14.17 x 9.92 x 0.78 inches large and weighs 4.41 pounds. The Acer Predator Triton 300 SE, which manages to fit a mobile RTX 3060 into a 14 inch device, is also especially impressive next to the m15 R5. Granted, both of those use lower-power processors designed for thinner machines. Specifically, the Acer is 12.7 x 8.97 x .70 inches large and weighs 3.75 pounds.
The Alienware m15 R4, which has a 10th gen 45W Intel Core i7 processor and an RTX 3070, is 14.19 x 10.86 x 0.78 inches large and weighs 5.25 pounds. That leaves it not as bulky as the m15 Ryzen Edition R5, but about as heavy.
Port selection is varied, although distribution differs from my usual preferences. The left side of the laptop only has the Ethernet port and the 3.5mm headphone/microphone jack, which is a shame as that’s where I typically like to connect my mouse. The back of the laptop has a few more connections, including the DC-in, an HDMI 2.1 port, a USB 3.2 Gen 1 Type-A port and a USB 3.2 Gen 2 Type-C port that also supports DisplayPort. The right side of the laptop has two additional USB 3.2 Gen 2 Type-A ports.
Gaming Performance on the Alienware M15 Ryzen Edition R5
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Our review configuration of the Alienware m15 Ryzen Edition R5 came equipped with an 8-core, 16-thread Ryzen R7 5800H CPU and an RTX 3060 laptop GPU. It’s the first time we’ve tested a 45W CPU with an RTX 3060 and, to that end, we’ve decided to compare it to one 35W laptop with an RTX 3070 CPU, the Asus TUF Dash F15 with an Intel Core i7-11370H, and one 35W laptop with an RTX 3060 GPU, the Acer Predator Triton 300 SE with an Intel Core i7-11375H. We’ve also thrown the Alienware m15 R4 into the mix, which has a 45W 10th gen Intel CPU and an admittedly more powerful RTX 3070, plus a significantly higher price tag than any other competitor even on its cheapest configuration (the thing starts at $2,149).
I played Control on the Alienware laptop for a half hour to get a personal feel for gaming on the system. I tended to fall between 60 – 70 fps at high settings throughout, and turning ray tracing on using its high preset dropped that to 30 – 40 fps. The fans are certainly noticeable but aren’t ear-splitting, and the laptop neither got hot-to-the-touch nor did it spray hot air on my hands.
In Shadow of the Tomb Raider’s benchmark running at highest settings, the m15 Ryzen Edition R5’s CPU seemed to do it a favor, as its 73 fps average only barely fell behind the m15 R4’s 77 fps average. The Acer laptop was next in line with 61 fps, while the Asus laptop was significantly behind all other options at 54 fps.
Scores were a bit more even in Far Cry: New Dawn’s benchmark running at ultra settings. While the m15 R4 hit 91 fps, everything else was in the 70s. The m15 Ryzen Edition R5 had an average of 79 fps, while the Asus scored 74 fps and the Acer reached 73 fps.
The m15 Ryzen Edition R5 fell to third place in the Grand Theft Auto V benchmark running at very high settings, where it hit an 82 fps average and the Asus laptop achieved an 87 fps average. The Acer laptop was significantly behind at 72 fps, while the m15 R4 was significantly ahead at 108 fps.
Red Dead Redemption 2’s benchmark running at medium settings saw the m15 Ryzen Edition R5 once again stay in third place, though by a more significant margin this time. The R5 achieved a 53 fps average, while the Asus led with 61 fps score. The Acer was once again behind at 48 fps, while the m15 R4 stayed ahead at 69 fps.
We also ran the Alienware M15 R5 Ryzen Edition through the Metro Exodus RTX benchmark 15 times in a row to test how well it holds up to a sustained heavy load. During this benchmark, it hit an average 56 fps. The CPU ran at an average 3.63-GHz clock speed while the GPU ran at an average clock speed of 1.82 GHz. The CPU’s average temperature was 90.36 degrees Celsius (194.65 degrees Fahrenheit) and the GPU’s average temperature was 82.02 degrees Celsius (179.64 degrees Fahrenheit).
Productivity Performance for the Alienware m15 Ryzen Edition R5
Image 1 of 3
Image 2 of 3
Image 3 of 3
While Alienware is a gaming brand, the use of a 45W AMD chip does open the Alienware m15 Ryzen Edition R5 up to high productivity potential.
On Geekbench 5, which is a synthetic test for tracking general PC performance, the m15 Ryzen Edition R5 hit 1,427 points on single-core tests and 7,288 points on multi-core tests. While its single core score was on the lower end when compared to the Asus TUF Dash F15’s 1,576 points and the Acer Predator Triton 300 SE’s 1,483 points, the Alienware blew those laptops away on multi-core scores. The Asus’ multi-core score was 5,185, while the Acer’s multi-core score was 5,234.
The Alienware m15 R4 was a bit more even with its AMD cousin, scoring 1,209 on single-core Geekbench 5 tests and 7,636 on the program’s multi-core benchmarks.
Unfortunately, the m15 Ryzen Edition R5 couldn’t maintain that momentum for our 25GB file transfer benchmark. Here, it transferred files at a 874.14 MBps speed, while the Asus hit 1,052.03 MBps and the Acer reached 993.13 MBps. The m15 R5 hit speeds of 1137.34 MBps.
The m15 Ryzen Edition R5 was the fastest contender in our Handbrake video encoding test, though, where we track how long it takes a computer to transcode a video down from 4K to FHD. The m15 Ryzen Edition R5 completed this task in 7:05, while the Asus took 10:41 and the Acer was even slower at 11:36. The m15 R5 almost caught up to its AMD cousin with a time of 7:07.
Display for the Alienware M15 R5 Ryzen Edition
Our configuration for the Alienware m15 Ryzen Edition R5 came with a 15.6 inch 1920 x 1080 IPS display with a 165Hz refresh rate. While it boasted impressive gaming performance and strong benchmark results, it still proved problematic for viewing content.
I watched the trailers for Nomandland and Black Widow on the m15 Ryzen Edition R5, where I found the blacks to be shallow and the viewing angles to be restrictive. In my office during the daytime, I couldn’t easily see the screen’s picture unless I was sitting directly in front of it. Turning my lights off and closing my curtain only extended viewing angles to about 30 degrees. Glare also proved to be an issue in the light, although turning lights off did fix this problem.
Colors were bright enough to pop occasionally but not consistently, with bolder tones like reds and whites holding up better than more subdued ones. Here, Black Widow came across a bit more vividly than the naturalistic style of Nomadland, so this screen might be better suited for more colorful, heavily produced films.
Our testing put the m15 Ryzen Edition R4’s color range above its closest competitors, the Asus TUF Dash F15 and Acer Predator Triton 300 SE, though not by much. With an 87.3 DCI-P3 color gamut, it’s only slightly ahead of the Asus’ 80.6% DCI-P3 score. The TUF Dash F15 had a starker difference, with a 78.5% DCI-P3 color gamut.
Our brightness testing saw the Alienware pull a more solid lead. With an average of 328 nits, it easily surpassed the Acer’s 292 nits and the Asus’ 265 nits.
The Alienware m15 R4 blew all of these systems out of the water, although the OLED screen our configuration had makes the comparison more than a bit unfair. Its DCI-P3 gamut registered at 150% while its average brightness was 460.2 nits.
To test the m15 Ryzen Edition R5’’s 165Hz screen, I also played Overwatch on it. Here, I had a much more pleasant experience than I did when watching movie trailers. The game’s bright colors appeared quite vivid and the fast refresh rate was perfectly able to keep up with the 165 fps I was hitting on Ultra settings.
Keyboard and Touchpad on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 configuration we received has a 4-zone RGB membrane keyboard, though other configurations do offer mechanical switches made in collaboration with Cherry. You can currently get that upgrade for an additional $98.
The membrane nature of this keyboard didn’t mean it wasn’t impressive, though. Keys have a noticeable resistance when pressed and 1.7mm of key travel gives you plenty of tactile feedback. I consistently scored around 83 words per minute on the 10fastfingers.com typing test, which is impressive as my average is usually around 75 wpm.
In an unusual choice, the Alienware’s audio control keys sit on the keyboard’s furthest right row rather than being mapped to the Fn row as secondary functions. Instead, the Page Up and Page Down keys that would normally be found there are secondary functions on the arrow keys.
The 4.1 x 2.4-inch touchpad doesn’t fare as well. While it has precision drivers and is perfectly smooth when scrolling with one finger, I felt too much friction when using multi-touch gestures to pull them off comfortably or consistently. For instance, when trying to switch apps with a three-fingered swipe, I would frequently accidentally pinch zoom instead.
Audio on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 has two bottom firing speakers that are loud with surprisingly decent bass, but tend to get tinny on higher notes.
I tested the m15 Ryzen Edition R5’s audio by listening to Save Your Tears by The Weeknd, which easily filled up my whole two bedroom apartment with sound. I was also surprised to be able to hear the strum of the song’s bass guitar, as it’s not uncommon for other laptops to either cut it out, make it quiet, or give it a more synth-like quality. Unfortunately, higher notes suffered from tinniness and echo.
Upgradeability of the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 is easy to open and has plenty of user customizability. Just unscrew the four screws closest to the back of the laptop, then loosen the four screws on the front (we used a PH0 Phillips Head bit).
Gently pry the case off, and you’ll see the networking card, two swappable DIMMs of RAM, the M.2 SSD and a second, open M.2 SSD slot (if you don’t buy the laptop with dual SSDs).
The only tradeoff here is that the SSDs are in a smaller, less common M.2 2230 form factor (most are 2280) , so you’ll probably need to buy a specialized drive for this laptop.
Battery Life on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 is a power hog, with half the non-gaming battery life of the RTX 3060 and RTX 3070 35W laptops we tested it against. This shouldn’t come as too much of a surprise, since it also has a 45W CPU, but don’t expect to be able to spend too much time away from an outlet.
In our non-gaming battery test, which continually streams video, browses the web and runs OpenGL tests over Wi-Fi at 150 nits of brightness, the M15 Ryzen Edition R5 held on for 3:29. That’s about 3 hours less time than we got out of both the Asus TUF Dash F15, which had a 6:32 battery life, and the Acer Predator Triton 300 SE, which lasted for 6:40.
The Alienware m15 R5, with its 45W Intel chip, also had a shorter battery life than our 35W laptops, though it was slightly longer than the m15 Ryzen Edition R5’s. It lasted 4:01 on our non-gaming test.
Heat on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5’s surface temperature was impressively cool during non-gaming use but could get toasty in select areas during our gaming benchmarks. For our tests, we measured its temperature both after 15 minutes of streaming video and during the sixth consecutive run of the Metro: Exodus extreme benchmark.
The laptop’s touchpad proved coolest during the video test, registering 81.1 degrees Fahrenheit. This was only slightly behind the center of the keyboard’s temperature, as the typer hit 85.5 degrees Fahrenheit in between the G and H keys. The bottom of the laptop was warmer, hitting 90.9 degrees, although the center-left of the display hinge is where it was hottest, registering 101.1 degrees Fahrenheit.
Our gaming test saw a mild jump in temperatures in all areas except the bottom and the hinge, where numbers spiked much higher. The touchpad was 83.3 degrees Fahrenheit and the center of the keyboard was 90.9 degrees Fahrenheit. By contrast, the bottom of the laptop was now 121.5 degrees Fahrenheit and the hot zone on the hinge was now 136.1 degrees Fahrenheit.
Despite these higher numbers, though, the laptop never became too hot to touch while gaming. It did feel pleasantly warm, however.
Webcam on the Alienware m15 Ryzen Edition R5
The Alienware M15 R4 Ryzen Edition’s 720p webcam is, like many premium gaming laptops, a bit of an afterthought. Regardless of lighting conditions, its shots always have a blocky and fuzzy appearance. Adding light also adds a distracting halo effect to silhouettes, while dimming your surroundings will just bring down detail even further.
Software and Warranty on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 comes packed with software, although most of it serves a genuinely useful purpose.
Most of these are apps like Alienware Command Center, which lets you customize lighting and thermals as well as set up macros. Some are less useful than others — Alienware Customer Connect simply exists to get you to fill out surveys — but apps like Alienware Mobile Connect, which lets you easily mirror your phone’s screen, transfer its files or take phone calls from your laptop are definitely a standout. It might be easier to navigate these functions if they were all centralized into one hub app rather than being their own standalone programs, though. My Alienware tries to be this hub app, although it’s mostly just a redirect to Alienware Command Center with a bunch of ads on the side.
This laptop also comes with typical Windows pack-ins like Microsoft Solitaire Collection and Spotify. Its default warranty is limited to one year, although you can extend it at checkout.
Configurations for the Alienware M15 R5 Ryzen Edition
Our configuration of the Alienware M15 R5 Ryzen Edition came with an AMD Ryzen 7 5800H CPU, an RTX 3060 laptop GPU, 16GB of RAM, a 512GGB SSD and a 1920 x 1080, 165Hz display for $1,649. That actually puts it towards the lower end of what’s available.
You can upgrade this laptop’s CPU to the Ryzen 9 5900HX, which has the same thread count but boosts up to 4.6 GHz, and its GPU to an RTX 3070 laptop card. Memory options range from 8GB to 32GB, while storage options range from 256GB to 2TB. You can also add on an additional SSD with the same range of options, making for up to 4TB of total combined storage.
There’s also a 360Hz version of the FHD display available, as well as a QHD version with a 240Hz refresh rate and G-Sync support.
Perhaps the most interesting option that wasn’t included on our configuration is the mechanical keyboard, which features physical ultra low-profile switches made in collaboration with CherryMX.
These upgrades can raise your price up to $2,479, with the display and keyboard upgrades being the most costly components in Dell’s customization tool. The Cherry MX keyboard will add $98 to your price at checkout, while the QHD display costs $78. The FHD @ 360Hz display is only available on the highest preset option, which locks you into a Ryzen 9 5900HX chip and starts at $2,332.
By contrast, the low end of this laptop starts at $1,567.
Bottom Line
The Alienware m15 Ryzen Edition R5 proves that Team Red and Alienware make a strong pairing . While it’s not quite the beast that the minimum $2,149 Alienware m15 R4 is, it still manages performance that equates to and sometimes beats peers in its price range on most titles, all while rocking Alienware’s unique premium looks. At $1,649 for our configuration, it’s an easy premium choice over the $1,450 Asus TUF Dash F15. And if you prefer power over size, it’s also a better option for you than the $1,400 Acer Predator Triton 300 SE.
While it’s certainly not the most portable contender and could do with more even port distribution and stronger audio, its 45W CPU lends it just enough of an edge on power to make it a solid first step into Dell’s flagship gaming brand.
Astell & Kern’s digital expertise comes good in this entertaining USB-C cable DAC
For
Notable improvement to audio
Clean, precise character
Nicely made
Against
No iOS device compatibility
No MQA support
For a relatively simple product, Astell & Kern’s first portable DAC has a rather convoluted moniker. ‘Astell & Kern AK USB-C Dual DAC Cable’ isn’t something you’d want to say out loud (or type) often but, to the company’s credit, it sums up the product perfectly: it’s a USB-C cable with two DACs inside.
Thankfully, the name doesn’t attempt to further explain its purpose, so let us fill in the gaps.
Features
Portable DACs – compact DACs that don’t rely on mains power – have arrived in force in recent years with the mission of conveniently improving the sound quality between your phone or computer and wired headphones. That’s because the digital-to-analogue converters and analogue output stages of these do-all devices are generally pretty poor.
Though wireless headphones connected to a device may be the portable audio preference of many nowadays, a wired set-up generally still offers the best performance-per-pound value, particularly if you want to play hi-res audio.
Astell & Kern AK USB-C Dual DAC Cable tech specs
Input USB-C
Output 3.5mm
Hi-res audio PCM 32-bit/384kHz, DSD256
Weight 27g
While there are a number of traditional box or USB stick portable DACs in existence, the AK USB-C Dual DAC Cable is one of an increasingly common group of DACs designed to enhance on-the-go or desktop sound quality in cable form. This Astell & Kern, like the Zorloo Ztella and THX Onyx, is essentially an extension of your headphones cable; the discreet middleman between them and your source device.
At one end is a 3.5mm output, and at the other is a USB-C connection for plugging into any device with that output, such as an Android phone, Windows 10 PC, tablet or MacOS computer. For the bulk of our testing, we use it with a Samsung Galaxy S21 and Apple MacBook Pro.
Some portable DACs, such as the multi-Award-winning Audioquest DragonFly Red, have a USB-A connection instead, but now that USB-C is becoming more prevalent it makes sense for a portable DAC like this one to adopt it. You can always buy a USB-C-to-USB-A adapter to cater for devices with such ports.
Portable DACs can often be used with Apple’s camera adapter to make them compatible with iPhones and iPads, but Astell & Kern says that isn’t the case here “due to the dual DAC incompatibility and power restrictions of iOS devices”. So iPhone users will have to look elsewhere.
The dual DACs (specifically, two Cirrus Logic CS43198 MasterHIFi chips) support native high-resolution audio playback of PCM files up to 32-bit/384kHz and DSD256. However, due to the AK USB-C Dual DAC Cable’s lack of MQA file support, Tidal HiFi subscribers won’t be able to benefit from the (MQA-encoded) hi-res Tidal Masters that are part of the tier’s offering. It’s also worth noting that the DAC has been built for sound output only, so it won’t work with headphones with an in-line remote.
A portable cable DAC is new territory for Astell & Kern – the company is most renowned for its portable music players but also makes headphones and desktop audio systems. But digital-to-analogue conversion technology is something the company is well versed in. And that shows.
For the AK USB-C Dual DAC Cable, Astell & Kern says it developed a circuit chip on a six-layer PCB just 14 x 41mm in size, featuring bespoke capacitors found in its music players, and optimised to prevent power fluctuations. The analogue amplifier (with a 2Vrms output level), meanwhile, is designed to drive even power-hungry and high-impedance headphones.
Sound
We use a range of headphones, from high-end Grados to more modest Beyerdynamic on-ears and Sennheiser Momentum earbuds – and the Astell & Kern doesn’t struggle to power any of them. However, we would be wary of your playback device’s volume output level when you first connect the DAC and plug in your headphones (especially if you’re using more than one pair) to avoid getting an unexpected earful. It’s something Astell & Kern advises in the manual, too.
Adding the AK USB-C Dual DAC Cable between these headphones and our source devices (which provide power to the DAC) makes the world of difference. As the likes of the Zorloo Ztella and Audioquest DragonFly Black have shown, even a modest outlay can make a significant improvement to your portable sound.
The Samsung Galaxy S21 is by no means the worst-sounding smartphone out there, and yet the Astell & Kern makes music come through our wired headphones much clearer, cleaner and punchier than with just a standard USB-C-to-3.5mm dongle. This little DAC doesn’t just do the basics by amplifying the sound and beefing up its tone, it also goes the extra mile to open up music and let you in on more of its detail.
Considering the increasing competition in the portable DAC market, you could say it’s a necessary mile. One of our favourite portable DACs, the Audioquest DragonFly Red, proves to be a notably more insightful and rhythmically entertaining performer – but then it is significantly pricier at £169 ($200, AU$280). For this modest amount of money, the AK USB-C Dual DAC Cable is a very attractive proposition indeed.
We play Lesley by Dave ft Ruelle and the rapper’s poignant storytelling is all the more compelling for the boost in clarity and vocal insight delivered by the DAC. The melodious synth chords, which twinkle with clarity against the contrasting backdrop, are planted with precision on either side.
It’s a similar story as we plug the Astell & Kern into our MacBook Pro and settle into Big Thief’s Shoulder, the presentation pleasantly opened up and generously populated with definition aplenty around Adrianne Lenker’s pleading vocal delivery and the warm textures of the band’s hallmark folksy guitar licks.
Build
So, it sounds good. But what’s it like to live with? After all, this is an everyday device that’s likely to sit in your pocket or on your desktop during the 9 to 5. Perhaps most crucially for a device of this nature, the AK USB-C Dual DAC Cable is compact, lightweight (27g) and well made – to the extent that we feel comfortable tossing it in a bag or shoving it down trouser pockets before long.
The twisted cable between the USB-C output and main body – made up of Technora aramid fibre at its core, wrapped by copper layers and finished with shielding treatment – makes it easy to manipulate the device into a jeans pocket when connected to a phone, and feels built to last. It also helps absorb the shock of accidental knocks, unlike USB stick designs.
While we would expect a device like this to last years, in the weeks we spend in its company we feel confident of its durability. Even when we accidentally yank the device out of our playback source with the cable a number of times, it proves hardy enough to withstand it.
While made to fit nicely into a pocket, some consideration has also clearly been taken to make the AK USB-C Dual DAC Cable look nice when it’s not hidden away – when it’s on a desktop, for example.
The metal casing at the end of the cable – comparable with one of the more compact USB sticks in our collection – has a polished finish and angled surface that resonate with the aesthetic of the company’s premium music players. Design niceties on products like these are only ever going to be the small touches, but they’re here at least.
Verdict
Before Astell & Kern announced its AK USB-C Dual DAC Cable, it wouldn’t have been a stretch to imagine the company making such a product. It has been in the portable digital audio game for years and enjoyed much success.
That know-how has been put to good use in offering USB-C device owners an affordable, practical way to soup up their smartphone or desktop sound through wired headphones. It’s such an appealing option that we can almost forgive the unwieldy name.
We’ve gotten through the expert witnesses of Epic v. Apple, and as a reward, Phil Schiller — currently an “Apple Fellow,” whatever that is, and previously the senior vice president of worldwide marketing — took the stand like a twinkly App Store St. Nick. To hear him tell it, Apple is a wonderful partner to developers, selflessly improving dev tools and responding to their needs. At times the testimony feels like a prolonged ad for iOS.
The goal of the testimony is to paint the App Store as a part of the iPhone that can’t be removed or replaced by a competing alternative. To this end, we heard in exhaustive detail about the improvements made to the iPhone that benefit the developers in the App Store. The chips. The Retina display. The accelerometer. The wireless upgrades. It’s practically an Apple event on the stand.
Among the exhaustive list, Schiller identified Metal, one of the developer tools Apple created. (Metal is a play on “close to the metal,” or writing code that’s close to the computer’s guts.) Apple’s counsel says the lawyer version of “roll tape!” and we’re treated to a 20 second clip of Tim Sweeney on stage at WWDC, praising Metal as a wonderful tool that will allow developers like Epic Games to create the next generation of improvements. Solid burn!
The overall impression I get from the list of improvements is, essentially, that Fortnite absolutely could not have launched on the first generation of iPhones — the hardware and chips couldn’t have handled the game. That is true! Also: Fortnite didn’t exist yet, so that’s a pretty good reason it couldn’t have been on those early iPhones. But there is another reason a time-traveling Fortnite couldn’t have launched on the first iPhones in 2007: the App Store didn’t exist yet, either. This particular fact is somewhat inconvenient for Apple’s argument that the iPhone and the App Store are inseparable.
Schiller’s testimony spends some quality time back in 2007, to explain the origin of the App Store. When the iPhone launched, the only apps on it were Apple’s own; all other apps were web apps. In response, there was a spate of “jailbreaking” — essentially, hacking the iPhone so you could put your own apps on it. This was the genesis of the App Store: Apple realizing that people were going to put their own apps on the iPhone no matter what it did. If it wanted control of the process, it was going to have to create an official route.
From the jump, security was going to be a concern, Schiller said. After all, the point of the phone was that you could carry it around — which involved collecting location data. So iOS was built from the ground up with this in mind, Schiller says. (This line of testimony is a rebuttal to Epic’s argument that MacOS allows side-loading, and it is therefore anticompetitive that iOS does not.) To put a stop to the jailbreaking, Apple did something unusual: rather than showing the world a finished product, it announced it was working on something. That something was the App Store.
The Steve Jobs line that Epic has touted — ”We don’t intend to make money off the App Store” — comes from these early days. At the time of this announcement, Apple didn’t know if it would make money, Schiller testified. He also suggests that the line was not a promise that Apple would not make money. The App Store was a “huge” risk, Schiller said. “We’re taking our hot new product and putting something we’ve never done before on it, and we have no apps yet! So we have no idea how this is going to do.” This is credible. What is less convincing is Schiller’s attempt to redefine what it means to “lock customers into our ecosystem,” a phrase that comes from a Jobs email entered into evidence earlier in the trial.
Look, “locked in” has an accepted meaning, and it’s not a very friendly one: prisoners, for example, are locked in. Schiller gives this the old college try anyhow, telling the court that the idea behind “locked in” was just to make services more attractive, so that customers wouldn’t want to leave. Later in the email, Jobs talks about making the ecosystem even more “sticky,” which is less menacing, but — glue traps are sticky. So are flystrips. When was the last time that being stuck to something was positive for you?
But hey, Schiller’s a marketer. He was Apple’s marketing guy for actual decades! Always be closing, baby. And so if it at times seemed like he presented Apple as though it were a selfless do-gooder, responding to devs’ requests for in-app payments — which was a then-nascent business — by building capability for that into the store, well, that’s his job. Still, presenting one of the most ruthlessly efficient cash machines in tech as a helpful friend of small developers is kind of like painting a whale shark orange and calling it a goldfish who feeds other goldfish.
Despite Schiller’s friendly demeanor, some of his testimony is a stretch. For instance, he says he doesn’t see mobile as a duopoly. He lists Samsung, Microsoft, Google, and Amazon as competition. The Amazon Fire phone was discontinued in 2015, as was the Windows Phone. Perhaps they haunt his dreams, but they certainly don’t haunt the market.
But Schiller mostly does what he needs to do for Apple — as I suppose he has for 30-odd years. He’s cheerful, pleasant to listen to, and at times, very convincing. The question in this case, though, is if marketing to a judge is as easy as marketing to Apple customers.
Blending Optane memory and QLC flash, Intel’s Optane Memory H20 is an innovative M.2 NVMe SSD that delivers a unique caching experience. It is ultra-responsive to most consumer workloads, especially repetitive tasks.
For
+ Optane caching improves system responsiveness
+ Rivals high-end NVMe SSDs in light and mixed workloads
Against
– Low endurance
– Optane caching not beneficial in all workloads
– Limited to specific systems and 1TB maximum capacity
– Slow sustained write performance after the SLC cache fills
– Lacks AES 256-bit encryption support
Features and Specifications
Not too long ago, Intel killed off most of its client Optane products to focus on one — the Optane Memory H20. The H20 marries the company’s latest QLC flash with Optane Memory to provide fast performance for most client workloads, but it can fall short in large sequential transfers. It’s a clever blend of innovation and technology, bringing improvements over the Optane Memory H10, but it still isn’t quite something that has fully won us over for day-to-day use. It also isn’t available for stand-alone purchases at retail, so it doesn’t make our list of Best SSDs.
Data caching isn’t anything new. For years, Intel has accelerated performance through data caching via various implementations. The company even has competition in the space from software vendors like Enmotus. My first experience with this technology started with the company’s Smart Response Technology a decade ago, which allowed you to use an SSD to cache data from an HDD for faster retrieval. But that was just the beginning.
Intel further refined the technology to leverage its very own Optane Memory to offer unparalleled response times compared to traditional SSDs. The company released products like the M10; low-density, fast access Optane SSDs for use with supported systems. Shortly after, Intel progressed to new designs.
Two years ago, the company released the H10, a dual-controlled, hybrid SSD that was very unique and unlike anything we’ve seen before. The idea was simple: combine the best of both worlds onto a single, slim M.2 SSD stick – high-density NAND flash for capacity along with bleeding-edge Optane Memory for speed. However, our initial impression of the NVMe SSD was rather underwhelming compared to the best SSDs available.
Since then, the company has tweaked and tuned the Rapid Storage Technology caching software with multiple optimizations and improvements. Concurrently, the company focused on advancing the H-series hardware, too. Today, we analyze the latest version of the software in use with the H10’s successor, the H20. Leveraging essentially what is an Intel SSD 670p in conjunction with a newer Optane controller, and of course, that sweet, sweet Optane Memory, Intel’s Optane Memory H20 is quite similar to the H10, only improved.
When Optane caching is enabled, the SSD aggregates the performance of both storage mediums for fast peak performance and fast access times. Intel’s H20 is an OEM-only-oriented product, however, meaning that it is highly unlikely you would ever find this SSD on sale in retail outlets. Still, they may trickle down to eBay or similar marketplaces, so purchasers of systems containing this unique SSD can upgrade their capacity.
Specifications
Product
H20 512GB
H20 1TB
Capacity (User / Raw)
512GB / 512GB
1024GB / 1024GB
Form Factor
M.2 2280 S3
M.2 2280 S3
Interface / Protocol
PCIe 3.0 x4 / NVMe 1.3
PCIe 3.0 x4 / NVMe 1.3
Optane Controller
SLMXT
SLMXT
Optane Media
1st Gen 3D Xpoint
1st Gen 3D Xpoint
Optane Capacity
32GB
32GB
NAND Controller
SM2265
SM2265
DRAM
DDR3
DDR3
NAND Flash
Intel 144L QLC
Intel 144L QLC
Sequential Read
3,300 MBps
3,300 MBps
Sequential Write
2,100 MBps
2,100 MBps
Random Read
65,000 IOPS
65,000 IOPS
Random Write
40,000 IOPS
40,000 IOPS
Security
Pyrite 2.0
Pyrite 2.0
Endurance (TBW)
185 TB
370 TB
Intel’s H20 comes in limited capacities of just 512GB and 1TB, and both models come equipped with 32GB of Optane Memory. Intel rates both for the same up to 3.3/2.1 GBps read/write speeds and up to 65,000/40,000 random read/write IOPS at a queue depth (QD) of 1. While the sequential figures aren’t groundbreaking, no flash-based SSD comes close to delivering the same random IOPS performance at low QDs.
As it is an OEM-only product, the pricing for the device is not clear, but the company states it will ship in June for PCs priced at roughly $800 and up. Additionally, the H20 comes with tight hardware requirements. The H20 is only compatible with 11th-Gen Intel Core series processors and Intel 500-series chipsets or newer. Also, you will need Windows, Intel’s RST Driver 18.1 or newer, and there is no planned retroactive support for previous-gen systems.
The H20 supports Pyrite 2.0 security but lacks AES 256-bit hardware-accelerated encryption. It also supports S.M.A.R.T. data reporting, Trim, and is rated to consume as little as 35mW at idle to reduce power consumption and heat generation.
A Closer Look
Image 1 of 3
Image 2 of 3
Image 3 of 3
Intel’s H20 comes in an M.2 2280 single-sided form factor. The drive interfaces with the host over four PCIe 3.0 lanes and communicates via the NVMe 1.3 protocol. The H20 takes advantage of the same components found in the company’s 670p and a newer and faster Optane controller. As with the H10, the H20’s SSD controllers are given two lanes each, meaning that sequential performance is limited if you don’t enable Optane Memory acceleration.
Image 1 of 3
Image 2 of 3
Image 3 of 3
Silicon Motion developed the NAND controller specifically for Intel and the company’s 144L QLC. The SM2265 is a dual-core, four-channel NVMe controller that interfaces with the flash at fast speeds of up to 1,200 MTps, roughly double the speed of the previous-gen flash. The drive does have DRAM, but very little – our 1TB sample contains only 256MB of DDR3.
Intel’s 144L QLC uses a floating gate design with three 48-layer decks stacked atop each other. Each deck can operate as SLC or QLC. Each deck can also be erased without disturbing the data on other decks, which helps reduce latency spikes caused by garbage collection. It also has four planes for handling parallel data operations and a few new reading and writing techniques to improve responsiveness.
Image 1 of 2
Image 2 of 2
Intel made revisions under the hood of the Optane Media controller, too. It features both performance and power management improvements that reduce overall power consumption, something that is needed when handling power-hungry Optane media on an M.2 form factor in its own right, and even more important when combining it with the secondary storage components. The Optane Memory is still first-gen media, however.
DLSS 2.0 off vs DLSS 2.0 on (Image credit: Nvidia)
DLSS stands for deep learning super sampling. It’s a type of video rendering technique that looks to boost framerates by rendering frames at a lower resolution than displayed and using deep learning, a type of AI, to upscale the frames so that they look as sharp as expected at the native resolution. For example, with DLSS, a game’s frames could be rendered at 1080p resolution, making higher framerates more attainable, then upscaled and output at 4K resolution, bringing sharper image quality over 1080p.
This is an alternative to other rendering techniques — like temporal anti-aliasing (TAA), a post-processing algorithm — that requires an RTX graphics card and game support (see the DLSS Games section below). Games that run at lower frame rates or higher resolutions benefit the most from DLSS.
According to Nvidia, DLSS 2.0, the most common version, can boost framerates by 200-300% (see the DLSS 2.0 section below for more). The original DLSS is in far fewer games and we’ve found it to be less effective, but Nvidia says it can boost framerates “by over 70%.” DLSS can really come in handy, even with the best graphics cards, when gaming at a high resolution or with ray tracing, both of which can cause framerates to drop substantially compared to 1080p.
In our experience, it’s difficult to spot the difference between a game rendered at native 4K and one rendered in 1080p and upscaled to 4K via DLSS 2.0 (that’s the ‘performance’ mode with 4x upscaling). In motion, it’s almost impossible to tell the difference between DLSS 2.0 in quality mode (i.e., 1440p upscaled to 4K), though the performance gains aren’t as great.
For a comparison on how DLSS impacts game performance with ray tracing, see: AMD vs Nvidia: Which GPUs Are Best for Ray Tracing?. In that testing we only used DLSS 2.0 in quality mode (2x upscaling), and the gains are still quite large in the more demanding games.
When DLSS was first released, Nvidia claimed it showed more temporal stability and image clarity than TAA. While that might be technically true, it varies depending on the game, and we much prefer DLSS 2.0 over DLSS 1.0. An Nvidia rep confirmed to us that because DLSS requires a fixed amount of GPU time per frame to run the deep learning neural network, games running at high framerates or low resolutions may not have seen a performance boost with DLSS 1.0.
Below is a video from Nvidia (so take it with a grain of salt), comparing Cyberpunk 2007 gameplay at both 1440p resolution and 4K with DLSS 2.0 on versus DLSS 2.0 off.
DLSS is only available with RTX graphics cards, but AMD is working on its own alternative for Team Red graphics cards. AMD Fidelity FX Super Resolution (FSR) is supposed to debut in 2021. It will require separate support from games, and we haven’t seen it in action yet. But like other FidelityFX technologies, it’s supposed to be GPU agnostic, meaning it will work on Nvidia and even Intel GPUs that have the necessary hardware features. We’re also expecting the next Nintendo Switch to have DLSS via an integrated SoC designed by Nvidia.
DLSS Games
In order to use DLSS, you need an RTX graphics card and need to be playing a game that supports the feature. You can find a full list of games supporting DLSS as of April via Nvidia below. Unreal Engine and Unity Engine also both have support for DLSS 2.0, meaning games using those engines should be able to easily implement DLSS.
Anthem
Battlefield V
Bright Memory
Call of Duty: Black Ops Cold War
Call of Duty: Modern Warfare
Call of Duty: Warzone
Control
CRSED: F.O.A.D. (Formerly Cuisine Royale)
Crysis Remastered
Cyberpunk 2077
Death Stranding
Deliver Us the Moon
Edge of Eternity
Enlisted
F1 2020
Final Fantasy XV
Fortnite
Ghostrunner
Gu Jian Qi Tan Online
Iron Conflict
Justice
Marvel’s Avengers
MechWarrior 5: Mercenaries
Metro Exodus
Metro Exodus PC Enhanced Edition
Minecraft With RTX For Windows 10
Monster Hunter: World
Moonlight Blade
Mortal Shell
Mount & Blade II: Bannerlord
Nioh 2 – The Complete Edition
Outriders
Pumpkin Jack
Shadow of the Tomb Raider
System Shock
The Fabled Woods
The Medium
War Thunder
Watch Dogs: Legion
Wolfenstein: Youngblood
Xuan-Yuan Sword VII
DLSS 2.0 and DLSS 2.1
In March 2020, Nvidia announced DLSS 2.0, an updated version of DLSS that uses a new deep learning neural network that’s supposed to be up to 2 times faster than DLSS 1.0 because it leverages RTX cards’ AI processors, called Tensor Cores, more efficiently. This faster network also allows the company to remove any restrictions on supported GPUs, settings and resolutions.
DLSS 2.0 is also supposed to offer better image quality while promising up to 2-3 times the framerate (in 4K Performance Mode) compared to the predecessor’s up to around 70% fps boost. Using DLSS 2.0’s 4K Performance Mode, Nvidia claims an RTX 2060 graphics card can run games at max settings at a playable framerate. Again, a game has to support DLSS 2.0, and you need an RTX graphics card to reap the benefits.
The original DLSS was apparently limited to about 2x upscaling (Nvidia hasn’t confirmed this directly), and many games limited how it could be used. For example, in Battlefield V, if you have an RTX 2080 Ti or faster GPU, you can only enable DLSS at 4K — not at 1080p or 1440p. That’s because the overhead of DLSS 1.0 often outweighed any potential benefit at lower resolutions and high framerates.
In September 2020, Nvidia released DLSS 2.1, which added an Ultra Performance Mode for super high-res gaming (9x upscaling), support for VR games, and dynamic resolution. The latter, an Nvidia rep told Tom’s Hardware, means that, “The input buffer can change dimensions from frame to frame while the output size remains fixed. If the rendering engine supports dynamic resolution, DLSS can be used to perform the required upscale to the display resolution.” Note that you’ll often hear people referring to both the original DLSS 2.0 and the 2.1 update as “DLSS 2.0.”
DLSS 2.0 Selectable Modes
One of the most notable changes between the original DLSS and the fancy DLSS 2.0 version is the introduction of selectable image quality modes: Quality, Balanced, or Performance — and Ultra Performance with 2.1. This affects the game’s rendering resolution, with improved performance but lower image quality as you go through that list.
With 2.0, Performance mode offered the biggest jump, upscaling games from 1080p to 4K. That’s 4x upscaling (2x width and 2x height). Balanced mode uses 3x upscaling, and Quality mode uses 2x upscaling. The Ultra Performance mode introduced with DLSS 2.1 uses 9x upscaling and is mostly intended for gaming at 8K resolution (7680 x 4320) with the RTX 3090. While it can technically be used at lower target resolutions, the upscaling artifacts are very noticeable, even at 4K (720p upscaled). Basically, DLSS looks better as it gets more pixels to work with, so while 720p to 1080p looks good, rendering at 1080p or higher resolutions will achieve a better end result.
How does all of that affect performance and quality compared to the original DLSS? For an idea, we can turn to Control, which originally had DLSS 1.0 and then received DLSS 2.0 support when released. (Remember, the following image comes from Nvidia, so it’d be wise to take it with a grain of salt too.)
One of the improvements DLSS 2.0 is supposed to bring is strong image quality in areas with moving objects. The updated rendering in the above fan image looks far better than the image using DLSS 1.0, which actually looked noticeably worse than having DLSS off.
DLSS 2.0 is also supposed to provide an improvement over standard DLSS in areas of the image where details are more subtle.
Nvidia promised that DLSS 2.0 would result in greater game adoption. That’s because the original DLSS required training the AI network for every new game needed DLSS support. DLSS 2.0 uses a generalized network, meaning it works across all games and is trained by using “non-game-specific content,” as per Nvidia.
For a game to support the original DLSS, the developer had to implement it, and then the AI network had to be trained specifically for that game. With DLSS 2.0, that latter step is eliminated. The game developer still has to implement DLSS 2.0, but it should take a lot less work, since it’s a general AI network. It also means updates to the DLSS engine (in the drivers) can improve quality for existing games. Unreal Engine 4 and Unity have both also added DLSS 2.0 support, which means it’s trivial for games based on those engines to enable the feature.
How Does DLSS Work?
Both the original DLSS and DLSS 2.0 work with Nvidia’s NGX supercomputer for training of their respective AI networks, as well as RTX cards’ Tensor Cores, which are used for AI-based rendering.
For a game to get DLSS 1.0 support, first Nvidia had to train the DLSS AI neural network, a type of AI network called convolutional autoencoder, with NGX. It started by showing the network thousands of screen captures from the game, each with 64x supersample anti-aliasing. Nvidia also showed the neural network images that didn’t use anti-aliasing. The network then compared the shots to learn how to “approximate the quality” of the 64x supersample anti-aliased image using lower quality source frames. The goal was higher image quality without hurting the framerate too much.
The AI network would then repeat this process, tweaking its algorithms along the way so that it could eventually come close to matching the 64x quality with the base quality images via inference. The end result was “anti-aliasing approaching the quality of [64x Super Sampled], whilst avoiding the issues associated with TAA, such as screen-wide blurring, motion-based blur, ghosting and artifacting on transparencies,” Nvidia explained in 2018.
DLSS also uses what Nvidia calls “temporal feedback techniques” to ensure sharp detail in the game’s images and “improved stability from frame to frame.” Temporal feedback is the process of applying motion vectors, which describe the directions objects in the image are moving in across frames, to the native/higher resolution output, so the appearance of the next frame can be estimated in advance.
DLSS 2.0 gets its speed boost through its updated AI network that uses Tensor Cores more efficiently, allowing for better framerates and the elimination of limitations on GPUs, settings and resolutions. Team Green also says DLSS 2.0 renders just 25-50% of the pixels (and only 11% of the pixels for DLSS 2.1 Ultra Performance mode), and uses new temporal feedback techniques for even sharper details and better stability over the original DLSS.
Nvidia’s NGX supercomputer still has to train the DLSS 2.0 network, which is also a convolution autoencoder. Two things go into it, as per Nvidia: “low resolution, aliased images rendered by the game engine” and “low resolution, motion vectors from the same images — also generated by the game engine.”
DLSS 2.0 uses those motion vectors for temporal feedback, which the convolution autoencoder (or DLSS 2.0 network) performs by taking “the low resolution current frame and the high resolution previous frame to determine on a pixel-by-pixel basis how to generate a higher quality current frame,” as Nvidia puts it.
The training process for the DLSS 2.0 network also includes comparing the image output to an “ultra-high-quality” reference image rendered offline in 16K resolution (15360 x 8640). Differences between the images are sent to the AI network for learning and improvements. Nvidia’s supercomputer repeatedly runs this process, on potentially tens of thousands or even millions of reference images over time, yielding a trained AI network that can reliably produce images with satisfactory quality and resolution.
With both DLSS and DLSS 2.0, after the AI network’s training for the new game is complete, the NGX supercomputer sends the AI models to the Nvidia RTX graphics card through GeForce Game Ready drivers. From there, your GPU can use its Tensor Cores’ AI power to run the DLSS 2.0 in real-time alongside the supported game.
Because DLSS 2.0 is a general approach rather than being trained by a single game, it also means the quality of the DLSS 2.0 algorithm can improve over time without a game needing to include updates from Nvidia. The updates reside in the drivers and can impact all games that utilize DLSS 2.0.
This article is part of the Tom’s Hardware Glossary.
Widespread flaws affecting Wi-Fi have been disclosed to the public by security researcher Mathy Vanhoef nine months after he tipped the Wi-Fi Alliance off about the problem. The vulnerabilities, reported by Gizmondo from a site set up by Vanhoef exploit mistakes in the implementation of Wi-Fi standards, and can affect any Wi-Fi device no matter how old, and running any level of security including WPA 2 and 3.
The ‘fragmentation and aggregation attacks’ – FragAttacks for short – are 12 different vulnerabilities that see Wi-Fi devices leak user data if probed in the right way. Three of the flaws are baked into the Wi-Fi standard itself, while the others flow from programming errors in specific products. The flaws have likely been lurking since Wi-Fi was first released in 1997, as even the venerable WEP protocol is vulnerable – though you really should have moved on from WEP by now, as it’s easily broken.
By taking advantage of the way some routers accept plaintext during handshakes, for example, or the way some networks cache data, intruders could intercept personal data, or even direct users to fake websites. Vanhoef talks us through the attacks in this YouTube video, remotely controlling a smart plug and compromising an outdated Windows 7 PC.
“The biggest risk in practice,” Vanhoef writes, “is likely the ability to abuse the discovered flaws to attack devices in someone’s home network. For instance, many smart home and internet-of-things devices are rarely updated, and Wi-Fi security is the last line of defense that prevents someone from attacking these devices. Unfortunately, due to [these] vulnerabilities, this last line of defense can now be bypassed.”
There is some good news, however: most of the flaws are hard to exploit, patches are available for many devices, including three from Microsoft going all the way back to Windows 7, and from all major router manufacturers (though not all models have received new firmware yet). At the time of writing Vanhoef said he wasn’t aware of any attacks in the wild using the exploits. This could be a good time to ditch your service provider’s router for the latest and best routers.
back in February, now has full specs, pricing and is opening pre-orders for the machine. It will come in three configurations, starting at $999.
That base model has an Intel Core i5-1135G7 processor, 8GB of DDR4 RAM, a 256GB NVMe, Wi-Fi 6 and will run Windows 10 Home. A $1,399 performance configuration bumps the processor up to an i7-1165G7 and doubles the RAM and storage to 16GB and 512GB, respectively. A professional model, starting at $1,999, has an i7-1185G7, 32GB of RAM, a 1TB SSD, support for vPro and runs Windows 10 Pro.
There will also be a DIY edition, starting at $749 barebones, that you build yourself from a kit and customize with parts and modules.
Each laptop will also have a 3:2, 2256 x 1504 display, 1080p webcam with a privacy switch, a 55Wh battery and a keyboard with 1.5 mm of key travel, all features you might find in one of the
best ultrabooks
. The entire motherboard is replaceable to allow upgrades to future generations of processors, which are typically soldered to the board on laptops.
Pre-orders are starting today in the United States, and Canada will come soon, with Asia and Europe coming before the end of the year.
But like most tech companies, Framework hasn’t been immune to supply chain issues, which it says will “limit the number of Framework Laptops we have available at launch.” The company will take its pre-orders in small batches to ensure it can fulfill orders. The first batch, Framework says, will ship in July, with more to come. A pre-order requires a $100 refundable deposit, and the balance will be paid when it’s ready to ship.
While the Framework Laptop sounds promising on paper, other small computer vendors have faced issues with fulfillment. Eve Devices, for instance, had issues fulfilling its Eve V, and some potential buyers have proven far more cautious around its Spectrum monitor and second-gen convertible. But Framework is acknowledging the difficulty in sourcing parts right now, so at least it’s being straightforward there.
The modular Framework Laptop is now available for preorders, with prices starting at $999 for a fully assembled computer or $749 for the DIY Edition, which adds the option of paying less upfront if you’re willing bring your own RAM, SSD, charge, and Wi-Fi card.
For prebuilt systems, Framework is offering three starting configurations. There’s the entry level, $999 Base, which offers a Core i5-1135G7 processor, 8GB of RAM, a 256GB NVMe SSD, and Wi-Fi 6. The Performance model, starting at $1,399, ups the processor to a Core i7-1165G7 and doubles the RAM and SSD storage. And the Professional configuration, starting at $1,999, has a Core i7-1185G7, 32GB of RAM, 1TB of storage, and Windows 10 Pro.
Whether fully prebuilt or DIY, the Framework features a 13.5-inch 2256 x 1504 screen, a 1080p 60fps webcam, a 55Wh battery, and a 2.87-pound aluminum chassis. Ports are handled through a customizable expansion card system, which allows users to select up to four from a selection of USB-C, USB-A, HDMI, DisplayPort, and microSD slots. There’s also the option to add extra storage through removable USB-C SSDs, too, and Framework promises more port options in the future.
To preorder, Framework is only asking that users put down a $100 deposit, with the first laptops set to ship at the end of July. Preorders are only available in the US for now; Canada is promised in “the next few weeks,” and European and Asian availability is set for later this year. Additionally, Framework warns that due to the ongoing global supply shortages, it’ll have limited supply at launch but plans to do its preorders in batches that should ship throughout the year.
Framework is obviously a new company with an untested platform here. It is offering a 30-day return guarantee and a one-year limited warranty, though, which might help anyone deciding whether to plunk down the cash for a new computer.
Gabe Newell, the head of Valve Software, hinted to students in New Zealand that the company might expand its Steam platform, or at least some games, to the console space later this year. The comment is vague at best, but at least it shows that the owner of one of the largest game distribution platforms has not given up its living room gaming plans.
Earlier this week, Gabe Newell spoke to students at Sancta Maria College in Auckland, New Zealand, and was asked whether Steam would be “porting any games on consoles, or [would] it just stay on PC?” The response was imprecise, but we cannot really expect anyone to disclose business plans at an event like this.
“You will get a better idea of that by the end of this year… and it won’t be the answer you expect,” Newell said. “You’ll say, ‘Ah-ha! Now I get what he was talking about.'”
The whole conversation had been recorded by a student who later uploaded it to Reddit, Ars Technica first reported.
Valve’s track record with game consoles in particular and living room gaming in general has been bumpy at best. On the one hand, the company successfully ported its games to consoles from Microsoft and Sony in the past, including the very successful The Orange Box on Xbox 360 and PS3. On the other hand, Valve’s Steam Machines initiatives has failed, just like its console oriented SteamOS. Valve’s Steam Link, which allowed to stream games from a local PC to a TV, has also failed to get popular enough for the company to keep selling the product.
For gamers, getting their Steam libraries on consoles would be a thing of dreams. It’s unclear how Steam would work on those systems, though, as Nintendo, Xbox and Sony all run their own exclusive stores on their platforms. Furthermore, far from all Steam games have versions for consoles, and developing a Windows or Linux emulator for Xbox or PlayStation is one heck of a task. Streaming games to consoles might be a way into the living room for Valve, but controller options may be an obstacle there.
Newell has been living in New Zealand since March 2020, when he was stuck as the Covid-19 pandemic hit. Since he doesn’t often do big appearances in the games industry, it makes some sense that a bunch of New Zealand students were the first to hear his thoughts on upcoming announcements.
The Aorus FV43U misses a couple of things as a TV replacement, but for gaming, it has few equals. A huge and accurate color gamut coupled with high contrast, 4K resolution and 144 Hz makes it a great choice for both PC and console gamers.
For
+ Class-leading contrast
+ Huge color gamut
+ Accurate out of the box
+ Excellent HDR
+ Solid gaming performance
Against
– No 24p support
– No Dolby Vision
Features and Specifications
If you’re looking for a jumbo-sized gaming monitor, there are plenty of routes you can take. There are multiple sizes of ultrawide 21:9 screens ranging from 34 to 38 inches diagonal. Then there’s the mega-wide 32:9, 49-inch genre. Or you can stick with flat panels in the 16:9 aspect ratio and go 32 inches or larger. Many simply opt for a TV, opening up the field to extra large displays that can top 80 inches.
If you want to stick with a desktop configuration though, the 43-inch category is a good choice. It’s large but not so big that you can’t sit close. It’s possible to play from 3 or 4 feet away, see the entire screen, and fill your peripheral vision with the image. And the 16:9 aspect ratio that 43-inch monitors come in means plenty of height, something that ultrawide and mega-wide monitors don’t have.
You can typically put a 43-inch gaming monitor on your desktop for around $1,500. That’s more than many 55-inch TVs but a computer monitor delivers a few things, like DisplayPort and high refresh rates, that consumer TVs do not. The Gigabyte Aorus FV43U makes the comparison a little easier, however, as it’s going for $1,000 as of writing.
The FV43U is a 16:9 VA panel competing with the best 4K gaming monitors with a 144 Hz refresh rate, AMD FreeSync, HDR and a quantum dot backlight that’s specced to reach 1,000 nits brightness. It also delivers decent sound from its built-in speakers, thanks to multiple sound modes. Let’s take a look.
Gigabyte Aorus FV43U Specs
Panel Type / Backlight
VA / W-LED, edge array
Screen Size & Aspect Ratio
43 inches / 16:9
Max Resolution & Refresh Rate
3840×2160 @ 144 Hz
FreeSync: 48-144 Hz
Native Color Depth & Gamut
10-bit (8-bits+FRC) / DCI-P3
DisplayHDR 1000, HDR10
Response Time (GTG)
1ms
Brightness
1,000 nits
Contrast
4,000:1
Speakers
2x 12w
Video Inputs
1x DisplayPort 1.4 w/DSC
2x HDMI 2.1, 1x USB-C
Audio
2x 3.5mm headphone output
USB 3.0
1x up, 2x down
Power Consumption
54.3w, brightness @ 200 nits
Panel Dimensions
38.1 x 25.1 x 9.9 inches
WxHxD w/base
(967 x 638 x 251mm)
Panel Thickness
3.5 inches (88mm)
Bezel Width
Top/sides: 0.4 inch (10mm)
Bottom: 1 inch (25mm)
Weight
33.8 pounds (15.4kg)
Warranty
3 years
By starting with a VA panel, the FV43U is already ahead of many premium gaming monitors that rely on lower contrast IPS technology. Most IPS monitors are specced for around 1,000:1 contrast, while the FV43U boasts 4,000:1 on its sheet and topped that dramatically in our testing with SDR and extremely when it came to HDR. HDR is aided by the monitor’s 1,000-nit backlight enhanced by a quantum dot filter for greater color volume, which our testing will also confirm.
Video processing leaves nothing under the table. The FV43U is one of the few 4K displays that can run at 144 Hz. It manages this over a single DisplayPort cable using Display Stream Compression (DSC). That means it can process 10-bit color, though it uses Frame Rate Control (FRC) to achieve this. FreeSync operates from 48-144 Hz in SDR and HDR modes. G-Sync also works with the same signals as verified by our tests, even though it’s not Nvidia-certified. (You can see how by checking out our How to Run G-Sync on a FreeSync Monitor tutorial).
Peripheral features include two HDMI 2.1 ports, which support console operation, namely the PS5 and Xbox Series X, with variable refresh rates up to 120 Hz. The monitor’s USB-C port accepts Ultra HD signals up to 144 Hz. This is common among USB-C monitors as USB-C can replicate DisplayPort functions, but you’ll need a graphics card with USB-C, of course. The monitor’s USB-C port also allows for KVM switching (allowing you to control multiple PCs with a single keyboard, monitor and mouse) through additional USB 3.0 ports.
Assembly and Accessories
Two solid metal stand pieces bolt in place on the bottom if you set up on a desktop or entertainment center. Wall mounting is supported by a 200mm lug pattern in back. You’ll have to source your own bolts which should be part of any bracket kit.
Product 360
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
From the front, the FV43U looks like any modern television with a narrow bezel around the top and sides and larger bit of molded trim across the bottom. The Aorus logo and a power LED are visible in the center. The stand puts the panel a bit less than 3 inches from the table, so it’s a good height for a desktop if you plan to sit around 4 feet back. The anti-glare layer is more reflective than most smaller screens, so plan placement accordingly if you have windows in your room.
A joystick for controlling the on-screen display (OSD) menu joystick is prominently situated on the panel’s bottom center but the easiest way to control the FV43U is with its tiny remote. It only has a few keys but they’re enough to zip through the OSD, change inputs and control the gaming features.
The back is where you’ll find most of the styling elements. Two slanted shapes are rendered in shiny plastic, along with an Aorus logo in the center. The rest of the finish is matte and features some brushed textures. Angles and straight lines are the order of the day with a generous grill at the top for heat dissipation. Speakers fire from the bottom vents and deliver 12W apiece (more on that in the Hands-on section).
The input panel is on the right side which makes it easily accessible. You get two HDMI 2.1, one DisplayPort 1.4 and a USB-C, which also supports 144 Hz and Adaptive-Sync. The HDMIs are limited to 120 Hz but support Adaptive-Sync and 4K resolution, making it fit for console gaming.
OSD Features
The OSD looks just like the menu found in all Aorus monitors but you can make it larger so it can be more legible from across the room.
The Gaming sub-menu has everything needed for competitive gameplay. At the top is Aim Stabilizer Sync, which is a backlight strobe for blur reduction. It’s one of the rare implementations that can work in concert with Adaptive-Sync, and it manages to do this without reducing brightness too much out of the box (of course, you can always turn the brightness up).
Black Equalizer makes shadow detail more visible; Super Resolution adds edge enhancement (not in a good way), Display Mode changes the aspect ratio and Overdrive offers four options. Balance is the best one, as it has good blur reduction, no visible ghosting and allows you to toggle Adaptive-Sync on or off.
The Picture menu offers an extensive array of image modes (eight, plus three custom memories), along with color temp and gamma presets and something we normally see only on professional screens: selectable color gamuts. You can choose between Adobe RGB, DCI-P3, sRGB or Auto, but in our tests, Auto did not automatically switch the color gamut for different signal types. That means that if we wanted to watch SDR content in the sRGB it’s made in, we had to select the gamut manually.
You also get Local Dimming, which increases contrast significantly. It makes the picture very bright as well, but highlight and shadow detail remain solid, so it is perfectly usable. However, we recommend leaving it off unless your room has a lot of ambient light because you can’t reduce brightness when it’s on. If you prefer a Low Blue Light mode for reading, that feature is in the OSD too.
A single press of the large button in the center of the remote’s nav pad brings up a quick menu. Pressing left opens the Aorus dashboard, which can display your PC component’s internal temperatures and fan speeds. You’ll need a USB connection for this, but most motherboards will transmit the information to the FV43U.
A right press brings up Game Assist, which offers timers, counters, refresh rate info and aiming points. You also get a single cross in the OSD and can create additional reticles if you download the Aorus desktop app. Additionally, the OSD offers alignment marks in case you plan to set up additional FV43Us in a multi-screen configuration. Now that would be super cool! We’re thinking ultimate desktop flight simulator.
Gigabyte Aorus FV43U Calibration Settings
The FV43U comes set to its Green (yes, that’s the term used in the OSD) picture mode. It has nothing to do with the color green but is fairly accurate out of the box – enough to make our Calibration Not Required list. But if you’re a perfectionist and want to tweak the image, choose the User Define color temp and adjust the RGB sliders. Gamma presets and color gamut options are also available. For the full native gamut, choose Auto or Adobe RGB. Either will deliver just over 100% of DCI-P3 coverage. sRGB is also very accurate, but we found it better to choose the sRGB picture mode rather than the sRGB gamut mode. Below are our recommended calibration settings for SDR on the Gigabyte Aorus FV43U.
Picture Mode
Green
Brightness 200 nits
13
Brightness 120 nits
4
Brightness 100 nits
2 (min. 89 nits)
Contrast
50
Gamma
2.2
Color Space
Auto or Adobe
Color Temp User
Red 100, Green 97, Blue 99
When HDR content is present, there are four additional picture modes available: HDR1000, HLG, Game and Movie. HDR1000 is the most accurate, but locks out all image controls. Game and Movie allow for brightness and contrast adjustments and toggling and local dimming. We’ll explain that in more detail in the HDR tests.
Gaming and Hands-on
A question that should be answered when one considers buying a 43-inch gaming monitor is, will it function as a TV? Since some FV43Us will wind up in living rooms or entertainment centers, it’s important to know whether it can play well with things like disc players or streaming boxes.
There is no internal tuner so technically, the FV43U is not a TV. But its HDMI 2.1 inputs can accept input from any cable or satellite receiver, as well as a 4K disc player or streaming box like Apple TV. We tried a Philips BDP-7501 player and an Apple TV source. SDR and HDR10 signals were supported fine with one omission, 24p. Film cadences are present on any Blu-ray and in many streamed shows and movies from streaming services like Netflix and Amazon Prime (in addition to 50 and 60 Hz). The FV43U converted these streams to 60 Hz, which caused a bit of stuttering here and there. It wasn’t pervasive, but we occasionally saw artifacts. Note that the FV43U, like most computer monitors, doesn’t support Dolby Vision. We’ve only seen a few pro screens that includeDolby Vision.
As a monitor for controlling Windows, the FV43U was a joy to use. With its vast area, we could clearly view four or five documents simultaneously. Sitting about 4 feet back, the pixel structure was invisible, but if we sat closer we were just able to see the dots. Color, meanwhile, was beautifully saturated, great for watching YouTube and browsing the web. If you want perfect accuracy for web browsing, the sRGB mode is available with a few clicks of the remote.
Gaming is also a blast with a screen this big. SDR games like Tomb Raider rendered in vivid hues with deep blacks, bright whites and superb contrast. The large dynamic range and accurate gamma mean that you’ll see all the detail present in the original content. That lends a realism seen on only the very best computer monitors.
HDR games, like Call of Duty: WWII, also showed tremendous depth on the FV43U. We played exclusively in the HDR1000 picture mode because of its very accurate luminance and grayscale tracking. The FV43Us large color gamut was put to good use here. It was readily apparent in skin tones and natural earth shades, like brown and green. That, coupled with nearly 39,000:1 contrast, made surfaces and textures pop with a tactility that we’ve only seen from premium screens like the Acer Predator CG437K or the Asus ROG Swift PG43UQ (both go for $1,500 as of writing). Without a full-array local dimming (FALD) backlight like that Acer Predator X27 and Asus ROG Swift PG27UQ have, the FV43U doesn’t quite make the very top tier of the best HDR monitors. But it comes awfully close to their image quality while delivering a lot more screen area.
The monitor’s two 12W speakers deliver sound that’s better than what you’ll hear from smaller monitors with much more bass and overall presence. Five audio modes help you tailor sound to your preference. If you’d rather use your best gaming headset, there’s a 3.5mm jack and an additional analog output for external systems.
The Xbox has gone through several visual periods during its life span, from an edgy and yet somehow dorky green alien thing, to a modern look that could be described as “I know how to use Excel but I can still have fun.” But like stumbling on a Facebook album from high school, you can still hold on to a bit of the past. As spotted by senior editor Tom Warren, the original Xbox background is now an option for the Xbox Series X / S.
The new (old) styling was added as a new dynamic background as part of Tuesday’s system update, which notably also brought improvements to quick resume. Titled “The Original,” it looks like a higher-resolution version of the glowing green orb that was at the center of the first Xbox’s user interface. Please note: it’s not the interface itself (Microsoft wouldn’t abandon tiles like that), but it is a recognizable part of it.
My experience with the original Xbox is admittedly secondhand. To me, it was the loud box that lived at my friend’s house and let us play Halo: Combat Evolved. But I do think you can get a pretty solid hit of nostalgia just by looking at this background and remembering what used to be. A simpler, more green time, when consoles were consoles and not Metro-inspired (or I guess Fluent Design-inspired) pseudo-Windows machines.
Microsoft and the Xbox team have been through a lot since the 2001 launch of the Xbox — the Xbox One was briefly positioned as a sort of cable box — but there’s some charm missing in the current dashboard and user experience. That charm was exchanged for a mostly better, if more complicated experience overall, but the heart still remembers what the brain forgot.
For a longer trip down memory lane, check out our visual history of the Xbox Dashboard and ponder with me how the Xbox 360’s “Blades” could be crammed on the Series X and S.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.