Days Gone is an open-world action adventure zombie shooter set in the Pacific Northwest of the United States. You’re Deacon, a military veteran who’s been caught in the middle of a global zombie apocalypse. The game takes place two years after the first outbreak, and the world as we know has ended. Hordes of zombie are scattered all around the lands, lurking and hunting the last survivors who have teamed up to improve their odds of survival.
Naturally, the various factions fight over whatever resources are left in the world, and as a mercenary, you’re caught in the middle of it all. As the plot progresses, the National Emergency Restoration Organization (NERO) also plays a role in uncovering more details of what happened.
Just like similar titles, Days Gone plays in the third-person perspective. The developer Bend Studio did a great job modeling the lush American wilderness, and placed hordes of zombies into it. As you make your way through the woods, you often have the choice of sneaking or using stealth to achieve your objective. Some missions are pure stealth, though, and in others you’ll encounter a zerg rush of zombies, which requires you use the environment to survive—shooting your way through is not an option.
Originally launched in April 2019 as a PlayStation 4 exclusive, Days Gone uses Unreal Engine 4 in DirectX 11 mode on PC. Compared to the console version, a lot of polish has been added, bugs fixed, and the graphics menu improved. In this article, we test performance and hardware requirements of Days Gone on 23 graphics cards in three resolutions.
Microsoft’s next major Windows 10 update is starting to roll out to devices today. The Windows 10 May 2021 Update focuses on improving remote work scenarios, with changes like being able to use multiple Windows Hello cameras on a single machine. That’s particularly useful for Surface devices that owners might want to connect to a monitor with an additional webcam while working from home.
Here are the full new features of the Windows 10 May 2021 Update (version 21H1):
Windows Hello multicamera support to set the default as the external camera when both external and internal Windows Hello cameras are present.
Windows Defender Application Guard performance improvements including optimizing document opening scenario times.
Windows Management Instrumentation (WMI) Group Policy Service (GPSVC) updating performance improvement to support remote work scenarios.
Microsoft typically delivers a big major update of Windows 10 during the springtime, with a smaller one in the fall. The company has reversed that cadence for 2021, so the update that will likely arrive in October will be full of changes.
The next major update will include new system icons, File Explorer improvements, and even the end of Windows 95-era icons. Microsoft has some even broader visual changes arriving in Windows 10, as part of a “sweeping visual rejuvenation of Windows.” The October update will also fix the rearranging apps issue on multiple monitors, add the Xbox Auto HDR feature, and even improve Bluetooth audio support.
Today’s May 2021 Update is so small that you’ll barely even notice it install. Microsoft has been using a special enablement package so that the features are simply hidden on your Windows 10 PC right now, and this update switches them on.
As always, the Windows 10 May 2021 Update will be available on Windows Update, but if you don’t see it yet, it’s because Microsoft is rolling this out in waves to ensure there are no compatibility issues. If you’re feeling brave, Microsoft does let people force the update through its installation media tool right here.
The Alienware m15 Ryzen Edition R5 is so good that it makes us wonder why Dell didn’t team up with AMD on a laptop sooner.
For
+ Strong gaming performance
+ Excellent productivity performance
+ Unique chassis
+ Not too costly for it power
Against
– Internals run hot
– Middling audio
– Bad webcam
It’s been 14 years since Alienware’s used an AMD CPU in one of its laptops, but AMD’s recent Ryzen processors have proven to be powerhouses that have generated a strong gamer fanbase. It also doesn’t hurt that AMD-based laptops have frequently undercut Intel in price. Point being, times have changed and now Team Red can easily compete with the best gaming laptops that Intel has to offer.
So it makes sense that Alienware’s finally been granted permission to board Dell’s UFO. And with the Alienware m15 Ryzen Edition R5, it’s getting a first class treatment.
Alienware m15 Ryzen Edition R5 Specifications
CPU
AMD Ryzen 7 5800H
Graphics
Nvidia GeForce RTX 3060 6GB GDDR6, 1,702 MHz Boost Clock, 125 W Total Graphics Power
Memory
16GB DDR4-3200
Storage
512GB M.2 PCIe NVMe SSD
Display
15.6 inch, 1920 x 1080, 165Hz, IPS
Networking
802.11ax Killer Wi-Fi 6, Bluetooth 5.2
Ports
USB-A 3.2 Gen 1 x 3, HDMI 2.1, USB-C 3.2 Gen 2 x 1 (DisplayPort), RJ-45 Ethernet, 3.5mm combination headphone/microphone port
Camera
720p
Battery
86 WHr
Power Adapter
240W
Operating System
Windows 10 Home
Dimensions(WxDxH)
14.02 x 10.73 x 0.9 inches (356.2 x 275.2 x 22.85 mm)
Weight
5.34 pounds (2.42 kg)
Price (as configured)
$1,649
Design of the Alienware m15 Ryzen Edition R5
Image 1 of 7
Image 2 of 7
Image 3 of 7
Image 4 of 7
Image 5 of 7
Image 6 of 7
Image 7 of 7
Unlike other recent Alienware laptops, the m15 R5 Ryzen Edition only comes in black. The “lunar light” white isn’t an option here. Still, it’s a bold design that puts the emphasis on the laptop’s build quality rather than on decoration, and it pays off. The m15 R5 feels sturdy in the hand and its smooth edges give it a unique premium flare. It’s not too plain, since lighting options for the Alienware logo on the lid plus a circular LED strip along the back rim add a touch of flair. On that note, the stylized “15” on the lid is stylish, though it can look a bit much like a “13” from the wrong angle.
Hexagonal vents that sit above the keyboard and along the back also give the m15 R5 a bit of functional decoration and help make up for the small and well hidden side vents. The keyboard on this model has four-zone RGB, but it can be a little dim in well-lit areas.
This laptop veers on the large and heavy end for systems with an RTX 3060. At 14.02 x 10.73 x 0.9 inches large and 5.34 pounds heavy, it’s generally bulkier than the Asus TUF Dash F15 we reviewed, which has a mobile RTX 3070 and is 14.17 x 9.92 x 0.78 inches large and weighs 4.41 pounds. The Acer Predator Triton 300 SE, which manages to fit a mobile RTX 3060 into a 14 inch device, is also especially impressive next to the m15 R5. Granted, both of those use lower-power processors designed for thinner machines. Specifically, the Acer is 12.7 x 8.97 x .70 inches large and weighs 3.75 pounds.
The Alienware m15 R4, which has a 10th gen 45W Intel Core i7 processor and an RTX 3070, is 14.19 x 10.86 x 0.78 inches large and weighs 5.25 pounds. That leaves it not as bulky as the m15 Ryzen Edition R5, but about as heavy.
Port selection is varied, although distribution differs from my usual preferences. The left side of the laptop only has the Ethernet port and the 3.5mm headphone/microphone jack, which is a shame as that’s where I typically like to connect my mouse. The back of the laptop has a few more connections, including the DC-in, an HDMI 2.1 port, a USB 3.2 Gen 1 Type-A port and a USB 3.2 Gen 2 Type-C port that also supports DisplayPort. The right side of the laptop has two additional USB 3.2 Gen 2 Type-A ports.
Gaming Performance on the Alienware M15 Ryzen Edition R5
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Our review configuration of the Alienware m15 Ryzen Edition R5 came equipped with an 8-core, 16-thread Ryzen R7 5800H CPU and an RTX 3060 laptop GPU. It’s the first time we’ve tested a 45W CPU with an RTX 3060 and, to that end, we’ve decided to compare it to one 35W laptop with an RTX 3070 CPU, the Asus TUF Dash F15 with an Intel Core i7-11370H, and one 35W laptop with an RTX 3060 GPU, the Acer Predator Triton 300 SE with an Intel Core i7-11375H. We’ve also thrown the Alienware m15 R4 into the mix, which has a 45W 10th gen Intel CPU and an admittedly more powerful RTX 3070, plus a significantly higher price tag than any other competitor even on its cheapest configuration (the thing starts at $2,149).
I played Control on the Alienware laptop for a half hour to get a personal feel for gaming on the system. I tended to fall between 60 – 70 fps at high settings throughout, and turning ray tracing on using its high preset dropped that to 30 – 40 fps. The fans are certainly noticeable but aren’t ear-splitting, and the laptop neither got hot-to-the-touch nor did it spray hot air on my hands.
In Shadow of the Tomb Raider’s benchmark running at highest settings, the m15 Ryzen Edition R5’s CPU seemed to do it a favor, as its 73 fps average only barely fell behind the m15 R4’s 77 fps average. The Acer laptop was next in line with 61 fps, while the Asus laptop was significantly behind all other options at 54 fps.
Scores were a bit more even in Far Cry: New Dawn’s benchmark running at ultra settings. While the m15 R4 hit 91 fps, everything else was in the 70s. The m15 Ryzen Edition R5 had an average of 79 fps, while the Asus scored 74 fps and the Acer reached 73 fps.
The m15 Ryzen Edition R5 fell to third place in the Grand Theft Auto V benchmark running at very high settings, where it hit an 82 fps average and the Asus laptop achieved an 87 fps average. The Acer laptop was significantly behind at 72 fps, while the m15 R4 was significantly ahead at 108 fps.
Red Dead Redemption 2’s benchmark running at medium settings saw the m15 Ryzen Edition R5 once again stay in third place, though by a more significant margin this time. The R5 achieved a 53 fps average, while the Asus led with 61 fps score. The Acer was once again behind at 48 fps, while the m15 R4 stayed ahead at 69 fps.
We also ran the Alienware M15 R5 Ryzen Edition through the Metro Exodus RTX benchmark 15 times in a row to test how well it holds up to a sustained heavy load. During this benchmark, it hit an average 56 fps. The CPU ran at an average 3.63-GHz clock speed while the GPU ran at an average clock speed of 1.82 GHz. The CPU’s average temperature was 90.36 degrees Celsius (194.65 degrees Fahrenheit) and the GPU’s average temperature was 82.02 degrees Celsius (179.64 degrees Fahrenheit).
Productivity Performance for the Alienware m15 Ryzen Edition R5
Image 1 of 3
Image 2 of 3
Image 3 of 3
While Alienware is a gaming brand, the use of a 45W AMD chip does open the Alienware m15 Ryzen Edition R5 up to high productivity potential.
On Geekbench 5, which is a synthetic test for tracking general PC performance, the m15 Ryzen Edition R5 hit 1,427 points on single-core tests and 7,288 points on multi-core tests. While its single core score was on the lower end when compared to the Asus TUF Dash F15’s 1,576 points and the Acer Predator Triton 300 SE’s 1,483 points, the Alienware blew those laptops away on multi-core scores. The Asus’ multi-core score was 5,185, while the Acer’s multi-core score was 5,234.
The Alienware m15 R4 was a bit more even with its AMD cousin, scoring 1,209 on single-core Geekbench 5 tests and 7,636 on the program’s multi-core benchmarks.
Unfortunately, the m15 Ryzen Edition R5 couldn’t maintain that momentum for our 25GB file transfer benchmark. Here, it transferred files at a 874.14 MBps speed, while the Asus hit 1,052.03 MBps and the Acer reached 993.13 MBps. The m15 R5 hit speeds of 1137.34 MBps.
The m15 Ryzen Edition R5 was the fastest contender in our Handbrake video encoding test, though, where we track how long it takes a computer to transcode a video down from 4K to FHD. The m15 Ryzen Edition R5 completed this task in 7:05, while the Asus took 10:41 and the Acer was even slower at 11:36. The m15 R5 almost caught up to its AMD cousin with a time of 7:07.
Display for the Alienware M15 R5 Ryzen Edition
Our configuration for the Alienware m15 Ryzen Edition R5 came with a 15.6 inch 1920 x 1080 IPS display with a 165Hz refresh rate. While it boasted impressive gaming performance and strong benchmark results, it still proved problematic for viewing content.
I watched the trailers for Nomandland and Black Widow on the m15 Ryzen Edition R5, where I found the blacks to be shallow and the viewing angles to be restrictive. In my office during the daytime, I couldn’t easily see the screen’s picture unless I was sitting directly in front of it. Turning my lights off and closing my curtain only extended viewing angles to about 30 degrees. Glare also proved to be an issue in the light, although turning lights off did fix this problem.
Colors were bright enough to pop occasionally but not consistently, with bolder tones like reds and whites holding up better than more subdued ones. Here, Black Widow came across a bit more vividly than the naturalistic style of Nomadland, so this screen might be better suited for more colorful, heavily produced films.
Our testing put the m15 Ryzen Edition R4’s color range above its closest competitors, the Asus TUF Dash F15 and Acer Predator Triton 300 SE, though not by much. With an 87.3 DCI-P3 color gamut, it’s only slightly ahead of the Asus’ 80.6% DCI-P3 score. The TUF Dash F15 had a starker difference, with a 78.5% DCI-P3 color gamut.
Our brightness testing saw the Alienware pull a more solid lead. With an average of 328 nits, it easily surpassed the Acer’s 292 nits and the Asus’ 265 nits.
The Alienware m15 R4 blew all of these systems out of the water, although the OLED screen our configuration had makes the comparison more than a bit unfair. Its DCI-P3 gamut registered at 150% while its average brightness was 460.2 nits.
To test the m15 Ryzen Edition R5’’s 165Hz screen, I also played Overwatch on it. Here, I had a much more pleasant experience than I did when watching movie trailers. The game’s bright colors appeared quite vivid and the fast refresh rate was perfectly able to keep up with the 165 fps I was hitting on Ultra settings.
Keyboard and Touchpad on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 configuration we received has a 4-zone RGB membrane keyboard, though other configurations do offer mechanical switches made in collaboration with Cherry. You can currently get that upgrade for an additional $98.
The membrane nature of this keyboard didn’t mean it wasn’t impressive, though. Keys have a noticeable resistance when pressed and 1.7mm of key travel gives you plenty of tactile feedback. I consistently scored around 83 words per minute on the 10fastfingers.com typing test, which is impressive as my average is usually around 75 wpm.
In an unusual choice, the Alienware’s audio control keys sit on the keyboard’s furthest right row rather than being mapped to the Fn row as secondary functions. Instead, the Page Up and Page Down keys that would normally be found there are secondary functions on the arrow keys.
The 4.1 x 2.4-inch touchpad doesn’t fare as well. While it has precision drivers and is perfectly smooth when scrolling with one finger, I felt too much friction when using multi-touch gestures to pull them off comfortably or consistently. For instance, when trying to switch apps with a three-fingered swipe, I would frequently accidentally pinch zoom instead.
Audio on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 has two bottom firing speakers that are loud with surprisingly decent bass, but tend to get tinny on higher notes.
I tested the m15 Ryzen Edition R5’s audio by listening to Save Your Tears by The Weeknd, which easily filled up my whole two bedroom apartment with sound. I was also surprised to be able to hear the strum of the song’s bass guitar, as it’s not uncommon for other laptops to either cut it out, make it quiet, or give it a more synth-like quality. Unfortunately, higher notes suffered from tinniness and echo.
Upgradeability of the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 is easy to open and has plenty of user customizability. Just unscrew the four screws closest to the back of the laptop, then loosen the four screws on the front (we used a PH0 Phillips Head bit).
Gently pry the case off, and you’ll see the networking card, two swappable DIMMs of RAM, the M.2 SSD and a second, open M.2 SSD slot (if you don’t buy the laptop with dual SSDs).
The only tradeoff here is that the SSDs are in a smaller, less common M.2 2230 form factor (most are 2280) , so you’ll probably need to buy a specialized drive for this laptop.
Battery Life on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 is a power hog, with half the non-gaming battery life of the RTX 3060 and RTX 3070 35W laptops we tested it against. This shouldn’t come as too much of a surprise, since it also has a 45W CPU, but don’t expect to be able to spend too much time away from an outlet.
In our non-gaming battery test, which continually streams video, browses the web and runs OpenGL tests over Wi-Fi at 150 nits of brightness, the M15 Ryzen Edition R5 held on for 3:29. That’s about 3 hours less time than we got out of both the Asus TUF Dash F15, which had a 6:32 battery life, and the Acer Predator Triton 300 SE, which lasted for 6:40.
The Alienware m15 R5, with its 45W Intel chip, also had a shorter battery life than our 35W laptops, though it was slightly longer than the m15 Ryzen Edition R5’s. It lasted 4:01 on our non-gaming test.
Heat on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5’s surface temperature was impressively cool during non-gaming use but could get toasty in select areas during our gaming benchmarks. For our tests, we measured its temperature both after 15 minutes of streaming video and during the sixth consecutive run of the Metro: Exodus extreme benchmark.
The laptop’s touchpad proved coolest during the video test, registering 81.1 degrees Fahrenheit. This was only slightly behind the center of the keyboard’s temperature, as the typer hit 85.5 degrees Fahrenheit in between the G and H keys. The bottom of the laptop was warmer, hitting 90.9 degrees, although the center-left of the display hinge is where it was hottest, registering 101.1 degrees Fahrenheit.
Our gaming test saw a mild jump in temperatures in all areas except the bottom and the hinge, where numbers spiked much higher. The touchpad was 83.3 degrees Fahrenheit and the center of the keyboard was 90.9 degrees Fahrenheit. By contrast, the bottom of the laptop was now 121.5 degrees Fahrenheit and the hot zone on the hinge was now 136.1 degrees Fahrenheit.
Despite these higher numbers, though, the laptop never became too hot to touch while gaming. It did feel pleasantly warm, however.
Webcam on the Alienware m15 Ryzen Edition R5
The Alienware M15 R4 Ryzen Edition’s 720p webcam is, like many premium gaming laptops, a bit of an afterthought. Regardless of lighting conditions, its shots always have a blocky and fuzzy appearance. Adding light also adds a distracting halo effect to silhouettes, while dimming your surroundings will just bring down detail even further.
Software and Warranty on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 comes packed with software, although most of it serves a genuinely useful purpose.
Most of these are apps like Alienware Command Center, which lets you customize lighting and thermals as well as set up macros. Some are less useful than others — Alienware Customer Connect simply exists to get you to fill out surveys — but apps like Alienware Mobile Connect, which lets you easily mirror your phone’s screen, transfer its files or take phone calls from your laptop are definitely a standout. It might be easier to navigate these functions if they were all centralized into one hub app rather than being their own standalone programs, though. My Alienware tries to be this hub app, although it’s mostly just a redirect to Alienware Command Center with a bunch of ads on the side.
This laptop also comes with typical Windows pack-ins like Microsoft Solitaire Collection and Spotify. Its default warranty is limited to one year, although you can extend it at checkout.
Configurations for the Alienware M15 R5 Ryzen Edition
Our configuration of the Alienware M15 R5 Ryzen Edition came with an AMD Ryzen 7 5800H CPU, an RTX 3060 laptop GPU, 16GB of RAM, a 512GGB SSD and a 1920 x 1080, 165Hz display for $1,649. That actually puts it towards the lower end of what’s available.
You can upgrade this laptop’s CPU to the Ryzen 9 5900HX, which has the same thread count but boosts up to 4.6 GHz, and its GPU to an RTX 3070 laptop card. Memory options range from 8GB to 32GB, while storage options range from 256GB to 2TB. You can also add on an additional SSD with the same range of options, making for up to 4TB of total combined storage.
There’s also a 360Hz version of the FHD display available, as well as a QHD version with a 240Hz refresh rate and G-Sync support.
Perhaps the most interesting option that wasn’t included on our configuration is the mechanical keyboard, which features physical ultra low-profile switches made in collaboration with CherryMX.
These upgrades can raise your price up to $2,479, with the display and keyboard upgrades being the most costly components in Dell’s customization tool. The Cherry MX keyboard will add $98 to your price at checkout, while the QHD display costs $78. The FHD @ 360Hz display is only available on the highest preset option, which locks you into a Ryzen 9 5900HX chip and starts at $2,332.
By contrast, the low end of this laptop starts at $1,567.
Bottom Line
The Alienware m15 Ryzen Edition R5 proves that Team Red and Alienware make a strong pairing . While it’s not quite the beast that the minimum $2,149 Alienware m15 R4 is, it still manages performance that equates to and sometimes beats peers in its price range on most titles, all while rocking Alienware’s unique premium looks. At $1,649 for our configuration, it’s an easy premium choice over the $1,450 Asus TUF Dash F15. And if you prefer power over size, it’s also a better option for you than the $1,400 Acer Predator Triton 300 SE.
While it’s certainly not the most portable contender and could do with more even port distribution and stronger audio, its 45W CPU lends it just enough of an edge on power to make it a solid first step into Dell’s flagship gaming brand.
Nvidia today announced that Deep Learning Super Sampling (DLSS) has made the jump to VR thanks to support from No Man’s Sky, Wrench, and Into The Radius.
The company said “DLSS doubles your VR performance at the Ultra graphics preset and maintains 90 FPS on an Oculus Quest 2 with a GeForce RTX 3080” in No Man’s Sky, boosts Wrench‘s performance up to 80%, and “greatly reduces shimmering and stair-stepping on objects and foliage” in Into The Radius via improved anti-aliasing.
Here are Nvidia’s performance results for the VR edition of No Man’s Sky with and without DLSS enabled. Note that actual improvements will vary by system, driver version, and the whims of whichever higher being decides the exact number of frames per second a given system can provide in a particular title.
DLSS seems like a natural fit for VR. Nvidia developed the technology in an effort to improve performance and presentation, both of which are critical to games played on a pair of displays held mere inches away from the player’s eyeballs. Any improvement to either of those factors make for a better VR experience.
Six other titles will also introduce DLSS support this month: Amid Evil, Metro Exodus: PC Enhanced Edition, Everspace 2, Aron’s Adventure, Scavengers, and Redout: Space Assault. Nvidia said the games saw performance bumps from 40% (Scavengers) to 100% (Metro Exodus) and that several will offer ray-tracing support as well.
These additions bring the total number of DLSS-compatible titles to 50 while simultaneously expanding the technology to VR. Its continued growth seems all-but-inevitable thanks to new integrations with Unreal Engine, Unity, and potentially the next Nintendo Switch making it more appealing than ever to game developers.
Nvidia should soon have some competition, in the form of AMD’s FidelityFX Super Resolution. That’s supposed to provide a cross-platform alternative to DLSS that works on hardware from AMD, Intel, and Nvidia alike rather than being exclusive to Nvidia hardware that features the company’s RTX Tensor Cores.
Nvidia is extending its cryptocurrency mining limits to newly manufactured GeForce RTX 3080, RTX 3070, and RTX 3060 Ti graphics cards. After nerfing the hash rates of the RTX 3060 for its launch in February, Nvidia is now starting to label new cards with a “Lite Hash Rate” or “LHR” identifier to let potential customers know the cards will be restricted for mining.
“This reduced hash rate only applies to newly manufactured cards with the LHR identifier and not to cards already purchased,” says Matt Wuebbling, Nvidia’s head GeForce marketing. “We believe this additional step will get more GeForce cards at better prices into the hands of gamers everywhere.”
These new RTX 3060 Ti, RTX 3070, and RTX 3080 cards will start shipping later this month, and the LHR identifier will be displayed in retail product listings and on the box. Nvidia originally started hash limiting with the RTX 3060, and the company has already committed to not limiting the performance of GPUs already sold.
While Nvidia tried to nerf mining with the RTX 3060, the company also accidentally released a beta driver that unlocked hash rates and increased performance. That’s been reinstated with more recent drivers, but the beta drivers are out in the wild now.
Nvidia’s new LHR cards are part of a broader effort to make its latest 30-series GPUs less desirable to cryptocurrency miners. PC gamers have been trying and failing to get their hands on new graphics cards for months due to the great GPU shortage, and miners have been blamed for part of the shortages.
Nvidia offers a separate Cryptocurrency Mining Processor (CMP) for Ethereum miners instead. These cards include the best performance for mining and efficiency, but they won’t handle graphics at all.
Nvidia’s move to nerf new cards will undoubtedly drive up prices for existing 30-series GPUs that don’t have these restrictions in place. It will also likely mean the rumored RTX 3080 Ti card will have similar cryptocurrency mining limits in place, as this card is expected to be announced later this month.
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
This is an iMac unlike any other iMac we’ve seen before, and it all comes down to the M1 chip.
Sure, there are some other differences between this 24-inch iMac and the 21.5-inch model from 2019 that it’s replacing. There are better microphones and better speakers. There are fewer ports, and some of them have moved around. The screen is bigger and better. The keyboard now has TouchID. But the M1 is the star of the show.
It’s not just the performance increase. It’s not just the fact that you can run iOS and iPadOS apps natively on the system. It’s not just the new advanced image signal processor, which helps create better low-light images than I’ve ever seen from an integrated webcam. It’s also the groundbreaking efficiency with which this processor runs, which has enabled Apple to create a slim, sleek, and quite unique iMac chassis.
Whether you actually get every upgrade here depends on the configuration you choose. The entry-level iMac is $1,299 for 256GB of SSD storage, two Thunderbolt / USB 4 ports, 8GB of unified memory, and a seven-core GPU — but that’s only available in four colors and doesn’t come with TouchID. The model I tested bumps the storage up to 512GB and the memory up to 16GB. It has two USB-3 ports in addition to the two Thunderbolt, an eight-core GPU, Touch ID, and a gigabit Ethernet port (which is in the power brick). I also received both the Magic Mouse and the Magic Trackpad with my model. You’d need to pay a total of $2,028 to get everything Apple sent me (and which I’ll be sending back, for the record).
In short, this device costs money. And it’s true that you’d get similar performance and save a few hundred bucks, if you just plugged a Mac Mini into an external display. But this iMac has almost everything that most people need in one package: processing power, sure, but also a camera, speakers, microphones, a keyboard, a mouse, a trackpad, and a display. And they’re all good. This is a computer you can plonk on your desk and never think about again. And for some of the iMac’s target audience, that’s probably worth the extra money. You’re paying for simplicity.
The M1 processor uses what’s called a “hybrid” configuration. The easiest way to conceive of this is that most competing Intel and AMD chips have a number of equally “okay” cores, where Apple’s M1 has four very fast cores and four lower-powered high-efficiency cores. This allows M1 devices to deliver arguably the best performance-per-watt in the world. It also means that they’re nearly unbeatable in single-core workloads.
That advantage bore out in our benchmark testing. This iMac model achieved a higher score on the Geekbench 5 single-core benchmark than any Mac we’ve ever seen before — even the iMac Pro. That means if you’re looking for a device for simpler everyday tasks that don’t scale to every available CPU core (and that largely seems to be the demographic that Apple is trying to sell this machine to), there has literally never been a better iMac for you to buy.
You can see the rest of our benchmarks below:
Apple iMac 24 (2021) benchmarks
Benchmark
Score
Benchmark
Score
PugetBench for Premiere Pro
372
Cinebench R23 Multi
7782
Cinebench R23 Single
1505
Geekbench Multi
7668
Geekbench Single
1739
Geekbench OpenCL
19114
These results help illuminate where this iMac fits into Apple’s all-in-one lineup, and where its limitations are. The 24-incher is a significant improvement over the 21.5-inch iMac in both single-core and multi-core workloads. And it’s very comparable in graphics tasks — which is quite impressive, given that the 21.5-inch iMac has a discrete GPU and this one relies on what’s integrated with the M1.
On the other end, these results (with the exception of single-core performance) are not close to what we’d expect from the 27-inch Intel iMac with discrete graphics. In this comparison, multi-core results are more important. They indicate that the 27-inch iMac is going to do much better on the types of tasks that owners (or prospective buyers) are likely to be doing: intense multitasking, computations, design, video work, and other more complex loads that may leverage the GPU.
There are other limitations that may put some workloads out of reach. As is the case with the MacBook Pro and Mac Mini, you can’t configure the iMac with more than 16GB of memory and 2TB of storage; we wouldn’t recommend those specs to anyone who regularly edits 4K or 8K video, for example. The memory and storage are soldered, so you can’t upgrade them after purchase. Only one external display is supported (up to 6K resolution at 60Hz). Ports are also bizarrely limited; the base model has just two Thunderbolt / USB-4 ports and a headphone jack, while more expensive models have an additional two USB-3 ports and Gigabit Ethernet. These all may be reasons Apple is pushing this iMac as a “home and family” PC, even though its processor is clearly capable of all kinds of professional work.
Another way to interpret these numbers is that I was getting effectively the same performance out of this machine as we got from the M1 MacBook Pro and the Mac Mini. That’s completely unsurprising, since these devices all use the same processor. But it’s a good proxy for gauging whether the iMac can handle your work: if you expect you could get a task done with the M1 MacBook Pro, you should be able to do it on this.
More anecdotally, I was able to use my test unit for all kinds of daily tasks, from emailing to YouTube to amateur photo and video work. I was able to hop between over 25 Chrome tabs with Cinebench looping in the background, with no stutter or slowdown whatsoever. If you’re buying the iMac for this kind of thing, I can’t imagine you’ll see too many spinning wheels.
During this testing process, I also got a sense of just how well cooled this chassis is. On thinner laptops that I test often (including the fanless MacBook Air), you’ll see performance decrease if you run heavy tasks over and over again. None of that on this iMac: I looped Cinebench R23 as well as a Premiere Pro 4K video export several times over and never saw scores go down. It took a lot to get the fans going — they were checked out during my daily office multitasking. When they did spin up, mostly while I was working in Premiere, I could barely hear them. They were quieter than the background hum of my refrigerator. That’s quite a quality-of-life improvement over prior Intel iMacs.
The M1’s advantage, after all, has never been raw power; it’s the combination of power and efficiency. We saw much better battery life in the MacBook Air and MacBook Pro than we did in their Intel predecessors. Battery life obviously isn’t a concern with the iMac, but efficiency certainly is. Chips are limited by two things: the power available and how well their systems can keep them cool. They vent almost all the energy they use as heat, and because the M1 has such incredibly high performance per watt, Apple doesn’t need a heavy-duty cooling system to keep it from frying itself. Because it doesn’t need that heavy-duty cooling system, Apple has finally been able to redesign the iMac from the ground up.
This iMac is sleek. Even though it has a 24-inch screen, it’s close in size to its 21.5-inch predecessor. Apple reduced the screen’s borders by close to 50 percent in order to squeeze the bigger screen into the compact chassis. This device is also 11.5 millimeters thick, or just under half an inch — which is quite thin as all-in-ones go. Next to the 27-inch iMac, it looks like a tablet on a stand.
Size isn’t everything; this iMac also comes in seven colors. There’s blue, green, pink, orange, purple, yellow, and the boring silver we know and love. I’m not quite convinced that the jazzier models will fit in outside of especially stylish homes and offices. But I will say: I’ve never seen so many of my friends, or so many people on TikTok, as excited about a tech product as they seem to be about the colored iMacs. The hues are a nice change, aren’t obnoxious, and are clearly a hit with certain crowds.
Some traditional iMac touches remain, of course. The bezels are still substantial compared to those of some modern monitors. You can’t raise or lower the display height — the built-in stand only allows tilt adjustments. (You can also buy it with a built-in VESA mount adapter.) And there’s still that pesky chin, though it’s no longer emblazoned with the Apple logo.
Pretty much every other notable part of the iMac has been upgraded in some way. There’s a 4.5K (4480 x 2520) Retina display, a step up from the predecessor’s 4096 x 2304 Retina display (though both have effectively the same pixel density). It has Apple’s True Tone technology, which automatically adjusts colors and intensity based on your surroundings.
But the screen is also another reminder that this iMac doesn’t have “Pro” in its name. Twenty-four inches is on the small side as screens go; most of the best external monitors are 27 inches or larger these days. Professionals on The Verge’s video team also noticed some vignetting on the sides of the screen, which caused issues with off-angle viewing — we had a similar issue with Apple’s Pro Display XDR. Of course, neither of these limitations were a problem for my untrained eye; I thought the display looked great, with sharp details and plenty of room for my Chrome tabs and apps.
Elsewhere, Apple has upgraded the camera, microphones, and speakers. The company claims that they’re the best camera, mic system, and speaker system that have ever appeared in a Mac. I’d believe it. The six-speaker sound system is easily on par with a good external speaker. I played some music in my kitchen, and it was audible all over the house. Percussion and bass were strong, and I felt very immersed in the songs. It also supports spatial audio when playing video with Dolby Atmos.
I don’t have too much to say about the three-mic array except that nobody on my Zoom calls had any trouble hearing me. But the webcam was a very pleasant surprise. The iMac has a 1080p FaceTime HD camera, which has a higher resolution than the 720p shooter that lives in the 21.5-inch iMac (as well as the MacBook Pro, MacBook Air, and many other AIOs). The M1 also lends a hand here: its built-in image signal processor and neural engines help optimize your picture in low-light settings.
I wouldn’t say I looked amazing on my Zoom calls — parts of my background were sometimes washed out, and the image looked processed in some dimmer areas. But I was visible and clear, which is better than you get from most webcams these days. And the difference between this webcam and the grainy mess the MacBook Pro has is night and day.
When I review a computer, my task is usually to figure out for whom that computer is made.
But all kinds of people use iMacs, from college students to accountants to podcast producers to retired grandparents. And this model has arguably the most widespread consumer appeal of any iMac that Apple has made in recent years. So it’s much easier to figure out for whom this iMac isn’t made.
It’s not for people who can’t handle dongles and docks; I kept a USB-C to USB-A dongle next to me on my desk while I was testing the iMac, and I used it very frequently. It’s not for people who already own a 27-inch iMac, because it would be a downgrade in display size and quality, port selection, upgradability, and raw power. And it’s not for people with serious performance needs.
It’s not for people who are looking for the very best value for their money. Most folks won’t need the specs and accessories that I tested here, but even $1,299, the base price, is certainly more than plenty of people want to spend on a computer. The base Mac Mini is $600 cheaper than the base iMac; plug that into a monitor and some speakers (you can find plenty of good ones for well under $600), and you’ll get the same M1 performance at a massive discount.
And that, right there, is the biggest reason that this iMac, despite its power, is primarily targeting the family market. Because it’s asking you to pay more in order to do less. You’re paying $600 not to have to research and budget out monitors, speakers, webcams, docks, keyboards, and mice. You’re paying not to have to arrange thousands of things on your desk. You’re paying for a device where everything, out of the box, works well. You’re paying to eliminate fuss.
Tech enthusiasts (especially those who want to pop their machines open and make their own upgrades) may see that as a waste of money. And for them, it probably is. But they’re not the target audience for this Mac — even if its specs might suit their needs.
Could Apple have done more with this iMac? Of course. I was hoping to see a 30-inch, 6K iMac with a powerhouse 12-core workstation chip this month as much as the next person. But I have faith that we’ll get one in the future — and in the meantime, I’m glad Apple released this. It’s not earth-shattering in its design; it doesn’t redefine its category. But it’s fun. It improves upon the 21.5-inch iMac to offer a simple, attractive, and very functional device for users across all kinds of categories. It’s not the iMac to beat — but it is the iMac for most people to buy.
Astell & Kern’s digital expertise comes good in this entertaining USB-C cable DAC
For
Notable improvement to audio
Clean, precise character
Nicely made
Against
No iOS device compatibility
No MQA support
For a relatively simple product, Astell & Kern’s first portable DAC has a rather convoluted moniker. ‘Astell & Kern AK USB-C Dual DAC Cable’ isn’t something you’d want to say out loud (or type) often but, to the company’s credit, it sums up the product perfectly: it’s a USB-C cable with two DACs inside.
Thankfully, the name doesn’t attempt to further explain its purpose, so let us fill in the gaps.
Features
Portable DACs – compact DACs that don’t rely on mains power – have arrived in force in recent years with the mission of conveniently improving the sound quality between your phone or computer and wired headphones. That’s because the digital-to-analogue converters and analogue output stages of these do-all devices are generally pretty poor.
Though wireless headphones connected to a device may be the portable audio preference of many nowadays, a wired set-up generally still offers the best performance-per-pound value, particularly if you want to play hi-res audio.
Astell & Kern AK USB-C Dual DAC Cable tech specs
Input USB-C
Output 3.5mm
Hi-res audio PCM 32-bit/384kHz, DSD256
Weight 27g
While there are a number of traditional box or USB stick portable DACs in existence, the AK USB-C Dual DAC Cable is one of an increasingly common group of DACs designed to enhance on-the-go or desktop sound quality in cable form. This Astell & Kern, like the Zorloo Ztella and THX Onyx, is essentially an extension of your headphones cable; the discreet middleman between them and your source device.
At one end is a 3.5mm output, and at the other is a USB-C connection for plugging into any device with that output, such as an Android phone, Windows 10 PC, tablet or MacOS computer. For the bulk of our testing, we use it with a Samsung Galaxy S21 and Apple MacBook Pro.
Some portable DACs, such as the multi-Award-winning Audioquest DragonFly Red, have a USB-A connection instead, but now that USB-C is becoming more prevalent it makes sense for a portable DAC like this one to adopt it. You can always buy a USB-C-to-USB-A adapter to cater for devices with such ports.
Portable DACs can often be used with Apple’s camera adapter to make them compatible with iPhones and iPads, but Astell & Kern says that isn’t the case here “due to the dual DAC incompatibility and power restrictions of iOS devices”. So iPhone users will have to look elsewhere.
The dual DACs (specifically, two Cirrus Logic CS43198 MasterHIFi chips) support native high-resolution audio playback of PCM files up to 32-bit/384kHz and DSD256. However, due to the AK USB-C Dual DAC Cable’s lack of MQA file support, Tidal HiFi subscribers won’t be able to benefit from the (MQA-encoded) hi-res Tidal Masters that are part of the tier’s offering. It’s also worth noting that the DAC has been built for sound output only, so it won’t work with headphones with an in-line remote.
A portable cable DAC is new territory for Astell & Kern – the company is most renowned for its portable music players but also makes headphones and desktop audio systems. But digital-to-analogue conversion technology is something the company is well versed in. And that shows.
For the AK USB-C Dual DAC Cable, Astell & Kern says it developed a circuit chip on a six-layer PCB just 14 x 41mm in size, featuring bespoke capacitors found in its music players, and optimised to prevent power fluctuations. The analogue amplifier (with a 2Vrms output level), meanwhile, is designed to drive even power-hungry and high-impedance headphones.
Sound
We use a range of headphones, from high-end Grados to more modest Beyerdynamic on-ears and Sennheiser Momentum earbuds – and the Astell & Kern doesn’t struggle to power any of them. However, we would be wary of your playback device’s volume output level when you first connect the DAC and plug in your headphones (especially if you’re using more than one pair) to avoid getting an unexpected earful. It’s something Astell & Kern advises in the manual, too.
Adding the AK USB-C Dual DAC Cable between these headphones and our source devices (which provide power to the DAC) makes the world of difference. As the likes of the Zorloo Ztella and Audioquest DragonFly Black have shown, even a modest outlay can make a significant improvement to your portable sound.
The Samsung Galaxy S21 is by no means the worst-sounding smartphone out there, and yet the Astell & Kern makes music come through our wired headphones much clearer, cleaner and punchier than with just a standard USB-C-to-3.5mm dongle. This little DAC doesn’t just do the basics by amplifying the sound and beefing up its tone, it also goes the extra mile to open up music and let you in on more of its detail.
Considering the increasing competition in the portable DAC market, you could say it’s a necessary mile. One of our favourite portable DACs, the Audioquest DragonFly Red, proves to be a notably more insightful and rhythmically entertaining performer – but then it is significantly pricier at £169 ($200, AU$280). For this modest amount of money, the AK USB-C Dual DAC Cable is a very attractive proposition indeed.
We play Lesley by Dave ft Ruelle and the rapper’s poignant storytelling is all the more compelling for the boost in clarity and vocal insight delivered by the DAC. The melodious synth chords, which twinkle with clarity against the contrasting backdrop, are planted with precision on either side.
It’s a similar story as we plug the Astell & Kern into our MacBook Pro and settle into Big Thief’s Shoulder, the presentation pleasantly opened up and generously populated with definition aplenty around Adrianne Lenker’s pleading vocal delivery and the warm textures of the band’s hallmark folksy guitar licks.
Build
So, it sounds good. But what’s it like to live with? After all, this is an everyday device that’s likely to sit in your pocket or on your desktop during the 9 to 5. Perhaps most crucially for a device of this nature, the AK USB-C Dual DAC Cable is compact, lightweight (27g) and well made – to the extent that we feel comfortable tossing it in a bag or shoving it down trouser pockets before long.
The twisted cable between the USB-C output and main body – made up of Technora aramid fibre at its core, wrapped by copper layers and finished with shielding treatment – makes it easy to manipulate the device into a jeans pocket when connected to a phone, and feels built to last. It also helps absorb the shock of accidental knocks, unlike USB stick designs.
While we would expect a device like this to last years, in the weeks we spend in its company we feel confident of its durability. Even when we accidentally yank the device out of our playback source with the cable a number of times, it proves hardy enough to withstand it.
While made to fit nicely into a pocket, some consideration has also clearly been taken to make the AK USB-C Dual DAC Cable look nice when it’s not hidden away – when it’s on a desktop, for example.
The metal casing at the end of the cable – comparable with one of the more compact USB sticks in our collection – has a polished finish and angled surface that resonate with the aesthetic of the company’s premium music players. Design niceties on products like these are only ever going to be the small touches, but they’re here at least.
Verdict
Before Astell & Kern announced its AK USB-C Dual DAC Cable, it wouldn’t have been a stretch to imagine the company making such a product. It has been in the portable digital audio game for years and enjoyed much success.
That know-how has been put to good use in offering USB-C device owners an affordable, practical way to soup up their smartphone or desktop sound through wired headphones. It’s such an appealing option that we can almost forgive the unwieldy name.
Oculus will soon roll out its v29 software for Quest and Quest 2 headsets, and this one adds more features and functionality to the lineup. The most exciting one is a new Live Overlay casting feature that will give you an easy way to capture a mixed reality view of you using VR superimposed over the content displayed in your headset. Any VR app that supports casting and recording will work with this feature, according to Oculus.
All you’ll need is at least an iPhone XS or newer, a Quest headset, and the Oculus app for iOS updated to a new version that’s coming to “a subset of users.” Then you turn on the feature and have someone aim the camera at you (or aim it at yourself).
There are other highlights to mention in this software version, but I want to focus a moment longer on how big of a technical achievement Live Overlay seems to be. The company’s previous solution, its Mixed Reality Capture Tool on PC for Quest and Rift S, required a bunch of expensive hardware, including a rig with 16GB of RAM, a decent graphics card, your own 1080p webcam, a 5GHz Wi-Fi router, and — of course — your own green screen. But for lucky iOS users who get the app update, Oculus’ improved feature has eliminated the need for almost all of those gadgets. You just need your headset and your phone.
In other casting news, Oculus is allowing you to capture what you’re saying into your headset’s built-in microphone during casting sessions or when you’re recording a video clip. It’s also extending multi-user support and the app sharing feature to owners of the original Quest, so multiple people can share games on a single headset. These features were originally limited to the Quest 2, but Oculus is making good on its pledge to bring them to the Quest.
There’s a Files app coming with this update, and it’ll be located within your app library. Oculus says in its blog post that you’ll be able to download and upload media files “to and from your favorite websites” through the browser built into the headset. In other words, it should be a lot easier to upload content to social media sites other than Facebook.
The last couple of additions include an Infinite Office update that lets iOS users (using an iPhone 7 and newer, with Android support coming soon) see lock screen notifications from within a Quest headset. Oculus is also adding a shortcut for its Passthrough command to the user interface in the Quick Settings menu, giving you an alternative method of triggering it instead of physically double-tapping the side of your headset.
Lastly, Oculus will begin showing ads for VR experiences within the mobile app with the intent to broaden discovery for both developers looking for an audience and users to find new content. This could be useful for smaller developers who have built quality experiences but don’t have the awareness of more popular apps. That said, if you’re a curious Quest owner looking for new experiences, I suggest checking out SideQuest on PC or from an Android phone.
DLSS 2.0 off vs DLSS 2.0 on (Image credit: Nvidia)
DLSS stands for deep learning super sampling. It’s a type of video rendering technique that looks to boost framerates by rendering frames at a lower resolution than displayed and using deep learning, a type of AI, to upscale the frames so that they look as sharp as expected at the native resolution. For example, with DLSS, a game’s frames could be rendered at 1080p resolution, making higher framerates more attainable, then upscaled and output at 4K resolution, bringing sharper image quality over 1080p.
This is an alternative to other rendering techniques — like temporal anti-aliasing (TAA), a post-processing algorithm — that requires an RTX graphics card and game support (see the DLSS Games section below). Games that run at lower frame rates or higher resolutions benefit the most from DLSS.
According to Nvidia, DLSS 2.0, the most common version, can boost framerates by 200-300% (see the DLSS 2.0 section below for more). The original DLSS is in far fewer games and we’ve found it to be less effective, but Nvidia says it can boost framerates “by over 70%.” DLSS can really come in handy, even with the best graphics cards, when gaming at a high resolution or with ray tracing, both of which can cause framerates to drop substantially compared to 1080p.
In our experience, it’s difficult to spot the difference between a game rendered at native 4K and one rendered in 1080p and upscaled to 4K via DLSS 2.0 (that’s the ‘performance’ mode with 4x upscaling). In motion, it’s almost impossible to tell the difference between DLSS 2.0 in quality mode (i.e., 1440p upscaled to 4K), though the performance gains aren’t as great.
For a comparison on how DLSS impacts game performance with ray tracing, see: AMD vs Nvidia: Which GPUs Are Best for Ray Tracing?. In that testing we only used DLSS 2.0 in quality mode (2x upscaling), and the gains are still quite large in the more demanding games.
When DLSS was first released, Nvidia claimed it showed more temporal stability and image clarity than TAA. While that might be technically true, it varies depending on the game, and we much prefer DLSS 2.0 over DLSS 1.0. An Nvidia rep confirmed to us that because DLSS requires a fixed amount of GPU time per frame to run the deep learning neural network, games running at high framerates or low resolutions may not have seen a performance boost with DLSS 1.0.
Below is a video from Nvidia (so take it with a grain of salt), comparing Cyberpunk 2007 gameplay at both 1440p resolution and 4K with DLSS 2.0 on versus DLSS 2.0 off.
DLSS is only available with RTX graphics cards, but AMD is working on its own alternative for Team Red graphics cards. AMD Fidelity FX Super Resolution (FSR) is supposed to debut in 2021. It will require separate support from games, and we haven’t seen it in action yet. But like other FidelityFX technologies, it’s supposed to be GPU agnostic, meaning it will work on Nvidia and even Intel GPUs that have the necessary hardware features. We’re also expecting the next Nintendo Switch to have DLSS via an integrated SoC designed by Nvidia.
DLSS Games
In order to use DLSS, you need an RTX graphics card and need to be playing a game that supports the feature. You can find a full list of games supporting DLSS as of April via Nvidia below. Unreal Engine and Unity Engine also both have support for DLSS 2.0, meaning games using those engines should be able to easily implement DLSS.
Anthem
Battlefield V
Bright Memory
Call of Duty: Black Ops Cold War
Call of Duty: Modern Warfare
Call of Duty: Warzone
Control
CRSED: F.O.A.D. (Formerly Cuisine Royale)
Crysis Remastered
Cyberpunk 2077
Death Stranding
Deliver Us the Moon
Edge of Eternity
Enlisted
F1 2020
Final Fantasy XV
Fortnite
Ghostrunner
Gu Jian Qi Tan Online
Iron Conflict
Justice
Marvel’s Avengers
MechWarrior 5: Mercenaries
Metro Exodus
Metro Exodus PC Enhanced Edition
Minecraft With RTX For Windows 10
Monster Hunter: World
Moonlight Blade
Mortal Shell
Mount & Blade II: Bannerlord
Nioh 2 – The Complete Edition
Outriders
Pumpkin Jack
Shadow of the Tomb Raider
System Shock
The Fabled Woods
The Medium
War Thunder
Watch Dogs: Legion
Wolfenstein: Youngblood
Xuan-Yuan Sword VII
DLSS 2.0 and DLSS 2.1
In March 2020, Nvidia announced DLSS 2.0, an updated version of DLSS that uses a new deep learning neural network that’s supposed to be up to 2 times faster than DLSS 1.0 because it leverages RTX cards’ AI processors, called Tensor Cores, more efficiently. This faster network also allows the company to remove any restrictions on supported GPUs, settings and resolutions.
DLSS 2.0 is also supposed to offer better image quality while promising up to 2-3 times the framerate (in 4K Performance Mode) compared to the predecessor’s up to around 70% fps boost. Using DLSS 2.0’s 4K Performance Mode, Nvidia claims an RTX 2060 graphics card can run games at max settings at a playable framerate. Again, a game has to support DLSS 2.0, and you need an RTX graphics card to reap the benefits.
The original DLSS was apparently limited to about 2x upscaling (Nvidia hasn’t confirmed this directly), and many games limited how it could be used. For example, in Battlefield V, if you have an RTX 2080 Ti or faster GPU, you can only enable DLSS at 4K — not at 1080p or 1440p. That’s because the overhead of DLSS 1.0 often outweighed any potential benefit at lower resolutions and high framerates.
In September 2020, Nvidia released DLSS 2.1, which added an Ultra Performance Mode for super high-res gaming (9x upscaling), support for VR games, and dynamic resolution. The latter, an Nvidia rep told Tom’s Hardware, means that, “The input buffer can change dimensions from frame to frame while the output size remains fixed. If the rendering engine supports dynamic resolution, DLSS can be used to perform the required upscale to the display resolution.” Note that you’ll often hear people referring to both the original DLSS 2.0 and the 2.1 update as “DLSS 2.0.”
DLSS 2.0 Selectable Modes
One of the most notable changes between the original DLSS and the fancy DLSS 2.0 version is the introduction of selectable image quality modes: Quality, Balanced, or Performance — and Ultra Performance with 2.1. This affects the game’s rendering resolution, with improved performance but lower image quality as you go through that list.
With 2.0, Performance mode offered the biggest jump, upscaling games from 1080p to 4K. That’s 4x upscaling (2x width and 2x height). Balanced mode uses 3x upscaling, and Quality mode uses 2x upscaling. The Ultra Performance mode introduced with DLSS 2.1 uses 9x upscaling and is mostly intended for gaming at 8K resolution (7680 x 4320) with the RTX 3090. While it can technically be used at lower target resolutions, the upscaling artifacts are very noticeable, even at 4K (720p upscaled). Basically, DLSS looks better as it gets more pixels to work with, so while 720p to 1080p looks good, rendering at 1080p or higher resolutions will achieve a better end result.
How does all of that affect performance and quality compared to the original DLSS? For an idea, we can turn to Control, which originally had DLSS 1.0 and then received DLSS 2.0 support when released. (Remember, the following image comes from Nvidia, so it’d be wise to take it with a grain of salt too.)
One of the improvements DLSS 2.0 is supposed to bring is strong image quality in areas with moving objects. The updated rendering in the above fan image looks far better than the image using DLSS 1.0, which actually looked noticeably worse than having DLSS off.
DLSS 2.0 is also supposed to provide an improvement over standard DLSS in areas of the image where details are more subtle.
Nvidia promised that DLSS 2.0 would result in greater game adoption. That’s because the original DLSS required training the AI network for every new game needed DLSS support. DLSS 2.0 uses a generalized network, meaning it works across all games and is trained by using “non-game-specific content,” as per Nvidia.
For a game to support the original DLSS, the developer had to implement it, and then the AI network had to be trained specifically for that game. With DLSS 2.0, that latter step is eliminated. The game developer still has to implement DLSS 2.0, but it should take a lot less work, since it’s a general AI network. It also means updates to the DLSS engine (in the drivers) can improve quality for existing games. Unreal Engine 4 and Unity have both also added DLSS 2.0 support, which means it’s trivial for games based on those engines to enable the feature.
How Does DLSS Work?
Both the original DLSS and DLSS 2.0 work with Nvidia’s NGX supercomputer for training of their respective AI networks, as well as RTX cards’ Tensor Cores, which are used for AI-based rendering.
For a game to get DLSS 1.0 support, first Nvidia had to train the DLSS AI neural network, a type of AI network called convolutional autoencoder, with NGX. It started by showing the network thousands of screen captures from the game, each with 64x supersample anti-aliasing. Nvidia also showed the neural network images that didn’t use anti-aliasing. The network then compared the shots to learn how to “approximate the quality” of the 64x supersample anti-aliased image using lower quality source frames. The goal was higher image quality without hurting the framerate too much.
The AI network would then repeat this process, tweaking its algorithms along the way so that it could eventually come close to matching the 64x quality with the base quality images via inference. The end result was “anti-aliasing approaching the quality of [64x Super Sampled], whilst avoiding the issues associated with TAA, such as screen-wide blurring, motion-based blur, ghosting and artifacting on transparencies,” Nvidia explained in 2018.
DLSS also uses what Nvidia calls “temporal feedback techniques” to ensure sharp detail in the game’s images and “improved stability from frame to frame.” Temporal feedback is the process of applying motion vectors, which describe the directions objects in the image are moving in across frames, to the native/higher resolution output, so the appearance of the next frame can be estimated in advance.
DLSS 2.0 gets its speed boost through its updated AI network that uses Tensor Cores more efficiently, allowing for better framerates and the elimination of limitations on GPUs, settings and resolutions. Team Green also says DLSS 2.0 renders just 25-50% of the pixels (and only 11% of the pixels for DLSS 2.1 Ultra Performance mode), and uses new temporal feedback techniques for even sharper details and better stability over the original DLSS.
Nvidia’s NGX supercomputer still has to train the DLSS 2.0 network, which is also a convolution autoencoder. Two things go into it, as per Nvidia: “low resolution, aliased images rendered by the game engine” and “low resolution, motion vectors from the same images — also generated by the game engine.”
DLSS 2.0 uses those motion vectors for temporal feedback, which the convolution autoencoder (or DLSS 2.0 network) performs by taking “the low resolution current frame and the high resolution previous frame to determine on a pixel-by-pixel basis how to generate a higher quality current frame,” as Nvidia puts it.
The training process for the DLSS 2.0 network also includes comparing the image output to an “ultra-high-quality” reference image rendered offline in 16K resolution (15360 x 8640). Differences between the images are sent to the AI network for learning and improvements. Nvidia’s supercomputer repeatedly runs this process, on potentially tens of thousands or even millions of reference images over time, yielding a trained AI network that can reliably produce images with satisfactory quality and resolution.
With both DLSS and DLSS 2.0, after the AI network’s training for the new game is complete, the NGX supercomputer sends the AI models to the Nvidia RTX graphics card through GeForce Game Ready drivers. From there, your GPU can use its Tensor Cores’ AI power to run the DLSS 2.0 in real-time alongside the supported game.
Because DLSS 2.0 is a general approach rather than being trained by a single game, it also means the quality of the DLSS 2.0 algorithm can improve over time without a game needing to include updates from Nvidia. The updates reside in the drivers and can impact all games that utilize DLSS 2.0.
This article is part of the Tom’s Hardware Glossary.
After you’ve gone through the process of building Chia Coin plots on a PC (see how to farm Chia Coin), there’s no need to waste the electricity and tie up expensive computer hardware keeping those plots connected to the Internet. Instead, it’s best to take an external drive or drive(s) with the plots on them and hook them up to a Raspberry Pi where they can stay online, without gulping down too much juice.
In this tutorial, we will create a custom Raspberry Pi Chia farming device powered by the latest Ubuntu 64-bit release for the Raspberry Pi. The unit is designed to be hidden away, farming Chia Coin silently while we go about our lives. As such we chose to house the Raspberry Pi 4 inside of a passively cooled case. Our choice this time was the Akasa Gem Pro which has great cooling for the SoC, PMIC and PCIe chip and a rather tasteful, if unusual design.
For This Project You Will Need
Raspberry Pi 4 4GB
Raspberry pi case, perhaps one of the best Raspberry Pi cases, with cooling
An external USB storage drive or SSD / HDD with USB 3.0 caddy.
16GB Micro SD card or larger
Raspberry Pi Imager tool
Accessories to use your Raspberry Pi 4
Installing Chia On Raspberry Pi 4
1. Install Ubuntu 21.04 to a 16GB micro SD card using the official Raspberry Pi Imager tool. You can also try a headless installation.
2. Connect your keyboard, mouse, screen and Ethernet cable. If you did a headless install, you can skip the keyboard / mouse / screen. Boot your Raspberry Pi 4 and complete the Ubuntu 21.04 installation process. Reboot the Raspberry Pi for all the changes to take effect.
3. Open a terminal and update the software repositories, then upgrade the system software.
$ sudo apt update
$ sudo apt upgrade -y
4. Install the openssh server to enable remote monitoring via an SSH session. After installation this will automatically start the SSH server as a background service.
$ sudo apt install openssh-server
5. Install and start Byobu, a terminal multiplexer that will enable us to log out of the Pi and leave our Chia farm running.
$ sudo apt install byobu
$ byobu
6. Make a note of your Raspberry Pi’s IP address and hostname.
$ hostname -I
7. Install the Git version control software and a number of essential dependencies to build the Chia application.
9. Change directory to chia-blockchain and run the installer.
cd chia-blockchain
sh install.sh
10. Activate the Chia virtual environment and create a new Chia config.
$ . ./activate
$ chia init
11. Connect your USB 3.0 hard drive containing your Chia plot to the blue USB 3.0 ports on the Raspberry Pi 4. The drive will mount to a directory inside of /media/.
12. In the terminal change directory to your USB drive. Our test drive is at /media/les/Chia/
$ cd /media/YOUR USERNAME/DRIVE NAME
13. Add the plot from your USB drive to the Chia config using the 24 word key, created when the plot was created. Enter the command and then type the 24 words with a space between each word.
$ chia keys add
14. Start farming the plot; this will also start services for your wallet. This command will only show that the process has started.
$ chia start farmer
15. Use this command to see the process of syncing our machine to the network and to confirm that farming has begun. The command will update every two seconds. This command can be stopped by pressing CTRL + C.
$ watch 'chia show -s -c'
16. Press F6 to detach from the current Byobu session. This releases us from the running session but it keeps the previous command to view our farming progress running in the background. Should you wish to go back to that Byobu session type this command.
$ byobu
It will take some time for the Pi to sync with the Chia network but it will still continue to farm as it syncs. Right now if you wish, you can unplug the monitor, keyboard, mouse. Leaving just the power, network and USB 3.0 drive connected. Your Pi will happily farm Chia quietly in the corner of the room. But to access the Pi we now need to use SSH, a secure shell terminal and for that we need to instal a client on our computer.
Should you ever need to manually start the Chia farmer, for example after a reboot, start byobu and repeat steps 14 to 16.
How To Remotely Access Your Raspberry Pi Chia Coin Farm
1. Install PuTTY on your PC. PuTTY is used to make remote serial connections, SSH, with our Raspberry Pi 4.
2. Open PuTTY and in the Host Name or IP Address field enter the hostname or IP address of your Raspberry Pi 4. Click Open.
3. Enter your username and password to remotely login to the Raspberry Pi 4.
4. Open the Byobu session to see the current progress.
$ byobu
Auto Mount USB Drive on Boot
Should we need to power off our P, or there is power loss, we need the drive to automatically be ready to farm Chia with little interaction. It is best to connect your keyboard, mouse and screen for this part of the project but it can also be done remotely using an SSH connection.
1. With the USB drive connected, open a terminal and list all the attached devices. Our device is sda1 which is connected to the mountpoint /media/les/Chia. Make a note of your mountpoint, we will need this later.
$ lsblk
2. Run this command to identify the UUID of your drive. Copy the text which starts UUID=. For example our UUID was UUID=”b0984018-3d5e-4e53-89d9-6c1f371dbdee
blkid /dev/YOUR DRIVE
3. Open the filesystem configuration file, fstab with sudo.
$ sudo nano /etc/fstab
4. Add this line to the end of the file. Add your UUID after the =, and leave a space before entering the mountpoint. Here is how our drive is configured.
UUID=b0984018-3d5e-4e53-89d9-6c1f371dbdee /media/les/Chia/ auto nosuid,nodev,nofail,x-gvfs-show 0 0
Palit Microsystems, which owns Galax, Gainward and KFA2, is reportedly working on a range of GeForce RTX 30-series graphics cards that feature cryptocurrency mining limiters which severely degrade the financial viability of using the cards for mining. Last week, we learned that Palit was prepping GeForce RTX 3070 and 3080 graphics cards with limited hash rates (LHR), and this week VideoCardz reported that the GeForce RTX 3060/3060 Ti LHR boards are also on the way from Galax.
The new Galax GeForce RTX 3060/3060 Ti For Gamers boards will be based on Nvidia’s LHR GA106 and GA104 GPUs featuring new IDs and paired with appropriate firmware, according to VideoCardz, which cited Galax as its source; however, the cards aren’t currently listed on the vendor’s site.
Palit’s FG-series GeForce RTX 3060 will reportedly carry 12GB of GDDR6 memory with a 192-bit interface; whereas, the GeForce RTX 3060 Ti will come with 8GB of GDDR6 memory with a 256-bit interface. Both will be equipped with custom triple-fan cooling systems, but their clock rates will be in line with Nvidia’s recommendations: up to 1,777 MHz for the RTX 3060 and up to 1,665 MHz for the RTX 3060 Ti.
Unofficial reports claim Nvidia is quietly rolling out its existing graphics processors with cryptomining limiters enabled by a combination of a new GPU ID, firmware and driver. In particular, it is expected that Nvidia’s lineup of crypto-limited graphics cards will include both existing and new models, such as the RTX 3090 (GA102-302), RTX 3080 Ti (GA102-225), RTX 3080 (GA102-202), RTX 3070 Ti (GA104-400), RTX 3070 (GA104-302), RTX 3060 Ti (GA104-200), RTX 3060 (GA106-302), and GeForce RTX 3050/3050 Ti.
There is a good reason why Nvidia and its graphics card partners reportedly want to deny usage of GeForce GPUs, some of the best graphics cards for gaming, for cryptomining. Mining obviously uses a GPU at around 100% load 24/7, something that a chip for a client PC is not meant for. As a result, failure rates because of mining are almost guaranteed to be growing.
Keeping in mind that Nvidia controls over 80% of the market of discrete desktop graphics cards selling around 9 million GPUs per quarter, increased failure rates would clearly hit the company and partners badly. Just 1% of 9,000,000 is 90,000 RMA cases, and makers of graphics cards (and possibly Nvidia itself) may not be ready to process an overwhelming number of RMA cases in a timely manner. Still, very few GeForce RTX LHR graphics cards have been officially announced so far.
Widespread flaws affecting Wi-Fi have been disclosed to the public by security researcher Mathy Vanhoef nine months after he tipped the Wi-Fi Alliance off about the problem. The vulnerabilities, reported by Gizmondo from a site set up by Vanhoef exploit mistakes in the implementation of Wi-Fi standards, and can affect any Wi-Fi device no matter how old, and running any level of security including WPA 2 and 3.
The ‘fragmentation and aggregation attacks’ – FragAttacks for short – are 12 different vulnerabilities that see Wi-Fi devices leak user data if probed in the right way. Three of the flaws are baked into the Wi-Fi standard itself, while the others flow from programming errors in specific products. The flaws have likely been lurking since Wi-Fi was first released in 1997, as even the venerable WEP protocol is vulnerable – though you really should have moved on from WEP by now, as it’s easily broken.
By taking advantage of the way some routers accept plaintext during handshakes, for example, or the way some networks cache data, intruders could intercept personal data, or even direct users to fake websites. Vanhoef talks us through the attacks in this YouTube video, remotely controlling a smart plug and compromising an outdated Windows 7 PC.
“The biggest risk in practice,” Vanhoef writes, “is likely the ability to abuse the discovered flaws to attack devices in someone’s home network. For instance, many smart home and internet-of-things devices are rarely updated, and Wi-Fi security is the last line of defense that prevents someone from attacking these devices. Unfortunately, due to [these] vulnerabilities, this last line of defense can now be bypassed.”
There is some good news, however: most of the flaws are hard to exploit, patches are available for many devices, including three from Microsoft going all the way back to Windows 7, and from all major router manufacturers (though not all models have received new firmware yet). At the time of writing Vanhoef said he wasn’t aware of any attacks in the wild using the exploits. This could be a good time to ditch your service provider’s router for the latest and best routers.
Icy Dock has developed the industry’s only U.2 to USB 3.2 Gen 2 adapter, which lets you connect an enterprise-grade U.2 SSD to any desktop or laptop with a USB Type-A or Type-C port. The EZ-Adapter Ex MB931U-1VB targets people who need to transfer data from an enterprise-grade SSD to a PC or those who use U.2 drives as recording medium and need to transfer videos to a computer. But PC builders may be attracted to the adapter too.
The U.2 form-factor (SFF-8639) was developed primarily for business and mission-critical server and workstation applications that have very strict requirements for connectivity, thermals, reliability and hot-plug capability. Today, U.2 drives used in servers and workstations and more. For example, select Blackmagic cameras with the Ursa Mini Recorder attached can use U.2 SSDs as storage medium.
A big market for the EZ-Adapter Ex MB931U-1VB are content creators who have to transfer loads of data from one PC to another (or from a camera to a PC). 10GbE networks used in studios are fast, yet a PCIe 3.0 x4 interface of U.2 SSDs is a lot faster, so it makes sense to use U.2 SSDs as flash drives. There are also people who might prefer to use enterprise-grade SSDs as their direct attached storage (DAS), due to their higher endurance and reliability.
But another potential market comes from PC DIYers. U.2 SSDs tend to be very expensive when bought from the IT channel, but they can also be found on sites like eBay for considerably cheaper. Depending on the model, U.2 drives are designed for read-intensive, write-intensive or mixed workloads. Even after some time in service, most U.2 SSDs will have plenty of resource left. Furthermore, such drives are tailored for sustained, rather than burst, performance. As a result, even used U.2 SSDs may be faster and more durable than cheap consumer-grade drives rated for 0.2 DWPD over a three-year period. Hence, it makes sense to consider U.2 SSDs for DIY DAS applications.
Yet, connecting such drives to PCs is complicated, as only select desktop workstations have U.2 ports (or M.2 to U.2 adapters), and not all of them have adapters that can house a U.2 drive. Furthermore, there are no laptops with U.2 slots.
Icy Dock’s EZ-Adapter Ex MB931U-1VB is based on the ASMedia ASM2362 controller. It can house any U.2 SSD and connect it to a PC with a USB 3.2 Gen 2 Type-A or Type-C connector.
The EZ-Adapter Ex MB931U-1VB adapter is available now for $150 from Amazon. A power adapter and USB-A and USB-C cables are included.
It seems like Adata’s decided to lace up its boots, put on its cowboy hat, and ride the bullish Chia Coin market until it finally stops bucking. The company published a press release today extolling the virtues of its latest high-capacity SSDs and memory products in a bid to convince miners to choose its products over the competition.
Adata said in the release that its XPG SX8200 Pro Gen3 x4, SX8100 Gen3 x4, and GAMMIX S50 Lite Gen4 x4 SSDs as well as its DDR4-3200 and XPG GAMMIX D45 3600 32GB modules “have been tested for Chia mining by ADATA engineers on the latest Intel Z590 and AMD X570 platforms to guarantee seamless performance.”
We reported yesterday that Adata’s SSD and DRAM orders increased 500% month-over-month in April due to the Chia boom because the cryptocurrency, which has been pitched as a more environmentally friendly alternative to Bitcoin, prizes storage and memory performance over sheer computational power.
“While the sudden increase in SSD demand is quite unexpected,” Adata president Shalley Chen said, “we are not at all surprised that Chia miners are turning to our high-capacity storage products for their unique needs. At ADATA, we have always been focused on developing and offering high-quality, high-performance, and high-capacity devices to meet the needs of discerning users, whether they be PC enthusiasts, gamers, creators, or crypto miners.”
With the rise of Chia, PC builders have to worry about sourcing yet another part because of cryptocurrency. At this point, we’ll be lucky if mining doesn’t make its way to headsets, keyboards, and mice.
Phison said earlier this week that SSD prices would rise because the cryptocurrency’s boom has coincided with supply issues. We’ve known Chia’s likely to have an outsized influence on the storage market for weeks, especially since farming can quickly wear out some drives.
Until the price of Chia falls, however, that’s unlikely to dissuade miners. CoinGecko said the cryptocurrency’s value had risen nearly 50% over the last week, and it doesn’t show any signs of dropping soon. It makes sense for miners to buy as many drives as possible to maximize their potential profits in the short term.
Minisforum is known for its ultra-compact form-factor (UCFF), highly integrated PCs primarily designed for offices or living rooms. Apparently, the company does not want to stop there. This week, it introduced a rather extraordinary product for itself. Not only is the Minisforum GameMini an attempt at cracking our best gaming PCs list, but it’s also an open-case gaming PC aimed at enthusiasts.
The Minisforum GamiMini appears to be quite a powerful rig. It uses Gigabyte’s Aorus B550I Pro AX motherboard carrying an AMD Ryzen 5 5600X (6 CPU cores,12 threads, clock speed of 3.70 GHz – 4.60 GHz, 32MB of L3 cache, 65W). The board also packs 32GB of dual-channel DDR4-3200 RAM and a 1TB Kingston KC2500 M.2 SSD. The system is also equipped with an AMD Radeon RX 6700 XT graphics card sitting on the opposite side to the motherboard. The PC is powered by SilverStone’s SX650-G power supply, so it can be upgraded fairly easily.
Since the GamiMini is an open system, Minisforum doesn’t have to worry too much about cooling the Ryzen 5 5600X, a 65W CPU that can pretty hot, or the rather power-hungry Radeon RX 6700 XT graphics card that’s rated for up to 230W TGP. Both components use rather modest air coolers.
As far as connectivity is concerned, Minisforum’s GameMini has everything that we come to expect from a Mini-ITX PC these days. The Aorus B550I Pro AX motherboard has a Wi-Fi 6 and Bluetooth 5.1 module, a GbE port and multiple USB connectors.
Minisforum traditionally funds development of its PCs using a crowdfunding platform, so it hasn’t yet discussed pricing of its GameMini or a final release date.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.