Nvidia has just celebrated GeForce Now’s One year anniversary with a blog post showcasing its achievements over its lifetime. Nvidia says that GeForce Now has a population of over 6 million members, over 175 million hours of streamed gameplay, 100 million hours of multiplayer gameplay, and 16 million hours of streaming indie titles alone. The GeForce Now library has also expanded to over 800 titles ranging from RTX-enabled AAA titles to small indie games.
That number of games on the service continues to grow – Nvidia also just announced thirty new games coming to GeForce Now this month, including Apex Legends Season 8, Valheim, Werewolf: The Apocalypse – Earthblood, and more.
Nvidia has expanded to support a wide range of devices and platforms, including Windows, Mac, Android, Android TV, iPhone, and iPad (via Safari). Now Google Chrome is in beta for both Windows and Macs.
GeForce Now is competing with a series of other streaming services, including Google Stadia, Amazon’s Luna, and, perhaps its biggest rival, Microsoft’s xCloud, which is built into Xbox Game Pass. GeForce has gotten some positive responses for the fact that you buy the games yourself and then play them on a remote rig, rather than Stadia, in which you pay for each individual game and keep them on that service. Microsoft’s model is more akin to Netflix, with games coming and going each month.
Nvidia’s first year has seen some serious growth since the closed beta – especially while GPUs are hard to find.
While we still don’t have an Intel Rocket Lake-S Core i9-11900K CPU to use for testing, the Intel Z590 motherboards are arriving in our labs and on store shelves. So while we await the ability to talk benchmarks, we’ll be walking in detail through the features of these brand-new boards. First up on our bench was the ASRock Z590 Steel Legend 6E Wi-Fi, and now we have the Gigabyte Z590 Aorus Master to dive into.
The latest version of this premium motherboard line includes an incredibly robust VRM, ultra-fast Wi-Fi and wired networking, premium audio, and more. While we don’t have exact pricing information at the time of this writing, the Z490 version came in just under $400, which is around where we expect the Z590 version to land, if not slightly higher.
Gigabyte’s current Z590 product stack consists of 13 models. There are familiar SKUs and a couple of new ones. Starting with the Aorus line, we have the Aorus Xtreme (and potentially a Waterforce version), Aorus Master, Aorus Ultra, and the Aorus Elite. Gigabyte brings back the Vision boards (for creators) and their familiar white shrouds. The Z590 Gaming X and a couple of boards from the budget Ultra Durable (UD) series are also listed. New for Z590 is the Pro AX board, which looks to slot somewhere in the mid-range. Gigabyte will also release the Z590 Aorus Tachyon, an overbuilt motherboard designed for extreme overclocking.
We’re not allowed to list any performance metrics for Rocket Lake (not that we have a CPU at this time) as the embargo wasn’t up when we wrote this article. All we’ve seen at this point are rumors and a claim from Intel of a significant increase to IPC, but the core count was lowered from 10 cores/20 threads in Comet Lake (i9-10900K) to 8 cores/16 threads in the yet-to-be-released i9-11900K. To that end, we’ll stick with specifications and features, adding a full review that includes benchmarking, overclocking and power consumption shortly.
The Z590 Aorus Master looks the part of a premium motherboard, with brushed-aluminum shrouds covering the PCIe/M.2/chipset area. The VRM heatsink and its NanoCarbon Fin-Array II provide a nice contrast against the smooth finish on the board’s bottom. Along with Wi-Fi 6E integration, it also includes an Aquantia based 10GbE, while most others use 2.5 GbE. The Aorus Master includes a premium Realtek ALC1220 audio solution with an integrated DAC, three M.2 sockets, reinforced PCIe and memory slots and 10 total USB ports, including a rear USB 3.2 Gen2x2 Type-C port. We’ll cover those features and much more in detail below. But first, here are full the specs from Gigabyte.
Specifications – Gigabyte Z590 Aorus Master
Socket
AM4
Chipset
Z590
Form Factor
ATX
Voltage Regulator
19 Phase (18+1, 90A MOSFETs)
Video Ports
(1) DisplayPort v1.2
USB Ports
(1) USB 3.2 Gen 2×2, Type-C (20 Gbps)
(5) USB 3.2 Gen 2, Type-A (10 Gbps)
(4) USB 3.2 Gen 1, Type-A (5 Gbps)
Network Jacks
(1) 10 GbE
Audio Jacks
(5) Analog + SPDIF
Legacy Ports/Jacks
✗
Other Ports/Jack
✗
PCIe x16
(2) v4.0 x16, (x16/x0 or x8/x8
(1) v3.0 x4
PCIe x8
✗
PCIe x4
✗
PCIe x1
✗
CrossFire/SLI
AMD Quad GPU Crossfire and 2-Way Crossfire
DIMM slots
(4) DDR4 5000+, 128GB Capacity
M.2 slots
(1) PCIe 4.0 x4 / PCIe (up to 110mm)
(1) PCIe 3.0 x4 / PCIe + SATA (up to 110mm)
(1) PCIe 3.0 x4 / PCIe + SATA (up to 110mm)
U.2 Ports
✗
SATA Ports
(6) SATA3 6 Gbps (RAID 0, 1, 5 and 10)
USB Headers
(1) USB v3.2 Gen 2 (Front Panel Type-C)
(2) USB v3.2 Gen 1
(2) USB v2.0
Fan/Pump Headers
(10) 4-Pin
RGB Headers
(2) aRGB (3-pin)
(2) RGB (4-pin)
Legacy Interfaces
✗
Other Interfaces
FP-Audio, TPM
Diagnostics Panel
Yes, 2-character debug LED, and 4-LED ‘Status LED’ display
Opening up the retail packaging, along with the board, you’re greeted by a slew of included accessories. The Aorus Master contains the basics (guides, driver CD, SATA cables) and a few other things that make this board a complete package. Below is a full list of all included accessories.
Installation Guide
User’s Manual
G-connector
Sticker sheet / Aorus badge
Wi-Fi Antenna
(4) SATA cables
(3) Screws for M.2 sockets
(2) Temperature probes
Microphone
RGB extension cable
Image 1 of 3
Image 2 of 3
Image 3 of 3
After taking the Z590 Aorus Master out of the box, its weight was immediately apparent, with the shrouds, heatsinks and backplate making up the majority of that weight. The board sports a matte-black PCB, with black and grey shrouds covering the PCIe/M.2 area and two VRM heatsinks with fins connected by a heatpipe. The chipset heatsink has the Aorus Eagle branding lit up, while the rear IO shroud arches over the left VRM bank with more RGB LED lighting. The Gigabyte RGB Fusion 2.0 application handles RGB control. Overall, the Aorus Master has a premium appearance and shouldn’t have much issue fitting in with most build themes.
Looking at the board’s top half, we’ll first focus on the VRM heatsinks. They are physically small compared to most boards, but don’t let that fool you. The fin array uses a louvered stacked-fin design Gigabyte says increases surface area by 300% and improves thermal efficiency with better airflow and heat exchange. An 8mm heat pipe also connects them to share the load. Additionally, a small fan located under the rear IO shroud actively keeps the VRMs cool. The fan here wasn’t loud, but was undoubtedly audible at default settings.
We saw a similar configuration in the previous generation, which worked out well with an i9-10900K, so it should do well with the Rocket Lake flagship, too. We’ve already seen reports indicating the i9-11900K has a similar power profile to its predecessor. Feeding power to the VRMs is two reinforced 8-pin EPS connectors (one required).
To the right of the socket, things start to get busy. We see four reinforced DRAM slots supporting up to 128GB of RAM. Oddly enough, the specifications only list support up to DDR4 3200 MHz, the platform’s limit. But further down the webpage, it lists DDR4 5000. I find it odd it is listed this way, though it does set up an expectation that anything above 3200 MHz is overclocking and not guaranteed to work.
Above the DRAM slots are eight voltage read points covering various relevant voltages. This includes read points for the CPU Vcore, VccSA, VccIO, DRAM, and a few others. When you’re pushing the limits and using sub-ambient cooling methods, knowing exactly what voltage the component is getting (software can be inaccurate) is quite helpful.
Above those on the top edge are four fan headers (next to the EPS connectors is a fifth) of 10. According to the manual, all CPU fan and pump headers support 2A/24W each. You shouldn’t have any issues powering fans and a water cooling pump. Gigabyte doesn’t mention if these headers use auto-sensing (for DC or PWM control), but they handled both when set to ‘auto’ in the BIOS. Both a PWM and DC controlled fan worked without intervention.
The first two (of four) RGB LED headers live to the fan headers’ right. The Z590 Aorus Master includes two 3-pin ARGB headers and two 4-pin RGB headers. Since this board takes a minimal approach to RGB lighting, you’ll need to use these to add more bling to your rig.
We find the power button and 2-character debug LED for troubleshooting POST issues on the right edge. Below is a reinforced 24-pin ATX connector for power to the board, another fan header and a 2-pin temperature probe header. Just below all of that are two USB 3.2 Gen1 headers and a single USB 3.2 Gen2x2 Type-C front-panel header for additional USB ports.
Gigabyte chose to go with a 19-phase setup for the Vcore and SOC on the power delivery front. Controlling power is an Intersil ISL6929 buck controller that manages up to 12 discrete channels. The controller then sends the power to ISL6617A phase doublers and the 19 90A ISL99390B MOSFETs. This is one of the more robust VRMs we’ve seen on a mid-range board allowing for a whopping 1,620A available for the CPU. You won’t have any trouble running any compatible CPU, including using sub-ambient overclocking.
The bottom half of the board is mostly covered in shrouds hiding all the unsightly but necessary bits. On the far left side, under the shrouds, you’ll find the Realtek ALC1220-VB codec along with an ESS Sabre ESS 9118 DAC and audiophile-grade WIMA and Nichicon Fine Gold capacitors. With the premium audio codec and DAC, an overwhelming majority of users will find the audio perfectly acceptable.
We’ll find the PCIe slots and M.2 sockets in the middle of the board. Starting with the PCIe sockets, there are a total of three full-length slots (all reinforced). The first and second slots are wired for PCIe 4.0, with the primary (top) slot wired for x16 and the bottom maxes out at x8. Gigabyte says this configuration supports AMD Quad-GPU Cand 2-Way Crossfire. We didn’t see a mention of SLI support even though the lane count supports it. The bottom full-length slot is fed from the chipset and runs at PCIe 3.0 x4 speeds. Since the board does without x1 slots, this is the only expansion slot available if you’re using a triple-slot video card. Anything less than that allows you to use the second slot.
Hidden under the shrouds around the PCIe slots are three M.2 sockets. Unique to this setup is the Aorus M.2 Thermal Guard II, which uses a double-sided heatsink design to help cool M.2 SSD devices with double-sided flash. With these devices’ capacities rising and more using flash on both sides, this is a good value-add.
The top socket (M2A_CPU) supports up to PCIe 4.0 x4 devices up to 110mm long. The second and third sockets, M2P_SB and M2M_SB, support both SATA and PCIe 3.0 x3 modules up to 110mm long. When using a SATA-based SSD on M2P_SB, SATA port 1 will be disabled. When M2M_SB (bottom socket) is in use, SATA ports 4/5 get disabled.
To the right of the PCIe area is the chipset heatsink with the Aorus falcon lit up with RGB LEDs from below. There’s a total of six SATA ports that support RAID0, 1, 5 and 10. Sitting on the right edge are two Thunderbolt headers (5-pin and 3-pin) to connect to a Gigabyte Thunderbolt add-in card. Finally, in the bottom-right corner is the Status LED display. The four LEDs labeled CPU, DRAM, BOOT and VGA light up during the POST process. If something hangs during that time, the LED where the problem resides stays lit, identifying the problem area. This is good to have, even with the debug LED at the top of the board.
Across the board’s bottom are several headers, including more USB ports, fan headers and more. Below is the full list, from left to right:
Front-panel audio
BIOS switch
Dual/Single BIOS switch
ARGB header
RGB header
TPM header
(2) USB 2.0 headers
Noise sensor header
Reset button
(3) Fan headers
Front panel header
Clear CMOS button
The Z590 Aorus Master comes with a pre-installed rear IO panel full of ports and buttons. To start, there are a total of 10 USB ports out back, which should be plenty for most users. You have a USB 3.2 Gen2x2 Type-C port, five USB 3.2 Gen2 Type-A ports and four USB 3.2 Gen1 Type-A ports. There is a single DisplayPort output for those who would like to use the CPU’s integrated graphics. The audio stack consists of five gold-plated analog jacks and a SPDIF out. On the networking side is the Aquantia 10 GbE port and the Wi-Fi antenna. Last but not least is a Clear CMOS button and a Q-Flash button, the latter designed for flashing the BIOS without a CPU.
Firmware
The Z590 Aorus Master BIOS theme doesn’t look any different from the Z490 versions. The Aorus board still uses the black and orange theme we’re familiar with. We’ve captured a majority of the BIOS screens to share with you. Like other board partners, Gigabyte includes an Easy Mode for high-level monitoring and adjustments, along with an Advanced section. The BIOS is well organized, with many of the more commonly used functions easily accessible without drilling down multiple levels to find them. In the end, the BIOS works well and is easy to navigate and read.
Image 1 of 17
Image 2 of 17
Image 3 of 17
Image 4 of 17
Image 5 of 17
Image 6 of 17
Image 7 of 17
Image 8 of 17
Image 9 of 17
Image 10 of 17
Image 11 of 17
Image 12 of 17
Image 13 of 17
Image 14 of 17
Image 15 of 17
Image 16 of 17
Image 17 of 17
Software
Gigabyte includes a few applications designed for various functions, including RGB lighting control, audio, system monitoring, and overclocking. Below, we’ve captured several screenshots of the App Center, @BIOS, SIV, RGB Fusion and Easy Tune.
Image 1 of 14
Image 2 of 14
Image 3 of 14
Image 4 of 14
Image 5 of 14
Image 6 of 14
Image 7 of 14
Image 8 of 14
Image 9 of 14
Image 10 of 14
Image 11 of 14
Image 12 of 14
Image 13 of 14
Image 14 of 14
Future Tests and Final Thoughts
With the release of Z590, we’re in a bit of a pickle in that we have boards in our hands, but not the Rocket Lake CPU designed for it. We know most of these boards should perform similarly to our previous Z490 motherboard reviews. And while there are exceptions, they are mostly at the bottom of the product stack. To that end, we’re posting these as detailed previews until we get data using a Rocket Lake processor.
Once we receive a Rocket Lake CPU and as soon as any embargos have expired, we’ll fill in the data points, including the benchmarking/performance results, as well as overclocking/power and VRM temperatures.
We’ll also be updating our test system hardware to include a PCIe 4.0 video card and storage. This way, we can utilize the platform to its fullest using the fastest protocols it supports. We will also update to the latest Windows 10 64-bit OS (20H2) with all threat mitigations applied, as well as updating the video card driver and use the newest release when we start this testing. We use the latest non-beta motherboard BIOS available to the public unless otherwise noted.
While we do not have performance results from the yet-to-be-released Rocket Lake CPU, we’re sure the 90A VRMs will handle the i9-11900K processor without issue. We quickly tested the i9-10900K and found the board quite capable with that CPU, easily allowing the 5.2 GHz overclock we set. For now, we’ll focus on features, price, and appearance until we gather performance data from the new CPU.
The Gigabyte Z590 Aorus Master is a well-rounded solution, bringing a lot of premium features to the table. Baked into the chipset is USB 3.2 Gen2x2 support, and on the network side, a 10 GbE port and Intel’s Wi-Fi 6E AX210 card are basically the best you can get out of the box. The 90A 18-phase VRM for the processor does not have any issues with an overclocked Comet-Lake CPU, so the new Rocket-Lake CPUs at the same TDP shouldn’t have a problem. This board can be used for sub-ambient overclocking (though the Gigabyte Z590 Tachyon is the purpose-built board by Gigabyte for such a thing).
Since Z590 added native PCIe 4.0 support (with Rocket Lake CPUs only) and additional PCIe lanes, we’ll see more boards with up to three M.2 sockets, just like the less-expensive Steel Legend has. The Aorus Master sports one PCIe 4.0 x4 (64 Gbps) slot and two PCIe 3.0 x4 (32 Gbps) slots. Add to that the six SATA ports and nearly everyone’s storage needs should be covered. The 10 USB ports on the rear IO include a USB 3.2 Gen2x2 Type-C port and should be plenty for most users.
If I had to pick out something that needs improvement, I would like to see more expansion slots. As it stands, there is only one full-length PCIe slot. The $400-plus price tag will also likely put off budget users. While Gigabyte hasn’t listed an exact price for the Aorus Master, the Z490 version came in at just under $400. We expect the Z590 version to be at that point or a little higher.
Compared to similarly priced peers (think ASRock Z590 Taichi, MSI MEG Z590 Unify and the Asus ROG Strix Z590-E Gaming WiFi ), the Gigabyte Aorus Z590 Master covers all the bases. If you prefer the latest audio codec and four M.2 sockets, instead of three, the Asus Z590-E Gaming has you taken care of. If you need ultra-fast networking, Gigabyte has you covered with its 10 GbE. All of the comparable boards are certainly capable and include quite a bit of features at this price point, so it comes down to the price, appearance, and features you need.
In the end, The Gigabyte Aorus Z590 Master is, like most Z590 motherboards, an iterative update from Z490. You get Rocket Lake support out of the box, superior power delivery, ultra-fast networking, and a premium appearance. If you’re looking for a Z590 motherboard around the $400 price point, The Z590 Aorus Master should be on your shortlist. Stay tuned for benchmarking, overclocking, and power results using the new Rocket Lake CPU short list.
Mechanical keyboard enthusiasts can spend hundreds of dollars on boards with limited-run backplates, obscure switches,and bespoke keycaps. Some high-end pre-built models, including the best gaming keyboards and best wireless keyboards, can also pass the $100 mark pretty quickly. But that doesn’t mean the joys of mechanical switches are exclusive to the deep-pocketed–there are some budget mechanical keyboards worth buying.
Before we get to our picks for the best budget mechanical keyboards, though, here are a few tips for shopping around.
Choose your form factor. When looking for the best budget mechanical keyboard, many will be either full-sized with a number pad or tenkeyless (TKL) without one. Those who spend a lot of time working in spreadsheets probably can’t imagine using a keyboard without a number pad, but those who are more likely to spend their time gaming might appreciate the extra space afforded by a TKL design. There are also 60% keyboards that omit the arrow and navigational keys, giving you the most desk space but appealing to a smaller crowd, due to the more limited functionality.
Choose your switch type. Mechanical switches aren’t all created equal. Different switches feature varying actuation points, travel distances,and types of feedback. There are three main categories of mechanical switches you should know:
Linear switches are easy to depress because there’s no tactile bump along the way to bottoming out. Many gamers prefer linear switches because they’re easy to press repeatedly quickly and tend to be quiet. These are often Red or Black.
Tactile switches feature a noticeable bump that offers clear feedback before bottoming out and, in many cases, increase the required actuation force. Many typists prefer tactile keyboards because they make it easier to feel each keypress. Common examples include Brown and Clear switches.
Clicky switches are tactile switches but also make noise when they hit the bump in the keypress. These are preferred by people who A) work alone or B) want to subject everyone around them to a cacophony of click-clacks for some reason. These are often Blue, Green or White.
There’s a wide range of options within each category, but these categories should apply to the vast majority of mechanical switches. Manufacturers are often kind enough to break down their switches into those categories, and they’re usually color-coded as well. Some examples include:
Consider a switch tester. The wide variety of mechanical switches available can make buying a keyboard seem daunting. Although it’s possible to change mechanical keyboard switches, it’s a notable hassle. Luckily, there are many switch testers on the market that make it easier to experiment with a variety of switches–the exact mix depends on the tester–before committing to a specific one. It’s an added up-front cost but cheaper than replacing a keyboard that features switches you don’t like.
Best Budget Mechanical Keyboard You Can Buy Today
1. Cooler Master CK552
Best Budget Mechanical Keyboard
Switches: Gateron Red, Blue or Brown | Backlight: Per-key RGB | Type: Full-sized | Size: 18.1 x 5.3 x 1.6 inches (460 x 135 x 41mm) | Weight: 1.9 pounds (850g)
Beautiful RGB backlighting
Solid aluminum top plate
Variety of available switch types
Red switches can be easy to mis-press
The Cooler Master CK552 is the best budget mechanical keyboard for most. It’s a full-sized gaming keyboard with RGB backlighting, an aluminum top plate, and a 5.9-foot (1.8m) USB 2.0 cable. It includes a good selection of switch types: Gateron Brown (tactile), Blue (clicky) and Red (linear), which are all said to be able to withstand up to 50 million key presses without failure.
My review unit came equipped with Gateron Red switches that proved responsive during gaming–so much so that I occasionally pressed a key I didn’t mean to. That also means it stumbled a bit during heavy typing sessions. Choosing a different switch type could help, and Cooler Master makes the keyboard with tactile Brown or clicky Blue switches too, but they aren’t as easy to find online as of writing.
Good for gamers and enthusiasts, the CK552 features on-board memory that can store up to four profiles and on-the-fly controls that make it easy to record macros and control the backlighting. It’s also compatible with the Cooler Master Portal utility, which offers more granular controls over many of the same areas but isn’t as robust as rivals, such as Razer Synapse. Still, this dual approach should appeal to people who don’t like to install a bunch of software and those who want greater control over their peripherals.
2. HyperX Alloy FPS Pro
Best Budget TKL Mechanical Keyboard for Gaming
Switches: Cherry MX Brown or Blue | Backlight: Red | Type: Full-sized | Size: 14.1 x 5.1 x 1.4 inches (358 x 130 x 36mm) | Weight: 1.8 pounds (816g)
Compact shape, detachable cable make for easy transport
Cherry MX switches are slightly better than rivals
Solid steel construction should withstand heated gaming sessions
Red-only backlighting
Keycaps show noticeable shine after limited use
The HyperX Alloy FPS Pro (currently available for $70) was made with esports in mind, making one of the best budget mechanical keyboards for gamers. It features a compact TKL design, a detachable USB cable and solid steel construction that should allow it to travel well. Those features alone would allow it to stand out from other keyboards on this list, but HyperX didn’t stop at the Alloy FPS Pro’s portability.
The Alloy FPS Pro is available with Cherry MX-branded linear Red or clicky Blue Switches, and that also helps it stand out from other budget mechanical keyboards. There’s nothing wrong with most manufacturers’ switches, but Cherry’s are still seen as the best of the best–at least in the mainstream consumer market. (Don’t worry, enthusiasts, we wouldn’t besmirch your Zealios or the new Panda switches from the Glorious PC Gaming Race.)
HyperX also decked out the Alloy FPS Pro with n-key rollover and 100% anti-ghosting as well as red backlighting with a variety of effects. All of these features combined led to some of the best and most comfortable gaming sessions I had during the course of preparing this round-up. The Alloy FPS Pro was responsive, fit perfectly with the rest of my setup and felt like the natural choice for gaming.
It would be nice to see RGB backlighting and some dedicated media keys, but from a pure gaming standpoint, it’s hard to beat the Alloy FPS Pro at this price point.
3. Logitech K840 Mechanical
Best Budget Mechanical Keyboard for Productivity
Switches: Romer-G | Backlight: None | Type: Full-sized | Size: 17.5 x 5.2 x 1.4 inches (445 x 132 x 34.3mm) | Weight: 2 pounds (910g)
Comfortable Romer-G switches rated for 70 million clicks
Minimalist design suitable for any desk
Quieter than most keyboards with tactile mechanical switches
No backlighting
On the larger, heavier side
The Logitech K840 Mechanical keyboard features the company’s proprietary Romer-G mechanical switches that have proven fairly divisive among consumers. Some people appreciate the switch’s tactile feel, which is quite distinct from other tactile switches, while others think it feels a little “mushy. The switches have 3.2mm total travel, actuating at 1.5mm with 45 grams of force, compared to the more traditional Cherry MX Brown’s 4mm total travel, 2mm actuation point and 55 grams of force. I’ve used Romer-G switches almost daily for the last several years, and it’s safe to say that I’m in the former camp.
Little else about the K840 should prove divisive. The keyboard features an anodized aluminum top plate, which helps with durability and a more premium look, plus a 5.9-foot (1.8m) braided cable. Flip-out feet make it easy to adjust the typing angle. Logitech also threw in a 26-key rollover that should eliminate registration errors and features customizable function keys, and the keyboard is compatible with the Logitech Options configuration software available on Windows and macOS.
Unfortunately the K840 doesn’t feature any backlighting, which will disappoint anyone who wants to click-clack late into the night, but that’s the only true disappointment here. Plus, the lack of RGB combined with its mute design makes it fitting for the workplace. With a numpad, navigational keys and tactile quiet switches, there’s no excuse not to get work done. This is a solidly built keyboard that I found comfortable to type on, even though it’s a bit larger than what I’m used to.
The Logitech K840 Mechanical also works for gaming in a pinch, although I found it too wide to use comfortably with my large mousepad. Someone with a smaller mousepad–or, perhaps, just wider shoulders–might be okay. Everyone else would probably be better served with a more svelte option.
4. Aukey KM-G14
Best Budget TKL Mechanical Keyboard for Typing
Switches: Outemu Blue | Backlight: RGB | Type: Tenkeyless | Size: 14.1 x 5.4 x 1.4 inches (357 x 138 x 36mm) | Weight: 2.3 pounds (1.1kg)
Satisfying–but not deafening–clicky switches
Ergonomic design well-suited to touch typing
RGB backlighting
Still a bit loud for intense gameplay
Heavy for a TKL board
Not as durable as other options
The Aukey KM-G14 was the only clicky keyboard I tested that didn’t make me want to “accidentally” spill something on top of it. The clicks are still pronounced, sure, but they’re closer to the pleasant pinging end of the spectrum than the “unbearable clacking” end. Pretty much everything else about the keyboard was also a welcome surprise, given its price.
Aukey equipped the KM-G14 with RGB backlighting and full n-key rollover. It also used double-shot ABS keycaps, which is surprising given how cheap this mechanical keyboard is. Still, the KM-G14 comes with a keycap puller to make it easier to swap out the keycaps with something a little snazzier, which was a nice touch and something I wish some other manufacturers had thought to include with their more-unfortunate-looking keycaps.
While this is marketed as a gaming keyboard, I’m recommending it for typing, simply because the clicks proved to be distracting to me and my teammates while I played games like Valorant and Call of Duty: Modern Warfare. There are no tactile or linear switch options here that would be less distracting. People who prefer single-player games–or who don’t mind griefing their teammates every time they press a key–will find a fine gaming keyboard here as well.
5. Havit KB487L
Best Budget Mechanical Keyboard for Mixed Use
Switches: Outemu Red | Backlight: None | Type: Tenkeyless | Size: 15.7 x 7 x 1.9 inches (398 x 177 x 48mm) | Weight: 2.3 pounds (1kg)
Attractive design with distinctive keycaps
Unique layout that bridges the TKL and full form factors
Responsive linear switches that are great for gaming
Lack of switch options
Unique layout won’t be for everyone
The Havit KB487L doesn’t fit neatly into any other category, but it’s such an interesting keyboard that we had to include it here. It’s a standard TKL shape, but instead of having the usual cluster of shortcut keys along the right-hand side, it has a number pad. This leads to a ‘have your cake and eat it too’ design that offers the space-saving advantages of a TKL keyboard but doesn’t actually consign spreadsheet lovers to using the number row. I didn’t notice the difference during everyday use–I rarely use either the shortcut cluster or the number pad–but it’s almost certain to throw off anyone who’s used to a more traditional layout.
Havit also equipped the KB487L with durable PBT keycaps that felt nicer than any of the other keycaps I poked, prodded and pressed in the course of preparing this round-up. They also boast a unique black, white, and orange color scheme that allow the KB487L to stand out among the sea of monochromatically faced keyboards currently available. This doesn’t look or feel like a budget mechanical keyboard.
Luckily the KB487L’s beauty is more than skin deep. I didn’t notice any mis-pressed keys throughout multiple days of playing Valorant or Counter-Strike: Global Offensive, and those games make it pretty easy to tell when you’re doing something wrong with the keyboard, namely by making what should have clearly been a headshot fly off somewhere between the lost cities of Atlantis and Narnia. It still wasn’t my preferred experience, but I liked it more than most linear switch equipped options.
Corellium, a software company specializing in virtualization solutions, has managed to port Linux to an Apple M1-based PC and even succeeded in making almost all the system peripherals work. In the process, Corellium discovered several interesting details about Apple’s M1 processor and the system architecture.
A couple of weeks ago, we reported that a startup called Corellium had managed to run Linux on an Apple M1-based computer. Back then, the operating system ran, but it did not support many things, essentially making the PC unusable to a large degree. Recently the company finally managed to make most of the things (including Wi-Fi) work, which means that Linux can now be used on the latest Macs. But the whole project of running a non-Apple OS on such computers has an interesting side effect as it reveals how different Apple’s SoCs are compared to other Arm-based architectures.
Loads of Proprietary Technologies
It’s no secret that Apple has focused on building its own Arm-based microarchitectures to offer unbeatable performance with its iPhones and iPads for quite a while now. Unlike its rivals, the company did not throw in more cores, instead improving its cores’ single-core/single-thread performance. In addition to custom cores, Apple apparently uses a highly custom system architecture too, according to Corellium.
When virtually all 64-bit Arm-based systems bootup, they call firmware through an interface called PSCI, but in the case of the M1, the CPU cores start at an address specified by an MMIO register and then start to run the kernel. Furthermore, Apple systems also use a proprietary Apple Interrupt Controller (AIC) that is not compatible with Arm’s standards. Meanwhile, the timer interrupts are connected to the FIQ, an obscure architectural feature primarily used on 32-bit Arm systems that is not compatible with Linux.
To make various processors in an M1-powered PC interact with each other, the OS has to provide a set of inter-processor interrupts (IPIs). Previously IPIs were handled just like traditional IRQs using MMIO accesses to the AIC, but in the case of the M1, Apple uses processor core registers to dispatch and acknowledge IPIsas they rely on FIQs.
Apple’s oddities do not end there. For example, Apple’s Wi-Fi/Bluetooth controller connects to the SoC using a non-standard PCIe-based protocol (which fortunately was supported by Corellium virtualization software). To make matters more complicated, Apple’s PCIe and the integrated Synopsys DWC3 USB controller use the company’s proprietary input–output memory management unit (IOMMU) called device address resolution table (DART). Furthermore, Apple’s I2C has a custom firmware that uses an exclusive protocol, which thwarted using the USB Type-A ports.
Complications
Using a proprietary system architecture is not something new for Apple, but it will make it much harder to port other operating systems to its platforms as well as running those OSes in virtualization mode. Recently a developer managed to make Microsoft’s upcoming Windows 10X run on an Apple M1-based system using QEMU virtualization, but this OS is not yet final, and it is unclear how stable it is. Furthermore, Windows 10X does not run Win32 apps, making it less valuable for some users.
Running Windows 10 or Linux on an Apple Mac may not be crucially important for most Mac owners. But a complicated system architecture featuring multiple proprietary technologies will likely make it harder to develop certain kinds of software and hardware for Arm-based Macs.
Mass Effect’s Legendary Edition remaster just got its release date set for May 14th, and the community is hard at work pulling every detail out of EA that the studio will let out into the wild. Among those uncovered details are the trilogy’s system requirements, and suffice it to say, it’s the much-needed break our systems need.
EA published the following system requirements:
Mass Effect Legendary Edition Minimum PC Requirements:
Operating System: 64-bit Windows 10
CPU: Intel Core i5 3570 or AMD FX-8350
Memory: 8 GB System Memory
GPU: NVIDIA GTX 760, AMD Radeon 7970 / R9280X
GPU Memory: 2 GB Video Memory
Storage: At least 120 GB of free space
Mass Effect Legendary Edition Recommended PC Requirements:
Operating System: 64-bit Windows 10
CPU: Intel Core i7-7700 or AMD Ryzen 7 3700X
Memory: 16 GB System Memory
GPU: NVIDIA GTX 1070, AMD Radeon Vega 56.
GPU Memory: 4 GB Video Memory
Storage: At least 120 GB of free space
These requirements aren’t steep, which is much appreciated in a day and age where games are becoming increasingly taxing on our systems — in a time when it’s nearly impossible to get your hands on a powerful graphics card for any reasonable amount of money.
Although nobody gets away with less than 120GB of free space, a good experience can be had with modest graphics cards and old CPUs. Better hardware will, of course, help you make the most out of the unlocked framerate, though. EA is upping the textures to be 4K ready, and 21:9 support is also being added for fans of ultrawide displays.
Of course, none of this is all too surprising. The studio decided that Mass Effect Legendary Edition was best remastered on the Unreal Engine 3 the original games were built on. Using UE4 would have required a full remake instead of a polishing up become far too big a task. As a result, it’s Mass Effect 1 that will benefit most from the remastering process.
Meanwhile, although the intention was for all DLC to be included with the Legendary Edition trilogy, Mass Effect 1’s Pinnacle Station DLC won’t make the cut. The reason here is simple: the original source code wasn’t backed up properly and is now corrupted, and remaking the DLC isn’t within the scope of work EA was able to put into the project.
Intel’s 12th-Gen Alder Lake chip will bring the company’s hybrid architecture, which combines a mix of larger high-performance cores paired with smaller high-efficiency cores, to desktop x86 PCs for the first time. That represents a massive strategic shift as Intel looks to regain the uncontested performance lead against AMD’s Ryzen 5000 series processors. AMD’s Zen 3 architecture has taken the lead in our Best CPUs and CPU Benchmarks hierarchy, partly on the strength of their higher core counts. That’s not to mention Apple’s M1 processors that feature a similar hybrid design and come with explosive performance improvements of their own.
Intel’s Alder Lake brings disruptive new architectures and reportedly supports features like PCIe 5.0 and DDR5 that leapfrog AMD and Apple in connectivity technology, but the new chips come with significant risks. It all starts with a new way of thinking, at least as far as x86 chips are concerned, of pairing high-performance and high-efficiency cores within a single chip. That well-traveled design philosophy powers billions of Arm chips, often referred to as Big.Little (Intel calls its implementation Big-Bigger), but it’s a first for x86 desktop PCs.
Intel has confirmed that its Golden Cove architecture powers Alder Lake’s ‘big’ high-performance cores, while the ‘small’ Atom efficiency cores come with the Gracemont architecture, making for a dizzying number of possible processor configurations. Intel will etch the cores on its 10nm Enhanced SuperFin process, marking the company’s first truly new node for the desktop since 14nm debuted six long years ago.
As with the launch of any new processor, Intel has a lot riding on Alder Lake. However, the move to a hybrid architecture is unquestionably riskier than prior technology transitions because it requires operating system and software optimizations to achieve maximum performance and efficiency. It’s unclear how unoptimized code will impact performance.
In either case, Intel is going all-in: Intel will reunify its desktop and mobile lines with Alder Lake, and we could even see the design come to the company’s high-end desktop (HEDT) lineup.
Intel might have a few tricks up its sleeve, though. Intel paved the way for hybrid x86 designs with its Lakefield chips, the first such chips to come to market, and established a beachhead in terms of both Windows and software support. Lakefield really wasn’t a performance stunner, though, due to a focus on lower-end mobile devices where power efficiency is key. In contrast, Intel says it will tune Alder Lake for high-performance, a must for desktop PCs and high-end notebooks. There are also signs that some models will come with only the big cores active, which should perform exceedingly well in gaming.
Meanwhile, Apple’s potent M1 processors with their Arm-based design have brought a step function improvement in both performance and power consumption over competing x86 chips. Much of that success comes from Arm’s long-standing support for hybrid architectures and the requisite software optimizations. Comparatively, Intel’s efforts to enable the same tightly-knit level of support are still in the opening stages.
Potent adversaries challenge Intel on both sides. Apple’s M1 processors have set a high bar for hybrid designs, outperforming all other processors in their class with the promise of more powerful designs to come. Meanwhile, AMD’s Ryzen 5000 chips have taken the lead in every metric that matters over Intel’s aging Skylake derivatives.
Intel certainly needs a come-from-behind design to thoroughly unseat its competitors, swinging the tables back in its favor like the Conroe chips did back in 2006 when the Core architecture debuted with a ~40% performance advantage that cemented Intel’s dominance for a decade. Intel’s Raja Koduri has already likened the transition to Alder Lake with the debut of Core, suggesting that Alder Lake could indeed be a Conroe-esque moment.
In the meantime, Intel’s Rocket Lake will arrive later this month, and all signs point to the new chips overtaking AMD in single-threaded performance. However, they’ll still trail in multi-core workloads due to Rocket Lake’s maximum of eight cores, while AMD has 16-core models for the mainstream desktop. That makes Alder Lake exceedingly important as Intel looks to regain its performance lead in the desktop PC and laptop markets.
While Intel hasn’t shared many of the details on the new chip, plenty of unofficial details have come to light over the last few months, giving us a broad indication of Intel’s vision for the future. Let’s dive in.
Intel’s 12th-Gen Alder Lake At a Glance
Qualification and production in the second half of 2021
Hybrid x86 design with a mix of big and small cores (Golden Cove/Gracemont)
10nm Enhanced SuperFin process
LGA1700 socket requires new motherboards
PCIe 5.0 and DDR5 support rumored
Four variants: -S for desktop PCs, -P for mobile, -M for low-power devices, -L Atom replacement
Gen12 Xe integrated graphics
New hardware-guided operating system scheduler tuned for high performance
Intel Alder Lake Release Date
Intel hasn’t given a specific date for Alder Lake’s debut, but it has said that the chips will be validated for production for desktop PCs and notebooks with the volume production ramp beginning in the second half of the year. That means the first salvo of chips could land in late 2021, though it might also end up being early 2022. Given the slew of benchmark submissions and operating system patches we’ve seen, early silicon is obviously already in the hands of OEMs and various ecosystem partners.
Intel and its partners also have plenty of incentive to get the new platform and CPUs out as soon as possible, and we could have a similar situation to 2015’s short-lived Broadwell desktop CPUs that were almost immediately replaced by Skylake. Rocket Lake seems competitive on performance, but the existing Comet Lake chips (e.g. i9-10900K) already use a lot of power, and i9-11900K doesn’t look to change that. With Enhanced SuperFIN, Intel could dramatically cut power requirements while improving performance.
Intel Alder Lake Specifications and Families
Intel hasn’t released the official specifications of the Alder Lake processors, but a recent update to the SiSoft Sandra benchmark software, along with listings to the open-source Coreboot (a lightweight motherboard firmware option), have given us plenty of clues to work with.
The Coreboot listing outlines various combinations of the big and little cores in different chip models, with some models even using only the larger cores (possibly for high-performance gaming models). The information suggests four configurations with -S, -P, and -M designators, and an -L variant has also emerged:
Alder Lake-S: Desktop PCs
Alder Lake-P: High-performance notebooks
Alder Lake-M: Low-power devices
Alder Lake-L: Listed as “Small Core” Processors (Atom)
Intel Alder Lake-S Desktop PC Specifications
Alder Lake-S*
Big + Small Cores
Cores / Threads
GPU
8 + 8
16 / 24
GT1 – Gen12 32EU
8 + 6
14 / 22
GT1 – Gen12 32EU
8 + 4
12 / 20
GT1 – Gen12 32EU
8 + 2
10 / 18
GT1 – Gen12 32EU
8 + 0
8 / 16
GT1 – Gen12 32EU
6 + 8
14 / 20
GT1 – Gen12 32EU
6 + 6
12 / 18
GT1 – Gen12 32EU
6 + 4
10 / 16
GT1 – Gen12 32EU
6 + 2
8 / 14
GT1 – Gen12 32EU
6 + 0
6 / 12
GT1 – Gen12 32EU
4 + 0
4 / 8
GT1 – Gen12 32EU
2 + 0
2 / 4
GT1 – Gen12 32EU
*Intel has not officially confirmed these configurations. Not all models may come to market. Listings assume all models have Hyper-Threading enabled on the large cores.
Intel’s 10nm Alder Lake combines large Golden Cove cores that support Hyper-Threading (Intel’s branded version of SMT, symmetric multi-threading, that allows two threads to run on a single core) with smaller single-threaded Atom cores. That means some models could come with seemingly-odd distributions of cores and threads. We’ll jump into the process technology a bit later.
As we can see above, a potential flagship model would come with eight Hyper-Threading enabled ‘big’ cores and eight single-threaded ‘small’ cores, for a total of 24 threads. Logically we could expect the 8 + 8 configuration to fall into the Core i9 classification, while 8 + 4 could land as Core i7, and 6 + 8 and 4 + 0 could fall into Core i5 and i3 families, respectively. Naturally, it’s impossible to know how Intel will carve up its product stack due to the completely new paradigm of the hybrid x86 design.
We’re still quite far from knowing particular model names, as recent submissions to public-facing benchmark databases list the chips as “Intel Corporation Alder Lake Client Platform” but use ‘0000’ identifier strings in place of the model name and number. This indicates the silicon is still in the early phases of testing, and newer steppings will eventually progress to production-class processors with identifiable model names.
Given that these engineering samples (ES) chips are still in the qualification stage, we can expect drastic alterations to clock rates and overall performance as Intel dials in the silicon. It’s best to use the test submissions for general information only, as they rarely represent final performance.
The 16-core desktop model has been spotted in benchmarks with a 1.8 GHz base and 4.0 GHz boost clock speed, but we can expect that to increase in the future. For example, a 14-core 20-thread Alder Lake-P model was recently spotted at 4.7 GHz. We would expect clock rates to be even higher for the desktop models, possibly even reaching or exceeding 5.0 GHz on the ‘big’ cores due to a higher thermal budget.
Meanwhile, it’s widely thought that the smaller efficiency cores will come with lower clock rates, but current benchmarks and utilities don’t enumerate the second set of cores with a separate frequency domain, meaning we’ll have to wait for proper software support before we can learn clock rates for the efficiency cores.
We do know from Coreboot patches that Alder Lake-S supports two eight-lane PCIe 5.0 connections and two four-lane PCIe 4.0 connections, for a total of 24 lanes. Conversely, Alder Lake-P dials back connectivity due to its more mobile-centric nature and has a single eight-lane PCIe 5.0 connection along with two four-lane PCIe 4.0 interfaces. There have also been concrete signs of support for DDR5 memory. There are some caveats, though, which you can read about in the motherboard section.
Intel Alder Lake-P and Alder Lake-M Mobile Processor Specifications
Alder Lake-P* Alder Lake-M*
Big + Small Cores
Cores / Threads
GPU
6 + 8
14 / 20
GT2 Gen12 96EU
6 + 4
10 / 14
GT2 Gen12 96EU
4 + 8
12 / 16
GT2 Gen12 96EU
2 + 8
10 / 12
GT2 Gen12 96EU
2 + 4
6 / 8
GT2 Gen12 96EU
2 + 0
2 / 4
GT2 Gen12 96EU
*Intel has not officially confirmed these configurations. Not all models may come to market. Listings assume all models have Hyper-Threading enabled on the large cores.
The Alder Lake-P processors are listed as laptop chips, so we’ll probably see those debut in a wide range of notebooks that range from thin-and-light form factors up to high-end gaming notebooks. As you’ll notice above, all of these processors purportedly come armed with Intel’s Gen 12 Xe architecture in a GT2 configuration, imparting 96 EUs across the range of chips. That’s a doubling of execution units over the desktop chips and could indicate a focus on reducing the need for discrete graphics chips.
There is precious little information available for the -M variants, but they’re thought to be destined for lower-power devices and serve as a replacement for Lakefield chips. We do know from recent patches that Alder Lake-M comes with reduced I/O support, which we’ll cover below.
Finally, an Alder Lake-L version has been added to the Linux kernel, classifying the chips as ‘”Small Core” Processors (Atom),’ but we haven’t seen other mentions of this configuration elsewhere.
Intel Alder Lake 600-Series Motherboards, LGA 1700 Socket, DDR5 and PCIe 5.0
Intel’s incessant motherboard upgrades, which require new sockets or restrict support within existing sockets, have earned the company plenty of criticism from the enthusiast community – especially given AMD’s long line of AM4-compatible processors. That trend will continue with a new requirement for LGA 1200 sockets and the 600-series chipset for Alder Lake. Still, if rumors hold true, Intel will stick to the new socket for at least the next generation of processors (7nm Meteor Lake) and possibly for an additional generation beyond that, rivaling AMD’s AM4 longevity.
Last year, an Intel document revealed an LGA 1700 interposer for its Alder Lake-S test platform, confirming that the rumored socket will likely house the new chips. Months later, an image surfaced at VideoCardz, showing an Alder Lake-S chip and the 37.5 x 45.0mm socket dimensions. That’s noticeably larger than the current-gen LGA 1200’s 37.5 x 37.5mm.
Because the LGA 2077 socket is bigger than the current sockets used in LGA 1151/LGA 1200 motherboards, existing coolers will be incompatible, but we expect that cooler conversion kits could accommodate the larger socket. Naturally, the larger socket is needed to accommodate 500 more pins than the LGA 1200 socket. Those pins are needed to support newer interfaces, like PCIe 5.0 and DDR5, among other purposes, like power delivery.
PCIe 5.0 and DDR5 support are both listed in patch notes, possibly giving Intel a connectivity advantage over competing chips, but there are a lot of considerations involved with these big technology transitions. As we saw with the move from PCIe 3.0 to 4.0, a step up to a faster PCIe interface requires thicker motherboards (more layers) to accommodate wider lane spacing, more robust materials, and retimers due to stricter trace length requirements. All of these factors conspire to increase cost.
We recently spoke with Microchip, which develops PCIe 5.0 switches, and the company tells us that, as a general statement, we can expect those same PCIe 4.0 requirements to become more arduous for motherboards with a PCIe 5.0 interface, particularly because they will require retimers for even shorter lane lengths and even thicker motherboards. That means we could see yet another jump in motherboard pricing over what the industry already absorbed with the move to PCIe 4.0. Additionally, PCIe 5.0 also consumes more power, which will present challenges in mobile form factors.
Both Microchip and the PCI-SIG standards body tell us that PCIe 5.0 adoption is expected to come to the high-performance server market and workstations first, largely because of the increased cost and power consumption. That isn’t a good fit for consumer devices considering the slim performance advantages in lighter workloads. That means that while Alder Lake may support PCIe 5.0, it’s possible that we could see the first implementations run at standard PCIe 4.0 signaling rates.
Intel took a similar tactic with its Tiger Lake processors – while the chips internal pathways are designed to accommodate the increased throughput of the DDR5 interface via a dual ring bus, they came to market with DDR4 memory controllers, with the option of swapping in new DDR5 controllers in the future. We could see a similar approach with PCIe 4.0, with the first devices using existing controller tech, or the PCIe 5.0 controllers merely defaulting to PCIe 4.0.
Benchmarks have surfaced that indicate that Alder Lake supports DDR5 memory, but like the PCIe 5.0 interface, but it also remains to be seen if Intel will enable it on the leading wave of processors. Notably, every transition to a newer memory interface has resulted in higher up-front DIMM pricing, which is concerning in the price-sensitive desktop PC market.
DDR5 is in the opening stages; some vendors, like Adata, TeamGroup, and Micron, have already begun shipping modules. The inaugural modules are expected to run in the DDR5-4800 to DDR5-6400 range. The JEDEC spec tops out at DDR5-8400, but as with DDR4, it will take some time before we see those peak speeds. Notably, several of these vendors have reported that they don’t expect the transition to DDR5 to happen until early 2022.
While the details are hazy around the separation of the Alder Lake-S, -P, -M, and -L variants, some details have emerged about the I/O allocations via Coreboot patches:
Alder Lake-P
Alder Lake-M
Alder Lake-S
CPU PCIe
One PCIe 5.0 x8 / Two PCIe 4.0 x4
Unknown
Two PCIe 5.0 x8 / Two PCIe 4.0 x4
PCH
ADP_P
ADP_M
ADP_S
PCH PCIe Ports
12
10
28
SATA Ports
6
3
6
We don’t have any information for the Alder Lake-L configuration, so it remains shrouded in mystery. However, as we can see above, the PCIe, PCH, and SATA allocations vary by the model, based on the target market. Notably, the Alder Lake-P configuration is destined for mobile devices.
Intel 12th-Gen Alder Lake Xe LP Integrated Graphics
A series of Geekbench test submissions have given us a rough outline of the graphics accommodations for a few of the Alder Lake chips. Recent Linux patches indicate the chips feature the same Gen12 Xe LP architecture as Tiger Lake, though there is a distinct possibility of a change to the sub-architecture (12.1, 12.2, etc.). Also, there are listings for a GT0.5 configuration in Intel’s media driver, but that is a new paradigm in Intel’s naming convention so we aren’t sure of the details yet.
The Alder Lake-S processors come armed with the 32 EUs (256 shaders) in a GT1 configuration, and the iGPU on early samples run at 1.5 GHz. We’ve also seen Alder Lake-P benchmarks with the GT2 configuration, which means they come with 96 EUs (768 shaders). The early Xe LP iGPU silicon on the -P model runs at 1.15GHz, but as with all engineering samples, that could change with shipping models.
Alder Lake’s integrated GPUs support up to five display outputs (eDP, dual HDMI, and Dual DP++), and support the same encoding/decoding features as both Rocket Lake and Tiger Lake, including AV1 8-bit and 10-bit decode, 12-bit VP9, and 12-bit HEVC.
Intel Alder Lake CPU Architecture and 10nm Enhanced SuperFin Process
Intel pioneered the x86 hybrid architecture with its Lakefield chips, with those inaugural models coming with one Sunny Cove core paired with four Atom Tremont cores.
Compared to Lakefield, both the high- and low-performance Alder Lake-S cores take a step forward to newer microarchitectures. Alder Lake-S actually jumps forward two ‘Cove’ generations compared to the ‘big’ Sunny Cove cores found in Lakefield. The big Golden Cove cores come with increased single-threaded performance, AI performance, Network and 5G performance, and improved security features compared to the Willow Cove cores that debuted with Tiger Lake.
Image 1 of 2
Image 2 of 2
Alder Lake’s smaller Gracemont cores jump forward a single Atom generation and offer the benefit of being more power and area efficient (perf/mm^2) than the larger Golden Cove cores. Gracemont also comes with increased vector performance, a nod to an obvious addition of some level of AVX support (likely AVX2). Intel also lists improved single-threaded performance for the Gracemont cores.
It’s unclear whether Intel will use its Foveros 3D packaging for the chips. This 3D chip-stacking technique reduces the footprint of the chip package, as seen with the Lakefield chips. However, given the large LGA 1700 socket, that type of packaging seems unlikely for the desktop PC variants. We could see some Alder Lake-P, -M, or -L chips employ Foveros packaging, but that remains to be seen.
Lakefield served as a proving ground not only for Intel’s 3D Foveros packaging tech but also for the software and operating system ecosystem. At its Architecture Day, Intel outlined the performance gains above for the Lakefield chips to highlight the promise of hybrid design. Still, the results come with an important caveat: These types of performance improvements are only available through both hardware and operating system optimizations.
Due to the use of both faster and slower cores that are both optimized for different voltage/frequency profiles, unlocking the maximum performance and efficiency requires the operating system and applications to have an awareness of the chip topology to ensure workloads (threads) land in the correct core based upon the type of application.
For instance, if a latency-sensitive workload like web browsing lands in a slower core, performance will suffer. Likewise, if a background task is scheduled into the fast core, some of the potential power efficiency gains are lost. There’s already work underway in both Windows and various applications to support that technique via a hardware-guided OS scheduler.
The current format for Intel’s Lakefield relies upon both cores supporting the same instruction set. Alder Lake’s larger Golden Cove cores support AVX-512, but it appears that those instructions will be disabled to accommodate the fact that the Atom Gracemont cores do not support the instructions. There is a notable caveat that any of the SKUs that come with only big cores might still support the instructions.
Intel Chief Architect Raja Koduri mentioned that a new “next-generation” hardware-guided OS scheduler that’s optimized for performance would debut with Alder Lake, but didn’t provide further details. This next-gen OS scheduler could add in support for targeting cores with specific instruction sets to support a split implementation, but that remains to be seen.
Intel fabs Alder Lake on its Enhanced 10nm SuperFin process. This is the second-generation of Intel’s SuperFin process, which you can learn more about in our deep-dive coverage.
Image 1 of 2
Image 2 of 2
Intel says the first 10nm SuperFin process provides the largest intra-node performance improvement in the company’s history, unlocking higher frequencies and lower power consumption than the first version of its 10nm node. Intel says the net effect is the same amount of performance uplift that the company would normally expect from a whole series of intra-node “+” revisions, but in just one shot. As such, Intel claims these transistors mark the largest single intra-node improvement in the company’s history.
The 10nm SuperFin transistors have what Intel calls breakthrough technology that includes a new thin barrier that reduces interconnect resistance by 30%, improved gate pitch so the transistor can drive higher current, and enhanced source/drain elements that lower resistance and improve strain. Intel also added a Super MIM capacitor that drives a 5X increase in capacitance, reducing vDroop. That’s important, particularly to avoid localized brownouts during heavy vectorized workloads and also to maintain higher clock speeds.
During its Architecture Day, Intel teased the next-gen variant of SuperFin, dubbed ’10nm Enhanced SuperFin,’ saying that this new process was tweaked to increase interconnect and general performance, particularly for data center parts (technically, this is 10nm+++, but we won’t quibble over an arguably clearer naming convention). This is the process used for Alder Lake, but unfortunately, Intel’s descriptions were vague, so we’ll have to wait to learn more.
We know that the 16-core models come armed with 30MB of L3 cache, while the 14-core / 24 thread chip has 24MB of L3 cache and 2.5 MB of L2 cache. However, it is unclear how this cache is partitioned between the two types of cores, which leaves many questions unanswered.
Alder Lake also supports new instructions, like Architectural LBRs, HLAT, and SERIALIZE commands, which you can read more about here. Alder Lake also purportedly supports AVX2 VNNI, which “replicates existing AVX512 computational SP (FP32) instructions using FP16 instead of FP32 for ~2X performance gain.” This rapid math support could be part of Intel’s solution for the lack of AVX-512 support for chips with both big and small cores, but it hasn’t been officially confirmed.
Intel 12th-Generation Alder Lake Price
Intel’s Alder Lake is at least ten months away, so pricing is the wild card. Intel has boosted its 10nm production capacity tremendously over the course of 2020 and hasn’t suffered any recent shortages of its 10nm processors. That means that Intel should have enough production capacity to keep costs within reasonable expectations, but predicting Intel’s 10nm supply simply isn’t reasonable given the complete lack of substantive information on the matter.
However, Intel has proven with its Comet Lake, Ice Lake, and Cooper Lake processors that it is willing to lose margin in order to preserve its market share, and surprisingly, Intel’s recent price adjustments have given Comet Lake a solid value proposition compared to AMD’s Ryzen 5000 chips.
We can only hope that trend continues, but if Alder Lake brings forth both PCIe 5.0 and DDR5 support as expected, we could be looking at exceptionally pricey memory and motherboard accommodations.
I have used a heck of a lot of laptops in the past year, and some of them are quite nice. MacBooks have nailed the “premium” look and feel for years, and I’ll never waste an opportunity to gush about the build quality of Dell’s XPS line.
But I’ve never touched a consumer laptop as gorgeous as the Spectre x360 14. The new Spectre’s sturdy black body, lustrous accents, and boldly sharp edges would make it a standout among convertible laptops across the board, even if it didn’t have a slew of other excellent qualities — which, from its 3:2 screen and packaged stylus to its stellar performance and battery life, it absolutely does.
With a starting MSRP of $1,299.99 ($1,589.99 as tested) the Spectre x360 is easily my new favorite 2-in-1 laptop. Today’s market is full of capable convertibles that look good, work well, and do certain things really well. But while the Spectre x360 14 isn’t a perfect laptop, it tops the pack in almost every area. It’s a stylish chassis, premium panel options, stylus support, a powerful processor, and fantastic battery in one. It’s proof that you can have it all — for a price.
The HP Spectre line is second to none when it comes to design, and this latest model is no exception. Like its 13-inch predecessor, the Spectre x360 14 is made of CNC-machined aluminum. Also like its siblings, you can get the 14 in “nightfall black,” “Poseidon blue,” or “natural silver.” Take a look at some pictures before selecting your color because they each have pretty different vibes. The nightfall black option has a sophisticated, svelte aesthetic that looks tailor-made for a boardroom. Poseidon blue is friendlier and probably the one I’d go for myself.
The accents, though, are what make the Spectre stand out from the legions of other black laptops out there. Lustrous trim borders the lid, the touchpad, and the deck. The hinges share its color, as does the HP logo on its lid. It’s bold without being obnoxious. The two rear corners are diamond-shaped, and one of them houses a Thunderbolt 4 port on its flat edge. (On the sides live an audio jack, a USB-A, a microSD slot, and an additional Thunderbolt 4, which is a decent selection — gone is the trapdoor that covered the USB-A port on the 13-inch model.) And the edges are all beveled, making the notebook appear thinner than it actually is (it’s 0.67 inches thick). Careful craftsmanship is evident here — I’m not exaggerating when I say this Spectre feels like artwork.
And, as the “x360” moniker implies, the Spectre is a 2-in-1. At 2.95 pounds, it’s a bit heavy to use as a tablet for long periods, but it’s smooth and easy to fold and the hinges are quite sturdy. Unlike with many convertibles, there’s barely any wobble when you use the touchscreen. The display is also stylus-compatible; the Spectre ships with HP’s MPP2.0 pen, which attaches magnetically to the side of the chassis.
Despite its design similarities, this Spectre looks noticeably different from its ancestors, and that’s because of the screen. The new model has a 3:2 display, which is 13 percent taller than the 16:9 panel on last year’s device. (It’s kept the same 90 percent screen-to-body ratio.)
Microsoft’s Surface devices have been using the 3:2 aspect ratio for years, and I’m glad that the Spectre line is finally making the switch. If you’re used to using a 16:9 display (which many modern Windows laptops have) and you give a 3:2 a shot, you’ll see what I mean. You have significantly more vertical space, which means less scrolling up and down and less zooming out to fit everything you want to see. It makes multitasking significantly easier without adding much size to the chassis.
This 3:2 panel can come in a few different forms. My test unit has an FHD option that HP says should reach 400 nits of brightness. I measured it multiple times, but it only reached 285 in my testing — which is dimmer than I’d hope to see from a device at this price point. I’ve reached out to HP to see what’s up and will update this review if it turns out to be a bug. (Of course, 285 nits is still more than enough for indoor office work.)
In addition to the FHD display, you can opt for a 3000 x 2000 OLED panel (HP didn’t provide a brightness estimate for this one; LaptopMag measured it at 339 nits) or a 1,000-nit option with HP’s Sure View Reflect technology, which makes the screen difficult to read from the sides. This will mostly be a benefit for business users.
In terms of other specs, the base model pairs the 400-nit screen with a Core i5-1135G7, 8GB of memory, and 256GB of storage (plus 16GB of Intel Optane). Then, there are a few upgrades you can go for. My test unit, priced at $1,589.99, keeps the base model’s screen but has a heftier processor (the quad-core Core i7-1165G7) and double its RAM and storage. I think this model is a good option for most people — it gets you a top processor and a good amount of storage without too stratospheric of a price tag. If you want to get fancier, you can get the OLED screen and 1TB of storage (plus 32GB of Intel Optane) for $1,699, or the Sure View screen and 2TB of storage for $1,959.99.
Of course, laptops aren’t just for looking at, but you’re not compromising on performance to get this build quality. The Spectre is verified through Intel’s Evo platform, which means that it offers a number of Intel-selected benefits including Thunderbolt 4, Wi-Fi 6, all-day battery life, quick boot time, fast charging, and reliable performance. In my testing, it more than surpassed those standards.
The system handled my heavy workload of Chrome tabs, downloads, and streams speedily with no issues. Battery life was excellent; I averaged 10 hours of continuous use with the screen around 200 nits of brightness. That means if your daily tasks are similar to mine, the Spectre should make it through your workday with no problem. (You’ll likely get less if you opt for the OLED panel.) The processor also includes Intel’s Iris Xe integrated graphics. While you wouldn’t want to use those for serious gaming, they’re capable of running lighter fare.
Elsewhere, I have almost no complaints. The backlit keyboard is snappy with a solid click — it’s easily one of my favorites. The speakers sound good, with very audible bass and percussion. There’s a fingerprint sensor to the left of the arrow keys and a Windows Hello camera, neither of which gave me any trouble.
Apart from the dimness, there are only two things about this laptop that I’m not in love with. They’re both minor; the fact that I’m even mentioning either of them in this review is a testament to how excellent this device is.
The first is the touchpad. It’s quite smooth and roomy (16.6 percent larger than that of last year’s Spectre x360 13) and handles scrolling and gestures just fine. But it’s noticeably stiffer than some of the best touchpads on the market. The press required to physically click is firm enough that I ended up doing it with my thumb most of the time. On the likes of the Dell XPS 13 and the MacBook, clicking with a finger is much less of a chore. When I first clicked with the integrated buttons, I also had to overcome some initial resistance to hit the actuation point (put plainly, every click felt like two clicks). This issue resolved itself during my second day of testing, but it’s still a hiccup I generally only see with cheaper items.
Secondly, bloatware. There are a number of junk programs preloaded onto the Spectre and several pinned to the taskbar. Dropbox, ExpressVPN, McAfee, and Netflix are all on here, and I got all kinds of notifications from them. This is an oddity at this price point, and seeing cheap McAfee alerts popping up on the Spectre is like seeing really ugly bumper stickers on a Ferrari. This software doesn’t take too long to uninstall, but I’m disappointed to see it nonetheless.
But those are really the only two complaints I have, and neither of them should stop you from buying this laptop. It’s beautiful to look at and a dream to use. I found myself using it in my free time instead of my personal device (which almost never happens with review units — I really like my products).
When we’re evaluating a convertible laptop at the Spectre’s price point, the big question is how it compares to the gold standard of Windows convertibles, the Dell XPS 13 2-in-1. The XPS has a few advantages: it’s a bit thinner and lighter, its touchpad is less stiff, and it has a more modest look that some users might prefer.
But for me, the ball game is close but clear. The Spectre x360’s meticulous craftsmanship, classy aesthetic, and 3:2 screen put it over the top. It also edges out the XPS in a few key areas: the keyboard is more comfortable, the battery life is better, and Dell’s closest-priced configuration to this unit only has half its storage. The Spectre’s smaller amenities that the XPS lacks — like the bundled stylus, the USB-A port, the blue color, and the OLED option — are icing on the cake.
If you’re looking for a premium Windows convertible with a classy aesthetic, that makes the Spectre a no-brainer purchase. This is HP at its best; it’s a luxury laptop in pretty much every area. I can’t imagine that it won’t be the next laptop I buy.
The $4 Raspberry Pi Pico just keeps on giving! The latest project as reported by Hackaday sees the team behind the popular ArduCAM create a library enabling the Raspberry Pi Pico to work with an SPI camera for video streaming and person detection.
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
As you have probably guessed, ArduCAM is the name for a range of cameras compatible with Arduino and now the Raspberry Pi Pico. In the demo video we can see the team using an ArduCAM 2MP Plus, a camera with an OV2640 2MP image sensor which has built in support for JPEG. ArduCAM does most of the heavy lifting, reducing the workload of the Raspberry Pi Pico.
A host application, shown running on a Windows 10 computer appears to show that the continuous streaming video demo runs at 320 x 240 and connects over a USB serial port via a USB to TTL adaptor such as the CH340 or CP2102. The demo is compatible with Windows, Mac and Linux devices including the Raspberry Pi.
The ArduCAM team have provided a Github repository with full instructions on how to recreate two demo applications, a simple video streaming test, and a demo with basic person detection. If you haven’t got the time, or just want to try out the demos for yourself, then ArduCAM also provide the demos as UF2 files, which can be flashed to the Raspberry Pi Pico in seconds. Right now the demos are built in C, but we spotted that ArduCAM are working on a port to MicroPython.
For more projects and to learn more about the Raspberry Pi Pico, take a look at our Everything You Need to Know About Raspberry Pi Pico.
Epson’s EH-LS300 UST projector is practical for everyday use, with some limitations, as well as for truly immersive bigscreen movie nights.
For
Powerful bright image
Free-to-air TV catch-up services
Convenient way to get a really large picture
Against
No Netflix or Prime support
No Ethernet
1080i/50 and 576i/50 issues
Sound+Image mag review
This review originally appeared in Sound+Image magazine, one of What Hi-Fi?’s Australian sister publications. Click here for more information on Sound+Image, including digital editions and details on how you can subscribe.
Ultra-short-throw projectors are going great guns. Even mainstream companies not previously or not recently involved in projection (LG, Samsung, Hisense) have realised that a projector sitting on a bench can deliver a large-screen ‘TV-like’ experience with a convenience that a conventional projector cannot when it has to be pushed back in the room or hung on a ceiling.
As a consequence the ultra-short-throw is rapidly evolving. Since it’s going to work like a TV, shouldn’t it have speakers? Shouldn’t it be smart, like a TV? Yes it should. So here comes Epson, a company which declares itself the world no.1 in projection (on the reasonable basis it has been declared so by Futuresource Consulting for the last 17 consecutive years), with models for just this space. The ‘4K Pro UHD’ EH-LS500 arrived first, and now this EH-LS300, which offers Full-HD resolution of 1920 x 1080. With such underlying expertise, but a lower resolution, does it deliver the promised new age of UST?
Build
Ultra-short-throw projectors use a combination of lenses and mirrors to cast the picture up onto a screen almost immediately above them. The Epson EpiqVision EH-LS300B is a fairly compact specimen at 467mm wide and 400mm deep. Inside there are three 15.5mm LCD projector panels using Epson’s C2 Fine technology, and a laser diode. That kind of light engine works by firing the laser into some phosphor, which then produces a bright white light.
The use of this kind of light engine means several good things. Firstly, long life. The projector is rated at 20,000 hours of lamp life. Secondly, the projector turns on fast – 5.5 seconds when in standby, according to our stopwatch, or 6.5 seconds if it has been disconnected from power. It goes off fast as well.
And the lamp can be controlled in level to help darken scenes. Epson says that the dynamic contrast ratio is as much as 2.5 million-to-1 (on the datasheet) or 1.5 million-to-1 (in the US manual we found online).
There are two HDMI inputs, a USB socket for playing back video content from a USB stick, and Wi-Fi for the smart TV stuff. Surprisingly, there’s no Ethernet connection – we’d prefer that option rather than having to rely on Wi-Fi in our somewhat RF-congested modern environments. But it is dual-band Wi-Fi – 2.4GHz and 5GHz – supporting the 801.11ac standard, so it may well outperform the usually-installed 100Mbps Ethernet of many devices. (Our 100Mbps test clip was pretty choppy… but we do have an extremely busy Wi-Fi environment.)
Best projectors 2021: Full HD, 4K, portable, short throw
Sound
A nice touch: built-in sound that is quite good. As always, we believe that the sound of your system should match the scale of the video. Since the Epson EpiqVision EH-LS300B delivers big vision, a decent external sound system should be in order. But absent that, we were quite impressed with the audio built into the projector. Epson relied on the audio expertise of Yamaha for this: a 2.1-channel forwards-firing audio system with 20W of power, better than any actual TV that we’ve used – and we’ve used plenty. We also checked the Audio Return Channel capability via HDMI to a connected home theatre receiver, and it worked perfectly well.
Setting up
The projector is available on its own at £2500 ($2000, AU$4000). But in Australia you can also get it as a package with an ‘Ambient Light Rejection’ screen in 100 inches (AU$5099 package) or 120 inches (AU$5699). These employ a surface treatment which reduces the reflection of light coming from above or directly in front, increasing their contrast ratio when there’s light in the room. These screens weren’t available to us at the time of review, so we used our regular viewing screen.
Epson EH-LS300 tech specs
Projection technology: 3 x 15.5mm C2 Fine LCD panels
Resolution: 1920 x 1080 pixels
Aspect ratio: 16:9
Lamp: Laser diode
Lamp life: 20,000 hours (Normal and Quiet modes)
Brightness: 3600 lumens (both white and colour); 1800 lumens (ECO mode)
Inputs: 2 x HDMI, 1 x USB, Wi-Fi
Outputs: Optical digital audio
Control/other: Mini-USB (Service)
Dimensions (whd): 467 x 133 x 400mm
Weight: 7.2kg
In a couple of ways an ultra-short-throw projector is a little trickier to set up than a regular projector. There is no zoom lens, for example, so the size of the picture is determined entirely by the distance between the projector and the screen – far less than a conventional projector, but still significant for the largest screen sizes. For a 100-inch screen, the distance from the wall to the back of the projector (the side facing the wall) should be 26.6cm. With the largest recommended screen size of 120 inches, the distance is just 38.6cm. The projector itself is getting on for 40cm deep, so its front will end up more than 75cm from the wall – which is a lot of bench depth.
The other slightly tricky thing is that the image is cast at such an extreme angle you have to adjust everything with better-than-millimetre precision. The slightest angle to one side and you have a marked trapezoid of a picture. You really don’t want to be nudging the projector when you’re dusting.
But there is a clever adjustment system for picture geometry. There’s even a dedicated button on the remote. With this you can drag the corners of a box to make sure everything’s square. It’s kind of like a supercharged keystone correction system. But like keystone correction, it’s done digitally, which means that Full-HD input signals are no longer directly mapped onto the display pixels, so detail is lost. So it’s best avoided if you can.Get the physical placement right instead.
Last thing on physical set-up: we found the focus adjust lever quite spongy, so a bit tricky to get the focus exactly right. With perseverance we did succeed. (One of the oddities about this kind of projector: focus adjustment has basically no effect at all at the bottom of the screen, which is always in focus, and a massive effect at the top.)
As for the smart set-up, Android TV has this well under control. You just use the Google Home app on your phone – it works on iPhones as well – which talks you through with very little fuss. We had the unit connected within a couple of minutes.
Performance
It turns out that 3600 lumens allows a surprisingly viewable picture even under the full glare of our room’s fluorescent tubes! Not that we did any kind of critical viewing that way, we hasten to add. But we suspect that kids would be perfectly satisfied with afternoon cartoons served this way, especially onto an Epson directional screen. And it’s fine for the news and such. So consider this not just a home cinema device, but a (kind of) everyday TV.
Then, when night fell, we started to use it as a real home cinema projector. And we were impressed. What impressed us? The colour and the black levels. The subjective black levels were entirely satisfying. Which is to say that with all the material we viewed – including HDR content from 4K Blu-ray – the black elements of the picture seemed, well, entirely black. And yes, 4K Blu-ray, because even though the projector delivers 1080p output, it supports Ultra-HD inputs including HDR (and Dolby Vision, which it treats as HDR).
We jammed in as much night-time viewing as we could in the time available, and we must say that we found it entirely satisfying: bold, accurate colours; sharp, detailed image.
The EpiqVision quickly demonstrated that it’s not really one for interlaced video in Australia. With both 576i/50 and 1080i/50, it seemed simply to assume that all such signals were video-sourced. So it applied motion-adaptive deinterlacing, rather than checking to see if a simple weave might be better for any given bit of video. Most of the time you won’t notice this, but occasionally there’ll be some lines or grid in the picture which adversely interact with the processing and develop distracting moire patterns. The solution, as always: use a source with good quality deinterlacing.
Best portable projectors: the best mini projectors 2021
Network streaming
When you first set the Epson up, you’re offered a default bunch of apps, including (in Australia) TV station apps such as iView, SBS On Demand and so on. There’s YouTube, Google play Movies & TV, Disney+ and Stan, Vimeo, Spotify and Tidal and a whole lot more. Even after we’d finished the setting up, it took a little while to download and install them one by one. It was kind of fun watching them pop up in the list of apps as they were loaded. When it had finished, we went to fire up Netflix – one of the paid video services to which we subscribe. We couldn’t find it.
So we figured we’d try Google Assistant to help us find it. A press of the microphone button on the remote and the utterance of the word Netflix, and the projector announced in the familiar Google Assistant voice, ‘Here’s Netflix on the Google play Store’. And there was, indeed, the play Store entry for Netflix. Only problem was, at the bottom of the screen it said, ‘Your device isn’t compatible with this version.’ This was also the case for Amazon’s Prime Video.
We checked the projector’s website and, sure enough, there it was in the fine print: “Not all streaming apps are natively available on the EpiqVision EH-LS300B. An external streaming media device is required to stream some services, including Netflix. Netflix cannot be streamed using Chromecast from Android TV, iOS, Mac or Windows devices.” (We assume by that ‘Android TV’ that Epson actually means Android, as in an Android phone.)
In other words, not only does this projector not support Netflix, apparently you can’t stream Netflix from your phone to the projector via Chromecast. We tried. In fact, we could kind of stream Netflix from our phone, but only in the slowest, choppiest most broken-up way. To compare, we plugged an actual Chromecast with Google TV device into one of the EpiqVision’s HDMI inputs, and found that Netflix would stream fairly smoothly via that route.
YouTube, by contrast, streamed smoothly via either route, as did iView and SBS On Demand and Stan. We did find it quite puzzling why any version of Android TV would not support Netflix and Prime, the second and fourth most popular streaming services (that’s counting YouTube as being number one).
Apparently the projector is also able to support video calls using the Epson Online Meeting app – powered by Zoom. You’ll need to plug a camera and microphone into the USB socket for that.
Verdict
Aside from the inability to play Netflix and Prime Video direct, we were impressed. The Epson EpiqVision EH-LS300B delivers a bright and impressive image to a large screen – and we would indeed recommend using a screen such as one of Epson’s, rather than a non-flat non-reflecting bit of wall, because we assure you, walls will never get the image truly flat. Plus the good quality sound and the extremely long-life light source makes this unit practical for everyday use, with some limitations, as well as for truly immersive bigscreen movie nights.
Apple’s rocky rollout of iCloud password sync on Windows has hit another bump: you can no longer download the version of the app that allowed syncing passwords from your Apple devices with your Chrome web browser on a Windows PC (via Windows Central). Version 12 of Apple’s iCloud for Windows app released last week with supposed support for the feature, but the Chrome extension to make it work wasn’t available until two days ago. Now, version 12 is gone from the Windows Store, with version 11.6 taking its place.
Weirdly, the changelog on the Windows Store still says that the password syncing functionality is present. However, upon downloading, the app shows the version as 11.6 in the top right. We verified this for ourselves, and you can see there’s no password sync option in our first screenshot below.
As The 8-Bit notes, the late-to-the-party Chrome Extension is still available on the Chrome Web Store, at least for now. Anyone still on version 11 of the iCloud for Windows app, though, won’t be able to upgrade and take advantage of the extension anyhow. But if you currently have version 12 on your computer, Windows won’t make you downgrade to the previous version.
It’s unclear why Apple pulled the update — the reviews for the Chrome extension aren’t good, but many of them are users reporting that it doesn’t work on Mac, which it’s not designed for. Whatever the reason, this has been one of the messier rollouts I’ve seen from Apple in recent memory. The story, it seems, is the same as the last time I wrote about iCloud Passwords for Windows: if you want them, you’ll have to wait a bit longer.
Apple was didn’t immediately respond to a request for comment.
A speed demon that prioritizes raw performance, the Alienware m17 R4 puts plenty of pop into a sleek but bulky chassis.
For
Unrivaled performance
Snappy keyboard
Attractive design
At present, RTX 3080 is the fastest laptop graphics card around, but not all RTX 3080-powered laptops are created equal. Many vendors use Nvidia’s Max-Q technology, which prioritizes power efficiency and low fan noise over high performance. Alienware’s m17 R4, however, seeks to pump out every possible frame, deploying a special cooling system and eschewing Max-Q to make its top-of-the-line configuration one of the best gaming laptops,
But the Alienware m17 R4 is not just a speed demon. Starting at $2,106 ($3,586 as tested), this laptop has a snappy keyboard, a sleek sci-fi inspired design with plenty of RGB and an optional 360 Hz screen. You just have to live with a heavy chassis and the occasional bout of fan noise.
Editor’s Note: The Alienware m17 R4 review unit we tested came with a 512GB boot drive and 2TB RAID 0 storage drive. While this hardware is for sale, it is normally shipped to consumers with the 2TB RAID 0 drive as boot drive.
3x USB Type-A 3.2, 1x HDMI 2.2, 1x mini DisplayPort 1.4, 1x Thunderbolt 3, 1x microSD card reader
Camera
1280 x 720
Battery
86 WHr
Power Adapter
330W
Dimensions (WxDxH)
15.74 x 11.56 x 0.87 inches
Weight
6.6 pounds
Price (as configured)
$3,586
Design of the Alienware m17 R4
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
The Alienware m17 R4 has the same sci-fi inspired “Legend” design as both its immediate predecessor, the m17 R3, and its sibling, the Alienware m15 R4. Available in “lunar light: white or “dark side of the moon” (black), the m17 R4 looks like a giant starship, rocketing through space. The body (ours was white) has a black rear end that juts out like the jet engine on the back of an imperial cruiser. The number 17 on the lid appears in a sci-fi font that you might find adorning a secret warehouse at Area 51.
There’s a honeycomb pattern for the vents on the back, above the keyboard and on the bottom surface. We can only assume that Alienware aliens live in some kind of hive where they are all doing CUDA core calculations.
And, of course, there’s lots of RGB lights to brighten the mood in outer space. The keyboard has four-zone RGB and there are customizable lights on the back edge and in the alien heads on the back of the lid and the power button.
The chassis is made from premium materials: a magnesium alloy with matte white or black paint, covered by a clear coat for extra durability. The interior uses Alienware’s cryo-tech cooling technology which has 12-phase graphics voltage regulation, 6-phase CPU voltage regulation and a CPU vapor chamber.
At 6.6 pounds and 15.74 x 11.56 x 0.87 inches, the Alienware m17 R4 is not exactly light or thin, not that would you expect that from a 17-inch laptop with a Core i9 CPU and RTX 3080 graphics. By comparison, the Gigabyte Aorus 17G (5.95 pounds, 15.9 x 10.8 x 1.0 inches) and Razer Blade Pro 17 (6.1 pounds, 15.6 x 10.2 x 0.8 inches) are both significantly lighter, though the Aorus is thicker. The Asus ROG Flow X13, which we’re also comparing to the m17, is much thinner and lighter (2.87 pounds, 11.77 x 8.74 x 0.62 inches), because it’s a 13-inch laptop that gets its RTX 3080 graphics via an external dock.
The Alienware m17 R4 has plenty of room for ports. On the right side, there are two USB 3.2 Type-A ports, along with a micro SD card reader. The left side contains a Killer RJ-45 Ethernet 2.5 Gbps port, a 3.5mm audio jack and another USB Type-A port. The back holds a Thunderbolt 3 port, a mini DisplayPort 1.4, an HDMI 2.1 connection, Alienware’s proprietary graphics amplifier port and the power connector.
Gaming Performance on the Alienware m17 R4
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Sporting an Nvidia RTX 3080 GPU and an Intel Core i9-10980HK CPU, our review configuration of the Alienware m17 R4 is as fast of a gaming laptop as you can get right now. Thanks to Alienware’s strong cryo-tech cooling solution and the company’s willingness to include a full version of the RTX 3080, rather than the Max-Q variants in some thinner notebooks.
When I played Cyberpunk 2077 at Ultra RTX settings, the game ranged between 61 and 72 frames per second, depending on how intense the action was at any given time. The frame rate improved to between 85 and 94 fps after I changed to Ultra settings with no RTX. In both cases, the fan noise was really loud by default. Changing the fan profile to quiet improved this somewhat while shaving only a couple of fps off, and only in intense scenes.
The Alienware m17 R4 hit a rate of 120 fps in Grand Theft Auto V at very high settings (1080p), eclipsing the Gigabyte Aorus 17G and its Max-Q-enabled RTX 3080 and Core i7-10870H CPU by 20%. The Asus ROG Flow 13 with its Ryzen 9 5980HS CPU and external RTX 3080 dock, was also a good 13% behind while the RTX 2080 Super-powered Razer Blade Pro 17 brought up the rear.
On the very-demanding Red Dead Redemption at medium settings, the m17 R4 achieved an impressive rate of 79.7 fps, besting the Aorus 17G and ROG Flow X13 by more than 20%. Saddled with last year’s card, the Razer Blade Pro 17 was a full 29 % behind.
Alienware’s behemoth exceeded 100 fps again in Shadow of the Tomb Raider, hitting 103 while the Aorus 17G and the ROG Flow X13 hovered in the mid 80s and 60s. On this test, surprisingly, the Razer Blade Pro 17 came close to matching the m17 R4.
Far Cry New Dawn at Ultra settings also provided a great example of the Alienware m15 R4’s dominance. It hit a full 105 fps where its nearest competitor, the Gigabyte Aorus 17G could only manage 92 fps with the Asus ROG Flow X13 and Razer Blade Pro 17 were both in the 80s.
To see how well the Alienware m17 R4 performs over the long haul, we ran the Metro Exodus benchmark at RTX, the highest settings level, 15 times at 1080p. The laptop was remarkably consistent, averaging 75.6 fps with a high of 76.2 and a low of 75.4. During that time, the average CPU speed was 4.19 GHz with a peak of 5.088 GHz. By comparison, the Gigabyte Aorus 17G, got an average frame rate of just 59.6 fps with an average CPU speed of 3.47 GHz and the Asus ROG Flow X13 managed a slightly-higher 65.2 fps with an average CPU speed of 3.89 GHz.
Productivity Performance of Alienware m17 R4
Image 1 of 3
Image 2 of 3
Image 3 of 3
With its Core i9-10980HK CPU, 32GB of RAM and dual storage drives, which include both a 2TB RAID 0 PCIe SSD (2 x 1TB) and a 512GB SSD, and that RTX 3080, our review configuration of the Alienware m17 R4 can be a powerful work tool.
On Geekbench 5, a synthetic benchmark that measures overall performance, the m17 R4 got a single-core score of 1,318 and a multi-core score of 8,051, which wa slightly ahead of the of the Core i7-10870H-powered Gigabyte Aorus 17G on both counts but behind the Asus ROG Flow X13 and its Ryzen 9 5980HS on single-core performance while creaming the Razer Blade Pro 17, which we tested with a Core i7-10875H.
The storage in our review unit came misconfigured slightly, with a 512GB NVMe PCIe SSD as boot drive and a significantly faster 2TB RAID 0 drive made from two 1TB NVMe PCIe SSDs. Dell sells this hardware, but consumers receive units with the 2TB as boot and the 512GB SSD as a secondary, storage drive.
In our tests, copying about 25GB of files, the 512GB drive managed a mediocre 379.7 MBps, but the 2TB drive hit an impressive 1305.5 MBps, which beats the Aorus 17G (869 MBps), the ROG Flow X13 (779.5 MBps) and the Blade Pro 17 (925.2 MBps).
The Alienware m17 R4 took just 6 minutes and 44 seconds to transcode a 4K video to 1080p in Handbrake. That time is 21% faster than the Aorus 17G, 18% quicker than the Flow X13 and a full 29% ahead of the Blade Pro 17.
Display on Alienware m17 R4
The Alienware m17 R4 comes with a choice of three different, 17-inch display panels: a 1080p panel with 144 Hz refresh rate, a 4K, 60 Hz panel and the 1080p, 360 Hz panel in our review unit. Our panel provided sharp images and accurate but mostly unexciting colors, along with smooth, tear-free gaming.
When I watched a trailer for upcoming volcano-disaster-flick Skyfire, the red-orange of lava bursts was lively and the green trees in a forest seemed true-to-life. Fine details like the wrinkles in actor Jason Isaacs’ forehead also stood out.
In a 4K nature video of a Costa Rican jungle, details like the scales on a snake and colors like the red on a parrot’s feathers were also strong, but not nearly as strong as when I viewed it on the 4K, OLED panel from the Alienware m15 R4 I tested recently. On both videos, viewing angles on the matte display were strong as colors didn’t fade even at 90 degrees to the left or right.
In Cyberpunk 2077, details like the threads on a rug or the barrel of a gun were prominent and colors like the red and yellow in the UI seemed accurate but didn’t pop.
The Alienware m17 R4’s display registered a strong 316.2 nits of brightness on our light meter, outpacing the Aorus 17G (299.6), the Razer Blade Pro 17 (304.4) and the Asus ROG Flow X13 (281.6). According to our colorimeter, the screen can reproduce a solid 80.6% of the DCI-P3 color gamut, which is about on par with the Aorus 17G and slightly behind the Razer Blade Pro 17, but miles ahead of the ROG Flow X13.
Keyboard and Touchpad on Alienware m17 R4
With a deep, 1.7mm of travel, great tactile feedback and a full numeric keypad, the Alienware m17 R4 offers a fantastic typing experience. On the tenfastfingers.com typing test, I scored a strong 102 words-per-minute with a 3% error rate, which is a little better than my typical 95 to 100 wpm and 3 to 5% rate.
Not only does the keyboard have a full numeric keypad, but it also sports four customizable macro keys above the pad on the top row. The Alienware Command Center software allows you to set these to launch a program, enter text or use a pre-recorded set of keystrokes when you hit them. I found programming them very unintuitive, however.it. Alienware Command Center also allows you to set RGB colors or lightning effects for four different zones on the keyboard.
The 3.1 x 4.1 glass touchpad, which uses Windows precision drivers, offers great navigation with just the right amount of friction. Whether I was navigating around the desktop or using multitouch gestures such as pinch-zoom or three-finger swipe, the pad was always accurate and responsive.
Audio on Alienware m17 R4
The Alienware m17 R4’s audio system outputs sound that’s loud enough to fill a mid-sized room and rich enough to dance to. When I played AC/DC’s “Back in Black” with the volume all the way up, the sound was mostly accurate, but some of the high-pitched percussion sounds were a little harsh. Earth, Wind and Fire’s bass-heavy “September” sounded great, with a clear separation of sound where instruments such as the horns section appeared to come from a different side of the notebook than, for example, the drums.
Gunshots and the sound of my NPC friend Jackie yelling at me to stay down sounded sharp and clear in Cyberpunk 2077. However, I had to turn the volume way up to compensate for the fan noise when the system was on high performance settings. Even on the “quiet” thermal setting, fan noise was quite prominent.
The preloaded Alienware Command Center app has an audio section that lets you tweak the sound settings and choose among profiles such as Music, Movie, Shooter and Role Play. I found that the default “Alienware” profile sounded about the same as the Music one, but disabling the audio enhancement definitely made the sound flatter.
Upgradeability of the Alienware m17 R4
The Alienware m17 R4 has three different M.2 SSD slots, all of which are accessible and user upgradeable. The first slot is an short 2230 length and the other two are both the normal 2280 size. Unfortunately, the RAM is soldered onto the motherboard and therefore not replaceable.
Opening the Alienware m17 R4 should be easy: there are eight Philips-head screws, some of which come out and the others of which you can just loosen, on the bottom panel. In our testing, getting the screws loosened was easy by prying off the bottom panel was challenging and required several minutes with a spudger. Once the panel is off, all three SSDs are visible, but are covered by copper heat sinks you can easily unscrew.
Battery Life on Alienware m17 R4
Forget about using the Alienware m17 R4 without a power outlet for any length of time. The laptop lasted just just 2 hours and 5 minutes on our battery test, which involves surfing the web over Wi-Fi at 150 nits of brightness. That’s awful in comparison to all of its competitors as both the Gigabyte Aorus 17G and Razer Blade Pro 17 lasted for an identical 4 hours and 41 minutes. But this is a 17-inch, 6.6-pound laptop so portability isn’t a primary concern.
Heat on Alienware m17 R4
The main touchpoints on the Alienware m17 R4 stay relatively cool when you’re not gaming and remain warm but tolerable when you are. After we streamed a YouTube video for 15 minutes, the keyboard hit a reasonable 35.5 degrees Celsius (95.9 degrees Fahrenehitt), the touchpad was a chilly 26.2 degrees Celsius (79.3 degrees Fahrenheit) and the underside was just 36.6 degrees Celsius (97.9 degrees Fahrenheit).
After running the Metro Exodus benchmark for 15 minutes to simulate gaming, those temperatures were obviously higher. The keyboard hit 35.5 degrees Celsius (112 degrees Fahrenheit), the touchpad measured 35 degrees (95 degrees Fahrenheit) and the bottom hit 50 degrees (122 degrees Fahrenheit).
When I played Cyberpunk 2077, the area around the WASD keys measured about 40 degrees Celsius (105 degrees Fahrenheit) but the key caps themselves didn’t feel uncomfortably warm to touch. At performance settings, the fan noise was extremely loud.
Webcam on Alienware m17 R4
The Alienware m17 R4’s 720p webcam is nothing special. Even when I shot it in a well-lit room, an image of my face was filled with visual noise and fine details like the hairs in my beard were blurry while colors such as the blue in my shirt and the green on the walls were muted. You’ll get by with this built-in camera if you need to, but you’d be better off springing for one of the best webcams.
Software and Warranty on Alienware m17 R4
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
The Alienware m17 R4 comes preloaded with a handful of useful first-party utilities.
Alienware Mobile Connect allows you to control your Android handset or iPhone from your laptop, taking calls and texts for the desktop.
Alienware Command Center lets you control all the RGB lighting effects, set keyboard macros, tweak audio settings and even modify the performance settings and thermals to go for better performance or quieter and cooler temps. You can even change the max frequency, voltage and voltage offset for the CPU manually if you have an unlocked CPU and want to try overclocking.
As with any Windows laptop, there’s also a small amount of preloaded bloatware, including a trial of Microsoft Office, links to download Photoshop Express and Hulu and free-to-play games like Roblox.
Alienware backs the m17 R4 with a standard one year warranty on parts and labor that includes in-home service (if there was already a remote diagnosis). You can pay extra to extend the warranty up to five years and you can add accidental damage protection with no deductible.
Configurations of Alienware m17 R4
When you purchase the Alienware m17 R4 from Dell.com, you can custom configure it with your choice of a Core i7 or Core i9 CPU, RTX 3070 or 3080 GPU, up to 32GB of RAM and up to 4TB of storage. You can choose white or blackcolor options and you can also pay extra to get per-key RGB lighting instead of the standard 4-zone lighting we tested.
You also get a choice of screens that includes 144 Hz and 360 Hz 1080p panels, along with a 4K, 60 Hz panel that promises to hit 100 % of the Adobe RGB color gamut. If you value image quality over fps, we recommend the latter, because the color on our 360 Hz panel was ok, but not exciting.
Our review configuration of the Alienware m17 R4 currently goes for $3,586.79. For that price, you get the Core i9-10980HK, RTX 3080 graphics, the 360 Hz display, 32GB of RAM and a combination of storage drives that includes two, 1TB M.2 PCIe SSDS in RAID 0 and a 512GB M.2 SSD by itself for a total of 2.5TB of storage. Dell lists the RAID drive as the boot drive in its store but our review model came with the 512GB drive as boot and the 2TB RAID drive as storage, which seems odd.
Bottom Line
At this point, it’s hard to imagine someone making a gaming laptop that’s significantly more powerful than the Alienware m17 R4 we tested unless they use desktop parts. The RTX 3080 is currently the fastest mobile GPU around, especially since Alienware didn’t opt for Nvidia’s more power efficient Max-Q technologies.. Using a strong cooling system, pairing it with a Core i9-10980HK, and you have performance that’s often 20% faster than competitors that also use RTX 3080s.
In addition to its strong performance, the Alienware m17 R4 offers a deep, tactile keyboard and a unique, attractive design that’s all its own. The 360 Hz screen is more than capable, but unless you’re a competitive gamer, you can go with the default screen or, better yet, go for the 4K panel which promises much richer colors.
The biggest drawbacks for this epic laptop are those which are kind of inherent to any 17-inch laptop which turns the performance volume up to 11. It’s heavy, has short battery life, emits plenty of fan noise. It’s also quite expensive. It would be nice if, for this price, you got a better-than-awful webcam, but most laptop webcams are terrible.
If you want to save a few dollars or you need a little more battery life, consider the Gigabyte Aorus 17G, which goes for $2,699 with similar specs (but just 1TB of storage) to our Alienware m17 R4. The 17G lasts more than twice as long on a charge and weighs 0.65 pounds less than the m17, but its gaming performance isn’t as good.
If you don’t feel attached to the 17-inch form factor, consider the Alienware m15 R4, which has the same design and keyboard but is much more portable, albeit hotter. It also has an optional, 4K OLED panel which has incredibly vibrant output. However, if you want the ultimate 17-inch gaming rig right now, the Alienware m17 R4 is your best choice.
Linux might only make up a very small portion of the operating system market share (about 1-2% depending on who you ask), but it’s a dedicated minority, filled with loyalists who are willing to stick with the OS despite compatibility issues thanks to its heavy customizability and open-source nature. In other words, people who use Linux tend to be pretty versed in tech. That’s probably why Boiling Steam, a site dedicated to gaming on Linux, thought it might be illuminating to do some research into what type of hardware Linux users prefer. In the AMD vs Nvidia debate, we finally have some numbers on what the most discerning tech wizards prefer…as well as which manufacturers better serve their unique needs.
These numbers come from ProtonDB, which is a site that tracks game compatibility with Proton, Steam’s built-in solution for running games on Linux. The idea of the site is to report how well certain games run on Linux or what features might need tweaking to work properly on the OS. But because users report their system specs as part of their reports, it’s also possible to use the site to glean which components are most popular among gaming-minded Linux users.
Boiling Steam admits that its data is limited- ProtonDB is really the only reliable source here, as other options like the Steam Hardware Survey aren’t granular enough to be reliable. But still, the site’s analysis (compiled from over 111,000 reports) is an interesting bird’s-eye view of the Linux gaming scene.
Before getting to the AMD vs Nvidia debate, let’s start with Boiling Steam’s CPU charts. The site’s analysis goes back 2 years, and in that time, it seems like AMD’s grown from being the CPU choice for roughly 27% of the ProtonDB user base to the CPU choice for almost 50%. Intel takes up the other 50%, of course, though Boiling Steam recognizes that AMD’s penchant for matching Intel’s single-core performance and providing more cores at a lower cost might soon tip the scales in its favor.
As for GPUs, AMD is also on the rise there, but Nvidia still holds a noticeable majority. In January of 2019, AMD was the GPU choice for 25% of the ProtonDB user base, with that percentage rising to 37.5% in January of this year. Nvidia still dominates the rest, which Boiling Steam credits to the company’s early successes in ray tracing and DLSS compared to AMD.
Perhaps most interesting, however, is how this data comes together. According to Boiling Steam, “If you own an Intel CPU, you are far, far more likely to have a[n] Nvidia GPU as well.” Meanwhile, “If you own an AMD CPU, there’s a good chance (50/50) you have an AMD GPU as well, and even more so if you use very specific [Linux] distros (Arch, Gentoo, Slack, KdeNeon, Fedora, Deepin, Void…)”
In other words, AMD fans are likely to go all-in with AMD on their systems. And, well…it seems like the geekier someone is, the more likely they are to prefer AMD.
None of this is too surprising going off general fan community conversation — you don’t have many gifs featuring Nvidia’s Jensen Huang transforming into a giant robot floating around online, but you do for AMD’s Lisa Su. Still, having concrete numbers to point to is great, plus this also shows a promising trend for AMD on Linux going forward.
“For Linux, if you were ever interested in gaming with decent performance, an Intel + Nvidia combo was pretty much required until recently,” Boiling Steam writes. “Intel, for the best single thread performance on CPUs, and Nvidia both for their excellent proprietary drivers and better hardware/pricing overall.”
However, as AMD’s made strides in recent years to provide comparable performance to Intel and Nvidia at a lower cost, that’s no longer the case, which promises a strong future for Linux users who are also AMD fans. It’s a future that’s only made stronger by AMD’s recent efforts to have its CPUs and GPUs work together using “Smart Access Memory.” So far, Smart Access Memory has maintained feature parity on Linux and Windows, which is always a strong sign of a company’s commitment to Linux users.
Going forward, then, it’s possible that AMD might steal even more of Intel and Nvidia’s Linux market share, as it both continues to improve its technology overall as well as support its unique features beyond just Windows.
For more of Boiling Steam’s analysis, and to see its charts, visit Boilingsteam.com.
What’s the first thing you think of when someone says the word ‘multi-room’?
For most, we imagine it’s a dedicated set-up from a single manufacturer such as Sonos or Bluesound, with its connected ecosystem of speakers, soundbars, soundbases and hi-fi components. Or perhaps you’d think of AirPlay 2, a gateway for an iOS source (Apple device) to stream music to multiple compatible products.
If you want to mix and match from multiple manufacturers, cherry-picking devices in order to get the best performance and fit for each room, DTS Play-Fi could be an option worth looking into. The app-controlled, wireless, multi-room platform has been licensed to several hi-fi brands and consequently sits at the functionality core of excellent streaming products such as the What Hi-Fi? Award winning Audiolab 6000N Play (pictured below) and five-star Arcam rPlay music streamers.
It claims to provide “premium wireless audio for every room of your house”, and works across a wide array of products. So, let’s take a look at what it does, how it does it and what products utilise it…
Multi-room audio: everything you need to know
What is DTS Play-Fi?
DTS Play-Fi is, at the most fundamental level, a platform and app that lets you connect and control various hi-fi devices together in order to stream audio from one to another. This can be within one room for a multi-channel set-up, or across multiple rooms in your home. Multi-room aside, it can just be used to facilitate network streaming in one standalone product.
The range of compatible devices includes portable wireless speakers, stereo systems, A/V tuners, preamps, amplifiers, music streamers and media servers – all of which can be managed on your smartphone, tablet, PC or TV via the dedicated app. DTS Play-Fi launched in 2012 and originally its app was only available only on Android. But now its dedicated control app is available on iOS, Kindle Fire (the operating system on Amazon’s Fire tablets) and Windows PCs too, as well as on TVs.
On Windows, however, there are two varieties of Play-Fi app: a free version and a “Play-Fi HD” version. The latter costs $14.95 (approximately £10), which buys you a code for from the DTS online store and ultimately gives you greater control over your audio as well as higher quality.
Best multi-room systems 2021
Best multi-room speakers 2021
How does it work?
Download the app and you should quickly see a list of available devices. Tap to select it, then choose audio from a number of sources including Amazon Music (including Amazon Music HD), Tidal, Deezer, Spotify and Qobuz, as well as internet radio stations via iHeartRadio, SirusXM and Radio.com. Apple Music is not available on the service at the moment.
Some DTS Play-Fi-compatible products will also support AirPlay and AirPlay 2, Apple’s simple method of streaming audio and video directly from iPhones, iPads and other devices. However, this isn’t available in every DTS Play-Fi product; it’s the manufacturer’s decision whether to implement it or not. Similarly, that’s the case with Google Chromecast and Spotify Connect.
You can configure two separate speakers into a stereo pairing – one playing the left channel audio, the other playing the right – using the app, or (if you have at least six DTS Play-Fi compatible products) create a 5.1 surround-sound system.
Those speaker groups can then be designated as ‘Zones’, which allows you to delegate music to different rooms of your house. For example, you could have a Tidal stream going to the speakers in your living room while a Deezer stream plays in the bedroom.
A new companion app, called DTS Play-Fi Headphones, also lets you stream audio from select DTS Play-Fi-connected products (soundbars, stereo amps and speakers) to a pair of headphones over wi-fi. DTS claims the wi-fi connection is better than Bluetooth headphones (which can introduce latency issues) and its AV synchronisation technology means there shouldn’t be any syncing issues between picture and audio when watching TV.
To use it, you have to connect your wired headphones to your smartphone or tablet with the app on it, up to four people can be connected to one stream, and users can even listen at different volumes. The free app is available on iOS and Android.
Best music streaming services 2021
Does DTS Play-Fi support hi-res music?
DTS Play-Fi will play MP3, M4A, AAC, FLAC, WAV and AIFF files. They can be streamed up to a 16-bit/48kHz resolution limit without compression – anything bigger will be compressed by default.
However, this can be changed using the service’s Critical Listening mode, which lets you stream 24-bit/192kHz music across your wi-fi network. A word of warning, though: since those files are generally pretty large (many of our hi-res music files are between 30MB and 70MB per track, compared to 3.5MB for the average MP3 file) and thus require greater bandwidth to stream, DTS advises using a wired connection for more reliable performance.
High-resolution audio: everything you need to know
Which products support DTS Play-Fi?
Among the vast array of products supporting DTS Play-Fi are soundbars, systems and speakers from – deep breath – Audiolab, Anthem, Arcam, Aerix, Definitive Technology, Integra, Klipsch, MartinLogan, McIntosh, Onkyo, Paradigm, Philips, Polk, Quad, Rotel, Phorus, Pioneer and Sonus faber.
Most recently, product announcements include the Quad Artera Solus Play streaming system, a range of Philips audio kit and TVs, and kit by Porsche Design.
You can see the full list of supported brands here.
Best music streamers 2021
How to add a streamer to your hi-fi system
How many products can you connect using DTS Play-Fi?
In the words of The Notorious B.I.G., the sky’s the limit. You could theoretically connect as many products as you like together, but DTS recommends a maximum of 32 Play-Fi devices per home network, lest performance start to suffer.
Up to 16 Play-Fi products can stream the same song, whether that’s from your phone, tablet, or Windows PC. You can designate a maximum of four zones, with each zone streaming audio from a different source.
Up to eight people can use the same wi-fi network to stream songs using DTS Play-Fi (provided they’re all using different devices).
Adding a music streamer to my hi-fi system has been a revelation (and a frustration)
Does DTS Play-Fi support voice control?
In a word, yes. DTS Play-Fi supports voice control most exhaustively via Amazon’s Alexa voice assistant. This functionality is available in two ways: “integration of Alexa Voice Services” (which is for products with the Alexa voice assistant built-in) or, most commonly, “Works with Alexa” (which means you’ll be able to control Alexa-based music streams on DTS Play-Fi products via an Echo device).
Play-Fi-connected smart speakers such as the Onkyo P3, Pioneer F4 and Phorus PS10 have integrated Alexa, so you can shout commands at it just as you would Amazon’s own Echo speakers.
To use Alexa with other Play-Fi products (such as those from McIntosh or Martin Logan – basically anything without microphones built in) you’ll need to ask any of the above smart speakers or an Echo or Dot to play music, which it will then do across the rest of your Play-Fi products. There’s a long list of Works with Alexa-supporting DTS Play-Fi products, which we can be found here.
Speaking of Alexa, DTS Play-Fi products now support Alexa Cast, allowing users to send and control music directly from the Amazon Music app to them. See compatible products here.
How about Apple’s Siri voice assistant? Music playback on Play-Fi products that support AirPlay 2 can be started via voice commands to Siri or to a HomePod. Alternatively, users can use spoken commands to add AirPlay 2 speakers into a group with a HomePod or even transfer music playback from the HomePod to other AirPlay 2 products in the home.
Amazon Echo smart speakers: which Alexa speaker should you buy?
Best smart speakers 2021
What are the DTS Play-Fi alternatives?
Many other companies offer their own services to connect up audio kit – controlled through bespoke apps. Sonos is the obvious standout in that respect – its multi-room speakers routinely win What Hi-Fi? Awards for ease of use and excellent support of numerous streaming services, plus exclusive features such as Sonos Trueplay.
Streaming products from the likes of Naim, Linn, Audio Pro, B&O and Bose (and many more) use their own streaming apps, while Bluesound, NAD, Dali use the shared BluOS platform, and Denon and Marantz use the shared HEOS platform.
Then there’s arguably the most ubiquitous platforms: Apple AirPlay 2 and Google Chromecast, which have been integrated across a huge variety of wireless products, allowing you to mix-and-match products in your streaming household.
One thing’s for sure, it’s no longer difficult to make a multi-room audio system. If anything, it’s just harder to choose which of the many paths to go down.
Nvidia’s cloud-based gaming service, GeForce Now, has just received a new update that features new support for Google Chrome’s web-browser and support for Apple’s M1 based Macs (through the native macOS app).
With the addition of Google Chrome, now any device or computer capable of running Chrome should be capable of running GeForce Now. However, Nvidia says they do not guarantee support on devices that are on operating systems other than macOS or Windows. Specifically, that means Nvidia still doesn’t officially support Linux platforms.
We conducted some cursory testing with the Chrome-based GeForce Now app on a Windows 10 machine and found it was very close to the native app experience. However, several options are missing in the settings menu compared to the native app: There is no 30 fps option (60 fps only), no option to change VSync, and the toggle to “adjust for poor network conditions” is missing as well.
Besides the missing options, the gameplay experience was good. Testing with Shadow of the Tomb Raider yielded excellent results; image quality, smoothness, and frame rates were great (with help from a wired 300 Mbps down/30 Mbps up ISP connection). The only minor difference we spotted with our limited selection of tests was in Apex Legends, where switching from the Chrome app to the native-app yielded a barely noticeable decrease in input lag. However, this is a small problem; if you play games casually instead of competitively, this should be a non-issue.
Overall the Chrome version works well, but if you can run the native app, it would be best to do so to get the best experience possible. Nvidia’s Chrome implementation is mostly aimed towards devices that aren’t capable of running the native GeForce Now apps in the first place, like Windows 10 ARM-based devices. Unfortunately, we were not able to test the M1 Mac update at this time.
According to analysts, Chromebooks had an incredible 2020, with the last quarter being the strongest ever for Google’s laptops (via 9to5Google). According to research firms Gartner and Canalys, over 30 million of the devices shipped last year, with somewhere between 11 and 11.7 million shipping in Q4 alone. Canalys says that’s a staggering 287 percentmore than were shipped in Q4 2019.
The firms disagree on how much growth this is when compared to 2019, but both estimate that it’s a lot: Gartner estimates that Chromebooks are up 80 percent, while Canalys says it’s 109 percent. For comparison, PC sales were up by 11 percent this year, which is the biggest growth the category had seen in a decade. This isn’t to give the impression that Chromebooks outsold PCs — Gartner estimates there were 275 million traditional computers sold this year, but it does show that one category is experiencing an explosion in growth, while another is recovering after a dip.
Education has a lot to do with it, according to Canalys. Chromebooks are big sellers in the education market, and schools had to find a way to provide at-home students with computers for remote learning during COVID. Even the Vice President of Microsoft Education agrees that Chromebooks are great for schools — according to audio from a leaked meeting obtained by Business Insider, he told his employees that “[i]n many cases, when schools are buying Chromebooks or Windows PCs, Chromebooks are still faster and cheaper, they are easier to deploy and manage.”
As students attend school from home, they’re obviously not going to be coming home to a different PC. A lot of kids I know have never used a traditional computer: they spend their time on Chromebooks and tablets instead.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.