Think Raspberry Pi, and we instantly think of Raspberry Pi OS, a branch of Debian Linux tweaked to run on the board. But the recently-released Raspberry Pi Pico eschews the traditional Raspberry Pi setup in favor of an Arm-based microcontroller. This has not deterred David Given, who has ported a Unix-like operating system to the $4 board.
A Raspberry Pi Pico running Fuzix will not be replacing your Raspberry Pi any time soon, but it is amazing to see this Unix-like OS running on such low-power hardware. Given’s port of Fuzix is based upon Alan Cox’s original project, which can run on hardware almost 40 years old. Fuzix provides us with a proper Unix filesystem, support for SD cards via the SPI interface, a full set of Fuzix binaries, and all of this is available via a serial console to UART0.
Right now, there is no support for flash memory, evident in the requirement of an SD card. Given says that the code for this is complete, but a bad file system will crash the dhara FTL library and that the onboard flash is too small even for the 32MB system image.
Given provides a readme full of installation instructions, which are written for experienced Unix / Linux users to follow. The easiest way to get started is to download the Fuzix binaries and format an SD card as per the readme instructions. Then flash the UF2 file to your Pico, connect up an SD card reader via the SPI interface, connect to UART0 and start using Fuzix on a $4 microcontroller.
Last year, AMD released the Ryzen 5000 series desktop processors in one of the most monumental hardware launches of the modern era. This final step completed the Red brand’s ascent back into the forefront of the desktop processor market that began with the launch of the first generation of Ryzen CPUs. Now, while Intel prepares to fire back with the launch of the 11th Generation Intel Core processors, we take a look at a less common specification of the forward-compatible 400 series.
While the 10th Gen Intel Core processors did not support PCIe 4.0 connectivity due to signal integrity issues, many of the 400 motherboards are designed to support the PCIe 4.0 specification. This is accomplished by adding clock generators to help clean up the signal. Generally speaking, when it comes to long-term platform support, AMD has been the trendsetter. Has AMD’s long support of the AM4 socket on its newer generation processors inspired Intel to take similar steps?
So what is the 400 series offering right now? For starters, there has been a large focus on VRM and VRM cooling design. With the top-level Intel Core i9-10900K featuring ten cores along with HyperThreading, the ability to deliver clean, continuous power is going to be one of the primary factors that separates a good 400 series board from the competition. With the introduction of HyperThreading on Intel’s mid-range line-up, power delivery is going to be vital in all segments.
The Vision line from Gigabyte targets content creators with a focus on connectivity, performance, and durability. The Gigabyte W480 Vision D features a direct 12-phase VRM for clean power delivery, as well as a direct-touch heatpipe for optimal cooling. An ample helping of USB ports, two Thunderbolt 3 ports, and 2.5 Gb/s Ethernet promote superior connectivity while the white aesthetic lends visual flare. The W480 chipset supports Xeon W and ECC memory at the cost of locked down CPU overclocking.
Let’s take a look at the Gigabyte W480 Vision D and see how it stacks up against its Z490 counterparts!
Specifications
Specifications
CPU Support:
Intel 10th Gen or later Core processors Intel Xeon W series processors
1x DisplayPort In port 2x Thunderbolt™ 3 connectors 1x HDMI port 2x SMA antenna connectors 2x USB 3.2 Gen 2 Type-A ports (red) 4x USB 3.2 Gen 1 ports 2x USB 2.0/1.1 ports 2x RJ-45 ports 1x optical S/PDIF Out connector 5x audio jacks
With Rocket Lake’s release date approaching, testers are getting their hands on more and more SKUs from Intel’s future Rocket Lake lineup; this time, we have benchmark results of Intel’s future Core i5-11600K (thanks to @Leakbench). The 11600K was found running the Geekbench 5 benchmark with mediocre performance at best, though, as usual, pricing will determine if it lands on our list of Best CPUs.
According to the spec sheet found on Geekbench 5’s browser, the Core i5-11600K packs 6 cores and 12 threads with a 3.9GHz base frequency along with a max turbo frequency of 4.9GHz. Nothing is unusual here; this is where we would expect a 11600K to land. Excluding the rare unlocked Core i3 and Pentium, the unlocked Core i5s have traditionally been the lowest clocked chips out of all the “K” SKUs.
That’s not all that will be slowing down Intel’s 11600K, unfortunately. The system configuration for the 11600K shows it being paired with super-slow DDR4-2133 memory. This will noticeably hamper performance, so take the upcoming benchmark results with another dose of salt — they certainly won’t represent what we’ll see in our CPU benchmark hierarchy when these chips come to market.
Image 1 of 2
Image 2 of 2
In the Geekbench 5 results, the Core i5-11600K scores 1565 points in the single-threaded test and 6220 points in the multi-threaded benchmark. These results are quite underwhelming, especially in the multi-core department where even AMD’s older Ryzen 5 3600 beat the 10600K by 7.6% (or roughly 400 points).
When it comes to single-core performance, the 11600K fares better, but it’s still the slowest CPU out of all known Rocket Lake SKUs and AMD Zen 3 CPUs to date. Luckily, the 11600K does take a major win against Comet Lake-S parts like the 10900K, beating that chip by 11%.
Again though, take these results with a huge grain of salt. Geekbench 5 already has a poor reputation for translating well to real-world results, and adding in slow memory complicates the findings.
The Rocket Lake release is coming soon next month, so hopefully, by that time we’ll have a review sample of the 11600K to test for ourselves and give you an in-depth look into how this chip really performs against our best gaming CPUs.
We recently reported about Expanscape, a startup developing battlestation laptops featuring up to seven displays and offering their prototypes to interested parties at undisclosed prices. Today we can share some more information about pricing, which tops out at an eye-popping $20,692 (after conversion) for the seven-screen model’s base configuration.
Expanscape’s Aurora laptops with five or seven screens are a work in progress, so every unit is unique to a large degree. The manufacturer says that it is getting closer to finalized pricing for its A7 prototype as it had standardized on specific hardware, but until now, the company hasn’t announced official prices of its multi-monitor laptops.
Expanscape currently has two types of laptop prototypes in three configurations. The ‘basic’ Aurora A5 notebook comes with five monitors: four 15.6-inch 4K displays and one 7-inch touchscreen integrated into its palm rest. This system packs an eight-core AMD Ryzen 7 4800U processor paired with 64GB of DDR4 memory, a 2TB PCIe/NVMe SSD, and a 2TB SATA SSD. This entry-level machine currently costs approximately £4,500, or $6,286 USD after conversion.
The ‘full’ Aurora A7 laptops are equipped with seven displays and come in two configurations. One model features four 17.3-inch 4K monitors, two 9.7-inch 1536p monitors, and one 7-inch touchscreen. Another does not have the 7-inch touchscreen but comes with a 17.3-inch touch-enabled LCD in its base to replace the keyboard, which now extends from under the chassis.
Expanscape’s Aurora A7 notebooks with seven screens are naturally more expensive than their smaller A5 brethren. At present, these machines cost £15,000 ($20,952 USD after conversion) for a standard model, but the pricing goes up with all of the customizations required for the professional built-to-order systems.
All Aurora A7 machines come equipped with 128GB of DDR4 memory as well as 8TB of PCIe/NVMe and SATA storage.
Image 1 of 7
Image 2 of 7
Image 3 of 7
Image 4 of 7
Image 5 of 7
Image 6 of 7
Image 7 of 7
Notably, the current pricing of Expanscape’s Aurora machines is somewhat higher than several months ago because of component shortages and new policies at the UK border.
It is noteworthy that while Expanscape’s Aurora notebooks are still prototypes, they are rather clumsy and heavy. However, according to the manufacturer, demand for these systems is still fairly significant. Customers who bought the systems reportedly said that they needed them ‘yesterday.’
What remains to be seen is whether high demand for Expanscape’s Aurora A5 and Aurora A7 will enable the company to make them look and feel like commercial products and not just prototypes. Evidently, a more solid build will make these systems considerably more popular among interested customers.
As spotted by HotHardware, Anthony, a hardware enthusiast from the Mod Labs forums, has recreated 3dfx Interactive’s renowned Voodoo 5 6000. The company never released the Voodoo 5 6000 to the public, but the skillful enthusiast managed to revive the fallen graphics card through some exquisite reverse engineering work.
You might not have even heard of 3dfx, and we don’t blame you because it has been ages since we’ve heard that name. For the uninitiated, 3dfx was one of the key players in the graphics card market, next to Nvidia and ATI. The company closed its doors in 2002, but it’s still widely regarded as one of the pioneers of the graphics card market.
Voodoo 5 6000 was a single-slot graphics card based on 3dfx’s VSA-100 (Napalm 30) die, or rather four of them, and it was quite the GPU back then. The VSA-100 chips, which measured 112mm², housed up to 14 million transistors. TSMC was responsible for producing the VSA-100 for 3dfx on the foundry’s 250nm process node. Each VSA-100 had up to two pixel shaders; therefore, the Voodoo 5 6000 had eight of them in total, running at 166 MHz. On the memory side, the Voodoo 5 6000 featured 128MB (4x32MB) of SDRAM clocked at 166 MHz. Across a 128-bit memory interface, the Voodoo 5 6000 provided up to 2.656 GBps of memory bandwidth.
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
Anthony’s creation isn’t exactly a faithful copy of the original Voodoo 5 6000, but some may argue that it looks even better. Instead of the old-school green PCB, he created his own black PCB, which adds a bit of a modern look to the graphics card. He equipped it with four VSA-100 dies, which cost $18.95 apiece, with their corresponding heatsinks and cooling fans.
The Voodoo 5 6000 was certified for a TDP of 60W, which is more than the AGP slot can provide. Therefore, it drew its power through an external 250W power brick. Anthony artfully added a standard 4-pin Molex power connector to his rendition of the Voodoo 5 6000 to run the graphics card with a common computer power supply. Putting aside these small changes, Anthony assures that his Voodoo 5 6000 performs the same as the original because he replicated the same BIOS, drivers, and even the same bugs.
Aside from the high production cost, 3dfx didn’t launch the Voodoo 5 6000 due to several bugs with the AGP x4 slot on some motherboards. As a result, the graphics card was forced to run at x2. Some of the reported issues included color distortion and artifacts in games, translucent stripes on the screen, and other miscellaneous bugs.
Vaio is known for making laptops that pack a surprising amount of power into unbelievably thin form factors. The Vaio Z may be the company’s most ambitious product yet. It contains up to Intel’s four-core Core i7-11357H — and at a starting weight of 2.11 pounds, it’ll be the lightest laptop ever to house an Intel H-series processor. (Though models you can buy in the US are 2.32 pounds.)
Part of the reason the Vaio Z is so light is that it’s the first laptop ever to be made of “contoured carbon fiber.” You’ll find carbon fiber in some of the nicest lightweight laptops on the market, including the Dell XPS line — it’s a sturdy and lightweight material. But those laptops utilize sheets of carbon fiber that are held together with metal or plastic parts. Vaio has actually contoured the material around the edges of the Z’s chassis, so it’s carbon fiber all around.
Vaio says the device has passed 26 “surface drop” tests, and will deliver up to 13 and a half hours of battery life. In terms of other specs, you can get up to 2TB of storage, 32GB of memory, Iris Xe integrated graphics, and either an FHD or a 4K 14-inch display. There’s a backlit keyboard, a webcam with a physical shutter, a full-size HDMI port, and two USB-C ports as well. The chassis is a clamshell, though you can fold the screen down to 180 degrees.
Of course, this all doesn’t come cheap. The Vaio Z starts at — I’m not joking — $3,579. So it won’t be a practical purchase for most people, but it’s still an impressive achievement and an interesting proof-of-concept. Keep an eye out for our full review in a few days, where we’ll dive into the performance you can expect for that price. You can preorder units now on Vaio’s website.
In a bid to support development of its scientific, economic, and allegedly military-bound projects, China has been building leading-edge supercomputers for about two decades. Initially, China used hardware developed in the U.S., but as tensions between the country and its main economic rival intensified, China had to build its own high-performance computing (HPC) hardware. As the era of exascale supercomputer looms, Chinese scientists propose various architectures for such systems.
One of the exascale supercomputer proposals includes scaling of the Sunway HPC architecture as well as the Shenwei (SW) many-core hybrid CPU architecture, reports NextPlatform citing a document from the National Research Center of Parallel Computer Engineering and Technology (NRCPC).
The Supercomputing Trends: More Cores
As a part of its preparations for exascale era, the NRCPC has conducted a study about general supercomputer trends in the recent years.
The organization found that because of the slowdown of both Moore’s law and Dennard Scaling law made it incredibly difficult to increase performance of supercomputers without increasing their power consumption and therefore increasing complexity the whole system architecture exponentially.
Based on these findings, performance of leading-edge supercomputers in 2008 ~ 2019 increased mainly due to increase of the number of compute cores by 44 times and merely because of increase of compute capability per core, which increased by three times. To that end, the NRCPC believes it makes sense to scale its existing Sunway supercomputer architecture and Shenwei CPU design rather than to invent something brand new. In particular, supercomputers featuring tens of millions of cores are considering.
Exploring the Shenwei SW26010 Architecture
The latest Sunway TaihuLight supercomputer launched in 2016 uses 40,960 homegrown manycore Sunway SW26010 processors featuring a hybrid architecture. The system offers Linpack performance (Rmax) of 93,014.6 TFLOPS as well as (Rpeak) performance of 125,436 TFLOPS. The current exascale proposal includes scaling of the SW26010 CPU as well as the TaihuLight system, so it makes sense to learn some more details about the CPU architecture.
The SW26010 processor is based on an in-house developed 64-bit RISC architecture and features four clusters, or core groups (CG) and a protocol processing unit (PPU). Each cluster has one management processing element (MPE), which is a superscalar out-of-order core with a 256-bit vector engine, 32 KB/32 KB L1 instruction/data cache, 256 KB L2 cache. It also integrates 64 compute processing elements (CPEs) featuring the same 256-bit vector engine as well as 64 KB of fast local store for data and 16 KB for instructions. CPEs are organized as an 8×8 array and are interconnected using a mesh network. It is important to note that MPEs and CPEs support coherence sharing with a directory-based protocol, which reduces data movement between cores and supports fine-grained interactions between different cores, which is particularly vital for applications with irregular data sharing access.
Each CG has its own DDR3 memory controller with its own address space that supports 8 GB of memory using nine memory modules for a proprietary ECC implementation. CGs are interconnected using a ringbus-like network-on-chip (NoC) link and the processor itself connects to the rest of the system using the System Interconnect (SI) bus. The SW26010 CPU used in the Sunway TaihuLight supercomputer operated at 1.45GHz. The NRCPC does not disclose which process technology it used to make the SW26010, but since the TaihuLight first appeared in the Top 500 list in mid-2016, it is logical to assume that its CPU was made using TSMC’s 28 nm fabrication process.
Such a processor features performance of around 3.168 TFLOPS (Rpeak) as well as memory bandwidth of approximately 136 GB/s, assuming that the Sunway Taihulight is fully loaded and is 100% efficient.
The SW26010 is essentially a hybrid processor with 260 cores that share the same microarchitecture, but feature different capabilities. Since the SW26010 is a single chip that can exploit thread-level parallelism with its 256 CPE cores, it is believed to be more efficient than CPUs equipped with compute accelerators (such as GPUs or FPGAs) since it does not have to make loads of memory transactions between serial (MPE) and parallel (CPE) cores. Meanwhile, modern x86-based supercomputers use CPUs with more than four ‘big’ cores, which adds quite some flexibility.
NRCPC’s Approach to Exascale: Scale Everything
From NRCPC’s point of view, it is possible to scale both the Sunway system as well as the Shenwei CPU architecture to build a supercomputer featuring performance of around 1 ExaFLOPS.
To build such a system, the NRCPC proposes to enhance the SW26010 CPU and increase the number of processors. The new Shenwei CPU for exascale machines will have eight CG clusters instead of four. The CG architecture will remain the same: one MPE and 64 CPEs. Meanwhile, CPEs will support 512-bit vector instructions (presumably the MPE will too, but the document does not state that explicitly). Based on NRCPC’s estimations, such a processor will provide over 12 FP64 TFLOPS. The exascale supercomputer will also more than double the number of CPUs per system to over 80,000.
The NRCPC says that an exascale Sunway supercomputer based on the next-generation Shenwei CPU architecture will offer around 1 FP64 ExaFLOPS, 2 FP32 ExaFLOPS as well as 4 FP16 ExaFLOPS peak performance. According to estimates by the organization, real-world performance of the exascale Sunway system will be around 700 PFLOPS (i.e., its efficiency will be at ~70%), so it will be 7.5 times faster than the TaihuLight. In addition, the supercomputer will offer about 7 times higher memory bandwidth and about 2 time higher network bandwidth.
The Sunway TaihuLight supercomputer consumes 15,371 kilowatts of power. By contrast, the Fugaku supercomputer, the world’s most powerful machine, consumes 29,899 kW, about two times more. The Frontier, which is expected to be the first system to offer ~1.5 ExaFLOPS performance sometimes later this year, is projected to consume ~30,000 kW. While NRCPC’s study gives some idea about performance expected from the Chinese exascale supercomputer, one of the things that the document lacks is expected power consumption of the system.
The paper acknowledges that enhancing the CPU architecture will lead to major internal redesigns on interconnections and caches, which means increase of power consumption. Furthermore, the whole supercomputer will have to be redesigned to take advantage of extra per-CPU performance as well as the number of CPUs. The NRCPC says that it will address challenges of other supercomputer subsystems in upcoming documents.
New Process Technologies Needed
Building a hybrid CPU with 520 cores (8 MPEs, 512 CPEs) is possible from engineering standpoint. Meanwhile, doubling the number of cores and increasing their complexity with 512-bit vector units that require two times faster internal interconnects will inevitably lead to a significant increase of transistor count.
Doubling transistor count is not an undoable challenge. At the end of the day, companies like AMD, Intel, and Nvidia know how to build large CPUs and GPUs for datacenters and supercomputers. But all of these companies have access to leading-edge process technologies and semiconductor production facilities. By contrast, since China wants to build all of its technology prowess independently, it is not clear whether it is inclined to contract TSMC or Samsung Foundry to make its hybrid supercomputer CPUs knowing that the U.S. might add the NRCPC into the Entity List and forbid chipmakers to supply silicon to this company.
Without knowing exactly which process technology is used to make the SW26010 and which node the NRCPC plans to use to make its 520-core chip, we can only make guesses and speculations about the organization’s exascale plans.
At present, China-based Semiconductor Manufacturing International Corp. has two FinFET manufacturing technologies: its 14 nm node as well as its N+1 node for inexpensive chips. Assuming that the SW26010 is made using TSMC’s 28 nm process technology, using SMIC’s 14 nm process for a considerably more complex CPU makes a lot of sense. It of course remains to be seen whether SMIC can indeed mass produce fairly complex chips using its 14 nm node (which so far has only been used for mobile SoCs and other relatively small components) and hit the right yields at the right frequency. Keeping in mind that SMIC is in the U.S. Department of Commerce’s Entity List and it is increasingly hard for the company to obtain necessary chemicals and spare parts, the foundry is refocusing to mature process technologies, so it is unclear whether it is even inclined to produce any new 14 nm designs even for ‘VIP’ customers like the NRCPC.
That said it is possible that the NRCPC might have to take a risk and use TSMC’s services for its next-generation supercomputer. As an added bonus, usage of TSMC’s 7 nm node will enable the National Research Center of Parallel Computer Engineering and Technology not only to increase transistor count of its CPU, but also to increase frequency while keeping power consumption in check.
Summary
One of the first Chinese supercomputers will leverage an existing Sunway supercomputer and Shenwei hybrid CPU architectures developed by the National Research Center of Parallel Computer Engineering and Technology. To achieve a 1 FP64 ExaFLOPS Rpeak performance in Linpack benchmark, the NRCPC will increase the number of execution units within its processor, add support for 512-bit vector instructions, and will double the number of CPUs per system.
The CPU that will power NRCPC’s proposed exascale system will feature 520 cores (8 high-performance cores and 512 simplified cores), and an all-new memory subsystem. What is unclear is whether the new Shenwei CPU will be made in China and which fabrication process will be used to produce it. On the one hand, China-based SMIC has successfully used its 14 nm node to make SoCs for Qualcomm and some other partners, but it is unclear whether the technology is good enough for highly complex supercomputer processors and whether SMIC can actually use it given the fact that it is in the U.S. DoC’s Entity List. On the other hand, while TSMC can offer the NRCPC one of its competitive N7 or N6 nodes, it is unclear whether the Chinese supercomputer specialist is inclined to use services of a Taiwanese company.
While Chinese engineers can develop a leading-edge supercomputer, including its CPU, DRAM, NAND, and other components, competitiveness of the proposed NRCPC exascale system will depend on semiconductor process technologies available to CPU designers.
Russia’s MCST Elbrus microprocessors made a splash last year, but it takes a lot mot than a microprocessor to develop a completely self-sufficient computing platform. Among other things such nationally-oriented platforms need is a proprietary SSD controller, and apparently server maker Kraftway has developed one and demonstrated it at a conference this week. The chip will enable building encrypted SSDs featuring a proprietary encryption technology.
The Kraftway K1942BK018 is an Arm Cortex-R5-based NVMe 1.2.1-compliant controller with eight NAND channels that supports up to 2TB of flash memory as well as up to 2GB of DDR3 SDRAM cache. The chip features an ONFI 200MHz interface and is compatible with NAND chips produced by Micron and Toshiba and then packaged in Russia by GS Nanotech. The controller connects to the host using a PCIe 2.0 x4 interface and enables building SSDs in an HHHL or U.2 form-factor. The chip is made using TSMC’s 40 nm process technology and comes in a BGA676 package. The developer claims that it consumes from 3.5W to 4W under load.
The manufacturer claims that the drives powered by the new controller will deliver an up to 828 MB/s read speed as well as an up to 659 MB/s read speed.
Surprisingly, the Kraftway K1942BK018 supports a rather outdated BCH 96-bit/1K ECC technology, which means that it may not support all modern types of 3D NAND.
The key feature of the Kraftway K1942BK018 is that it was fully developed in Russia and uses proprietary management algorithms as well as cryptography standards. The primary customers that will use the controller are various government agencies, the ministry of defense, state-controlled companies and other entities interested in maximum security and proprietary algorithms.
Kraftway plans to produce 10,000 SSDs based on its K1942BK018 controller in the coming months in a bid to use them with its PCs aimed at those markets.
Interestingly, but in addition to the Arm Cortex-R5-based K1942BK018 controller there are also two RISC-V-based SSD and USB drive controllers designed in Russia and based on cores developed in Saint Petersburg.
Nvidia today announced its new Cryptocurrency Mining Processor (CMP) line of GPUs to “address the specific needs of Ethereum mining” and, hopefully, improve the availability of the best graphics cards in the GeForce product line in the process. The company also limited the mining performance of the soon-to-be-launched RTX 3060 cards to roughly 50% of the normal performance. It sounds like a good move, but there’s a lot going on.
Cryptocurrency miners previously had to purchase graphics cards originally intended for PC gaming if they wanted to maximize their operation’s profitability. This can lead to—or exacerbate—GPU shortages whenever a particular coin’s value skyrockets. Right now that coin is Ethereum. CoinDesk’s figures put the cryptocurrency’s price at $281 in February 2020, but at time of writing it’s priced at $1,920. The scramble to join this digital gold rush has worsened the GPU shortage caused by COVID-19.
Nvidia’s solution? Give miners their own GPUs. The company said in today’s announcement that the CMP line was developed to “help miners build the most efficient data centers while preserving GeForce RTX GPUs for gamers.” The resulting Nvidia CMP HX “allows a fully open, airflow-optimized bracket and is configured to allow a greater number of GPUs to be controlled by one CPU.” That airflow optimization was partly enabled by ditching unnecessary display ports.
Nvidia said the CMP line also features lower core voltages and frequencies to improve mining efficiency. The new GPUs will be sold through authorized resellers including “ASUS, Colorful, EVGA, Gigabyte, MSI, Palit, and PC Partner.” Those optimizations should be decent incentives for cryptocurrency miners to buy CMP products instead of their GeForce counterparts. The proverbial carrot, if you will. And the stick? Deliberately making gaming-focused GPUs worse at mining.
Nvidia explained in today’s announcement: “RTX 3060 software drivers are designed to detect specific attributes of the Ethereum cryptocurrency mining algorithm, and limit the hash rate, or cryptocurrency mining efficiency, by around 50 percent.” The company didn’t say whether or not it plans (we’ll get to that in a moment), but Nvidia’s latest RTX offerings are currently at the top of our GPU benchmarks hierarchy, and it would be nice if more of them were actually used to play games.
Model
30HX
40HX
50HX
90HX
Ethereum Hash Rate
26 MH/s
36 MH/s
45 MH/s
86 MH/s
Rated Power
125W
185W
250W
320W
Power Connectors
1 x 8-pin
1 x 8-pin
2 x 8-pin
2 x 8-pin
Memory Size
6GB
8GB
10GB
10GB
Starting Availability
Q1
Q1
Q2
Q2
Is this really good news, or is this just Nvidia playing both sides? To be clear, these CMP cards are still the same exact silicon that goes into GeForce and Quadro graphics cards. They don’t have video outputs, cooling should be improved (for large-scale data center mining operations), and they’re better tuned for efficiency. But every single GPU sold as a CMP card means one less GPU sold as a graphics card. What’s perhaps worse is that while miners can still use consumer cards for mining (maybe not the upcoming RTX 3060, depending on how well Nvidia’s throttling works), gamers can’t use these mining cards for gaming.
Nvidia does state that these GPUs “don’t meet the specifications required of a GeForce GPU and, thus, don’t impact the availability of GeForce GPUs to gamers.” Frankly, that doesn’t mean much. What does Nvidia do with a GPU that normally can’t be sold as an RTX 3090? They bin it as a 3080, and GA102 chips that can’t meet the 3080 requirements can end up in a future 3070 (or maybe a 3070 Ti). The same goes for the rest of the line. Make no mistake: These are GPUs that could have gone into a graphics card. Maybe not a reference 3060 Ti, 3070, 3080, or 3090, but we’ve seen TU104 chips in RTX 2060 cards, so anything is possible.
Which brings up another big question: What specific GPUs are being used for CMP? The 90HX is almost certainly an Ampere GA102 chip, because it’s probably the only one that can reach the 86MH/s target speed. The rest, though, who knows? Turing TU104, TU106, and TU116 GPUs can easily reach those performance figures, and this could be a way to clear out a bunch of older GPUs at premium prices. It could even be a “flavor of the month” approach where Nvidia uses a variety of GPUs that couldn’t qualify for use in a GeForce card and sells them as a CMP.
These CMP cards also shouldn’t have any use for the RT cores or maybe even Tensor cores in Turing and Ampere, which would be a good way of selling off otherwise ‘dud’ chips. Look at the 30HX, with 6GB of memory and a 125W TGP. That matches up almost perfectly with a GTX 1660 GDDR6 card.
That brings us to the relative performance and specs. Note that the 90HX lists an Ethereum hash rate of just 86MH/s and a 320W TGP. After a bit of tuning, an RTX 3080 can usually do 94MH/s at 250W or less, so these cards (at least out of the box) aren’t any better. That’s probably because Nvidia knows running GPUs at high fan speeds and temperatures for 24/7 use leads to component failures. It’s why the data center and workstation lines are normally clocked far more conservatively than the consumer line.
It gets worse as you go down the line, though. 50HX only does 45MH/s at 250W — that basically matches the tuned performance of the RTX 2060 Super through RTX 2080 Super, with a TGP that’s still twice as high as what we measured. It’s also half the speed of an RTX 3080 while potentially still using the same GPU (10GB VRAM). Or maybe it’s a TU102 that couldn’t work with 11 memory channels, so it’s been binned with 10 channels. Either way, who’s going to want this? 40HX at 36MH/s and 185W and 30HX at 26MH/s and 125W are equally questionable options.
Of course, we don’t have pricing information yet. That’s going to be a critical factor. Maybe this is just a way to more easily sell GPUs to miners at inflated prices. Or maybe it’s a way to sell otherwise junk silicon to miners are reasonable prices (doubtful). Certainly, miners are paying exorbitant pricing on eBay right now. If the CMP cards cost more than graphics cards using the same GPU, they’re not going to sell well.
The driver limiting of mining performance for the upcoming RTX 3060 sounds far more interesting. Nvidia probably can’t implement the same restrictions on existing GPUs without facing a class action lawsuit (not to mention miners could just use older drivers), but making future GPUs less attractive to miners should help push them to other options. Maybe that’s CMP, maybe it’s AMD GPUs, or maybe it’s custom ASICs.
Still, there’s only a limited number of leading edge wafer starts available, and they’re in high demand, so this isn’t going to radically improve the situation with graphics card shortages any time soon. But maybe — hopefully! — it will be enough by the time Hopper rolls out in 2022/2023.
The BenQ Zowie XL2546K leaves out HDR and extended color but has DyAc+, which is the best blur reduction feature we’ve ever seen. The monitor delivers smooth and responsive gameplay. With a few tweaks, it delivers excellent color too. It’s definitely worth a look if a 240 Hz monitor is on your radar.
For
Saturated color with calibration
Low input lag
Excellent blur reduction
Against
Below-average contrast
Poor color and gamma out of the box
No HDR
No extended color
Features and Specifications
In the early days of video gaming, competition took place in computer labs, and the prizes were things like magazine subscriptions or special parking privileges at the local university. Today, eSports is a major spectator sport with millions of loyal fans and professional players who earn a living competing in virtual arenas. With that meteoric rise in skill level comes a need for better tools and that’s where the best gaming monitors come in.
Once, 144 Hz was enough to earn a monitor eSports status, but 240 Hz is quickly becoming the new standard for gaming monitors and is no longer an exclusive refresh rate. You’ll still pay a premium to go that fast though, case in point, BenQ’s Zowie XL2546K. It sells for around $500, which is a median price in this category.
For that price, you get a 25-inch (24.5-inch viewable) TN panel with 1080p resolution and AMD FreeSync Premium. Though its out-of-the-box image quality could be better, the BenQ Zowie XL2546K offers a strong gaming experience with minimal input lag and fantastic blur reduction.
BenQ Zowie XL2546K Specs
Brand & Model
BenQ Zowie XL2546K
Panel Type & Backlight
TN / W-LED, edge array
Screen Size & Aspect Ratio
24.5 inches / 16:9
Max Resolution & Refresh
1920×1080 @ 240 Hz
FreeSync: 48-240Hz
G-Sync compatible
Native Color Depth & Gamut
8-bit (6-bit+FRC) / sRGB
Response Time (GTG)
0.5 ms
Brightness (mfr)
320 nits
Contrast (mfr)
1000:1
Speakers
–
Video Inputs
1x DisplayPort 1.2
3x HDMI 2.0
Audio
3.5mm headphone output
USB 3.0
–
Power Consumption
19.4w, brightness @ 200 nits
Panel Dimensions WxHxD w/base
22.5 x 14.5-20.7 x 7.9 inches (572 x 368-526 x 191mm)
Panel Thickness
2.2 inches (55mm)
Bezel Width
Top/sides: 0.5 inch (13mm)
Bottom: 0.7 inch (17mm)
Weight
13.7lbs (6.2kg)
Warranty
Three years
Panel Type / Backlight
TN / W-LED, edge array
Screen Size & Aspect Ratio
24.5 inches / 16:9
Max Resolution & Refresh
1920×1080 @ 240 Hz
AMD FreeSync Premium: 48-240 Hz
Native Color Depth & Gamut
8-bit (6-bit+FRC) / sRGB
Response Time (GTG)
0.5 ms
Brightness
320 nits
Contrast
1,000:1
Speakers
None
Video Inputs
1x DisplayPort 1.2
3x HDMI 2.0
Audio
3.5mm headphone output
USB 3.0
None
Power Consumption
19.4w, brightness @ 200 nits
Panel Dimensions
22.5 x 14.5-20.7 x 7.9 inches
WxHxD w/base
(572 x 368-526 x 191mm)
Panel Thickness
2.2 inches (55mm)
Bezel Width
Top/sides: 0.5 inch (13mm)
Bottom: 0.7 inch (17mm)
Weight
13.7 pounds (6.2kg)
Warranty
3 years
The BenQ Zowie XL2546K is somewhat old school with a TN panel running at FHD resolution. The pixel count isn’t unusual for this class, but the TN screen is. It’s no longer necessary for a fast monitor to be TN. IPS has evolved to 240 Hz and beyond. Witness the two 360 Hz IPS monitors we recently covered, Asus’ ROG Swift PG259QN and Alienware’s AW2521H. While they both sell for over $700, they’re proof that you don’t need TN to go fast.
BenQ offers the XL2546K as a no-frills gaming monitor by leaving out HDR and extended color. While these things are not necessary in a competition gaming tool, they are nice to have for the rest of us. Granted, this category doesn’t see a lot of DCI-P3 color gamuts, but our recent experience with the AW2521H also demonstrated that good HDR is possible with a fast display.
AMD FreeSync Premium is the featured Adaptive-Sync tech. Compared to standard FreeSync, it includes low framerate compensation (LFC). The XL2546K isn’t Nvidia-certified, but we got it to run Nvidia G-Sync too. See our How to Run G-Sync on a FreeSync Monitor article for instructions.
Assembly and Accessories of BenQ Zowie XL2546K
After bolting the upright and base together, the XL2546K’s panel snaps in place. If you’d rather use a monitor arm, a 100mm VESA pattern is included with large-head bolts already installed.
The stand is completely wobble-free once assembled. Rigid shades click in place on the sides, but there is no light blocking piece for the top. The controller for the on-screen display (OSD) comes out of its own little box and connects to a special Mini-USB port. You also get a DisplayPort cable and an IEC power cord. Everything is neatly and carefully packed as a premium product should be.
Product 360
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
BenQ bakes in its usual solid build quality and functionality with a wired OSD controller and light blocking shades for the panel’s sides. Along with a beefy stand, the XL2546K is ready for competition or just to satisfy a casual enthusiast’s lust for speed.
The XL2546K is the first monitor in our recent memory to be devoid of any logos or graphics on the front. The base and upright are similarly unadorned, but around back, you’ll find a Zowie logo in red. The same symbol is molded into the hinged light shutters. Red trim lines a large hole in the upright through which you can pass cables.
The OSD puck controller, BenQ calls it the S Switch, has five buttons and a scroll wheel that makes menu navigation quick and intuitive. The bezels are always visible but aren’t too thick at 13mm on the top and sides and 17mm at the bottom.
In case you need to lug the screen about, the XL2546K features a metal handle that’s more than up to the task of moving the XL2546K around. To the side is a flip-out headphone hook, and at the bottom are OSD controls, namely a joystick and two buttons. The third key there is a power toggle.
From the side, you can see that there are no USB ports. The input panel underneath doesn’t have them either. The stand has a small red arrow that you slide into your preferred position to recall the height setting. A similar feature is in evidence on the base via tick marks indicating swivel angle. Adjustments include 6 inches of height, 45-degree swivel to either side, -5/23-degree tilt and a portrait mode. Movements exude the quality of a premium display.
The input panel features three HDMI 2.0 ports and a single DisplayPort 1.2. A 3.5mm jack accommodates headphones or external audio. There are no internal speakers, but you can adjust volume in the OSD. The HDMI ports will accommodate the 120 Hz refresh rate from the new Xbox Series X and PS5 consoles.
OSD Features of BenQ Zowie XL2546K
A quick menu appears when you press any key on the BenQ Zowie XL2546K’s panel or on the S Switch puck controller. The S switch is very handy, particularly since you can program four of its functions. This means you can change settings quickly and conveniently without going through the OSD’s full menu.
Once you get into the OSD, you’ll find many options to tailor both image and performance. There are eight picture modes, all of which are fully adjustable. Settings save to each mode individually and by input. The number of possible combinations is, therefore, vast. The default mode is FPS1, which takes some less than attractive liberties with color and gamma. We’ll show you its effects in the image tests. Standard is the better choice, as it comes close to the mark without calibration.
To tweak the Zowie XL2546K’s image, BenQ provides three color temps, plus a user mode with RGB sliders. They work extremely well and deliver very accurate color in the sRGB gamut. You also get five gamma presets, black equalizer for enhancing shadow detail, color vibrance, which adjusts overall saturation, low blue light for reading and a color weakness feature for color blind users deficient in either red or green.
The Picture menu has the brightness and contrast sliders along with DyAc+ (more on this in the Hands-on section), BenQ’s name for its blur reducing backlight strobe. DyAc+ has two settings, which vary the LED pulse width. The lesser of the two is called High and is enough to remove any visible blur.
BenQ also offers overdrive, which it calls AMA. This option is best left turned off because it produced visible ringing when we played games using Adaptive-Sync. The artifact isn’t as obvious with DyAc+ but doesn’t improve the image either.
BenQ Zowie XL2546K Calibration Settings
If you do nothing else, we strongly recommend switching your BenQ Zowie XL2546K to Standard mode. The default, FPS1, alters color and gamma in unattractively. Accurate color is always the best choice.
You don’t absolutely need to calibrate the Standard mode, but a few changes resulted in a visible improvement. We improved grayscale with adjustments to the RGB sliders. Perceived contrast also increased with a change from gamma 3 to gamma 4, and we reduced the contrast control by 18 steps to fix a color clipping issue which bumped up the color saturation. We’ll talk about all of that on page three.
Our recommended settings for the BenQ Zowie XL2546K are below.
Picture Mode
Standard
Brightness 200 nits
67
Brightness 120 nits
32
Brightness 100 nits
23
Brightness 80 nits
15
Brightness 50 nits
3 (min. 45 nits)
Contrast
32
Gamma
4
Color Temp User
Red 96, Green 100, Blue 97
Gaming and Hands-on with BenQ Zowie XL2546K
The BenQ Zowie XL2546K gave us a few surprises when we sat down for some gaming. After our calibration (see our recommended settings above), we wondered how our contrast setting, which seemed extreme, would look. The answer is very good. Though the panel doesn’t show great native contrast, changing the gamma from 3 to 4 and lowering the contrast slider makes a huge difference in color saturation and shadow depth. Those tweaks made the BenQ equal to the better IPS screens we’ve reviewed.
The second, and greater, surprise came via the XL2546K’s blur reduction feature that BenQ calls DyAc+. Blur reduction usually means a brightness reduction, but BenQ managed to avoid this pitfall with some clever engineering. We measured the two DyAc+ settings (High and Premium) with the brightness control set to the same value, and light output did not change. This is a first in our experience.
This is the first monitor we’ve played on where the backlight strobe produced better motion resolution and video processing quality than Adaptive-Sync. FreeSync and G-Sync both worked perfectly with two systems: one equipped with a GeForce RTX 3090 and the other a Radeon RX 5700 XT. Frame rates were maxed at 240 frames per second (fps) in all the games we played, so tearing did not occur, even with Adaptive-Sync off. Since there’s no reduction in brightness, we recommend using DyAc+ instead of Adaptive-Sync. And that’s something we thought we’d never say.
In either case, input lag was a complete non-issue. There are gamers who prefer using backlight strobes instead of Adaptive-Sync because they believe input lag is lower. We can’t confirm this with testing, but at 240 Hz, but no one is going to perceive a 1ms or 2ms difference. When you think about a control input, the BenQ Zowie XL2546K responds. It is certainly fast and responsive enough for competitive gaming. And DyAc+ is the best implemented backlight strobe we’ve seen yet.
Color and contrast are excellent for gaming. With a little bonus saturation in the primary colors, on-screen environments are vibrant and three-dimensional. There is plenty of light output to compliment the darker gamma we chose, and the resulting picture is much better than the test numbers suggest. This is also an unusual thing in our experience, but there’s no denying that the BenQ Zowie XL2546K plays games well and looks great doing it.
It also looks great performing workday tasks. Some might prefer higher pixel density but at 25 inches, there are 89 pixels per square inch, which is enough to resolve small fonts and details. Photo editing isn’t this monitor’s strong suit, but its accuracy is sufficient for the demands of color-critical work. The XL2546K is a solid all-around display.
Whether you’re a student, a professional or just want to stay connected and productive, a laptop is one of the most important tools of the trade. But some are better than others, with wide differences in keyboards, battery life, displays and design. If you’re looking for a powerful laptop that easily fits in your bag and doesn’t break your back, you want an ultrabook.
The “ultrabook” moniker was originally coined by Intel in 2012 and used to refer to a set of premium, super-thin laptops that met the chipmaker’s predefined standards. However, just as many folks refer to tissues as Kleenexes or web searching as Googling, the term ultrabook commonly refers to any premium ultraportable laptop, whether it carries Intel’s seal of approval or not.
Of course, there’s always new tech coming down the pipe. Intel has announced its 11th Gen Core “Tiger Lake” processors with Iris Xe graphics and Thunderbolt 4, with laptops shipping in time for the holiday season. And its likely that an AMD Ryzen refresh won’t be far behind, bringing USB 4 to laptops. That’s in addition to the possibility of Apple’s first Arm-powered MacBook coming this fall.
Get a good keyboard: Whether you’re using an ultrabook to browse the web, send emails, code, write or do other productivity work, the keyboard is one of your primary ways of interacting. Get something with responsive keys that aren’t mushy. Low-travel is ok if the keys have the right feel to them, but the last thing you want to do is “bottom out” while typing.
Consider what you need in a screen: At a minimum, your laptop should have a 1920 x 1080 screen. Some laptops offer 4K options, though it’s sometimes harder to see the difference at 13-inches or below. While 4K may be more detailed, 1080p screens give you much longer battery life.
Some laptops can be upgraded: While CPUs and GPUs are almost always soldered down, some laptops let you replace the RAM and storage, so you can buy cheaper now and add more memory and a bigger hard drive or SSD down the road. But the thinnest laptops may not have that option.
Battery life is important: Aim for something that lasts for 8 hours or longer on a charge (gaming is an exception). For productivity, many laptops easily surpass this number. But be wary of manufacturer claims, which don’t always use strenuous tests. Some laptops are starting to add fast charging, which is a nice bonus.
The HP Spectre x360 14 is everything a modern ultrabook should be. This laptop has an attractive design, but isn’t about form over function. It has both Thunderbolt 4 over USB Type-C, as well as a microSD card reader, all in a thin chassis.
But what really wows is the display. The
3:2 aspect ratio
is tall and shows more of your work or web pages, and is also more natural for tablet mode. The OLED model we reviewed also offered vivid colors, though you would likely get longer battery life with the non-OLED, lower resolution panel.
The other big plus is the Spectre x360’s keyboard, which is clicky and comfortable. Sure, it’s no desktop mechanical keyboard, but for a laptop, it’s very responsive and feels great to use.
The Dell XPS 13 has long been celebrated for both its form and function. The laptop is tiny, but packs a punch with Intel’s Tiger Lake processors and adds some extra screen real estate with a tall, 16:10 display (many laptops have a 16:9 screen).
We also like the XPS 13’s keyboard, with a snappy press and slightly larger keycaps than previous designs. The screen is bright, and we shouldn’t take its thin bezels for granted, as Dell continues to lead on that front.
Admittedly, the XPS 13 is short on ports, opting for a pair of Thunderbolt 4 ports for booth charging and accessories. Its performance, portability and long battery life are likely to make up for that for those on the go.
Read: Dell XPS 13 (9310) review
3. MacBook Pro 13-inch (M1)
The Best Mac
CPU: Apple M1 | GPU: 8-core GPU on SOC | Display: 13.3-inch, 2560 x 1600, True Tone | Weight: 3.0 pounds / 1.4 kg
M1 is powerful and fast
Runs cool and quiet
Apps just work, even if emulated
Long-lasting battery life
Strong audio
Limited ports and RAM options
Touch Bar isn’t very useful
Poor webcam
While some people may still want the power, large display and port selection of the
16-inch MacBook Pro
, Apple has proved with the 13-inch version that its own home-grown M1 chip is capable of the needs of plenty of people. This is Apple’s first step in breaking away from Intel, and it is extremely impressive.
The 13-inch MacBook Pro runs cool and quiet, while the chip is faster than its competition in most cases. It’s also efficient and ran for more than 16 and a half hours on our battery test.
Many apps run natively on the Arm processor and those that don’t use Apple’s Rosetta 2 software for emulation. Even then, users will barely know that emulation is being used at all. Everything just works.
The big difference between the Pro and the Air, which also uses M1, is that the Pro has a fan. Those who aren’t doing intensive work may be able to save a bit and get a very similar machine by going with the Air, and they will get function keys instead of the MacBook Pro’s Touch Bar.
Read: Apple MacBook Pro 13-inch (M1) review
4. MSI GE66 Raider
The Best Overall Gaming Laptop
CPU: Intel Core i9-10980HK | GPU: Nvidia GeForce RTX 2080 Super Max-Q | Display: 15.6 inches, 1920 x 1080, 300 Hz | Weight: 5.3 pounds (2.4 kg)
Great gaming performance
300 Hz display
Well-executed RGB light bar
High-end build
Cramped keyboard
Tinny audio
The MSI GE66 Raider is a gaming laptop, and it’s saying it loud with a massive RGB light bar. It’s new look is aggressive, but it’s not just talk, with options going up to an Intel Core i9-10980HK and Nvidia GeForce RTX 2080 Super Max-Q.
For those looking for esports-level performance in games like League of Legends or Overwatch, there’s an option for a 300 Hz display.
And while it’s not the slimmest laptop around (or even MSI’s thinnest), it does feel remarkably portable considering the power inside, and we can’t help but appreciate high-end build quality.
Lenovo’s ThinkPads have always been favorites, and the ThinkPad X1 Carbon (Gen 8) continues that trend with a slim design, excellent keyboard and an excellent selection of ports to keep you connected to all of your peripherals.
If you get the 1080p option, you can count on all-day battery life (the 4K model we tested didn’t fare as well, but that’s often the tradeoff for higher resolution among ultrabooks).
Of course, the ThinkPad X1 Carbon also attracts one other audience: fans of the TrackPoint nub in the center of the keyboard.
Read:Lenovo ThinkPad X1 Carbon (Gen 8) review
6. Asus ZenBook Duo 14 UX482
Best Dual Screen Laptop
CPU: Intel Core i7-1165G7 | GPU: Intel iris Xe | Display: 14-inch 1080p (1920 x 1080) touchscreen, 12.6 inch (1920 x 515) ScreenPad Plus | Weight: 3.5 pounds / 1.6 kg
$999 starting price with an i5
Very good battery life
Loud speakers
Improved hinge mechanism and keyboard layout
Keyboard/touchpad are awkward
8GB of RAM in lower configurations
Asus has begun to refine the dual screen laptop. Sure, there’s a more powerful version, but for a laptop with two screens, this one is fairly light, and ran for over 10 and a half hours on a charge.
Windows 10 doesn’t yet natively support dual screen software, Asus’s ScreenPad Plus launcher has improved since launch, with easy flicks and drags to move apps around the display. For Adobe apps, there’s custom dial-based software.
The keyboard and mouse placement are the big compromises, as there isn’t a wrist rest and they can feel cramped. But if you want two-screens, this is as good as it gets for now.
If you’re going for a big screen, the Dell XPS 17 shines. The display on the laptop is bright and colorful, especially on the 4K+ option that we tested, and with minimal bezels around it, your work (or play) is all that’s in focus.
With up to an Intel Core i7 and an Nvidia GeForce RTX 2060 Max-Q, there’s plenty of power here. While it’s not on our list of best gaming laptops, you can definitely play video games on it, including intensive games that use ray tracing.
All of that comes in an attractive design similar to the XPS 13 and XPS 15, though the trackpad takes advantage of the extra space. It’s a luxurious amount of room to navigate and perform gestures.
Read: Dell XPS 17 (9700) review
CPU
GPU
RAM
Storage
Display
HP Spectre x360 14
Up to Intel Core i7-1165G7
Intel Iris Xe (integrated)
Up to 16GB LPDDR4-3733
Up to 2TB M.2 PCIe NVMe SSD
13.5-inch touchscreen, up to 3000 x 2000 resolution, OLED
Dell XPS 13 (9310)
Up to Intel Core i7-1165G7
Intel Iris Xe (integrated)
Up to 16GB LPDDR4x-4276
Up to 512GB M.2 PCIe NVMe SSD
13.4-inch touchscreen, 1920 x 1200 resolution
MacBook Pro (16-inch)
Up to Intel Core i9-9980HK
Up to AMD Radeon Pro 5500M
Up to 64GB DDR4
Up to 8TB SSD
16 inches, 3072 x 1920
Asus ROG Zephyrus G14
Up to AMD Ryzen 4900HS
Nvidia GeForce RTX 2060 with ROG Boost
Up to 16GB DDR4-3200 (8GB on-board, 8GB SODIMM)
1TB PCIe 3.0 M.2 NVMe
14 inches, 1920 x 1080, 120 Hz
Lenovo ThinkPad X1 Carbon (Gen 8)
Up to Intel Core i7-10610U
Intel UHD Graphics
Up to 16GB LPDDR3
Up to 1TB PCIe NVMe SSD
14 inches, up to 4K with Dolby Vision and HDR400
Asus ZenBook Duo UX481
Up to Intel Core i7-10510U
Nvidia GeForce MX250
Up to 16GB DDR3
1TB PCIe NVMe SSD
14 inch 1080p (1920 x1080) touchscreen, 12.6 inch (1920 x 515) ScreenPad Plus
MSI dropped some bombs today during the transmission of its MSI Insider Show. In addition to revealing the pricing for its B560 and H510 motherboards, the company the also teased its upcoming MEG Z590 Unify/Unify-X motherboards for Rocket Lake-S processors.
MSI’s Unify series of motherboards are recognized for two main traits. They arrive with a pure black design that lacks RGB lighting (for those that hate RGB), and they’re also heavy on overclocking features. As anticipated, MSI will be bringing the MEG Z590 Unify and Unify-X motherboards to exploit Intel’s Rocket Lake-S chips. The pair of motherboards are like manna from heaven for enthusiasts that require more connectivity than what the MEG Z590I Unify has to offer.
Adhering to the standard ATX form factor, the MEG Z590 Unify and Unify-X share identical specifications, except for the number of memory slots. The Unify-X will only come with two DDR4 memory slots that will ultimately help with memory overclocking, given the shorter traces.
Made to compete with the best motherboards, both Unify motherboards exploit a 16-phase power delivery subsystem with power stages rated for 90A each. A pair of 8-pin EPS power connectors are present to feed the processor with more juice than it can handle. MSI didn’t touch too much on the memory slots, but we expect the motherboards to easily support all of the best RAM, including memory modules faster than DDR4-5000.
Despite being overclocking-oriented, the MEG Z590 Unify and Unify-X aren’t short on other features either. The storage options include six normal SATA III connectors, three PCIe 4.0 x4 slots and one PCIe 3.0 x4 slot. Since multi-GPU setups are a thing of the past, the Unify motherboards only come with one PCIe 4.0 x16 expansion slot.
With Rocket Lake-S, you basically have 20 high-speed PCIe 4.0 lanes at your disposal. The Unify motherboards’ layout allows you to manage the PCIe 4.0 x16 expansion slot in two ways. If you decide to limit the PCIe 4.0 x16 expansion slot to x8, it opens up the opportunity for you to run the three M.2 slots at PCIe 4.0 x4. On the flipside, if you rather have your expansion slot at x16, you’ll be limited to one M.2 PCIe 4.0 x4 slot.
The Unify motherboards’ other attributes include 2.5 Gigabit Ethernet networking, Wi-Fi 6E connectivity, Lightning USB 20G (USB 3.2 Gen 2×2) port and MSI’s Audio Boost 5 technology.
Hyperkin is best known for making gaming peripherals, but it also creates clones of consoles that allow you to play retro games in modern resolutions. Now it is releasing a new retro console that will allow you to play Game Boy games on your TV. Hyperkin’s RetroN Sq (Square) is a console that will allow you to play Game Boy, GBC, and GBA cartridges.
The RetroN Square includes one wired USB “Scout” controller with a shape similar to a SNES controller. Instead of composite video hook-ups, the RetroN Sq connects to a TV via HDMI with games upscaled to 720p resolution; there’s also a switch that will allow you to switch the aspect ratio to either 4:3 or 16:9, depending on your preference.
The console will allow you to natively play Game Boy and GBC cartridges, while GBA games are listed as a “beta feature,” but it does not note which GBA games are compatible. The back of the console includes a memory card slot, allowing you to store firmware for the system.
I know the official product name hints it is supposed to be a square-shaped console, yet the design looks more like a color, compact Nintendo GameCube that strictly plays Game Boy cartridges, but I’ll let you be the judge of that.
Image: Hyperkin
Image: Hyperkin
Image: Hyperkin
Image: Hyperkin
Hyperkin’s RetroN Sq will release on March 25th for $75. But if you already know you want to buy one, you can preorder your unit at Hyperkin’s website. The gadget comes in two colors: “black gold” and “hyper beach,” which to me looks more like a turquoise color.
In its latest installment of the MSI Insider Show, MSI has shared the pricing for its complete stack of B560 and H510 motherboards. The budget-friendly offerings are designed for consumers to squeeze every bit of performance out of Intel’s imminent 11th Generation Rocket Lake-S processors.
MSI is pricing the new B560 and H510 motherboards very closely to their counterparts from the previous generation. Rocket Lake-S will bring PCIe 4.0 support to a mainstream Intel desktop platform. The processors are backwards compatible with 400-series motherboards, but you will miss out on the PCIe 4.0 feature, which might bne the only reason users upgrade ro Rocket Lake. At any rate, it’s good to see that the PCIe 4.0 tax doesn’t have a huge impact on MSI’s budget motherboards.
Designed to compete with the best motherboards, the MAG B560 Tomahawk WiFi will retail for $189, which is the same price tag that’s on the current MAG B460 Tomahawk. Other motherboards, such as the MAG B560M Mortar WiFi, B560M Pro-VDH WiFi or MAG B560M Mortar will even be $10 less expensive than the current models.
MSI B560, H510 Motherboard Pricing
Motherboard
MSRP in $ (excl. VAT)
MSRP in € (incl. VAT)
MPG B560I Gaming Edge WiFi
159
159
MAG B560 Tomahawk WiFi
189
189
MAG B560 Torpedo
169
169
MAG B560M Mortar WiFi
179
179
MAG B560M Mortar
159
159
MAG B560M Bazooka
139
139
B560M Pro-VDH WiFi
149
149
B560M Pro-VDH
129
129
B560M Pro WiFi
129
129
B560M Pro
109
109
B560M-A Pro
99
99
H510M Pro
95
95
H510M-A Pro
89
89
The MPG B560I Gaming Edge WiFi, which costs $159, will likely be a very enticing option for SFF builders who don’t have to make the jump to a Z590 motherboard. The Tomahawk series has always been popular with budget performance eekers, and we don’t expect that to change for this generation.
Borrowing the power delivery subsystem from the MAG Z490 Tomahawk, the MAG B560 Tomahawk WiFi leverages the same 12+2+1 design. That’s only two CPU phases less than on the premium Z590 Tomahawk WiFi. The MAG B560 Tomahawk WiFi also comes with support for memory speeds up to DDR4-5066, three M.2 slots, 2.5 Gigabit Ethernet and Wi-Fi 6E networking.
At $89, the H510M-A Pro arrives with only the strickly necessary feaures for the really tight budgets. You’ll still get access to the PCIe 4.0 goodness through the motherboard’s sole PCIe 4.0 x16 expansion slot. Unfortunately, the M.2 slot is still locked to PCIe 3.0 x4. To get acces to a PCIe 4.0 x4 slot, you’ll have to upgrade to the B560M-A Pro, which commands a $10 higher price tag.
The HP Spectre x360 14 is a beautifully constructed 2-in-1 laptop with a vibrant 3:2 OLED touch screen to showcase your work. It has an excellent keyboard and a variety of ports for all of your accessories. Those who prioritize battery life may want to consider a non-OLED configuration, however.
For
Sleek, attractive design
Vivid 3:2 display shows more of your work
Clicky, responsive keyboard
Thunderbolt 4 and USB Type-A ports
Against
OLED model doesn’t last all day
Difficult to upgrade SSD
There’s no need to beat around the bush: the HP Spectre x360 14 ($1,219.99 to start; $1,699.99 as tested) is one of the best ultrabooks we’ve tested in the last several months. It’s exquisitely designed with a 13.5-inch, 3:2 display that showcases more of your work, whether it be words, numbers, or code.
You’ll pay a premium price for it, but it sure feels premium, with a sleek chassis, clicky keyboard and both USB Type-C and Type-A ports, as well as a microSD card reader.
The model we reviewed had an impressive
OLED
screen with a 3,000 x 2,000 resolution. It looks great, but if you want all-day battery life, you may consider alternative configurations.
Design of the HP Spectre x360 14
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
HP makes a handsome laptop. The Spectre x360 doesn’t make a ton of changes to what has largely become a tried and true design. It’s an aluminum notebook with solid construction. Ours came in “nightfall black” with copper accents, which I think is a bit showy for my tastes these days, but you can also get in “Poseidon blue” or my likely choice, “natural silver.”
The back two edges near the 360-degree hinge are chopped off, one of which makes room for a Thunderbolt 4 port. It’s a divisive choice, but it’s grown on me. That placement lets you flip from a laptop into a tablet while it’s charging and barely move the cable at all.
When you unfold the laptop for the first time, you’ll notice the big difference with this Spectre: a 13.5-inch, 3:2 display that feels incredibly luxurious compared to the 16:9 screen on the smaller
Spectre x360 13
that we recently reviewed. There’s minimal bezel around it, putting the focus on your work. It also creates a slightly longer profile for the whole device.
Unlike many 2-in-1s, the power button is on the keyboard, rather than the side of the device. As a person using it primarily as a laptop, I prefer this choice, though tablet-heavy users might be annoyed. There’s also a fingerprint reader next to the arrow keys, this, combined with the IR camera, allows for security options beyond a password whether in tablet or laptop mode, which I appreciate. The speaker grilles above the function keys make for a nice accent.
Image 1 of 2
Image 2 of 2
There aren’t a ton of ports on the Spectre x360 14, but there’s enough for most people’s everyday use. Most of the action is on the right side, where you’ll find two Thunderbolt 4 ports (one on the right corner), a 3.5 mm headphone jack and a microSD card. On the right, there is one USB 3.2 Gen 1 Type-A port. The rest of that side of the notebook is magnetized to fit the included HP Tilt Pen.
At 2.95 pounds with an 11.75 x 8.67 x 0.67-inch footprint, the Spectre is fairly compact. The Dell XPS 13 2-in-1 9310 is 2.9 pounds and 11.69 x 8.15 x 0.56 inches — a bit smaller — but also has a 13.4-inch screen in a 16:10 aspect ratio. The MacBook Pro is a 3 pound clamshell and measures 11.95 x 8.36 x 0.61 inches, and the Asus ZenBook Flip S UX371 is 2.7 pounds and 12 x 8.3 x 0.6 inches.
HP Spectre x360 14 Specifications
CPU
Intel Core i7-1165G7
Graphics
Intel Iris Xe Graphics
Memory
16GB LPDDR4-3733
Storage
1TB PCIe NVMe SSD with 32GB Intel Optane
Display
13.5-inch, 3000 x 2000 OLED touchscreen
Networking
Intel Wi-Fi 6 AX 201 (2×2) and Bluetooth 5
Ports
2x Thunderbolt 4, USB 3.2 Gen 1 Type-A, Headphone/microphone jack, microSD card reader
Camera
720p IR
Battery
66 WHr
Power Adapter
65 W
Operating System
Windows 10 Home
Other
HP Rechargeable MPP2.0 Tilt Pen
Dimensions(WxDxH)
11.75 x 8.67 x 0.67 inches / 298.45 x 220.22 x 17.02 mm
Weight
2.95 pounds / 1.34 kg
Price (as configured)
$1,699.99
Productivity Performance on the HP Spectre x360 14
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Our HP Spectre x360 14 review unit came with an Intel Core i7-1165G7, 16GB of LPDDR4 RAM and a 1TB PCIe
NVMe SSD
with 32GB of Intel Optane memory. In my use, it could handle plenty of browser tabs and streaming video without an issue.
On the Geekbench 5 overall performance benchmark, the Spectre earned a single-core score of 1,462 and a multi-core score of 4,904. The ZenBook Flip S was in a similar range. The Dell XPS 13 2-in-1 had a higher score in multi-core performance (5,639). The MacBook Pro, too, had a higher multi-core score when emulated through Rosetta 2 to run the same version of the test (5,925).
The Spectre transferred 25GB of files at a rate of 533.61 MBps, faster than the XPS 13 2-in-1, but slower than the ZenBook Flip S (979.37 MBps).
In our Handbrake test, which transcodes a 4K video to 1080p, the Spectre x360 14 finished the task in 18 minutes and 5 seconds. While this was four minutes faster than the ZenBook, the XPS 13 2-in-1 was speedier and the MacBook Pro led the whole pack, even while emulating x86 instructions.
To stress the Spectre, we ran it through 20 runs of Cinebench R23. It was fairly consistent in the low 4,000’s, though there were some peaks up to around 4,300. The CPU ran at an average of 2.61 GHz and an average temperature of 74.07 degrees Celsius (165.33 degrees Fahrenheit).
Display on the HP Spectre x360 14
The 13.5-inch touchscreen on the Spectre x360 has a 3:2 aspect ratio, making it taller than it is wide. It’s an opulent amount of space, especially for doing work. You’ll see more text, code, spreadsheet cells or whatever else you’re working on because the screen is taller. It’s a big improvement over 16:9 displays, and makes for a more natural shape as a tablet, as it’s similar in shape to a piece of paper.
Our main review configuration was an OLED model with a 3,000 x 2,000 resolution. It looked incredible, with deep blacks and vibrant colors, as has been the case on most OLED monitors we’ve seen to date. Of course, most videos are still 16:9, so when I watched the trailer for Godzilla vs. Kong, it was letterboxed on the top and bottom. The beginning of the trailer features the titular ape on a barge during a sunset, and its blue and orange hues were beautiful as jets flew overhead.
The OLED screen covers 139.7% of the DCI-P3 color gamut (the non-OLED, 1920 x 1280 screen covered 74.6%). The next best was the ZenBook Flip S, also with an OLED display, at 113.1%. The MacBook Pro measured 78.3% and the XPS 13 2-in-1 covered 70%.
The Spectre’s display measured an average of 339 nits on our light meter. This never seemed like an issue in regular use, though the ZenBook, XPS 13 2-in-1 and MacBook Pro all got far brighter.
Keyboard, Touchpad and Stylus on the HP Spectre x360 14
The keyboard on the Spectre takes up as much room as possible, moving from edge to edge of the
chassis
. This gave HP room to include a full keyboard, including an extra column for home, page up, page down and end keys. The tilde key is a little squeezed, but not enough for me to be inconvenienced.
The keys are clicky (they even have a bit of a clicky sound!), and I really enjoyed typing on them. On the 10fastfingers typing test, I reached 105 words per minute with my usual 2% error rate.
There’s a fingerprint reader built into the keyboard on the right side, next to the arrow keys. On the function row, there’s a key to kill the camera. The F1 key is sort of wasted, though, in that it is programmed exclusively to open the browser and search for “how to get help in
Windows 10
.”
HP has equipped the Spectre x360 with a 4.5 x 2.8-inch touchpad. It’s slightly smaller than the MacBook Pro (5.1 x 32 inches), but is still plenty spacious. With Windows 10 precision drivers, it responded immediately to every gesture.
A rechargeable stylus is included with the laptop, the “HP Rechargeable MPP2.0 Tilt Pen.” (MPP is short for Microsoft Pen Protocol.) It’s round with one flat edge that connects to the left side of the laptop with magnets. That flat side also has two customizable buttons
The Spectre’s palm rejection worked pretty well, and the stylus worked well with both tilting and shading in supported applications. In Paint 3D, using the crayon tool required extra pressure for a deep hue, just like the real thing. I do wish, like some of Microsoft’s styluses, that HP would add an eraser to the end.
HP claims it lasts 30 hours on a charge. When you slide up the top of the stylus, a USB-C port is revealed, which is a neat addition. A ring light on the very top tells you its charging status.
Audio on the HP Spectre x360 14
HP’s collaboration with Bang & Olufsen has produced winning laptop audio for a while now, and the Spectre x360 14 is no exception.
These things get loud — too loud, even. As I listened to Spotify, I turned the volume down as Fall Out Boy’s “Bob Dylan” stormed through my apartment. The drums, vocals and guitars were clear. Bass was a bit quiet. I tried changing that manually in the Bang & Olufsen control center, but to a little effect. There are presets in that app, but I found most of them to be overkill.
Upgrading the HP Spectre x360 14
Unfortunately, HP has made upgrades and repairs to the Spectre x360 14 more difficult for the average person than they need to be.
There are two visible Torx screws on the underside of the laptop, but underneath the rear rubber foot, there are four more Phillips head screws. The feet are applied with adhesive and could rip when you remove them. HP makes extras available to authorized repair shops.
If you did get into the laptop, per
the maintenance manual
, you would find that while the RAM is soldered down, the SSD, WI-Fi module and battery are user replaceable.
Battery Life on the HP Spectre x360 14
Like most laptops with OLED screens, the Spectre x360 14’s battery life isn’t exceptional. It will last you most of the day, but you’ll want to bring the braided USB Type-C charger with you.
On our test, which continuously has laptops browse the web, run OpenGL tests and stream video over Wi-Fi at 150 nits, the Spectre ran for 7 hours and 14 minutes. A non-OLED version with a 1920 x 1280 screen ran for 12:11, should you value battery life over image quality.
The Dell XPS 13 2-in-1 lasted 10:52, while the ZenBook Flip S (also with OLED) ran for 8:11. The MacBook Pro with Apple’s M1 processor lasted the longest at a whopping 16:32.
Heat on the HP Spectre x360 14
We took skin temperature measurements on the 14-inch Spectre x360 while running our Cinebench R23 stress test.
Image 1 of 2
Image 2 of 2
The center of the keyboard measured 34.8 degrees Celsius (94.64 degrees Fahrenheit), while the touchpad was a cooler 29.4 degrees Celsius (84.92 degrees Fahrenheit).
The hottest point on the bottom was 47.1 degrees Celsius (116.78 degrees Fahrenheit).
Webcam on the HP Spectre x360 14
It’s a shame this beautiful, high-resolution screen wasn’t paired with a beautiful, high-resolution
webcam
. Like most laptop cameras, the Spectre x360’s is still stuck at
720p
.
An image I took at my well-lit desk was color accurate, catching my navy shirt, blue eyes and the mixed shades of brown in my hair and blue. But overall, the picture was grainy, and light coming in from some nearby windows was blown out.
On the bright side, it works with Windows Hello for facial login. While there’s also a fingerprint reader on the keyboard, this is better for logging in when it’s a tablet.
Software and Warranty on the HP Spectre x360 14
While the Spectre x360 is undoubtedly a premium device, it has the kind of bloat you would expect from some budget machines.
HP has a lot of its own software. I wish it would combine more of these disparate programs into the main app, HP Command Center, which lets you make performance adjustments based on temperature and sound and also lets you decide which software gets network priority.
There are separate pieces of software for choosing among different display modes, switching between headphone and speakers, changing HP telemetry settings and adjusting the buttons on the stylus. There’s also HP Quick Drop to move files between your phone and the laptop.
On top of all that, there is MyHP, which gives you your serial number and is otherwise filled in with some fairly vapid tips for using your PC. HP has also added LastPass, ExpressVPN, Netflix, trials of Adobe software and a promotion with Dropbox for new users to get 25GB of free space. There’s also a suite of McAfee software, including McAfee LiveSafe, Personal Security and File Lock.
Amazon Alexa is also preinstalled, which may be divisive. It sure is more useful than Cortana. Either way, it’s not actively listening. Instead, you have to sign in to your Amazon account.
Of course, there’s also some bloatware that’s included in most Windows 10 installs, like Hulu, Roblox and Hidden City: Hidden Object Adventure.
HP sells the Spectre x360 14 with a 1-year warranty.
HP Spectre x360 14 Configurations
We tested the Spectre x360 14 with an Intel Core i7-1165G7, 16GB of RAM, a 1TB SSD with 32GB of Intel Optane memory, a 3000 x 2000 OLED display. It comes in black and costs $1,699.99 at Best Buy as of this writing.
HP sells other configurations on its own website, starting at $1,219.99 with an Intel Core i5-1135G7, 8GB of RAM, a 256GB SSD with16GB of Intel Optane memory and a 1920 x 1280 touchscreen. Changing to black or blue adds $10 to the price, and for more money, you can go up to 2TB of storage (up to an extra $320).
Bottom Line
The HP Spectre x360 14 is the best 2-in-1 laptop you can get right now. The 3:2 display highlights your work in laptop mode and is more natural than 16:9 or 16:10 screens in tablet mode. It offers solid performance, has a variety of ports, includes a stylus and has an excellent keyboard.
If battery life is your priority, the OLED screen won’t do you any favors, but the 1920 x 1280 model might be more your speed. The
MacBook Pro with M1
, a clamshell alternative, is top of the class in endurance. If you prefer a smaller footprint, the
Dell XPS 13 2-in-1 9310
is still very good, though it has fewer ports and a 16:10 screen rather than 3:2.
But the Spectre x360 14 largely has it all, making this one easy to recommend if you’re willing to pay a premium price.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.