Arm this week announced Armv9, its latest instruction set architecture that will power a broad range of processors and system-on-chips that will be launched in the coming years. The new ISA promises to enable designers to build SoCs with multiple special-purpose accelerators for artificial intelligence (AI), machine leading, digital signal processing (DSP), and security. Arm believes that SoCs and CPUs with specialized accelerators will better address existing and upcoming challenges of computing.
Arm’s Armv8 ISA unveiled a decade ago introduced 64-bit instructions, advanced SIMD instructions, cryptography extensions, virtualization, AMBA5 CHI (coherent hub interface), and a number of others. To a large degree, the Armv8 was a development of the general-purpose processor paradigm, which implicates that a CPU should perform all the tasks that a system needs. While this concept has been around for some time and will continue to exist for many reasons, special-purpose accelerators such as those used for AI/ML, graphics processing, IoT, and DSP are not only faster, but are also more energy efficient. To that end, Arm decided to build an ISA that is not only generally fast, but is more suitable for heterogeneous SoCs with accelerators aimed at all types of applications, from IoT to servers.
It is noteworthy that Arm expects CPUs based on its Armv9 instruction set architecture to offer a more than 30% performance increase over the next two generations of mobile and infrastructure processors (codenamed Matterhorn and Makalu), so performance of PCs, servers, smartphones, and other Arm-powered devices will continue to increase at a rather rapid pace.
But the addition of heterogeneous processing capabilities will offer even more substantial gains, according to Arm. To make heterogeneous SoCs more robust, Arm promises new methods to increase frequency, bandwidth, and cache size, and reduce memory latency to amplify the performance of Armv9-based CPUs.
“The increasing complexity of use cases from edge to cloud cannot be addressed with a one-size-fits-all solution,” said Henry Sanders, corporate vice president and chief technology officer, Azure Edge and Platforms at Microsoft. “As a result, heterogeneous compute is becoming more ubiquitous, requiring greater synergy among hardware and software developers.”
One of the key features of Arm’s v9 ISA is the Confidential Compute Architecture (CCA) that protects potions of code and data from access or modification while in-use by making computations in a hardware-based secure environment. Arm’s CCA relies on the so-called Realms that are separated from both the secure and non-secure worlds. To some degree, Realms can be compared to sandboxes used by software. Meanwhile, Realms will use both software and hardware resources. Realms will be useful not only for client devices such as PCs or smartphones, but for servers as well as edge computing devices.
“A good example of this synergy between hardware and software are the ArmV9 confidential compute features which were developed in close collaboration with Microsoft,” added Sanders. “Arm is in a unique position to accelerate heterogeneous computing at the heart of an ecosystem, fostering open innovation on an architecture powering billions of devices.”
In addition, to address demanding AI/ML and DSP workloads, Arm teamed up with Fujitsu to design Scalable Vector Extension 2 (SVE2) technology for Armv9. Fujitsu’s custom Arm processors that are used for the Fugaku supercomputer already support SVE instructions. Going forward Arm intends to add ‘substantial enhancements in matrix multiplication within the CPU,’ which is a similar approach that Intel will support with its AMX technology featured by its upcoming Sapphire Rapids CPUs.
“The launch of the Armv9 architecture signals a new era for our company; a globally-pervasive platform driving secure AI-driven computing that will enable our ecosystem of more than 1,000 partners well into the 2030s,” said Simon Segars, CEO of Arm. “The Armv9 roadmap contains multiple new elements addressing the need for specialized compute from the smallest sensor to the largest supercomputer, but none are as important as the need to secure the world’s data.”
To date, numerous companies already announced support for Armv9, including Google, Foxconn, Microsoft, Nvidia, NXP, Marvell, Renesas, Unity, Samsung, Siemens, Volkswagen, Zoom, and others.
Microsoft has started testing a new version of its Edge browser on Xbox consoles. The software giant provided access to the Chromium version of Edge to Xbox Insiders earlier this month, offering an early look at the improved browsing capabilities coming to the Xbox One and Xbox Series X / S. I’ve had a chance to try out this early version over the past few days, and I’ve been able to test Discord, Stadia, and other web services running inside Edge on the Xbox. It’s like having the full version of Edge from PC running on your TV.
The Xbox version of Edge looks almost identical to the one you can find on PC or Mac right now. It even includes features like vertical tabs and Collections. Like Edge on PC and mobile, the Xbox version also syncs all your settings, favorites, tabs, and web history.
Extension support is the only big feature that’s really missing right now. I’m not sure if this is a general restriction with the Xbox version, or whether Microsoft might implement it once this Chromium version is ready to release. Either way, if you try to add a Chrome or Edge extension it will fail.
The big reason you might want to use this new version of Edge on the Xbox is for the greatly improved web compatibility. This allows services like Discord, Skype, or even Google Stadia to run on the Xbox version of Edge. Discord will let you join voice calls and participate in text channel chats, but microphone support isn’t there just yet. This is a really early version, so it’s likely that it will be supported eventually. Likewise, if you switch to another game or app, Discord calls in the Edge browser do not continue in the background. This may also change before this Edge update is broadly available, too.
Google Stadia works really smoothly. I’ve been able to stream multiple games using the service, and the Xbox controller is automatically detected and supported in games. I’ve also tried to use Nvidia’s GeForce Now streaming service, but Nvidia appears to be blocking the Edge user agent string, and there are no developer tools or extensions that will allow me to spoof the Chrome user agent.
Elsewhere, I’ve also tested out Office web apps in this Xbox version of Edge. They work as reliably as you’d expect, and you can even hook up a keyboard to the Xbox and type away. Unfortunately, mouse support isn’t available in this Edge browser yet. That appears to be part of a broader restriction on Xbox apps accessing a mouse on Microsoft’s consoles, so it’s not clear if this will be fully supported in the future.
Edge on Xbox is currently based on Chromium 91, which is expected to debut on desktop versions of Edge in May. Microsoft hasn’t revealed when it plans to release this Xbox version, though.
This Edge browser is already a big improvement over the legacy version that exists on Xbox today. Full sync support, web compatibility, and just the general interface is greatly improved. While Xbox typically gets dedicated streaming apps for most services, this Edge update will be useful for many who want to access everything the web has to offer.
Apple’s built-in web content filter for iOS and macOS has apparently been blocking any searches containing the word “Asian” for almost a year, and the error is finally getting fixed, according to a new report from Mashable. The issue shows up when you’re using a device that has the “Limit Adult Websites” content restriction turned on. But it’s not just blocking adult websites; it’s blocking almost any searches relating to Asian culture, including “Asian market,” “Asian culture,” and even “stop Asian hate.”
The Verge was able to confirm that the above searches were blocked on devices running current versions of Apple’s software (iPadOS 14.4.1 and macOS 11.2.3) with the content restrictions turned on. The searches are blocked on both Safari and Chrome on iOS devices, but Chrome was able to perform them on macOS. Devices without the content restrictions turned on can perform the searches and receive results.
The good news is that the issue should soon be resolved. The searches work on devices running the iOS 14.5 beta, Mashable reports. Still, it’s ridiculous that it took this long to fix: there are reports of the issue from February 2020, and it shouldn’t have been an issue in the first place. Of course, creating a filter for adult content is extremely difficult, but almost entirely blocking a word used to describe 60 percent of the world’s population is obviously incorrect.
In current, non-beta software, the filter behaves very strangely. “Asian restaurants” is blocked on macOS but not iOS, while “Asian food” is blocked on both. And while the issue has been around for a while, it’s especially embarrassing for Apple in the current moment, when anti-Asian sentiment and violence have been on the rise. It’s good to see it should be solved in the next versions of iOS, but it shouldn’t have taken this long.
Amazon is reportedly working on custom silicon chips for its hardware network switches, according to The Information. The chips, which could help Amazon improve its internal infrastructure as well as AWS, are said to be the result of Amazon’s $350 million acquisition of Israeli chipmaking firm Annapurna Labs back in 2015.
Amazon building the silicon for its switches could help it solve bottlenecks and issues in its own infrastructure, especially if it’s also custom-building the software that runs on them. Amazon already builds its own switches, but it’s reliant on Broadcom for the silicon that powers them. It makes sense for the company to want to completely control those machines, especially given how important its web services business is. It’s possible Amazon could even offer some services it wasn’t able to offer before, powered by the new switches, The Information suggests, citing the machine learning software that Amazon offers that currently runs on Annapurna chips.
The reported chips aren’t Amazon’s first custom silicon product. The company has previously worked with MediaTek to create a chip for its Echo smart speaker products, designed to make Alexa respond faster. It also has its custom machine learning chips it calls Trainium, which will become available to AWS customers soon. It makes sense that the company would also want its own chips to power the backbone of its network.
Custom chip design is also quickly becoming table stakes for big tech companies. We’ve recently seen reports that Google and Microsoft are building custom processors for their devices, and Apple has been doing it for years now, with the M1 Macs being an example of how much more potent a company’s products can be when it controls both the hardware and software. While Amazon may not be a titan in the smartphone or PC space, it’s a giant in the cloud, and that’s where it seems to be focusing on putting new chips to work.
Chip designer Arm has announced Armv9, its first new chip architecture in a decade following Armv8 way back in 2011. According to Arm, Armv9 offers three major improvements over the previous architecture: security, better AI performance, and faster performance in general. These benefits should eventually trickle down to devices with processors based on Arm’s designs.
It’s an important milestone for the company, whose designs power almost every smartphone sold today, as well as increasing numbers of laptops and even servers. Apple announced its Mac computers’ transition to its own Arm-based processors last year, and its first Apple Silicon Macs released later in the year. Other manufacturers like Microsoft have also released Arm-based laptops in recent years.
First of the big three improvements coming with Armv9 is security. The new Arm Confidential Compute Architecture (CCA), attempts to protect sensitive data with a secure, hardware-based environment. These so-called “Realms” can be dynamically created to protect important data and code from the rest of the system.
Next up is AI processing. Armv9 will include Scalable Vector Extension 2 (SVE2), a technology that is designed to help with machine learning and digital signal processing tasks. This should benefit everything from 5G systems to virtual and augmented reality and machine learning workloads like image processing and voice recognition. AI applications like these are said to be a key reason why Nvidia is currently in the process of buying Arm for $40 billion.
But away from these more specific improvements, Arm also promises more general performance increases from Armv9. It expects CPU performance to increase by over 30 percent across the next two generations, with further boosts performance coming from software and hardware optimizations. Arm says all existing software will run on Armv9-based processors without any problems.
With the architecture announced, the big question is when the processors using the architecture might release and find their way into consumer products. Arm says it expects the first Armv9-based silicon to ship before the end of the year.
(Pocket-lint) – Huawei’s second generation of its foldable smartphone comes in the form of the Mate X2, while Xiaomi’s foldable phone is called the Mi Mix Fold.
Both follow a similar design to the Samsung Galaxy Z Fold series after Huawei changed the format of the folding device from its predecessor – the Mate X and Xs- moving from a foldable display on the outside, to an inward folding display.
If you’re in the market for a vertically folding smartphone, here is how the Xiaomi Mi Mix Fold, Huawei Mate X2 and the Samsung Galaxy Z Fold 2 compare.
Design
Mi Mix Fold: 173.3 x 133.4 x 7.6mm unfolded / 173.3 x 69.8mm x 17.2mm folded / 317g (Black) 332g (ceramic)
Mate X2: 161.8 x 145.8 x 4.4-8.2mm unfolded / 161.8 x 74.6 x 13.6-14.7mm folded / 295g
Z Fold 2: 159.2 x 128.2 x 6.9mm unfolded / 159.2 x 68 x 16.8mm folded / 282g
The Xiaomi Mi Mix Fold has a vertical folding display in a book-style design, like the Huawei Mate X2 and Galaxy Z Fold 2. It comes with a glass or special edition ceramic back and there is a prominent camera housing in the top left corner. There’s a metal frame, large display and a single punch hole camera in the top right corner when folded.
When unfolded, the Mi Mix Fold has an 8.01-inch display and a precision based hinge. There is a small gap when folded though, like the Samsung Galaxy Z Fold 2. It comes in Black or Special Edition Ceramic finishes.
The Huawei Mate X2 meanwhile, also features a vertical folding display. There’s a glass rear with a prominent rectangular camera housing in the top left corner, a metal frame and a full display with dual cut-out cameras on the front when folded.
When unfolded, the Mate X2 has a large 8-inch screen. The hinge is multi-dimensional, creating a water dropped-shaped cavity for the display when the phone is folded, allowing for no gap at all when shut, resulting in a different design to the Mi Mix Fold and Galaxy Z Fold 2. There’s also a wedge-like design that is just 4.4mm at the slimmest point. It comes in White, Black, Crystal Blue and Crystal Pink colours.
The Samsung Galaxy Z Fold 2 has a similar form to the Huawei Mate X2 and Xiaomi Mi Mix Fold in that it offers a vertical fold in a book-style design. It too has a glass rear, with a rectangular camera housing in the top left corner, as well as a metal frame. Like the Mi Mix Fold, it has a single, punch hole camera on the front when folded but it is centralised rather than positioned to the right.
When unfolded, the Galaxy Z Fold 2 has a slightly smaller 7.6-inch display than the Huawei Mate X2 and Xiaomi Mi Mix Fold. It’s hinge allows for multiple viewing angles but there is a slight gap at the fold of the device when folded. It comes in Mystic Bronze and Mystic Black colours.
The Xiaomi Mi Mix Fold has a 6.5-inch AMOLED display with a resolution of 2520 x 840 resolution and a pixel density of 409ppi on the front when folded. It has a 27:9 screen ratio and a 90Hz refresh rate. There’s a punch hole camera in the top right, and the bezels are slightly larger than the Huawei and Samsung alternatives.
When unfolded, the Mi Mix Fold has a 8.01-inch display WQHD+ resolution, 1 billion colours and a 4:3 aspect ratio. It has a 60Hz refresh rate.
The Huawei Mate X2 has a 6.45-inch OLED display with resolution of 2700 x 1160 and a pixel density of 456ppi on the front when folded, making it fractionally smaller than the Mi Mix Fold, though with slimmer bezels. It features a 21:9 aspect ratio and a 90Hz refresh rate. As mentioned above, the X2 has dual punch-hole front cameras in the top left of the display.
When unfolded, the Mate X2 has an 8-inch OLED display with a 2480 x 2200 resolution, which results in a pixel density of 413ppi. The unfolded display has a ratio of 8:7.1. It too has a 90Hz refresh rate.
The Samsung Galaxy Z Fold 2 has a 6.23-inch external display, making it slightly smaller than the Mate X2 and Mi Mix Fold. It too is an AMOLED panel and it offers a resolution of 2260 x 816 pixels and an aspect ratio of 25:9.
When unfolded, the Galaxy Z Fold 2 has an internal display of 7.6-inches – which is the smallest of the three devices being compared here. It’s Dynamic AMOLED and it has a 2208 x 1768 pixel resolution, resulting in a pixel density of 372ppi. It also has a 120Hz refresh rate and supports HDR10+.
Cameras
Mi Mix Fold: Triple rear camera (108MP + Liquid Lens 8MP + 13MP), 20MP front
Mate X2: Quad rear camera (50MP+16MP+12MP+8MP), 16MP front
Z Fold 2: Triple rear (12MP+12MP+12MP), 10MP front
The Xiaomi Mi Mix Fold has three lenses on its rear, with a 108-megapixel main sensor with 2.1µm pixels and a 7P lens, along with a 13-megapixel ultra wide angle lens with a 123-degree field of view.
There is also an 8-megapixel liquid lens on the rear that uses the principle of human eye bionics and a special chip created by Xiaomi to change the radius of curvature of the spherical surface. It allows the one lens to cover two functions, enabling 3x optical zoom, up to 30x digital and a minimum focus distance of 3cm. The front camera on the Mix Mix Fold is a 20-megapixel snapper.
The Huawei Mate X2 has a quad camera on the rear, which features Leica technology, like Huawei’s other flagship smartphones. The camera setup includes a 50-megapixel main sensor, 16-megapixel ultra-wide angle sensor, 12-megapixel telephoto sensor, and an 8-megapixel SuperZoom sensor.
The main sensor has a f/1.9 aperture and OIS, the Ultra-wide sensor has a f/2.2 aperture, the telephoto sensor has a f/2.4 aperture and OIS with 3x optical zoom, while the SuperZoom sensor has a f/4.4 aperture, OIS and 10x optical zoom. The front camera is 16-megapixels wide angle with a f/2.2 aperture.
The Samsung Galaxy Z Fold 2 has a triple rear camera, comprised of a 12-megapixel main camera, 12-megapixel telephoto sensor and 12-megapixel Ultra-wide sensor.
The main camera has an f/1.6 aperture, dual pixel phase-detection autofocus and OIS, the telephoto lens has an aperture of f/2.4 and OIS and the ultra-wide sensor has an aperture of f/2.2. There is also a 10-megapixel front camera.
The Xiaomi Mi Mix Fold runs on Qualcomm’s Snapdragon 888 chipset. It’s a 5G handset and it comes with 12GB of RAM and either 256GB or 512GB of storage. The Special Edition Ceramic model has 16GB of RAM and 512GB of storage though.
There’s a 5020mAh battery under its hood that supports Xiaomi’s 67W fast charging. The software is MIUI 12, based on Android 10 and there features like a one click option to close down things like GPS for security, as well as a Desktop Mode with a three-finger swipe.
The Huawei Mate X2 runs on Huawei’s own 5nm Kirin 9000 platform. It too is a 5G device. It is supported by 8GB of RAM and it comes in 256GB and 512GB storage variants.
It has a 4500mAh battery under the hood that supports Huawei’s 25W SuperCharge. Huawei’s own Harmony OS can be installed over the company’s usual EMUI interface running on top of Android.
The Samsung Galaxy Z Fold 2 runs on Qualcomm’s Snapdragon 865 Plus chipset, supported by 12GB of RAM and it comes in 256GB and 512GB storage options.
There’s a 4500mAh battery running the Fold 2, which supports 25W wired charging, 11W wireless charging and 4.5W reverse wireless charging. It runs Android with Samsung’s One UI over the top and there are some great multi-tasking features that make great use of the screen when unfolded.
Price
Mi Mix Fold: Equivalent of £1105/$1550, China
Mate X2: Equivalent of £1985/$2785, China
Z Fold 2: £1799, $1999
The Xiaomi Mi Mix Fold costs RMB 9999, 10,999 or 12,999, starting at the equivalent of $1550 or £1105. It will be available in China from 15 April.
The Huawei Mate X2 costs RMB 17,999 or 18,999, starting at the equivalent of $2785 or £1985. It is available in China only for now.
The Samsung Galaxy Z Fold 2 costs £1799 in the UK and $1999 in the US.
SQUIRREL_339457
Conclusion
The Xiaomi Mi Mix Fold and Huawei Mate X2 is only available in the Chinese market at the moment but while the Mate X2 is a little more expensive than the Samsung Galaxy Z Fold 2, the M Mix Fold is cheaper.
On a spec-by-spec comparison, these three devices are similar and while the Xiaomi Mi Mix Fold tips the scales in several areas, the other two devices have their fair share of wins too.
The Xiaomi Mi Mix Fold has larger displays both interior and exterior than both the Huawei and Samsung. It also offers the most advanced Qualcomm chipset, the largest battery of the three devices, the fastest wired charging capabilities and it has an interesting camera setup with its liquid lens.
The Mate X2 arguably has a more streamlined design, an extra camera on the rear and a wide-angle front camera. It also has only fractionally smaller displays than the Mi Mix Fold.
The Galaxy Z Fold 2 has more RAM than the Mate X2 but the same as the Mi Mix Fold. It also has a higher refresh rate on the internal display though and it offers a more user-friendly software experience, as well as supporting Google services – the latter of which the Mate X2 does not offer. The Z Fold 2 has less cameras than Huawei but the triple rear camera does offer great results.
The Samsung is more widely available than both the Mate X2 and the Mi Mix Fold so while the Mi Mix 2 and Mate X2 might win on some specification areas, you’ll need to live in China to get your hands on them for now. It’s also worth remembering the Galaxy Z Fold 3 is tipped for a July 2021 launch.
Yesterday, AMD release a new Adrenalin driver to the public, version 21.3.2 with support for several new titles including Dirt 5, along with several bug fixes. Specifically, driver 21.3.2 adds support for Dirt 5‘s new DirectX Raytracing (DXR) update.
Dirt 5 originally launched late last year, and CodeMasters worked with AMD on the title. Not long after launch, AMD provided the press with early access to a beta DXR branch of the game, with the promise that DXR support would eventually get rolled into the public build. It took longer than expected, but with the latest update you can now try Dirt 5‘s ray tracing feature on AMD’s current RX 6000 series GPUs. (It also works with Nvidia RTX GPUs.) We’re planning a more extensive look at the state of ray tracing in games in the coming weeks, both to see how much DXR and ray tracing impact performance, as well as how much ray tracing improves the look of various games.
AMD added support for the new Outriders RPG and Evil Genius 2: World Domination as well. There’s no indication of major performance improvements or bug fixes for those games, but the latest drivers are game ready.
Bug Fixes
Besides the above, here are the five bugs squashed in this update:
The Radeon RX 6700 will no longer report incorrect clock values in AMD’s software.
Shadows corruption is fixed in Insurgency: Sandstorm when running on RX 6000 series hardware.
There is no longer an issue where the desktop resolution in Windows may change when turning a monitor off then back on again.
The start and cancel buttons should no longer disappear when resizing the Radeon Software.
You should no longer get a black screen when enabling Radeon FreeSync and setting a game to borderless fullscreen/windowed mode on RX 6000 series GPUs.
AMD (via Kepler_L2) released a new Linux patch that exposes the cache configuration for its Navi 21, Navi 22 and Navi 23 silicon. The last is rumored to power the chipmaker’s upcoming Radeon RX 6600 series (or maybe RX 6500 series).
The description for the patch reads: “The L1 cache information has been updated and the L2/L3 information has been added. The changes have been made for Vega10 and newer ASICs. There are no changes for the older ASICs before Vega10.” Therefore, it holds a ton of valuable information on both existing and future AMD products.
Introduced with RDNA 2, Infinity Cache basically acts as a big L3 cache that’s accessible by the GPU. It’s there to help improve performance since AMD’s RDNA 2 graphics cards employ relatively narrow memory interfaces. The Radeon RX 6800 XT for example uses a 256-bit bus, but manages to mostly keep pace with the GeForce RTX 3080’s 320-bit bus that also includes higher clocked GDDR6X memory.
Navi 21 (Sienna Cichlid) and Navi 22 (Navy Flounder) sport 128MB and 96MB of Infinity Cache, respectively. According to the new information, Navi 23 will wield 32MB of Infinity Cache. In comparison to Navi 22, we’re looking at a 66.7% reduction on Navi 23. That should also help cut down the die size, though at the cost of performance.
The jury is still out on whether AMD will use Navi 23 for the Radeon RX 6600 series, though. Some think that Navi 23 could find its way into the Radeon RX 6500 series instead. Regardless, AIDA64, a popular monitoring and diagnostics tool, recently received support for Radeon RX 6600 series. Assuming that the software’s release notes are accurate, the Radeon RX 6600 XT and RX 6600 will indeed be based around the Navi 23 die.
ASRock registered a couple of Radeon RX 6600 XT models with the Eurasian Economic Commission (EEC) not so long ago. It’s important to highlight that not every product makes it to the market, but if what ASRock submitted is true, the Radeon RX 6600 XT may feature 12GB of GDDR6 memory. Realistically, it makes more sense for the Radeon RX 6600 XT to have 8GB of GDDR6 memory across a 128-bit memory interface.
The fact that AIDA64 already supports the Radeon RX 6600 series hints that a potential launch may not be too far around the corner. We’re still waiting for a trimmed down Radeon RX 6700 using Navi 22, which we expect to see some time in April.
Despite still being in its relative infancy, Sony’s PlayStation 5 games console is already serving up some stunning gaming experiences. The shift up to more consistent true 4K graphics at both 60Hz and, remarkably, 120Hz is joining forces with wider, better use of high dynamic range and the impressive efforts of Sony’s new 3D Audio sound system to make gaming worlds more immersive and beautiful than ever before.
However, getting the most out of this next-gen console isn’t just a case of plugging the PS5 into your TV and expecting everything to just turn out fine. In fact, between the secondary kit you might need and some of the PS5 set-up tricks you need to familiarise yourself with, getting the maximum impact out of your new console is anything but straightforward.
With this in mind, we’ve put together a comprehensive checklist of everything you need to do if you want to be sure you’re getting the full value from Sony’s new gaming beast. Starting with potentially the most expensive…
Get the right television
The single biggest source of trouble when it comes to the PS5’s new graphics capabilities is the currently messy television market – or more precisely, the confusing world of HDMI connections.
Getting the best picture quality (4K resolution at 120Hz refresh rates with HDR and, following an upcoming update, support for variable refresh rates) out of the PS5 requires a TV’s HDMI ports to support data rates of at least 32Mbps, and that’s something the vast majority of current TVs cannot do.
What’s more, there’s currently no easy labelling system to help you spot TVs that might be compatible with all of the latest gaming features. Even if a TV claims to be compatible with the latest 2.1 version of the HDMI input, that doesn’t guarantee 4K/120Hz/VRR compatibility. All you can do is try and trawl through a TV’s small print/detailed specs to see if 4K/120/VRR are included.
We can get the ball rolling, though, with some sets we already know support all the latest gaming features. For starters, all of LG’s OLED CX, GX, WX, ZX and upcoming C1, G1 and Z1 models feature four HDMIs with full PS5 compatibility. Samsung’s QLED models from 2020 and QLED and Neo QLED TVs for 2021 all have one or two HDMI ports that support all the PS5 features, with 2021 models from the Q95A series upwards carrying four PS5-friendly HDMIs.
At the time of writing, Samsung’s 2020 TVs aren’t able to play PS5 games in 4K 120Hz while retaining HDR. Samsung has stated, though, that this apparent ‘bug’ will be fixed by a PS5 firmware update.
Sony, ironically, has just one series in its 2020 range, the XH9005s, that support all of the PS5’s graphics features, via a single HDMI port. Thankfully, more Sony models will carry the requisite HDMI support in 2021.
Philips and Panasonic haven’t so far launched any TVs with next-gen gaming features, but both brands are set to do so in 2021. Cheaper TVs (and brands) in the UK have so far not embraced next-gen gaming features, but hopefully some will do so this year.
One final point here is that, in theory, the PS5 can support 8K. So if you want to be ready for that, you will need an 8K TV. These are relatively expensive right now, and it doesn’t seem as though 8K PlayStation content is going to become common any time soon.
For more guidance here, check out our rundown of the best gaming TVs you can currently buy.
Make sure you use the right input on your TV
As noted in the previous section, on some TVs only one or two HDMIs have enough bandwidth to support all of the PS5’s graphics features. So make sure you have your PS5 connected to one that does.
Some TVs help with this by labelling the relevant HDMI(s) as Game or 4K/120, but otherwise, you will need to refer to your TV’s manual.
Use the provided PS5 HDMI cable (or pick a replacement carefully)
It’s not just HDMI sockets that need to be able to handle enough data to unlock all of the PS5’s features. HDMI cables also vary in how much data they can carry. So you should stick with the HDMI cable provided with the PS5 where possible, as this is designed to carry all the data the console needs for its maximum performance.
If you really must use a different cable – because the official cable isn’t long enough, for example – look for one that carries the official Ultra High Speed HDMI Cable certification that you can see in the image above.
Make sure your TV HDMI port is set up for high data rates
Most TVs now will automatically switch their HDMI ports to so-called ‘enhanced’ modes for high data rates when a 4K HDR source is detected. There are still some budget brands, though (Hisense, for instance) where you need to manually switch HDMIs from Standard to Enhanced in the TV’s menus. It’s certainly worth checking the settings on your TV for the HDMI that your PS5 is connected to.
Set your TV to Game mode
Almost all TVs have a special Game mode setting that reduces the time a TV takes to produce its images. This can make as much as 100ms of difference, which could be a lifetime, literally, in gaming terms. Your TV might automatically switch into Game mode when the PS5 is detected, but if response times matter with the game you’re playing, you should check that it has.
Note that Game mode settings can reduce some aspects of picture quality with some TVs. So if you’re playing a less reaction-based title, such as an RPG, you may prefer the overall picture quality with Game mode turned off.
Check your PS5’s Video Output screen
In the System Software section of the PS5’s System menu, there’s an option called Video Output Information. This brings up a screen telling you what graphics capabilities the console thinks your TV is capable of handling, based on its ‘handshake’ with your TV’s HDMI port. This screen is handy for checking that your console and TV are talking to each other as you’d expect.
This Video Output Info can be particularly useful if you’re trying to feed your PS5 through an intermediary audio device, such as a soundbar or an AV receiver, and on from that to your TV. Many people forget that the PS5 will read the capabilities of the intermediary device’s HDMIs and determine supported graphics output based on that, rather than reading what your TV is capable of. So unless your audio device has full HDMI 2.1 4K/120/VRR pass-through support (which is currently very rare), it could limit the graphics you experience.
The best way around this is to connect your PS5 directly to your TV, and then use your TV’s ARC/eARC HDMI jack (if it has one) to output digital sound from the TV to your audio equipment.
Setting up your PS5
The PS5 is proactive about HDR, prompting you to run through a trio of simple HDR set-up screens whenever you attach it to a new TV. The way the screens work, though, is rather questionable.
Before going through this HDR set-up, it’s worth checking whether your TV has a menu option called HGIG (HDR Gaming Interest Group) – if so, turn it on. This will make sure that your TV doesn’t try and apply its own automatic HDR optimisation (dynamic tone mapping) processes to pictures that you have already optimised via the PS5’s HDR set up system.
Once done, you can crack on with the console’s calibration, but you shouldn’t do exactly as you’re told. Two of the screens ask you to increase the console’s brightness/peak light levels to a point where you can only just see a relatively dark symbol against a white background. The other one asks you to adjust the console’s black level to a point where a lighter symbol against a dark background remains only just visible. In fact, you should adjust each of these screens to the point where the visible symbol just disappears. In other words, the points at which the first square goes completely white and the second completely black is where you want to set the console.
Even then, not all games are designed to work with the PS5 console’s HDR set-up system, preferring instead to use their own internal HDR calibration screens. Examples of these titles include Dirt 5 and Assassin’s Creed: Valhalla. You should absolutely go through these game-specific calibration processes and it’s worth checking in these cases whether your TV’s HGIG setting (if it has one) is better switched on or off.
Another key aspect of gaming performance that requires care is frame rates. As with HDR, the PS5’s process for adjusting the frame rate a game uses varies from title to title. So with Dirt 5, the game’s own internal graphics options allow you to select whether you prefer to prioritise resolution or frame rates (there’s always a graphical trade off associated with switching from 60Hz to 120Hz). With Call Of Duty: Black Ops – Cold War, however, you have to choose in the console’s menus whether you want to prioritise ‘Resolution’ or ‘Performance’ (frame rates) before booting the game if you want to get 120Hz.
This ‘Performance Mode or Resolution Mode’ option, confusingly, is found in the Game Presets section of the Saved Data and Game/App Settings submenu of the PS5 itself.
A further refresh rate option of some sort will likely be added when the PS5 is finally enabled for variable refresh rates.
Choose the right audio options
We’ll discuss the PS5’s 3D Audio gaming system shortly. First, though, we should note the mess concerning the PS5’s Dolby Atmos activation options. Specifically the fact that there are two of them: one for streaming apps, and a separate one for the built-in Blu-ray/4K Blu-ray player. The PS5 does not support Dolby Atmos for games.
The first Dolby Atmos option appears in the System/Sound menu, under Audio output. Scroll right to the bottom of this page and you’ll see an Audio Format (Priority) option, that will be set to Linear PCM by default. There’s an option to choose Bitstream (Dolby) or Bitstream (DTS) if you prefer that.
However, when you try and play a 4K or HD Blu-ray disc with a Dolby Atmos soundtrack, the console still does not output Dolby Atmos. To make it work you need to press the Options button on your PS5 joystick while playing a film disc, then click the ‘three dots’ icon and choose the Bitstream option under Audio Format.
Unlike Microsoft with its latest Xbox consoles, Sony has decided not to use Dolby Atmos for its premium game audio experience. Instead, it has developed its own new ‘Tempest’ 3D Audio system. It’s up to individual developers whether and how they deploy 3D Audio, but notable titles to use ‘full-on’ versions of it include Spider-Man: Miles Morales and Demons Souls.
At the time of writing, the new 3D Audio system can’t be output to any external multi-channel home theatre speaker/AVR system. Currently, it only works via headphones, though Sony has suggested that this will change at some point in the future.
To try out 3D Audio with headphones, first make sure that you have the Enable 3D Audio option in the Audio Output part of the Sound menu activated. Also, when you first use headphones with the PS5, be sure to check out the Adjust 3D Audio Profile option. This plays a ‘babbling brook’ test signal and asks you to pick which of five settings makes the sound feel most at ear level.
You don’t need special headphones to experience the 3D Audio effect – any wired pair will do the job once connected to the DualSense controller – but the quality of the headphones you use certainly impacts how effective 3D Audio sounds.
As you might expect, Sony’s own Pulse 3D wireless gaming headset, which has been designed for the PS5, is particularly effective – though at £90 ($100, AU$150) it certainly isn’t cheap. However, once you start using it that price actually starts to sound more than fair.
For starters, it’s able to deliver the 3D audio effect wirelessly; you don’t need to be tethered to the DualSense controller. It also carries nifty high-sensitivity microphones complete with noise-cancelling technology built into the main headset, rather than in the usual mic ‘arm’, as well as providing buttons for mixing the game sound and chat sound, and for monitoring your own voice.
The Pulse 3D is lightweight and reasonably comfortable, and it does an excellent job of getting both a precise and strikingly large sense of space from the 3D Audio system.
If you want a more luxuriously built wireless headphone option and you’d prefer a dedicated mic arm, Turtle Beach’s Stealth 700 Gen 2 (£130, $150, AU$250) could be up your street. Just bear in mind that while good-looking, great for chatting and more comfortable to wear for epic gaming sessions, they don’t sound quite as punchy as the Sony Pulse 3D models. They can’t be jacked into the DualSense controller when they run out of juice, either, but with an impressive 20 hours of battery life, that shouldn’t be a big problem. Plus you can use them while they’re charging.
If you’re on a tight budget, meanwhile, and don’t mind a wired rather than wireless headset, then the Xiberia V20D (£30) are good value.
For a few other possibilities, check out our Best Gaming Headsets 2021 feature.
Brace yourself
The number of things you need to think about and potentially invest in if you want to unlock the full capabilities of your PS5 is pretty intimidating. Rest assured, though, that Sony’s new console is more than capable of rewarding your effort and expense with truly next-gen thrills. Once you’ve experienced it in full, you’ll wonder how you ever lived without it.
MORE:
Read our full PlayStation 5 review
Considering your next-gen options? Here’s our Xbox Series X review
Check out our list of the best gaming TVs you can currently buy
Google is releasing a new experimental app today called Stack that borrows the technology underlying its powerful DocAI enterprise tool for document analysis. The end result is a consumer document scanner app for Android that Google says vastly improves over your average mobile scanner by auto-categorizing documents into the titular stacks and enabling full text search through the contents of the documents and not just the title.
“I joined Google a couple of years ago when my education startup, Socratic, was acquired. At Socratic, we used Google’s computer vision and language understanding to make learning easier for high school students. I wondered if we could apply the same technologies to make organizing documents easier,” said Christopher Pedregal, the team lead on Stack, in a statement.
Pedregal and his colleague Matthew Cowan joined Google’s Area 120 incubator program, where they came up with an app that could use DocAI and its artificial intelligence technology to improve the process of scanning receipts, bills, and other important documents. The app uses Google’s biometric authentication on Android, so you can secure sensitive documents behind face or fingerprint scanning to unlock the software. It also automatically creates fields for scanned bills so you can fill in due dates and other important info.
The app, like so many of Google’s experimental (and sometimes even not-so-experimental) efforts, is getting released in the hopes it catches on and not with any real concrete business model attached or a definite roadmap. Pedregal stresses that “it’s early days” for Stack, which means the app can “still get things wrong.” It also means it could, at a moment’s notice, get sent to the Google Graveyard — though, presumably, with the option to export your documents if that does happen at some point in the future.
That said, it seems like a powerful alternative to a normal document scanner, and few companies are better than Google at understanding text and recognizing images. So it’s certainly worth a shot if you’re in the market for a better way to organize your real-world paper.
The ASRock Z590 Steel Legend WiFi 6E is an inexpensive yet capable Rocket Lake board that should handle any ambiently cooled CPU you can throw at it. It packs integrated Wi-Fi 6E, three M.2 sockets and six SATA ports. This roughly $210 board looks to be a well-rounded option to jump into Intel’s latest platform.
For
+ Wi-Fi 6E
+ Three M.2 sockets
+ Capable 14-Phase, 50A Power Delivery
+ Reasonable price for Z590
Against
– Only six rear USB ports
– Mediocre audio codec
– Appearance may not be for everyone
Features and Specifications
Editor’s Note: A version of this article appeared as a preview before we had a Rocket Lake CPU to test with Z590 motherboards. Now that we do (and Intel’s performance embargo has passed), we have completed testing (presented on page 3) with a Core i9-11900K and have added a score and other elements (as well as removing some now-redundant sentences and paragraphs) to make this a full review.
In our first close look at Z590 motherboards, ASRock’s Z590 Steel Legend leads the way. The Steel Legend SKUs have been around for a couple of generations now and are typically a lower-priced option in ASRock’s product stack. But just because the price is lower doesn’t mean the features are sparse. The new Z590 Steel Legend WiFi 6E brings the latest in Intel Wi-Fi, solid power delivery, 2.5 Gb Ethernet and more, all for around $210. If you’re after a reasonably affordable Z590 option, it may just be one of the best motherboards for your next build.
ASRock’s Z590 lineup is similar to the previous-generation Z490 stack. At the time we wrote this article, the ASRock site has 12 Z590 motherboards listed. At the top is Z590 Taichi, followed by the PG Velocita and three Phantom Gaming boards, including a Micro-ATX option. Additionally, there are two professional boards in the Z590 Pro4 and Z590M Pro4, two Steel Legend boards, two Extreme boards (also more on the budget end), and a Mini-ITX board round out the product stack. Between price, size, looks, and features, ASRock should have a board that works for everyone looking to dive into Rocket Lake.
Now that we can talk about performance using Rocket Lake-based CPUs (in this case, i9-11900K, the Steel Legend held its own in most of our tests. Where it lagged behind is in the long-running tests. By default, the Steel Legend follows Intel specifications, so you’ll see the board throttle back clock speeds as the PL1 and PL2 time limits expire. Simply raising the power limits allows it to compete with other boards that go above the Intel specifications out of the box.
On the overclocking front, after disabling AVX-512 (as we do with all other boards for overclocked testing) and raising all the power limits, the Z590 Steel Legend was able to run our i9-11900K at 5.1 GHz without issue. VRM temperatures were the hottest we’ve seen so far. However, they run well within the MOSFETs operating parameters.
The budget-friendly Steel Legend comes in two flavors: the base Steel Legend and the Steel Legend WiFi 6E that includes the latest Wi-Fi. The 6E version includes WiFi that uses the new 6 GHz band (as well as the existing 2.4 and 5 GHz bands) for faster performance on an uncluttered wavelength. Note you’ll need a 6E-capable router to utilize the additional bandwidth. The board also comes with 2.5 GbE, a 14-phase VRM, a USB 3.2 Gen2x2 Type-C port, reinforced slots, the ASRock graphics card holder and more. We’ll cover those features in detail below.
Opening up the retail packaging, we find the typical array of SATA cables, support DVD, screws and more. ASRock also includes an adjustable graphics card holder that connects to the motherboard and chassis. This is a sight for sore eyes as some of the latest generation video cards are bigger and heavier than previous versions and could use a little support. Below is a complete list of all extras inside the box.
Quick Installation Guide
Support CD
(2) SATA cables
(4) Screws for M.2 sockets
(2) Standoffs for M.2 sockets
Graphics card holder
Image 1 of 3
Image 2 of 3
Image 3 of 3
When removing the ASRock Z590 Steel Legend WiFi 6E for the box, we’re greeted by a black PCB with grey and white patterns stenciled on its entirety. The heatsinks and shrouds are all grey/silver, providing a stark contrast against the dark board. I’m personally not a fan of all the patterns, but beauty is in the eye of the beholder. That said, the Steel Legend will fit into most build themes without issue.
No board is complete these days without RGB lighting and the Steel Legend continues this trend. You’ll find an “S” lit up on the IO shroud, while the chipset heatsink lights up the words “Steel Legend.” From below, the right-hand edge (along with “Steel Legend” again on that same edge) is lit up by several RGB LEDs, which gives it a nice glow from underneath. The integrated RGB lighting was saturated and bright, with control handled through ASRock’s Polychrome RGB software.
Focusing on the top half of the board, you get a better look at the large silver heatsinks, along with a shroud that covers the rear I/O bits. In the upper-left corner are two 8-pin EPS connectors (one required) that send power to the CPU. The socket area is relatively busy, with many caps dotting the space around the socket. To the right are four DRAM slots capable of supporting up to 128GB of DDR4 RAM at speeds listed up to DDR4 4800+(OC).
The first of seven 4-pin fan/pump headers is located just above the DRAM slots. You can find the rest scattered around the bottom half of the board. As far as power goes, The CPU fan connector supports up to 1A/12W, while the CPU/Water Pump and Chassis/Water Pump support a maximum of 2A/24W. All headers except for the CPU header auto-detect if a 3-pin or 4-pin spinner is connected.
We find the first two (of four) RGB headers in the same area. On top in grey is the 3-pin ARGB, and the 4-pin in white below it is for RGB. The 4-pin headers support 12V/3A, 36W strips, while the ARGB is 5V/3A and 15W. Both values are standard. Also located in this area is an RGB feature where the LEDs below shine through the 6-layer PCB, showing off the Steel Legend branding. Continuing down the right edge is the 24-pin ATX feeding power to the motherboard, a front panel USB 3.2 Gen2x2 (20 Gbps) Type-C header, and finally a front panel USB 3.2 Gen1 front panel header.
ASRock lists the Steel Legend as a 14-phase Dr.MOS VRM, which breaks down to a 12+2 configuration for the Vcore and SOC. A Richtek RT3609BE 6-channel controller handles the CPU while a Renesas RAA229001 controls the SOC. The six-channel controller feeds 12 Vishay Sic654 50A MOSFETs for CPU Vcore in a teamed/parallel configuration. In other words, ASRock does not use phase doublers on this board. This configuration is plenty for both 10th and 11th generation CPUs intended for this platform.
Moving down to the bottom half of the motherboard, we’ll start on the left side with the audio. Here we see a fully exposed Realtek ALC897 codec and four Nichicon audio caps. The ALC897 codec is from the budget side of things, though most should still find it sufficient.
In the middle of the board are five PCIe slots and three M.2 sockets. On the PCIe front, we welcome native support for PCIe 4.0 when using a Rocket Lake processor. The primary PCIe slot and an M.2 socket receive the extra bandwidth. The Z590 Steel Legend includes two full-length slots, with the top one reinforced to prevent shearing and reduce EMI (ASRock calls this Steel Slot). The top full-length slot is PCIe 4.0 x16, sourcing its lanes from the CPU, while the other is PCIe 3.0 x4 and from the chipset. This configuration supports AMD CrossfireX, but not Nvidia SLI (which requires an x8 slot). The three small x1 slots support PCIe 3.0 x1 and are fed from the chipset.
Around and between the PCIe slots are three M.2 sockets, the top and bottom with heatsinks. There is technically a fourth M.2 socket, but it’s Key-E and already populated with the Intel Wi-Fi 6E adapter. The 6E portion brings users up to 14 additional 80 MHz channels or seven 160 MHz channels in the 6 GHz space and increased bandwidth. In essence, you can maintain faster high-speed connections, and more of them, without having to scan for the least-congested channels.
On the storage side, the top socket, M2_1, is dubbed Hyper M.2 and runs at PCIe 4.0 x4 (64 Gbps) speeds. It supports PCIe-based modules only, up to 80mm in length. The second slot down, M2_2, is PCIe 3.0 x4 (32 Gbps) and supports both PCIe and SATA modules up to 80mm. This slot shares lanes with SATA port 1. When using a SATA-based module, SATA 1 is disabled. The bottom socket, M2_3, is also PCIe 3.0 x4 and supports both PCIe and SATA drives, but this one holds up to 110mm modules. With M2_3. SATA port 5 will be disabled when using a SATA drive in this socket.
To the right of the PCIe area, we see the large chipset heatsink, and to the right of that are four of the six SATA ports. This board supports RAID0, 1, 5 and 10. Below is the POST status checker. The four LEDs, labeled CPU, Boot, RAM, and VGA, correspond to POST activities. If something goes wrong at any of those points, the LED where the POST stopped stays lit, showing you where the problem is.
Across the board’s bottom are several headers and even a few SATA ports. You won’t find any buttons here. Below is the full list, from left to right:
Front panel audio
ARGB header
RGB header
USB 3.2 Gen1 header
Clear CMOS jumper
(2) USB 2.0 headers
(2) Chassis/Water Pump fan header
System panel header
(2) SATA ports
Chassis/Water Pump fan header
TPM header
The Z590 Steel Legend’s rear ports use a preinstalled and adjustable IO plate that matches the board’s white/grey pattern. There are a total of six USB ports out back: two USB 3.2 Gen2 ports (one Type-A and Type-C), two USB 3.2 Gen1 ports and two USB 2.0 ports, all of which support ESD protection. I would like to see more than six USB ports here as they can all get used up quickly. Video outputs consist of an HDMI (v2.0) port and a DisplayPort (v1.4). The Realtek Dragon 2.5 GbE port sits above the USB 3.2 Gen1 ports, just to the right is the 5-plug plus SPDIF audio stack. Outside of that is a legacy PS/2 port for a keyboard/mouse, and the Wi-Fi antenna.
The MSI Aegis RS 11th is a powerful gaming desktop with the latest parts from Intel and Nvidia and off-the-shelf components that allow for easy upgrades.
For
+ Off the shelf parts
+ Powerful gaming performance
+ Decent pack-in peripherals
Against
– MSI Center software is clunky
– Middling file transfer speeds
It’s hard to build a computer right now, because many of the key parts are sold out everywhere you look . If you can get a quality desktop prebuilt, it may be worth springing for it just to get the components you want. The MSI Aegis RS11th ($1,999 to start, $2,499) as tested, delivers the latest with Intel’s 11th Gen Rocket Lake and an Nvidia GeForce RTX 3080. If those are the parts you’re looking for, this PC should be in your consideration.
MSI isn’t using a weird, proprietary
chassis
that’s hard to open; This is made of standardized parts, just mostly MSI-branded ones. That does mean that when parts are easier to buy, this is a PC you’ll be able to upgrade and grow with.
The MSI Aegis RS 11th’s gaming performance is strong, which makes one of the
best gaming PCs
, but Rocket Lake’s modest core count holds it back in productivity workloads.
Design of the MSI Aegis RS 11th
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
How much you like the design of the Aegis RS will rely heavily on how much you like one of MSI’s existing PC cases, the MPG Gungnir 110R, s a mid-tower chassis with both tempered glass and black aluminum.The front is split between the two, making it look like the
Two-Face
of computer cases. Considering our review unit had three RGB fans up front, I didn’t love that they were half covered up. It’s a weird design choice.
The rest of it, however, is far more conservatve. The left side panel is tempered glass, which lets you see your components, while the right side is opaque and covers up the cable management. There are two dust filters: a magnetic one on top of the case, and a second one in front of the intake fans.
There are three 120mm intake fans on the front. There’s another on the rear, as exhaust, but it also cools the radiator on the MSI Coreliquid 120 liquid cooler for the CPU. I would like to see the radiator mounted up top, where there is room for one up to the 240mm in size, and have a regular exhaust fan in the back, since there is no obstacle to that with this case. (In fact, I wouldn’t mind a beefier cooler for this processor, too.) Still, unlike many custom chassis we’ve seen lately, this one doesn’t seem to have particular issues with where to put fans.
The front three fans and CPU cooler have RGB lighting, which can be controlled with a button labeled “LED” on the top of the case, or with a module in the MSI Center software.
At 17.72 x 16.93 x 8.46 inches, the Aegis RS is smaller than the Alienware Aurora R11 (18.9 x 17 x 8.8) and iBuypower Gaming RDY IWBG207 (18.9 x 19.2 x 8.5 inches). The HP Omen 30L, however, is slightly smaller at 17.7 x 16.8 x 6.6 inches.
2x USB 3.2 Gen 1 Type-A, USB 3.2 Gen 2 Type-C, 3.5 mm headphone and microphone jacks
Rear Ports (Motherboard)
4x USB 2.0, 2x USB 3.2 Gen 1 Type-A, USB 3.2 Gen 2 Type-A, USB 3.2 Gen 2×2 Type-C, PS/2, DisplayPort, HDMI, audio connectors
Video Output (GPU)
3x Displayport 1.4a, HDMI 2.1
Power Supply
MSI MPG A750GF – 750W
Cooling
MSI Coreliquid 120R liquid cooler, 3x 120mm case fans
Operating System
Windows 10 Home
Dimensions
17.72 x 16.93 x 8.46 inches
Price as Configured
$2,499
Ports and Upgradeability on the MSI Aegis RS 11th
There are five ports on the top of the Aegis RS chassis: a USB 3.2 Gen 2 Type-C port, a pair of USB 3.2 Gen 1 Type-A ports, and separate 3.5 mm headphone and microphone jacks.
The rear ports are from the MSI Z590 Pro Wi-Fi motherboard, and include four USB 2.0 Type-A ports, two USB 3.2 Gen 1 Type-A ports, two USB 3.2 Gen 2 Type-A ports, USB 3.2 Gen 2×2 Type-C, as well as audio connectors and PS/2 for legacy peripherals. There’s also DisplayPort and HDMI, though you’ll likely use the options on the graphics card.
Internally, the Aegis is easy to update or repair, because it’s built just like a PC you might put together yourself. There aren’t any weird custom chassis tricks or hidden parts. MSI makes the case, power supply, graphics card, motherboard and liquid cooler as separate components. There’s nothing proprietary about this that you couldn’t change or update later.
You can get to most of the parts by removing the tempered glass side panel. It’s held into the back of the chassis with two thumb screws, so no tools are needed. There’s a handle to pull it straight back from the case. One highlight here is a sled for a 2.5-inch drive for easy extra storage. The RAM and M.2 SSD are easy enough to access without moving anything.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
The right side panel comes off the same way as the glass door, and it’s the easiest (only, really) way to access the HDD and the PSU, which are hidden beneath a shroud. The case’s RGB controller is also back here, and there’s another 2.5-inch drive sled.
I’m not going to say the cable management is beautiful compared to some other prebuilts, but it’s functional enough (I honestly probably don’t have the patience to do any better) and, unlike some cases, you can easily access it.
Gaming and Graphics on the MSI Aegis RS 11th
The Nvidia GeForce RTX 3080 and Intel Core i7-11700K proved potent for gaming.
I played a bit of Control on the Aegis RS, which I like to try because of how well it integrates ray tracing and stresses even the most powerful components. I ran it at 4K with the high preset and medium ray tracing.
In the beginning of the game, which features exploration sequences, combat with hiss guards in the Oldest House and fights on the Astral Plane, the game typically ran at around 57 frames per second. During fights inside the house, the rate dropped as low as 37 fps when I used lots of Jesse’s melee attacks, which bring about large telekinetic explosions with lots of objects moving. In the Astral Plane, which is rendered on a largely white background, it often stayed in the low 70’s, even during combat. With a slightly lower resolution or a few tweaks, you could be at a steady 60 fps fairly easily.
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
On the Shadow of the Tomb Raider benchmark (highest settings), the game ran at 147 fps in 1080p and 57 fps at 4K. It was beat in both only by the Alienware Aurora R11 (149 fps at 1080p, 64 fps at 4K) with an RTX 3090.
In Grand Theft Auto V (very high settings), the Aegis RS had superior 1080p performance at 163 fps and ran in 4K at 54 fps. The Aurora won out in 4K, while the Omen and iBuypower both had identical 4K performance to the Aegis.
On the Far Cry New Dawn benchmark, The Aegis dominated again at FHD, running at 134 fps. In 4K, it ran at 94 fps, behind the iBuypower and the Aurora by a few frames but tied with the Omen.
The Aegis came just behind the Alienware in Red Dead Redemption 2 (medium settings) at 113 fps, but had the highest 4K score at 40 fps. It beat the Omen by 10 frames, though the iBuypower was closer.
On Borderlands 3 (badass settings), the Aegis RS hit 136 fps at FHD and 58 fps in 4K. That’s the worst of the 4K showings, but on par with the HP Omen 30L for 1080p. The Aurora, with its RTX 3090, did the heavy lifting, winning at both resolutions.
Productivity Performance of the MSI Aegis RS 11th
This is the first pre-built desktop we’ve reviewed with the Intel Core i7-11700K. MSI has paired it with 16GB of RAM, a 1TB PCIe Gen 4 SSD and a 2TB HDD. The CPU has 8 cores and 16 threads. Its competitors, though, pose a threat, as many high-end PCs come with Core i9 processors that have more cores. It should be noted that even the Core i9 Rocket Lake has just 8 cores, so this isn’t a limitation of testing the Core i7.
Image 1 of 3
Image 2 of 3
Image 3 of 3
On Geekbench 5, an overall performance benchmark, the Aegis RS 11th notched a single-core score of 1,676 and a multi-core score of 10,102. While that was the highest single-core score, the other three desktops had higher multi-core scores. All of those used Intel Core i9-10900K chips, which have 10 cores rather than the 8 cores in the 11700K.
The PCIe Gen 4 SSD in the Aegis didn’t show off. It transferred 25GB of files at a rate of 635.3 MBps, just edging past the iBuypower and falling far short of the Omen (978 MBps) and Aurora (1,201.87 MBps).
On our Handbrake video editing test, the MSI Aegis RS 11th transcoded a 4K video to 1080p in 5 minutes and 19 seconds. That’s faster than the iBuypower, but the Aurora and Omen both beat the Aegis’ time by five seconds.
MSI Vigor GK30 Keyboard and Clutch GM08 Mouse
MSI includes a keyboard and mouse in the box, which are good enough to use if you don’t have a lot else lying around, but that you may want to replace if you already have favorite peripherals.
The mouse, the Clutch GM08, has rubberized grips on the side, though it felt a bit narrow for my wide claw grip. Still, there are some higher-end features here, including adjustable weights (two 3-gram weights and one 5-gram weight) to make the mouse lighter or heavier. It has a PixArt PAW351 sensor that goes up to 4,200 DPI. The mouse has a DPI switch button that lets you adjust sensitivity, and has two buttons on the left side of the mouse, but they aren’t programmable in MSI Center. It typically
sells separately for around $20
, so don’t get your hopes up too much, but it gets the job done in a pinch. The red LED light can’t be changed in MSI’s software, either.
The keyboard, the Vigor GK30, is the same one that came with the MSI Aegis Ti5 I recently reviewed. It’s just OK. The keyboard, which MSI suggests is “mechanical-like” has keys that are stiff and not quite clicky. There’s perhaps too much RGB lighting in a sea around the keys. They can’t be controlled in MSI’s software, either, but can be customized with buttons on the keyboard.
MSI Center, Software and Warranty on the MSI Aegis RS 11th
This is the first MSI PC that’s crossed my desk with MSI Center, the company’s replacement for its two previous swiss army-knife applications, Dragon Center (for gaming) and Creator Center (for, well, creating).
MSI Center, though, seems barebones. Sure, it has an optional light/dark mode switcher, which is nice, and you can still see CPU and GPU temperatures and usage, and there are still different usage scenarios to choose from, though they’re buried behind menus. But some features from Dragon Center are nowhere to be found, including one-click optimization for games, Mystic Light and the LAN manager are optional modules to add on. This feels like it’s in beta; there’s an area to “downlaod, update or uninstall” (MSI’s typo, not mine).
But MSI still includes its share of bloat, including MSI App Player, its version of BlueStacks, which runs Android apps, as well as LinkedIn. It notably doesn’t have the Cyberlink suite that I’ve complained about on previous systems, though no one can escape the bloat that comes with
Windows 10
, like Facebook Messenger, Hulu and Roblox.
MSI sells the Aegis RS 11th with a one-year warranty.
MSI Aegis RS 11th Configurations
We reviewed the Aegis RS with a new Intel Core i7-11700K “Rocket Lake” processor, 16GB of RAM, and MSI RTX 3080 Ventus 3X OC GPU, a 1TB PCIe Gen 4 SSD and a 2TB, 7,200-rpm HDD. When the system becomes widely available in mid-April, it will run for $2,499.
When we were reviewing this model, MSI told us that the RS 11th series would start at $1,999. It didn’t have completely finalized specs, but suggested the base model would have an RTX 3070 and 650W GPU and ditch the HDD. Several configurations may continue to utilize a Z490 motherboard and then transition to Z590 as supply levels out. The Aegis RS series is expected to top out at a Core i9-11900K, RTX 3090, 32GB of RAM, an 850W power supply and a 240mm CPU cooler, going possibly as high as $3,899.
Bottom Line
If you’re looking for the latest and greatest, the MSI Aegis RS 11th delivers you the most recent parts from Intel and Nvidia (at least, as long as it’s in stock).
Unlike some other prebuilts, there’s nothing proprietary here. It’s all standardized parts, mostly from MSI, that you can easily upgrade down the line.
Intel’s Core i7 Rocket Lake and the Nvidia GeForce RTX 3080 in our review configuration worked potently together. In productivity, though, Rocket Lake’s limited core count didn’t stand up to some competition, which affects some workloads.
MSI needs to add polish to its MSI Center utility.. If you use the app to monitor CPU usage, check temperatures or change RGB colors, it will feel a bit like beta. If you prefer other applications, you may not notice.
As a whole package, the Aegis RS 11th is a powerful gaming rig with few frills. If you need a PC to play games, this will stand up, even in 4K with the right settings.
Intel today launched its 11th Generation Core “Rocket Lake” desktop processor family led by the Core i9-11900K—this is its long-awaited review. With the Core i9-11900K, Intel wants to respond to the AMD Ryzen 5000 series, which snatched overall performance leadership away from the company. Rocket Lake is Intel’s first attempt at improving per-core (single-threaded) performance in several years, through the introduction of the new “Cypress Cove” CPU core. Intel claims IPC gain over the previous generation of up to 19%. The i9-11900K is an 8-core/16-thread processor, which is a step backward from its 10-core/20-thread predecessor, the i9-10900K, but Intel believes that the IPC gain and enhancements to the multi-core boosting algorithm should help recover some of the multi-threaded performance despite the two-core deficit. This is also their attempted hint at the market and software developers that eight cores should be plenty for cutting-edge gaming and client desktop tasks.
The reason Intel had to stop at eight cores for Rocket Lake has more to do with the fact that the processor is still manufactured on the 14 nm silicon fabrication node Intel has been lugging along for six years now. The Core i9-11900K is built on the same Socket LGA1200 package as its predecessor, and the package is physically of the same size as the i7-860 from 2009. The new Cypress Cove CPU cores are significantly larger than the “Skylake” cores on “Comet Lake,” and the new Gen12 Xe LP iGPU is larger than the Gen 9.5 unit, too. As a result, elongating the die to cram in more cores wasn’t an option. Add to this that the 14 nm node limits the power budget, and the 10-core Comet Lake was already flirting with 250 W package power draw. Physically removing the iGPU to make room for the extra two cores wasn’t an option either, as Intel emphasizes the iGPU to sell these chips to the vast majority of desktop users that don’t need discrete graphics. Intel plans to significantly change its mainstream desktop socket with the future generation “Alder Lake,” however.
Why Intel stuck with 14 nm is another mystery. Intel’s position is that to accomplish the performance target of Rocket Lake on the desktop platform, 14 nm was sufficient. Intel already has a more advanced silicon fabrication node, the 10 nm SuperFin, which it’s using to make 11th Gen “Tiger Lake-U” mobile processors with plans to launch a new 8-core “Tiger Lake-H” mobile chip later this year. Mobile processors make up a major share of Intel’s client CPU sales, and with the recent surge in notebook sales, the company wants to maximize its 10 nm foundry utilization for mobile chips. The desktop platform has a relatively “unlimited” power budget compared to mobile, and with 10th Gen “Comet Lake-S,” Intel seems to have decided that it’s willing to take the heat for selling a hot and inefficient desktop chip as long as it’s competitive.
We’ll go into the nuts and bolts of Rocket Lake on the following pages, but put briefly, the chip combines eight new Cypress Cove CPU cores with a Gen12 Xe LP integrated graphics core and an updated platform I/O that includes PCI-Express Gen 4. The chip also puts out eight more PCIe lanes than the previous generation. These contribute to a CPU-attached NVMe interface, much like those of AMD Ryzen chips, and a double-width DMI x8 chipset-bus. The general purpose PCIe connectivity put out by the new Intel 500-series chipsets continues to be PCIe Gen 3.
With this generation, Intel has an ace up its sleeve—DLBoost, or hardware acceleration of AI deep-learning neural net building and training. Intel claims DLBoost accelerates DNN training performance by up to six times compared to normal x86 execution. DLBoost made its debut with the company’s 10th Gen “Ice Lake” mobile processors, and Intel sees huge potential for AI in several client-relevant media tasks, such as quick image and video manipulation—just like on the latest smartphones. The company also put out plenty of developer documentation and is working with ISVs to promote DLBoost. Another feature making its desktop debut is the new AVX-512 instruction set, or at least a truncated version of it, with only client-relevant instructions.
The Core i9-11900K 8-core processor is clocked at 3.50 GHz, with a maximum Turbo frequency of 5.30 GHz using Thermal Velocity Boost and an all-core boost frequency of 4.70 GHz. Each of the eight Cypress Cove cores comes with 512 KB of dedicated L2 cache, and the chip has 16 MB of shared L3 cache. The i9-11900K is unlocked and ready for overclocking. Intel has introduced several new features for overclockers, which we’ll detail on the following pages. The i9-11900K is priced at US$539 in 1,000-unit tray quantities, which should put its retail starting price at around $550, the same pricing territory as AMD’s 12-core Ryzen 9 5900X. In this review, we put the Core i9-11900K through an exhaustive new set of CPU and gaming tests to show you if Intel has managed to take back the crown from AMD.
Founded in 1999, EVGA is a US-based computer hardware company. After a longer hiatus, EVGA returns to the gaming mouse world with the X15, X17, and X20. The X17 is a right-handed ergonomic mouse with two main draws: Actual 8000 Hz polling and ultra-low lift-off distance owing to two additional, independent LOD sensors. PixArt’s PMW3389 sensor capable of 16,000 CPI is used, along with pre-tensioned main buttons. The default weight of 106 g can be further adjusted with five 5 g weights. In total, ten buttons are available, all of which can be rebound within EVGA’s Unleash software, which also has the usual options for RGB lighting, among others.
It’s hard to fault a 32-inch VA 1440p monitor with 165 Hz that sells for so little. There are a few flaws, but taken purely as a gaming display, the Pixio PXC327 delivers excellent performance where it counts. With excellent SDR picture quality and smooth video processing, it’s a great way to put a big screen on your desk for not a lot of money.
For
Good contrast
Excellent, large color gamut after calibration
Good value
Against
No sRGB mode
Undersaturated HDR color
Smearing effect with backlight strobe
Features and Specifications
Gaming monitors come in a huge variety of shapes and sizes, and we’ve covered just about every category currently available. Whether your preference runs flat or curved, there’s a screen for you. For the best experience, a panel of at least 25 inches diagonal is a good choice, but if you have the space and budget, well, bigger is usually better. One of the more unusual form factors is 32-inch curved. We’ve looked at a number of these over the past two years and found them very worthy of consideration. A 32-inch screen in the 16:9 aspect ratio provides plenty of width and height to immerse gamers while still being well-suited for productivity. Adding a curve brings the user a little closer to virtual reality.
Some of these screens are premium priced, but the Pixio PXC327 manages to break the cost barrier. This 32-inch VA panel with 1440p resolution and an 1800R curve sells for just $310, cheaper than most of the best gaming monitors, either direct from the manufacturer or several popular online outlets. Though it doesn’t offer a lot of bells and whistles, it does have AMD FreeSync Premium, along with a speedy 165 Hz refresh rate and HDR.
Pixio PXC327 Specs
Panel Type / Backlight
VA / W-LED, edge array
Screen Size, Aspect Ratio & Curve
31.5 inches / 16:9
Curve radius: 1800mm
Max Resolution & Refresh Rate
2560 x 1440 @ 165 Hz
AMD FreeSync Premium
Native Color Depth & Gamut
8-bit / DCI-P3
Response Time (MPRT)
1ms
Brightness (mfr)
350 nits
Contrast (mfr)
3,000:1
Speakers
None
Video Inputs
2x DisplayPort 1.2
1x HDMI 2.0
Audio
3.5mm headphone output
USB 3.0
None
Power Consumption
29.3w, brightness @ 200 nits
Panel Dimensions WxHxD w/base
28.5 x 16.7 x 7.5 inches (724 x 424 x 191mm)
Panel Thickness
1.7 inches (43mm)
Bezel Width
Top/sides: 0.4 inch (9mm)
Bottom: 0.7 inch (17mm)
Weight
12.1 pounds (5.5kg)
Warranty
3 years
Backlit by a white LED edge array, the PXC327 is specced for 350 nits maximum brightness and coverage of about 83% of the DCI-P3 color space. It accepts HDR10 signals and runs at 165 Hz without overclock. AMD FreeSync Premium is the native adaptive tech. It’s AMD mid-tier screen tear-fighting offering and adds low framerate compensation (LFC) over standard FreeSync. The PXC327 isn’t G-Sync Compatible-certified, but we got it to run Nvidia’s anti-screen tear feature anyway (see: How to Run G-Sync on a FreeSync Monitor). Both flavors of Adaptive-Sync and HDR work in concert over DisplayPort at 165 Hz and over HDMI at 144 Hz.
The PXC327 has little else in the way of features, but as a basic gaming monitor, it offers a lot of positives for the price.
Assembly and Accessories
You’ll need to break out the Phillips-head screwdriver to assemble the PXC327’s metal base and upright. From there, just snap the panel in place. The external power supply is a small brick, and you also get a DisplayPort cable.
If you prefer to use a monitor arm, the panel has 100mm VESA lugs in back, but you’ll have to source your own bolts to use them.
Product 360
Image 1 of 3
Image 2 of 3
Image 3 of 3
The PXC327 maintains the current styling trend of thin flush-mounted bezels with a 9mm frame around the top and sides and a 17mm wide strip across the bottom. Only a small Pixio logo is visible in the center. In back, there’s a larger Pixio logo molded in a shiny finish with the surrounding plastic done in smooth matte. A large chevron is deeply cut into the back cover also finished in gloss. Underneath that are geometric shapes that resemble the side of a sci-fi movie spaceship.
The input panel is clearly labeled with white lettering visible from the back. You get two DisplayPort 1.2 inputs that support HDR and Adaptive-Sync up to 165 Hz. A single HDMI 2.0 can do up to 144 Hz, along with HDR and FreeSync. You can also plug your best gaming headset into the 3.5mm audio jack. There’s a USB port for service and firmware updates, but it does not support peripherals.
The all-metal stand is much more solid than its thin appearance would suggest. However, it allows for a 25-degree tilt. There are no swivel or height adjustments available, but the PXC327 is light enough to move around easily. The screen sits a little low for a perfectly vertical position, so we had to tilt it up a bit.
OSD Features
You control all of the PXC327’s functions by a single joystick/button on the back right side of the panel. It moves through the simple and efficiently designed on-screen display (OSD) menu with ease and doubles as a power toggle. Clicking the four directions outside the OSD brings up quick menus for picture mode, input selection, brightness and Game Assist’s aiming point, timer and frames per second (fps) counter.
There are seven picture modes which correspond to different game types. User is the default mode and makes all image controls available. Additionally, there are three user memories in the User Data submenu that can be saved and recalled. Black Equalizer raises the black level to make shadow detail easier to see. We left that option alone as the PXC327 has very good gamma accuracy and deep black levels. We had no problems seeing fine detail in dark places.
The PXC327 packs three Color Temperature presets, along with sRGB and a user mode. The RGB sliders allow for precise and accurate results during calibration, which we recommend because the preset temps are all either too cool or too warm.
You also get five gamma options. 2.2 is the default and best one. If you want to tweak color saturation and hue, there are sliders for that. And you can dial in a low blue light mode for fatigue-free reading. Note that the sRGB mode here does not reduce the color gamut from the native DCI-P3. DCI-P3 is the one and only choice.
Gaming Setup is where you’ll find the FreeSync toggle, three-level overdrive, Game Assist, HDR, dynamic contrast and MPRT, which is Pixio’s term for blur reduction, a backlight strobe.
Pixio PXC327 Calibration Settings
The PXC327’s default User mode has a few flaws that calibration can correct. For one, grayscale runs cool, but that’s easily fixed with adjustments to the RGB sliders in the User color temp. These tweaks also improved gamma, which is a bit too dark by default. No change to the gamma preset is necessary. We also recommend lowering the contrast slider to make highlights pop a bit more.
Once these changes are made, measured contrast is improved, and color is very accurate in the DCI-P3 realm. Again, there is no sRGB option available. The sRGB color temp setting does not reduce gamut volume at all.
Please try our recommended settings for the Pixio PXC327 below:
Picture Mode
User
Brightness 200 nits
51
Brightness 120 nits
25
Brightness 100 nits
19
Brightness 80 nits
13
Brightness 50 nits
5 (min. 34 nits)
Contrast
46
Gamma
2.2
Color Temp User
Red 55, Green 53, Blue 47
Once you switch over to HDR mode, which you must do manually, there are errors in color and luminance tracking that cannot be corrected. We’ll tell you more about that on page 4.
Gaming and Hands-on
The 32-inch form factor is great for just about any use. It provides enough image area to line up two word processing documents side by side. It’s great for editing music scores or photos. And you’ll see a huge portion of any spreadsheet. In the PXC327’s case, QHD resolution provides 93 pixels per inch, enough to render small details sharply without obvious jaggies. You can sit up close to this monitor without seeing the pixel structure in photos and videos.
The PXC327’s curve is subtle, which is typical of curved monitors with a 16:9 aspect ratio. In the past, we’ve wondered if this aspect ratio benefits from curvature at all, and after reviewing many panels in both 27 and 32-inch sizes, we say yes. Though we wouldn’t call it a deciding factor, the PXC327’s 1800R curve clearly enhances gameplay while making no real difference to the Windows desktop, which is a good thing. If gaming is the only intended use for a monitor, it’s hard to have too much curve. But if you must spend part of your day working, an extreme radius can be distracting.
With the PXC327, the Windows desktop looked bright and colorful, thanks to the large color gamut. If you need sRGB for Photoshop or accuracy otherwise, you’ll need to use a software profile. There’s no usable sRGB option in the OSD.
At our reference setting of 200 nits brightness, there was more than enough light output for a brightly lit office. In fact, we turned it down a bit when browsing the web. White backgrounds coupled with a large screen mean less light is necessary.
We weren’t as enamored with HDR operation in Windows. Brightness is locked to the maximum, which made the image very harsh. Color was also more muted than it is in SDR (you’ll see why in our HDR gamut test on page 4). Ultimate, there was no benefit to working in HDR mode.
Booting up our SDR copy of Tomb Raider, we were impressed by deep blacks full of detail and the rich, saturated color palette. Though we saw a bit more color than the game’s creators intended, it looked natural thanks to the PXC327’s accuracy after calibration. Greens and reds were particularly vivid, and fleshtones looked slightly ruddy. With a huge dynamic range, over 3400:1, image depth is superb.
We had the same experience playing Call of Duty: WWII in SDR mode. The PXC327 has contrast to spare. This is a good thing because switching to HDR mode showed some color flaws in terms of saturation. The game just didn’t have the expected pop in pop. Instead, it looked a little more muted than the SDR version of the game. You’ll see why this is so on page four. Meanwhile, HDR contrast looked about the same as SDR, which jives with our measurements, but the image was extremely bright. We weren’t able to play in HDR mode for long before fatigue set in.
There were no such issues with video processing. Our GeForce RTX 3090 graphics card was able to hit 165 fps no matter how intense the action became. That’s a benefit of the PXC327’s QHD resolution. Would 4K resolution look better? Probably a little but at a much greater hit to the wallet. This is one great-looking monitor for the price. Our machine with a Radeon RX 5700 XT card ran the same games at 120-140 fps, still very smooth and responsive. In all cases, we used the High overdrive setting. There was no visible ghosting, and motion resolution remained clear and sharp.
The backlight strobe was unusable in our tests. In addition to reducing brightness by around 20%, it created a smearing effect that made some objects appear doubled around the edges. We stuck with Adaptive-Sync for all gameplay and were more than satisfied.
Current page:
Features and Specifications
Next PageBrightness and Contrast
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.