Premieres of AMD Ryzen processors 5000 and AMD Radeon graphics cards The RX 6000 turned out to be quite successful, finally we have tough competition in both segments, so the situation seems more comfortable than ever. You can find the results of new AMD products in our article database. However, I decided to extend the measurement procedure a bit and perform the first AMD Ryzen 7 5800 X and Radeon RX 6800 XT, consisting in checking the performance of such a set in strictly processor places. The results I obtained made me quite stunned, because I expected an increase in performance, but I saw a series of huge drops …
AMD Ryzen 7 5800 X has a performance degradation when connected to the AMD Radeon RX graphics card 6800 XT, however on NVIDIA GeForce RTX 3080 and RTX 2080 Everything works fine. Has the overhead come back?
Anticipating the questions – I did all the tests on fresh 64 – Windows bit 10 build 2004 (I also checked the versions 1909 and H2), the latest drivers and updated UEFI of the motherboard (2502). Previously, I obtained identical results on the media from the CPU and GPU platforms, so out of curiosity, I put everything from the beginning to rule out possible driver conflicts. The course of measurements and settings were, of course, the same as during the premiere tests of the processors, and the only difference in the hardware was the insertion of the Radeon RX 6800 XT instead of GeForce RTX 2080 Ti.
Because I got three times same results, I decided to share my conclusions hotly. Other computer games that I use in processor tests also usually show drops, which will be the subject of a separate article. I originally planned to test two overclocked platforms – AMD Ryzen 7 5800 X and Intel Core i7 – 10700 KF – seasoned with fast DDR4 memory in a 4x 8 GB system, which were alternately accompanied by AMD Radeon RX 6800 XT and GeForce RTX 3080. The results on the NVIDIA card brought the expected increases compared to the standard set on which I test the processors, while the Radeon RX 6800 XT caused me a lot of problems, resulting in the following performance drops:
As you can see, AMD Ryzen 7 5800 X has performance drops in four out of five proven productions when paired with the AMD Radeon RX graphics card 6800 XT. Some cases are downright devastating (Watch Dogs 2, Kingdom Come: Deliverance), others are slightly hit (Far Cry: New Dawn, The Witcher 3: Wild Hunt), but there are also ups (Shadow of the Tomb Raider). However, I would like to add that tests performed on an overclocked AMD Ryzen 5 processor 5800 X and Radeon RX 6800 XT included SAM (Smart Access Memory), while the default setting was disabled. AMD Ryzen 5 5800 X and NVIDIA GeForce RTX Kits 2080 Ti / GeForce RTX 3080 they work without any anomalies. Let me remind you – everything checked on three systems, two motherboards and DDR4 sets.
The Witcher 3: Wild Hunt is the most interesting case to analyze because I included in the tests CPU and GPU place. In the first scenario, AMD and AMD performs significantly worse in combination with the Radeon RX 6800 XT than the GeForce RTX 3080. However, when we move to the GPU place, it suddenly turns out that Radeon is exactly the same as in the premiere tests and is ahead of the GeForce RTX 3080. As the problem mainly affects DirectX-enabled titles 11, and the only one using DirectX 12 is Shadow of the Tomb Raider, this prompts you to conclude that there is a recurring driver issue. Strictly speaking, the so-called CPU driver overhead that reduces / inhibits AMD & AMD kits performance. This can be seen in the screenshot below from the task manager, showing the load of individual components, which is not a determinant, but illustrates certain trends. I have been showing the problem of AMD driver overhead for a long time (LINK and LINK), some tests still reach 2015 of the year, but BIG NAVI emphasizes it additionally. As GPU performance has increased, the loss to NVIDIA automatically becomes more apparent. More results coming soon.
In early September, ASUS unveiled new ZenBook laptop models that will feature processors 11 Intel Tiger Lake generation. Attention was also focused on screens, thanks to which we will get more models with OLED matrices or IPS panels with a 3: 2 aspect ratio. One of the most interesting new products is ASUS ZenBook Flip S UX 371 a few days ago it also started in Poland. We, on the other hand, are in a hurry in our premiere test of this device, which looks great visually. Black and gold colors, aluminum housing, quite good processor and a very good, integrated graphics chip. The whole is seasoned with an OLED screen, characterized by excellent blackness and rich colors. The icing on the cake is the support for the ASUS Pen. Is it worth spending close 7500 zlotys for such a configuration? Damian Marusiak
One of the novelties of the Taiwanese manufacturer, which is slowly going on sale in Poland, is the ASUS ZenBook Flip S (UX 371), which uses Intel Tiger Lake processors, Xe Graphics and an OLED matrix (Ultra HD 4K) with 100% coverage of a wide range of DCI-P3 colors and brightness 500 rivets. For people who prefer a Full HD screen, the manufacturer has prepared the ZenBook Flip S UX model 363, which will be equipped with a Full HD 1W screen. In both cases the panel has a diagonal 13, 3 “. The version I tested (the only reviewer in Poland!) Was equipped with an Intel Core i7 processor – 1165 G7 and 4K OLED matrix with HDR support 10.
ASUS ZenBook Flip S UX 371 is a brand new laptop that uses both Intel Tiger processors Lake as well as the 4K OLED matrix.
One of the new features of the ASUS ZenBook Flip S laptop is undoubtedly the use of Intel Tiger Lake processors (11 Intel Core generation) There was a lot of talk about processors long before their final debut (September 2). First of all, they use the redesigned Willow Cove architecture, based on last year’s Sunny Cove. Mainly the cache subsystem was changed, increasing the number of L1 cache, L2 cache and L3 cache. Moreover, a noticeably improved 10 nm technological process, named 10 nm SuperFin. Name associated primarily with using new SuperF transistors in, which are to allow you to work with a higher clock with lower voltages. The tested version of ASUS has an Intel Core i7 – 1165 G7, equipped with 4 cores and 8 threads. The base clock of the processor is 2.8 GHz with the possibility of increasing in Turbo Boost 2.0 mode to a maximum of 4.7 GHz. Intel Core i7 – 1165 G7 has a default TDP of 28 W. The processor supports not only the Thunderbolt 4 platform, but is also the first to introduce PCIe 4.0 for notebooks. Also supports DDR4 3200 MHz, LPDDR4X 4267 MHz and LPDDR5 5400 MHz – the latter will appear in laptops, however, only next year.
Western Digital SN SSD 730 / 1 TB / PCIe x4 Ge interface n.3 NVMe
Microsoft Windows operating system 10 Pro 2004
Also new is the integrated Intel Iris Xe Graphics (Intel Gen. 12), which for the first time completely severs itself from older architectures and is based on a completely new Xe architecture. Intel Iris Xe Graphics is a representative of the weakest line of Xe-LP (Low Power), which have built-in maximum 96 EU (execution units). This translates into the presence of a total 768 stream processors. The integrated Iris Xe Graphics chip still uses RAM, while the manufacturer is also preparing a dedicated version of the Xe-LP chip called Intel DG1. Unlike the iGPU, it will have its own VRAM of either 3 or 6 GB GDDR6, which in many games will definitely improve the results even further. The new Intel Xe-LP also has low energy consumption and uses the same technological process known as 10 nm SuperFin. The computing power of the new iGPU is 2, 07 TFLOPS, which at least in theory should give priority. In the case of an integrated graphics chip, performance in e.g. games will also be heavily dependent on bandwidth, and therefore on the clock speed of the RAM.
AMD’s 400 series motherboards already have the first BIOS versions with support, and the company is working with Intel and NVIDIA to expand support.
AMD released a new Smart Access Memory feature with the new Radeon RX 6000 series graphics cards. The company said SAM requires a Ryze 5000 series processor, a 500 series motherboard, and an RX 6000 series graphics card to run, but the truth is not quite so straightforward.
Smart Access Memory is AMD’s term for Resizable BAR support under the PCI Express standard. Normally, the processor sees 256 MB of video card memory, but with Resizable BAR support, that window can be resized to the desired size, even if it covers the entire video card memory. This is an advantage in some tasks where the processor can send more data directly to the video card’s memory.
Resizable BAR has been included in the optional features of PCI Express since PCIe 2.1, but operating system support has dragged on badly. Support for Linux was once implemented by AMD and for Windows by Microsoft through Windows 10. Support for motherboards also appears to remain variable, although many motherboards support the necessary functionality, at least for iron
. AMD’s claim about feature requirements was true when it was made; That combination of AMD’s iron was the only platform on which the company had validated its operations. NVIDIA announced shortly after AMD’s announcement that it would also support it, at least with Ampere-based graphics cards, with a later update
. AMD has since said that support for the feature will not remain in this respect either. In an interview with PCWorld, Scott Herkelman confirmed that the company’s Radeon team is working with Intel and the Ryzen team should be working with NVIDIA to extend support to as many platforms as possible. Support is also promised for AMD’s older platforms, as Cracky, a member of Rawiiol, which produces live streams, among others, noticed that ZR 3 BIOS, previously released by ASRock for B 450 motherboards, would also activate Ryze
on Smart Access Memory. series processor. MSI has also released the first beta BIOSs with SAM support for its 400 series motherboards, so it is expected that support will be officially introduced for the 400 series motherboards in the new Zen 3: a with supported BIOS versions. However, it is not yet known whether the support will be extended to older processors.
Christina Munro 1 day ago Featured Tech Reviews, Mouse, Reviews
If you’re a fan of lightweight gaming mice then you will want to check out our comparison review of SteelSeries’ Aerox 3 and Aerox 3 Wireless gaming mice. These both feature a holey design to keep the weight as low as possible. The Aerox 3 comes in at £59.99, with a TrueMove Core sensor co-designed by PixArt, whereas the Aerox 3 Wireless comes in at £99.99, with a TrueMove Air sensor also co-designed by PixArt.
Watch via our Vimeo channel (below) or over on YouTube at 2160p HERE
Battery Life: Up to 80 hours 2.4GHz, Up to 200 hours Bluetooth
OS: Windows, Mac, Xbox, and Linux. USB port required.
Software: SteelSeries Engine 3.18.4+, for Windows (7 or newer) and Mac OSX (10.12 or newer)
You can purchase the SteelSeries Aerox 3Wireless for £99.99 HERE!
Pros
Great connectivity options.
Good specs.
Low LOD.
Well written and clear manuals.
No side flex.
IP54 rated
Comfortable after long use.
Works well with palm and claw grip.
Cons
Spongy side buttons
Slight flex that engages the left and right click if squeezed hard.
Cost quite a bit more than the wired option.
KitGuru says: Of the two models, we would recommend the SteelSeries Aerox 3 Wireless all day long. It is more expensive than the wired model, but paying that bit extra ensures you are getting the best specs and all of those very handy connectivity options.
Become a Patron!
Check Also
Seasonic AURA MODDED Build (Part 3)
We move onto the next stage of mods with the Seasonic Aura build!
With the now presented version 2020. 2 of their thin client operating system, openthinclient GmbH cuts off old braids and says goodbye to 32 – bit platforms. In addition, the software was given the ability to boot over the network via UEFI-PXE even on newer hardware. The support for 30 – bit devices is not applicable, openthinclient 2020. 2 works exclusively on Intel-compatible 64 – bit thin clients or – PCs.
As a substructure, openthinclient uses a Debian 10 (Buster) on the, among other things, for the connection to the terminal server FreeRDP 2.2 and the Citrix client 2009 are available with plug-ins for Zoom and Teams. To work with older Windows servers up to 2008 there is RDesktop 1.8.6. In pure cloud environments, Chromium serves as a browser, Firefox has been removed by the manufacturer.
Central management included The software is available in two options. On the one hand as a server installation for central management of the thin clients with the Linux-based operating system. They are available either as an installer (Linux and Windows, each approx. 170 MByte) or as a virtual appliance (HyperV and VMare, 2, 7 or 2.9 GB). On the other hand, with the client-OS-on-a-Stick version, openthinclient has a live system in its portfolio that can only be installed on USB sticks. In addition to the OS and the apps already mentioned, this includes its local management. It can be used via the integrated OpenVPN in order to be able to securely access the company server from branches or from the home office with a thin client.
The openthinclient environment can be controlled via a clear management interface manage.
(Image: openthinclient)
The creation, booting and administration of a new thin client takes place via the central openthinclient management GUI within a few minutes. Depending on the configuration, the clients boot an adapted thin client operating system over the network or load it as an image from the USB stick. Typical server-based computing applications such as Citrix or RDP are then started. The thin client then accesses the virtual desktop and users can start their work. Settings such as user rights, approved applications, etc. are made via Citrix Workspace Environment Management or via the Terminal Server Manager.
Both The management server and the on-a-stick variants can be tried out free of charge – and without support. possible. For both you have to register and make a € 0 purchase, which is a bit cumbersome. The on-a-stick variant can be tried out 30 days, the management server can be used up to 49 Use clients permanently for free. The fee-based licenses start at 60 euros per year for openthinclient-on-a-stick.
JetBrains opens the round of 89. 3 versions of the development environments with the database IDE DataGrip. Above all, the SQL queries to the document-oriented database MongoDB are worth mentioning. A connection has recently also been made for the NoSQL database Couchbase, but via the N1QL query language from Couchbase.
NoSQL in focus In conjunction with MongoDB, the IDE enables direct SQL queries. The database usually relies on native drivers for programming languages such as C ++, Java, Node.js, Go or Python and offers its own query language, MongoDB Query Language (MQL). In the current release, DataGrip comes with a driver written in JavaScript, which takes care of the translation of the SQL commands into MongoDB queries.
DataGrip 2020. 3 bring a MongoDB driver for direct SQL queries with them.
(Image: JetBrains)
At the start the driver only allows SELECT – Queries, and as clauses are JOIN , WHERE , GROUP BY , HAVING , ORDER BY , LIMIT and OFFSET allowed. The JavaScript code created for the query can be accessed via the context menu with Copy JS script to clipboard or Show JS Script copy or display and edit.
In the NoSQL environment, the newly added connection to Couchbase is also noteworthy. For the interaction, DataGrip has its own JDBC driver that executes queries using the Couchbase-specific query language N1QL. The connection initially works exclusively via the Couchbase Query Service, while there is not yet a connection to the Couchbase Analytics Service.
Cleverly extracted Two new methods for extracting data have been added for import and export. Data extractors are essentially rules for copying data into the editor. The new menu entry One-row adds the selected elements in a row, which, among other things, allows the transfer of values from a column for IN – Clauses can be useful.
Let via data extractors now take over several lines for an INSERT statement.
(Image: JetBrains)
The also new method SQL-Insert-Multirow allows copying several lines into one INSERT statement. A small innovation concerns the CSV export, which can now omit any quotation marks. For Quote values exists in the CSV -Dialog next to When needed and Always recently Never .
View into the cell In addition, the current release extends the one in DataGrip 2020. 2 introduced Cell Values Editor for editing individual cell contents. Among other things, single-line XML and JSON content can be displayed in a structured and formatted multi-line view. In addition, the Cell Value Editor has recently started displaying images and can be moved to the lower edge to save space.
The Cell Values Editor is now showing images.
(Image: JetBrains)
For editing, an extended function for the selection has been added, which no longer just enlarges, but recently can also be reduced. This is done using the keyboard shortcut Ctrl | in Windows Shift | W and under macOS Opt | Below .
Further innovations in DataGrip 2020. 3 can be found on the JetBrains blog. Details on the connection to MongoDB can be found in a separate blog post from the end of October. JetBrains offers all IDEs in a subscription model, with the price dropping within the first three years. DataGrip costs in the first year 89 euros and from the third year 53 Euro per user. Unlike IntelliJ IDEA, there is no free version for DataGrip.
Apple and Qualcomm are certainly not the only adopters of the Arm architecture that develop system-on-chips for PCs. Rockchip, a major fabless chip developer from China, is working on its high-end RK3588 SoC that promises to bring together performance and low power consumption.
Rockchip expects its RK3588 to address various kinds of performance-hungry applications, including high-end tablets, laptops, AR/VR, televisions, and even NVR (network video recorder) surveillance systems, according to a report from CNX Software that cites information revealed at the Rockchip Developer Conference (RKDC).
Modern high-end SoCs for flagship smartphones and tablets already offer compute performance comparable or exceeding that of processors that were used for mainstream PCs several years ago. In fact, Rockchip’s RK3588 packs quite a punch with eight general-purpose cores (four high-performance Cortex-A76 and four energy-efficient Cortex-A55 in DynamIQ configuration), a quad-cluster Arm Mali ‘Odin’ GPU, a 6 TOPS NPU accelerator, and an advanced multimedia engine supporting 8Kp30 video encoding as well as 8Kp60 video decoding. As for memory, the RK3588 has a 64-bit (4×16-bit) LPDDR4X/LPDDR5 controller that supports up to 32GB of DRAM.
But performance alone is not enough to address a broad range of PC-based and emerging applications. To power a competitive computer, an SoC needs to support traditional PC interfaces, but for emerging applications like AR/VR gear, it also has to support mobile I/O. This is exactly what the Rockchip RK3588 does. The SoC supports dual GbE, SATA, PCIe 2.0, PCIe 3.0 x4, DisplayPort 1.4, eDP, HDMI 2.1, MIPI DSI, eMMC 5.1, SDIO, and USB 2.0/3.1 interfaces. The chip does not seem to have SPI and UFS interfaces, which might be a cost-cutting measure.
Staying true to its mobile roots and in a bid to enable devices like AR/VR geat as well as tablets, the Rockchip also features 48MP (2x 24MP) ISP with HDR and 3D NR support.
Rockchip plans to use Samsung Foundry’s 8LPP (8 nm) manufacturing technology to produce the SoC. This fabrication process does not use extreme ultraviolet (ULV) lithography and was primarily designed for relatively cost-effective applications. Meanwhile, it is hard to estimate the pricing of the RK3588.
As for operating systems support, Rockchip officially claims that the RK3588 SoC will work with Google’s Android, Linux, and China’s ‘domestic’ OS. For some reason, Rockchip did not reference Microsoft’s Windows on Arm, but perhaps the company did not want to mention a proprietary operating system designed by a U.S.-based company at its RKDC 2020 event held in China.
Rockchip originally planned to launch its RK3588 SoC sometime in the second half of 2020, but for some reason, the chip was delayed to Q2/Q3 2021.
Breaking down the best gaming headsets we’ve tested (Image credit: Zivica Kerkez/Shutterstock)
Finding the best gaming headset is arguably nearly as important as choosing the best graphics card or the best gaming keyboard. After all, the sound of your virtual world and how you communicate with your friends all depends on the device you wear on your head.
But choosing the best gaming headset for you isn’t easy, partially due to the sheer amount of market saturation we’re facing right now. With the ever-rising popularity of eSports and the relative simplicity of combining off-the-shelf audio hardware with cushy earcups, a sprinkle of software wizardry and maybe some RGB, PC gamers are now offered more options than ever, whether they’re planning to plug their headset into one of the best gaming PCs or the best gaming laptop. A quick search of a few popular online retailers will yield hundreds of choices across dozens of companies, ranging from under $10 (£8) to over $600 (£460).
If you’re headed back to school or work in a virtual situation, now’s a great time to invest in a quality headset and clear mic. You may already know how much your willing to spend on a pair of cans, but there are still plenty of other things to consider.
Luckily, we’ve been testing piles of gaming headsets (to see every model we’ve tested, check out our gaming headset reviews page). Below are the best gaming headsets we’ve tested.
Here are some things to keep in mind when searching for the best gaming headset for you:
Wired or wireless? Wired headsets generally cost less and don’t need to be charged. Therefore, if you typically game at your desk, you may want to stick to wired options to keep things cheaper and simpler. A wired headset also won’t die on you mid-battle. On the other hand, there’s no denying the convenience of being able to run to the kitchen for a drink without having to remove your cans.
Headbands and earcups. Comfort is more subjective than measuring audio output and input, but generally speaking you should be wary of plush gaming headsets with thick bulges, cheap foam and cloth covers. When we’ve tested these types of headsets ,we’ve often found disappointing acoustic performance. Ear-cushion material can make a huge difference in what your ears ultimately perceive.
Audio and mic quality. These are very important if you want the best gaming headset, but impossible to evaluate on the one or two floor models. We focus on these aspects in detail in our reviews. In short, detailed reproduction and good spatial resolution, specifically when it comes to complex noises and environments with multiple sound sources, are more important than any attempt at simulated surround sound.
A key Bluetooth spec: aptX. If you do go wireless and opt for Bluetooth (no USB dongle needed), look for headsets that support Qualcomm’s aptX tech, a compression tech (codec) that’s been leveraged for decades in TV and movie voice-work, movie theater audio and thousands of radio stations. If you’ve heard Bluetooth audio in years past and hated it (it definitely was bad for a long time), give an aptX-enabled headset a listen. As long as the underlying hardware is good, you’ll be pleasantly surprised by the sound output.
Best gaming headsets at a glance:
1. Best Gaming Headset Overall: HyperX Cloud Alpha
2. Best Wireless Gaming Headset for Most: SteelSeries Arctis 7
3. Best Budget Gaming Headset: Asus TUF Gaming H3
4. Best Virtual Surround Sound Gaming Headset: HyperX Cloud Orbit S
5. Best Gaming Headset for RGB Lovers: Patriot Viper V380
6. Best-Looking Gaming Headset: Corsair Virtuoso RGB Wireless SE
7. Best High-Res Gaming Headset: SteelSeries Arctis Pro + GameDAC
8. Good Price, Great Mic: Corsair Void RGB Elite USB
9. Best Splurge: SteelSeries Arctis Pro Wireless
The Best Gaming Headsets You Can Buy Today
The HyperX Cloud Alpha is the best gaming headset for the typical player. (Image credit: HyperX)
The HyperX Cloud Alpha is the best gaming headset for most gamers, offering nearly perfect sound quality. Noise reproduction with these cans sounds natural, with the drivers avoiding flaws like overly aggressive bass or highs. It’s not revolutionary headset, but it’s a fantastic value, especially if you can find it for cheaper than $100.
In terms of long-term wearability, the headset earns its Cloud branding with a light, comfy fit built with quality materials. This includes thick memory foam padding on the headband and earcups and HyperX’s decision to opt for aluminum over plastic in some important areas. The overall look and feel is one of quality.
If you like the Cloud Alpha’s design but want something with some more features, there’s also the HyperX Cloud Alpha S. It’s basically the same headset but with 7.1 virtual surround sound, an inline controller and bass sliders on each ear cup. The black-and-blue or all-black color options (instead of the Cloud Alpha’s black and gold or black and red) add more options too.
Read: HyperX Cloud Alpha review
The SteelSeries Arctis 7 is the best wireless gaming headset for most, due to its price-to-value ratio. (Image credit: SteelSeries)
Wireless cans can cost you well over $200, but the latest model of the SteelSeries Arctis 7 lands at a more affordable price while offering louder audio than its predecessors. Meanwhile, high volumes are distortion-free, and the overall audio is clear and rich, despite a less than snug fit.
We’d prefer stronger performance at lower volumes, and the bass isn’t as good as what you’d hear with the HyperX Cloud Alpha above. However, besides games, the Arctis 7 is also fit for light video editing and mixing. DTS Headphone:X v2.0 virtual surround sound also boosted gaming audio details, like footsteps.
Just like prior versions of SteelSeries’ Arctis 7, the current model sits as a reliable wireless headset for PC gamers. If you need a cheaper wireless option, consider the Cooler Master MH670, which is currently $40 cheaper than the Arctis 7 or the SteelSeries Arctis 1 Wireless for $100. For more colorful, there’s the Logitech G733 Lightspeed, and for ultimate comfort, consider the Razer BlackShark V2 Pro.
Read:SteelSeries Arctis 7 review
The Asus TUF Gaming H3 is the best gaming headset for good quality that won’t break the bank. (Image credit: Tom’s Hardware)
Advertised virtual 7.1 surround sound is Windows Sonic, usable by any 3.5mm headset
The Asus TUF Gaming H3 is the best gaming headset for preserving your budget. These can be hard to find, but you can typically spot it selling for about $50. Despite the lower price, you still get a headset that fits well and sounds good right out of the box. That means you can get right to gaming without having to fiddle around in software. When we tested the cans, performance was comparable to pricier rivals, including the Asus TUF Gaming H7. We attribute a lot of that to the H3’s comfortable fit with leatherette contact points preventing sound leakage.
The downside is these aren’t particularly pretty. And if you’re excited about virtual 7.1 surround sound, note that the H3 is a 3.5mm headset that only uses Windows’ Sonic spatial audio, which any 3.5mm headset can use.
But when it comes to gaming and hearing sound cues like weapon switches, this headset gets the job done without effort on your part or heavy damage to your bank account.
Looking for a cheap headset without virtual surround sound? Check out our Roccat Elo X Stereo review.
Read:Asus TUF Gaming H3 review
The HyperX Cloud Orbit S stands out with powerful and immersive 3D audio with effective headtracking, (Image credit: Tom’s Hardware)
4. HyperX Cloud Orbit S
Best Virtual Surround Sound Gaming Headset
Driver: 100mm neodymium | Impedance: Not disclosed | Frequency response: 10-50,000 Hz | Mic: Unidirectional condenser | Connectivity: 3.5mm, USB Type-A, USB Type-C | Weight: 0.8 pounds(362.9g)
Immersive and loud 3D audio
Soft, squishy headband and ear cups
Good battery life
Accurate head tracking
A little heavy
Head tracking’s audio impact varies depending on game
The HyperX Cloud Orbit S is an expensive, premium pair of cans and the best gaming headset for splurging. It gives you a discernible gaming advantage, thanks to its customizable 3D mode with head tracking. When you’re gaming with head tracking, the location of enemies is apparent and the auditory environment moves with you. You can also use head tracking as game controls, freeing up your hands for more action. (For another head tracking option with premium features, check out the similarly priced JBL Quantum One).
There are lower-priced headsets with true surround sound (instead of the Orbit S’ virtual surround sound) and wireless capability. But the Orbit S, which bears the same cozy memory foam headband and earpads as other headsets in HyperX’s Cloud line, provides a gaming edge you’ll actually notice.
We also love the versatility of this headset. In addition to offering hi-res, virtual surround and 3D audio, you can use the headset with a 3.5mm jack, USB Type-A port or USB Type-C port.
Read: HyperX Cloud Orbit S review
With tasteful lighting, the Patriot Viper V380 is the best gaming headset for RGB fans. (Image credit: Tom’s Hardware)
Virtual surround sound can be helpful for games, movies
Environmental noise cancellation mic produces quality sound
Fair price
Software is basic
USB Type-A connection only
RGB isn’t as prominent in the headset world, likely because it’s hard to see lighting sitting on your ears. But if you plan on streaming, video chatting or just like the comfort of having as much RGB as possible (have you seen our best RGB mouse pads list?), the Patriot Viper V380 is the best headset for you. It has one ring of programmable RGB framing each earcup. That’s just the right amount of color, and, somehow, the headset still manages to look tasteful.
The Viper V380 has more to offer than just pretty lights. It boasts a mic that successfully limited background noise during testing, as well as virtual 7.1 surround sound that enhanced how voices sounded in FPS games. Volume is also more than sufficient with the headset’s 53mm drivers that are larger than the 50mm average. Just be sure you have an available USB Type-A port, because there are no other connectivity options.
The Viper V380 has some of the best RGB implementation we’ve seen in a headset. But if you’re looking for something that’s even flashier, the JBL Quantum One has three RGB zones programmable with some wild effects.
Read: Patriot Viper V380 review
The Corsair Virtuoso RGB Wireless SE headset will cater to your vanity. (Image credit: Tom’s Hardware)
The Corsair Virtuoso RGB Wireless SE is one of the rare headsets that looks as good as it sounds. It offers premium quality audio that enters audiophile territory and looks pretty and shiny instead of clunky and heavy. The SE version of the Virtuoso RGB boasts gunmetal-colored aluminum stamped with a touch RGB via the Corsair logo. Overall, it looks as expensive as it is.
The Virtuoso RGB SE delivered strong audio, including high-res support, in our testing.n Its 50mm drivers also sounded great with gun fights in games like Borderlands 3. The cans’ music reproduction sat in the middle of bass-heavy cans like the Audio-Technica’s ATH-G1 and flatter-sounding ones like the SteelSeris Arctis Pro Wireless listed below.
Topping things off with a 20-hour wireless battery life, Corsair’s Virtuoso RGB Wireless SE is a fine pair of cans that both look and sound premium.
Read: Corsair Virtuoso RGB Wireless SE review
If you’ve got high-res content, the SteelSeries Arctis Pro + GameDAC are the cans for you. (Image credit: SteelSeries)
7. SteelSeries Arctis Pro + GameDAC
Best High-Res Gaming Headset
Driver: 40mm neodymium | Impedance: 32 Ohms | Frequency response: 10-40,000 Hz | Mic: Bidirectional noise-canceling | Connectivity: 3.5mm, USB Type-A or S/PDIF | Weight: 0.9 pounds(426.1g)
High-end materials exude solid build quality
Separate GameDAC does a lot for sound quality and available settings
Well-balanced frequency response
Comfortable on narrow heads
Headband does not fit well over large heads
Artificial limit on output
With a premium build, sound quality and price tag to match, the SteelSeries Arctis Pro + GameDac is fit for audiophiles. It uses a ESS ES9018 Sabre32 reference DAC, which is worth money on its own and amplifies the headset’s capabilities. We were sad to find out that the DAC has an artificial audio limit (to help prevent hearing damage). However, the DAC is easily navigable with a lot of settings for tweaking audio for gaming and chatting without opening software.
The Arctis Pro + GameDAC showed deep bass, satisfying mid-range in games and vocals and dominating high frequencies without sounding too metallic.
If you’re looking for a high-performance PC gaming headset (it works with PS4 too), the Arctis Pro + GameDAC is exceptional for games and music — as long as your head isn’t particularly big.
Read: SteelSeries Arctis Pro + GameDAC review
Everyone will hear you loud and clear with the Corsair Void RGB Elite USB’s mic. (Image credit: Tom’s Hardware)
If you do a lot of chatting on your headset, be with your Overwatch teammates, work colleagues or Mom, the Corsair Void RGB Elite USB will make sure you sound just like you to whoever’s listening. For this price, we were pleased at the microphone’s quality, which can handily fold up when you need to take a sip of water or sneeze. It’s also Discord-certified and showed better low-end response than rivals.It’s not quite as warm as what you can get with the best gaming microphones or any USB mic, but it’s close.
On the other hand, when we tested the headset with a smaller head, bass was lacking due to sound leakage. Your head size may change things. The Void RGB Elite USB also has virtual 7.1 surround sound, but it didn’t prove to be anything extraordinary.
For chatterboxes, this is the best gaming headset with its mid-range price, cozy padding and splash of RGB. Note there’s also a wireless version of the Void RGB Elite USB. For more mic options, consider the expensive JBL Quantum One, which comes with a unidirectional and detachable boom microphone and a separate calibration microphone.
Read: Corsair Void RGB Elite USB review
If you can afford it, it’s hard to beat the SteelSeries Arctis Pro Wireless. (Image credit: SteelSeries)
9. SteelSeries Arctis Pro Wireless
Best Gaming Headset Splurge
Driver: 40mm neodymium | Impedance: 32 Ohms | Frequency response: 20-40,000 Hz | Mic: Bidirectional condenser | Connectivity: USB Type-A wireless dongle or Bluetooth 4.1 | Weight: 0.8 pounds(357g)
Comfortable headband design
Peerless swappable battery system
Crisp hi-res audio
Feature-laden base station
Needs base station to charge
Headband durability concerns
The SteelSeries Arctis Pro kicks things up a notch or two over other SteelSeries cans, including the Arctis 7 wireless ones listed above. It’s very pricey, even for a wireless headset. But you get your choice of wireless dongle or Bluetooth connectivity, which means you could use the Arctis Pro Wireless without it occupying a USB port.
The cans offer a large frequency response range and high-res. Lossless titles, like Wolfenstein II: The New Colossus, sounded noticeably crisper with a lot of depth on the Arctis Pro. Ultimately, the game sounded more immersive, particularly in the high end, where we could hear the different layers of sound. You also get DTS Headphones:X virtual surround sound via a transmitter base station boasting other helpful features, like ChatMix and general volume control.
Despite its higher price, the Arctis Pro Wireless isn’t vastly more comfortable than the cheaper Arctis 7 wireless cans and don’t offer twice as detailed audio. But the Arctis Pro Wireless has the advantage in its smart design, Bluetooth capability and swappable batteries to keep the party going while traveling.
For a cheaper Bluetooth option, consider the Sennheiser GSP 670 and for the ultimate portability, the Asus ROG Strix Go 2.4.
I have been a technology cheapskate most of my life. I’ve rarely bought a monitor brand-new; I’m pleased to say I pieced together my current three-screen articulating swing-arm setup primarily from Craigslist and hand-me-downs. But this fall, I had an opportunity to temporarily replace my three aging displays with the most ridiculous, most advanced gaming monitor ever made: the super-ultrawide, super-curved, ultra-high resolution 49-inch Samsung Odyssey G9.
The Samsung Odyssey G9 is a monitor so big, so wide, so curved, it can fill a midsized desk and wrap around your entire field of view. It’s also simply a phenomenal screen: speedy (240Hz, 1ms, G-Sync, and FreeSync 2), high resolution (5120 x 1440-pixel), and bursting with brilliant color thanks to a QLED panel that tops out at an eye-searing 1,000 nits of brightness. I’m not kidding when I say I have to avert my eyes when I launch Destiny 2 in HDR, and I could swear I felt the flames the first few times my Star Wars: Squadrons’ TIE Bomber blasted an X-Wing into oblivion.
As they say on Reddit, I have ascended — and the past few weeks have been a gaming and productivity experience like few I’ve had before.
But gradually, I’ve been coming back down to Earth.
Design
The Odyssey G9 is a showstopper, and I don’t just mean that figuratively: last January, attendees of the world’s biggest technology show were dazzled by its unprecedented curvature and sci-fi inspired frame.
When I put that same monitor on my humble IKEA sit-stand desk, the effect is otherworldly. Compared to my old hodgepodge of screens and rat’s nest of cabling, this G9 looks like a terminal aboard a Star Trek spaceship… even if my physical keyboard and its long braided cable ruin the illusion a bit.
The sheer size of the Odyssey G9 and its broad-shouldered stand do limit your options. I’m lucky that my small-form-factor Ncase M1 can fit behind the screen, and there’s just enough clearance (a little over six inches) for my Audioengine A2+ speakers to fit underneath the monitor at the stand’s highest position. But if I had a bigger PC or bigger speakers, I might have also needed a bigger desk — or else had to use the included 100mm x 100mm VESA adapter to mount the nearly four-feet wide, one-foot deep, 31-pound screen to the wall. My current monitor arms can’t carry nearly that much weight, though you can buy some TV arms that do.
As it is, I’m a fan of the way this monitor brings my whole desk together. Two DisplayPorts and an HDMI 2.0 port let me switch between three video sources easily, including a side-by-side mode which lets me display two at once, effectively giving my PC and game console (or a second computer) each their own 24.5-inch, 2560 x 1440 displays.
There’s also a two-port USB-A 3.0 hub and a 3.5mm audio output, which worked perfectly with my keyboard’s USB and 3.5mm audio passthrough. As you can see from my photos, I can do a lot with only a single visible cable thanks to those ports. And while the narrow V-shaped stand might seem a little minimal for a monitor this hefty, it takes a decent shove to get it to tip forward even at its highest position.
Underneath the rear cover, cable management at work.
You can adjust the monitor’s settings using a tiny five-way control nub underneath the power LED, and it’s remarkable how much you can tweak — including the ability to crop the entire panel to 4:3, 16:9, or 21:9 aspect ratios instead of stretching out the image. You can effectively have a 27-inch HDR panel for your game console or TV whenever you need. It’s just a shame that the monitor’s biggest benefits don’t necessarily translate to its side-by-side mode, where your 240Hz HDR screen generally becomes a pair of 60Hz SDR ones.
Productivity
My first big test for Samsung’s Odyssey G9 wasn’t a console or even PC gaming — last month, I co-hosted The Verge’s industry-famous Apple event live blog, capturing every screenshot you saw. I normally run three monitors because I switch tasks like mad, and if there’s a better multitasking test than an Apple event, I haven’t met it yet.
Lots of room for productivity.
At first, I wasn’t sure this epic screen would work. Most apps and websites aren’t designed to display across the vast expanse of a single 32:9 monitor, so you have to live in windows. I couldn’t simply toss one or two apps onto each monitor like I usually do. But while Samsung doesn’t ship the G9 with any good windowing software and Windows 10’s default Snap is woefully insufficient, Microsoft’s free downloadable FancyZones windowing manager worked wonders.
l built my own set of dedicated snappable spots for the Apple live stream; The Verge’s live-blogging tool; Slack; a browser window to keep track of any Apple press releases that might pop during the show; and even a narrow strip of Windows Explorer so I could see which images I’d already captured and weed them out as necessary. The only other wrinkle was the additional Chrome extension I had to download to ensure YouTube could launch “full screen” in a browser window, instead of taking over my entire ultrawide monitor.
In general, while I did occasionally miss my two vertically oriented monitors for scrolling long webpages, Google Docs, and Tweetdeck, I found the G9’s gigantic horizontal expanse of real estate nearly as effective for most tasks. Where I could only squeeze four narrow columns of Tweetdeck onto my old portrait-orientation screens, the G9 would comfortably fit five, plus a 30-tab web browser, a nice vertical strip of Evernote for note-taking, and our Slack newsroom alongside.
The curve is even more noticeable from this angle.
I wouldn’t say it’s better than having three screens for work, but it seems like a sufficient substitute — except maybe that toast notifications now pop up in the corner of my eye where they’re pretty easy to miss. Still, it’s nice not to have to match color, contrast, and brightness across three screens at a time, or adjust how my mouse crosses from one monitor to the next. Having a single, vast, unbroken expanse of real estate that’s always the same distance from my face (as I spin in my chair) is an absolute treat. And while the Odyssey G9’s unprecedented curve does tend to catch ambient light, the matte screen does a great job of diffusing any glare.
The ultrawide aspect ratio didn’t work as well for video as I hoped, though. While you might imagine 32:9 being great for movies, I had a hard time finding anything I could play in ultra high definition that wasn’t 16:9. Most streaming platforms won’t easily let you access their 4K and HDR content on a Windows machine to begin with — YouTube’s the primary exception, though Netflix works if you’ve got a recent Intel processor and use Microsoft Edge or the native app — and you’ll want 4K to take advantage of a screen this high-res and this close to your face. The 4K YouTube videos I played were definitely clearer than 1080p — I could really peep these pixels in Dieter’s iPhone 12 video review. And while standard 16:9, 1080p content does display just fine full-screen with black borders on the sides, it feels like I’m wasting a lot of screen real estate that way. Plus, the blacks are a bit gray, not the deep inky black you’d get from an OLED screen — particularly with HDR on and Samsung’s iffy local dimming enabled.
Gaming
The first thing you should know about gaming on the Odyssey G9 is that you’ll want a serious graphics card to go with it. Technically, 5120 x 1440 resolution isn’t quite as many pixels as a 3840 x 2160 4K UHD screen… but remember we’re also talking about a monitor that goes up to 240Hz. To properly review the Odyssey G9, I borrowed an Nvidia GeForce RTX 3080 to get enough horsepower, since my GTX 1080 couldn’t even run games like Death Stranding or Destiny 2 at 60fps smoothly at that resolution.
The second thing you should know about gaming on the Odyssey G9 is that it may not be quite as immersive as you’re imagining.
Don’t get me wrong: having an X-Wing cockpit wrapped all around you is an epic experience, and it feels like a true advantage to be able to use my peripheral vision in competitive shooters like PUBG and CS:GO. But it wasn’t long until I noticed something weird going on.
Look carefully at these images: notice how the sides are warped? Imperial deck officers and Novigrad Temple Guards aren’t generally this pudgy.
I tried game after game after game on the Odyssey G9, digging into my Steam, Epic, and Uplay libraries and tweaking a variety of settings, and this is simply the reality: every 3D game gets warped when you’re viewing it in a 32:9 aspect ratio, and there’s not much you can do about it. Changing your field of view in a game doesn’t get rid of the effect; it simply changes how much of the game world appears in the center of your screen (where things look normal) and at the edges (where they look stretched and zoomed). I actually pulled out a tape measure and checked: video game content that measures 4.75 inches at the center of the display can get stretched to a full 12 inches at the edges.
Now, this isn’t Samsung’s fault; it’s just the way games are built. Most games have a single virtual camera that exists at a single point in space, and while Nvidia once proposed changing that (see link above), the company’s Simultaneous Multi-Projection doesn’t seem to have made it into any of the games I tested. And in games with pre-rendered cutscenes, like Final Fantasy XV, you’ll be watching them at their original aspect ratio.
But before you write off 32:9 ultrawides right now, there are three things I’d like you to consider:
You might get used to it.
It’s not that distracting in some games!
2D games aren’t affected at all.
Let me give you some examples.
CS:GO and PUBG are incredibly competitive, nail-biting games where focus is everything, where you always need to have your gun at the ready and be scanning for any sign of movement. I don’t have time to turn my head left and right to appreciate the scenery or think about whether it’s warped. The G9 simply gives me enhanced peripheral vision, and it helps — not hurts — that things which appear in the corner of my eye are zoomed in by default. I got used to treating it as my peripheral vision and nothing else. (The 240Hz also comes in pretty handy in games like CS:GO where you can actually hit that frame rate.)
Genshin Impact, Abzû, Rocket League, and BioShock Infinite are games with gorgeous, colorful worlds whose proportions aren’t “normal” to begin with, and I love having them wrapped around me.
XCOM 2’s more cinematic movements look pretty good in 32:9 as well.
In Destiny 2 and XCOM 2, I found I could forgive the warping because of the enhanced field of view and the ability to easily zoom whenever you want. It’s nice to see more of the battlefield at once in XCOM while planning out how my soldiers will move each turn, and it’s pretty cool to aim down the sights in Destiny without the typical claustrophobia that comes with zooming in, since you’re still able to see what’s going on around you.
2D / 2.5D games like Worms W.M.D and Disco Elysium do look fantastic on the G9 — when you can find ones that actually support an ultrawide screen. That’s not a given: I managed to launch Soldat at 5120 x 1440 resolution, but it didn’t stretch across my monitor. Games with fixed widths like Streets of Rage 4 and Hyper Light Drifter won’t either. Even Disco Elysium only offers 21:9 support, not 32:9, unless you apply a hack.
Borderlands 3 looked pretty good at 32:9 when it wasn’t moving.
And for every one of the 3D games that worked, I also found a Borderlands 3 or The Witness or Goat of Duty or The Witcher 3 where the warped geometry really bugged me, either because I wanted to sit back and look at the beautiful vista or because the edges of my screen were moving faster than the center.
That’s not easy to show you in still images, so here’s a video clip to show you what I mean:
In games like the hack-and-slash Mordhau or the road-tripping Final Fantasy XV, the distraction can also be when a piece of geometry that’s critical to the game constantly looks wrong. (Your Mordhau sword or axe often extends into the warped area of the screen; the road itself in FFXV looks curved instead of flat!)
Frankly, the most annoying game I played on the Odyssey G9 was figuring out which games would work in the first place. Here, I have to shout out Rock Paper Shotgun’s Katharine Castle, whose brilliant example-filled guide showcases nearly three dozen titles that do work, complete with GIFs so you can see for yourself. But if you’re willing to work at it (and understand the risks), a community at the Widescreen Gaming Forum (WSGF) and PCGamingWiki can help you hack and patch many existing titles to work at 32:9, too.
Death Stranding tweaked to run at 32:9.
For instance, I installed a trainer that let me run Death Stranding at full-resolution 32:9, with an infinitely adjustable field of view, instead of the 21:9 that designer Hideo Kojima and company shipped.
Persona 4 Golden at 32:9. I still need to un-stretch the UI.
Using a common tutorial, I hex-edited my Persona 4 Golden .exe and remarkably wound up playing what was originally a 480p PlayStation 2 game — and later a 720p, 16:9 PlayStation Vita game — at a glorious 3840 x 1080 at 32:9. (I do still need to figure out how to un-stretch the UI.) And there’s an old, unmaintained program called Widescreen Fixer that helped me revisit an old favorite:
I wouldn’t say the community is robust enough that you could necessarily find a fix for any game in your library. But the WSGF does now have a Discord you might want to check out.
The ultimate ultrawide, but the best monitor?
The Samsung Odyssey G9 costs $1,479.99 — a premium price for a premium monitor like nothing else on the market. You can find other 49-inch 32:9 panels for less, but none with this combination of resolution, brightness, curvature, and refresh rate. The closest you can come is last year’s $1,200 Samsung CRG9 which maintains the resolution and brightness but with half the refresh rate at 120Hz and a notably less pronounced 1800R curvature — which, I imagine, wouldn’t be as good at giving you convincing peripheral vision in games.
If you’re looking for the ultimate ultrawide, this is currently it. I’m just not convinced that I am, personally, even if I had that much money earmarked for a new screen. For $1,500 and the enormous amount of space the Odyssey G9 consumes, I could buy a 48-inch LG OLED TV instead. I’d get a screen just as gigantic for my multitasking, but taller, with 120Hz G-Sync and FreeSync support, incredibly deep blacks, HDMI 2.1 for variable refresh rate for the PS5 and Xbox Series X, and no need to troubleshoot aspect ratios for my videos and games. Linus Tech Tips has a video that dives deep into the pros and cons of that LG screen, and I came away fairly convinced.
It wouldn’t be the same experience that the G9 offers, of course, and I might regret it if Nvidia and AMD ever dust off Simultaneous Multi-Projection for real. The TV might also cut off access to a large portion of my desk, and I might not be able to place my PC and speakers within easy reach without blocking a bit of the screen. But I’d have a more obviously future-proof setup; an equally, if not more gorgeous image; and a lot less ambient annoyance when I want to game. At the very least, here’s hoping Samsung adds HDMI 2.1 to this epic monitor next year.
Photography and screenshots by Sean Hollister / The Verge
(Pocket-lint) – Looking for a new Chromebook, eh? Chromebooks have been getting more and more successful, and are available from many big names in laptops including HP, Acer and Asus.
Education has also been a big market for the devices, but they’re also great for a simple home laptop for Gmail, Google Docs and web browsing.
Can’t decide whether to choose between a Chromebook and a standard Windows or Mac laptop? Then check out Chromebook vs laptop: Which should you buy?
Chrome OS is Google’s alternative to Windows and, essentially, is a more complex version of the Chrome browser with a desktop. It’s ridiculously simple to use, updates itself and lacks the confusion and complexity that often hobbles Windows and macOS. That said, it is less capable for professional use than the other operating systems. But for most users at home or in education it provides more than enough functionality.
Here’s our list of the best Chrome OS laptops and we’ll be adding more in as time goes on.
Our pick of the best Chromebooks to buy today
Pocket-lint
Pixelbook Go
squirrel_widget_168562
If you’re impressed by the Pixelbook line but have a tighter budget, the Pixelbook Go is a hugely impressive entry at a lower price-point (though it’s still pricey for a Chromebook). It’s a uniquely-designed machine that makes some brilliant choices, such as the ripple texture on the unit’s bottom side.
It’s got impressive power under the hood, too, to make sure that you’ll feel like you’re using a more premium machine than its price suggests, and we comfortably rate it as one of the best Chromebooks we’ve ever used.
Google Pixelbook Go review: A sublime Chromebook experience
Lenovo
Lenovo IdeaPad Flex 3
squirrel_widget_333717
Sitting at the lower end of Lenovo’s Chromebook range is this brand new IdeaPad Flex 3.
An 11-inch laptop with an IPS display, this great entry-level laptop is powered by an Intel Celeron processor while there’s up to 10 hours of battery life on board in addition to a plethora of ports – two USB-A, a USB-C and an audio jack.
Although it’s available with 32GB of storage, we’d recommend this 64GB option.
Pocket-lint
Asus Chromebook Flip C434
squirrel_widget_161310
A more recent entry in Asus’s line of Chromebooks, the Flip C434 is a great 2-in-1 that feels really premium while still undercutting the Pixelbook’s price nicely. It’s got the holy triumvirate for a laptop, in a solid screen, great keyboard and impressive battery life.
The trackpad lets the team down a little bit, but it runs really nicely and has great ports. Plus, with the option to use it like a tablet, it’s really adaptable and useful in different scenarios.
Asus Chromebook Flip C434TA review: One of the best Chromebooks money can buy
Acer
Acer 714 Chromebook
squirrel_widget_333637
Acer is a super choice for your Chromebook buy and while this 714 has a bit of a utilitarian design, there’s a powerful dual-core Intel Core i3 processor in our preferred version alongside 128GB of storage and 8GB of RAM.
There’s a Full HD display, too, powered by Intel’s UHD 620 graphics.
What’s more, there’s an embedded fingerprint reader for accessing the laptop without your password.
HP
HP Chromebook 14 G5
View offer on HP Store (sponsored link)
There’s nothing better for portable productivity than the Chromebook 14 G5. This machine deploys the secure, intuitive Chrome OS to deliver seamless compatibility with Google’s apps, and the HD webcam and noise-cancelling microphone gives you confidence when collaborating.
The Chromebook has a responsive, spill-resistant keyboard, 802.11ac wireless networking and 8GB of memory – and it’s powered by Intel’s latest generation of Celeron processor.
Pocket-lint
Acer Chromebook 11
squirrel_widget_143241
If you’re looking at the budget end of the market then a Chromebook is a savvy choice. The Chrome OS operating system has developed considerably over the course of time, with both online and offline use possible, including the option to run Play Store apps.
Sure, the Acer Chromebook 11 isn’t going to bring design excitement with its plastic finish, but it’s befitting of the price point. Besides, it’s smartly made, portable has a 360-degree hinge so the screen can be positioned as you please, and it works for many hours at a time too.
If your school or college season is incoming and you don’t have heaps of cash then this is a savvy buy to crack on with your homework.
Pocket-lint
Asus Chromebook Flip C436
squirrel_widget_176742
You want how much for a Chromebook? The Asus Chromebook Flip C436 is one of the most expensive Chromebooks we have tested to date, priced at just under a grand.
It’s a premium Chromebook with design appeal – but its high-end spec isn’t as useful in Chrome OS as a comparable Windows machine and, therefore, it’s slightly harder to recommend than the C434 above.
Asus Chromebook Flip C436 review: Flippin’ powerful, but flippin’ pricey too
Microsoft has released a new world update for Flight Simulator, which presents many regions of the United States in more detail than before. It appeared at the same time as the patch to version 1. 11. 6.0, which alone barely 13 GByte SSD memory used. The World Update itself, which can be optionally downloaded from the game marketplace, weighs around 4 GB and is available free of charge. After the Japan update from September, it is now the second free update of the game scenery.
The World Update USA revises the elevation maps for numerous areas and grants access to updated aerial photographs. Microsoft has marked the respective locations on a rough USA map. In addition, the US update improves the level of detail of four airports (Atlanta International, Dallas / Fort Worth International, New York Stewart International and Friday Harbor) as well as numerous attractions. Microsoft lists a total of 48 of such points of interest, such as the Washington Monument, Capitol and White House, Mount Rushmore, Fort Knox, Yosemite El Capitan and Half Dome, Alcatraz, the Las Vegas Strip at night, the Kennedy Space Center and Pearl Harbor.
Microsoft Flight Simulator: United States World Update Aircraft improvements Patch 1. 10. 6.0 is supposed to eliminate numerous causes of crashes, although Microsoft does not provide any details reveals. So far, numerous players have struggled with sudden crashes to the desktop, for which there was no generally applicable, comprehensible solution – it is still clear to what extent this update will bring improvements and the appropriate masking of distant aircraft. In addition, the behavior of numerous aircraft has been revised: for example the kerosene consumption of the airliners Airbus A 320 neo and Boeing 747-8th. Microsoft made minor adjustments to the aircraft Cessna 121 B, Cirrus SR 22, Extra 330 LT, Pipistrel Virus SW 121 and Robin Cap 10.
For the input devices CH Throttle Quadrant, Flightstick Pro, Combatstick and the Honeycomb Bravor Throttle Quadrant, the update brings ready-made standard profiles. Interesting for fans of subsequent flights: The lighting of smaller streets should now be displayed more randomly.
The Flight Simulator is available in three editions for 70 Euro (Standard), 70 Euro (Deluxe) and 120 Euro (Premium-Deluxe), each containing more aircraft and more finely reproduced airports. In addition to the online sales versions via Windows Store and Steam, the standard and premium editions are also available as disc versions with 10 DVDs distributed by Aerosoft.
The security solution McAfee Endpoint Security (ENS) is actually supposed to protect Windows computers. Due to three weak points, the opposite is now the case. Secure versions provide a remedy.
DoS and malicious code attacks Two of the gaps (CVE – 2020 – 7331 and CVE – 2020 – 7332) are with the threat level ” high “classified. If attackers successfully tackle the vulnerabilities, they could paralyze the protection software via DoS attacks or even execute malicious code on systems. These are the ENS versions 10. 6.1 and 10. 7.0 affected.
The loophole with the identifier CVE – 2020 – 7333 is classified as ” medium “. Here, attackers could execute their own scripts via XSS attacks. Only ENS 10 7.0 is susceptible to this. In the Windows versions 10. November 6th 2020 Update and 10 .7.0 November 2020 Update the developers have closed the gaps.
The warning message does not reveal what specific attacks might look like.
Autumn always brings new versions of immortal tapeworms, among which the most common are Assassin’s Creed, FIFA, Need for Speed and of course a real veteran – Call of Duty. Although the current year brought a lot of serious problems and bitter disappointments, with the power of the powerful Activision, he miraculously spared fans of shooting at enemies of democracy. The latest incarnation with the lengthy title of Call of Duty: Black Ops Cold War continues the tradition by sending the player on a sensational trip to various regions of the world as a defender of the global order. What are the hardware requirements for Call of Duty: Black Ops Cold War? It’s worth checking out, especially since the production supports ray tracing and DLSS.
Author: Sebastian Oktaba
The first installment of Call of Duty debuted in 2003 year, and from 2005 subsequent parts are released regularly, which is why the series already includes seventeen full-fledged computer games, being an indispensable element of the autumn-winter publishing schedule . Although the realities and characters have been changed many times over the decades, and not necessarily sensible solutions have been introduced, one thing has remained untouched – the cinematic formula of the game intended to increase the immersion of the player. The age-old graphics engine, once considered the most archaic in the world of first-person shooters, has not been changed for a long time, but last year’s Call of Duty: Modern Warfare has finally made a decisive step in the right direction. Call of Duty: Black Ops Cold War uses the achievements of its predecessor and adds some new things, catching up with the competition, among others Battlefield. We will soon find out how programmers approached the optimization issue.
Call of Duty: Black Ops Cold War is based on the modified IW 8.0 engine that drove its predecessor, but introduces DLSS in addition to ray tracing.
Call of Duty: Black Ops Cold War hardware requirements are divided into four categories, depending on what settings and additional functions we plan to enable The basic configuration should have an Intel Core i3 processor – 4340 / AMD FX – 6300, which are really old units with the performance of today’s low-end. The same applies to eight-year-old NVIDIA graphics systems GeForce GTX 670 / AMD Radeon HD 7950, whose counterparts currently occupy the lowest segments, but everything needs to be replaced with 8 GB of RAM and 175 GB of free disk space. and the machine includes an Intel Core i5 processor – 2500 K / AMD Ryzen 5 1600 X and NVIDIA GeForce GTX graphics 970 / AMD Radeon RX 580, so the equipment that is within reach of the average player is again indicated. The stairs only start when we want to turn on ray tracing, because then the requirements grow exorbitantly. According to the developers, you will need at least NVIDIA GeForce RTX 3070 / AMD Radeon RX 6800 and Intel Core i7 – 8700 K / AMD Ryzen 7 1800 X .
Minimum
Recommended
RTX Medium
RTX Ultra
AMD processor
FX – 6300
Ryzen 5 1600 X
Ryzen 7 1800 X
Ryzen 7 3700 X
Intel processor
Core i3 – 4340
Core i5 – 2500 K
Core i7 – 8700 K
Core i9 – 9900 K
AMD card
HD 7950
RX 580
RX 6800
RX 6800 XT
NVIDIA Card
GTX 670
GTX 970
RTX 3070
RTX 3080
RAM
8 GB
12 GB
16 GB
16 GB
Disk space
175 GB
175 GB
165 GB
250 GB
Operating system
Windows 10 64 – bit
Windows 64 -bit
Windows 10 64 – bit
Windows 10 64 – bit
Call of Duty: Black Ops Cold War uses the Black Ops Cold War Engine, which is essentially a modification of the IW 8.0 developed for Call of Duty: Modern Warfare. Production works only under the DirectX command 12, using PBR lighting (Physically-Based Rendering), volumetric lighting and realistically reproduces thermal thermal radiation. The engine also supports NVIDIA Reflex, Adaptive Shading, DLSS 2.0 and real-time ray tracing. The ray-tracing implementation is primarily focused on lighting effects and shading, including sun shadows, local shadows and ambient occlusion, i.e. the package has been extended to Call of Duty: Modern Warfare. The previous version did not receive DLSS support, which was made up for by introducing DLSS 2.0, which offers a much better quality of the reproduced image and a significant increase in performance compared to the native resolution. It is also worth adding that the buyers of GeForce RTX 3080 and GeForce RTX 3090 (eg ASUS TUF) at selected partners, should get Call of Duty: Black Ops Cold War for free.
ASUS GeForce RTX Test 3070 TUF Gaming – Non-reference Ampere
Xbox head, Phil Spencer, is continuing to do the rounds at press outlets to discuss all things Xbox and promote the recent Series X/S launch. In a recent interview published this week, we learned about what’s next for xCloud in 2021, with Microsoft’s game streaming service set to become much more accessible.
During a recent interview with The Verge, Phil Spencer was asked about the possibility of an Xbox streaming app for TVs, to which Spencer said “I think you’re going to see that in the next 12 months”. This is a sign that Microsoft plans to grow its xCloud streaming platform in 2021, going beyond Android devices.
Microsoft has toyed with the idea of TV streaming for a while now, with the company’s most recent attempt resulting in a cancelled Xbox-branded TV stick, similar to a Chromecast. Given the prevalence of Smart TVs nowadays though, a simple app would suffice. Microsoft already has an xCloud partnership in place with Samsung, so we could end up seeing Samsung TVs getting access first.
Of course, Microsoft also recently revealed plans to bring xCloud to iPhone and iPad using in-browser streaming. The company is also working on bringing xCloud streaming to Windows 10 – a feature that should be available on PC soon.
While xCloud will continue to grow though, Spencer has no plans to completely phase out local gaming hardware. Instead, Xbox’s approach is a “hybrid environment” where cloud and local compute capability coexist.
KitGuru Says: Cloud gaming still has its question marks but so far, Microsoft seems to be handling the introduction of this technology better than the likes of Google or Amazon. Have any of you tried xCloud recently? Would you use it if it was available on a wider range of devices?
Become a Patron!
Check Also
Cyberpunk 2077 PS4 Pro and PS5 gameplay footage released
Last week, CD Projekt Red released the first official footage of Cyberpunk 2077 running on …
The Razer Blade Stealth is confused about what it wants to be. It’s priced as a premium ultraportable and looks like one too. But as a gaming notebook, it’s quite underpowered for the price. While the OLED screen is beautiful, Razer needs to work on the keyboard.
For
Great build quality
OLED screen is gorgeous
Thunderbolt 4 on both sides
Against
Cheaper gaming laptops offer better graphics
Uncomfortable keyboard
There are a few things I can say with certainty about the Razer Blade Stealth ($1,799.99 to start, $1,999.99 as tested). For one thing, it’s built like a tank. Our option had a beautiful OLED display, and Razer doesn’t heap on bloatware.
What I can’t tell you, though, is who this laptop is for exactly. Razer calls it a “gaming ultraportable,” and prices it among the
best ultrabooks
, which are often expensive partially due to build quality. But the mix of an Intel Core i7-1165G7 and Nvidia GeForce GTX 1650 Ti, while they can play eSports games or AAA games, are low-end for a laptop of this price. Competing gaming notebooks with far superior graphics performance can be found for less money. To make it more confusing, Razer has announced
the Razer Book 13
, a non-gaming ultraportable that houses Intel’s Xe integrated graphics.
Among competing ultrabooks, the
best gaming laptops
and in Razer’s own stack, the Blade Stealth feels more niche than it used to. And yet, somehow, it still gets plenty right.
Design of the Razer Blade Stealth
Image 1 of 4
(Image credit: Tom’s Hardware)
Image 2 of 4
(Image credit: Tom’s Hardware)
Image 3 of 4
(Image credit: Tom’s Hardware)
Image 4 of 4
(Image credit: Tom’s Hardware)
We last reviewed the Razer Blade Stealth in September with a 10th Gen Intel Ice Lake processor, and the design hasn’t changed a bit in the intervening two months. But for those who aren’t familiar with it, the Stealth is a black aluminum notebook. The lid features the Razer tri-headed snake logo, but at least the company made it black on black, so you can mostly ignore it if it doesn’t fit your style.
The no-frills aluminum build continues when you unfold the laptop. It’s all-black aluminum on the deck, with speakers flanking both sides of the keyboard. That’s the one spot with some color, as the keys are lit with single-zone Chroma RGB. The display is surrounded by a moderate, but inoffensive, bezel.
The left side of the laptop has a Thunderbolt 4 port over USB Type-C, USB 3.1 Gen 1 Type-A and a headphone jack. The right side is the same, minus the headphone jack. I do appreciate that Razer has the ports evenly distributed across the laptop, and you can charge on either side via the Thunderbolt ports.
If you consider the Stealth to be a gaming notebook, it’s small at 3.1 pounds and 12 x 8.3 x 0.6 inches. The
Asus ROG Zephyrus G14
, a 14-inch gaming notebook, is 3.5 pounds and 12.8 x 8.7 x 0.7 inches, and even that’s petite for a gaming notebook.
But if the Blade Stealth is an ultraportable, then it’s big. The Dell XPS 13 9310 is 2.8 pounds and 11.6 x 7.8 x 0.6 inches, though it does have fewer ports.
2x USB 3.1 Gen 1 Type-A, 2x Thunderbolt 4, 3.5 mm headphone/mic jack
Camera
720p IR
Battery
53.1 WHr
Power Adapter
100 W
Operating System
Windows 10 Home
Dimensions(WxDxH)
12 x 8.3 x 0.6 inches / 304.6 x 210 x 15.3 mm
Weight
3.1 pounds / 1.4 kg
Price (as configured)
$1,999.99
Gaming and Graphics on the Razer Blade Stealth
The Razer Blade Stealth comes with an Nvidia GeForce GTX 1650 Ti for gaming. To put it flatly, that’s not going to get you strong performance outside of some eSports titles, unless you’re willing to bring down your settings.
Image 1 of 4
(Image credit: Tom’s Hardware)
Image 2 of 4
(Image credit: Tom’s Hardware)
Image 3 of 4
(Image credit: Tom’s Hardware)
Image 4 of 4
(Image credit: Tom’s Hardware)
I tested out the Stealth playing a few rounds of Rocket League, an eSports title that’s matches the type of game one should most expect to play on this laptop. In a round, I saw frames fluctuate between 177 and 202 frames per second (fps) on high quality mode at 1080presolution. Since our review unit’s screen only has a 60 Hz refresh rate, it really would have made sense to limit the frames.
You may notice that our primary competitor in gaming, the Asus Zephyrus G14, has a much better GPU: an Nvidia GeForce RTX 2060 Max-Q. This isn’t an accident — you can get that machine for $1,449, which is cheaper than the Blade Stealth we’re reviewing. What the Stealth offers is better than the integrated graphics you get in most ultraportables, but other, cheaper gaming laptops do offer more power.
On the Shadow of the Tomb Raider benchmark, on the highest settings at 1080p, the Stealth ran the game at 26 fps, which is below our 30 fps playable threshold, while the Zephyrus G14 hit 49 fps. Red Dead Redemption 2, at medium settings and 1080p, was also unplayable at 22 fps, while the Zephyrus G14 hit 35 fps.
On Grand Theft Auto V, at very high settings at 1080p, the Stealth played at 35 fps, but the Zephyrus G14 hit 115 fps.
The Blade managed to play Far Cry New Dawn (1080p, ultra) at 47 fps, but the Zephyrus beat it again at 73 fps.
To stress-test the Blade Stealth, we ran the Metro Exodus benchmark 15 times on a loop, which simulates about half an hour of gameplay. On the high preset, the game ran at an average of 29.9 fps, suggesting that you really need to drop down to normal or lower for playable frame rates. It hit 30 fps the first two runs before dropping down to around 29.9 for the rest of the gauntlet.
During that test, the CPU measured an average clock speed of 3.5 GHz and an average temperature of 60.8 degrees Celsius (141.4 degrees Fahrenheit). The GPU ran at an average speed of 1,287.6 MHz and an average temperature of 57.2 degrees Celsius (135 degrees Fahrenheit).
Productivity Performance on the Razer Blade Stealth
We tested the Razer Blade Stealth with an Intel Core i7-1165G7 CPU, 16GB of LPDDR4X RAM and a 512GB PCIeNVMe SSD. The package is a formidable workhorse, though other machines in both the gaming and ultraportable space do have some advantage.
Image 1 of 3
(Image credit: Tom’s Hardware)
Image 2 of 3
(Image credit: Tom’s Hardware)
Image 3 of 3
(Image credit: Tom’s Hardware)
On Geekbench 5.0, the Stealth earned a multi-core score of 4,992, falling to the XPS 13 (also with a Core i7-1165G7) and the Asus ROG Zephyrus G14, a gaming machine with an AMD Ryzen 9 4900HS.
The Blade Stealth copied 4.97GB of files at a rate 946.6 MBps, beating the XPS 13 9310 but still slower than the Zephyrus.
On our Handbrake test, Razer’s laptop took 16 minutes and 19 seconds to transcode a 4K resolution video to 1080p. That’s faster than the XPS 13, but the Zephyrus smashed it in less than half that time.
Display on the Razer Blade Stealth
The 13.3-inch FHD OLED touchscreen on the Stealth sure looks nice. The trailer for Black Widow (is it ever going to come out?) looked excellent. When the titular heroine is surrounded by flames in a car chase with Taskmaster, the orange reflections really stood out on a dark road. The villain’s navy suit contrasted with Red Guardian’s, well, red, knockoff Captain America outfit. In Rocket League, the Blade Stealth’s screen made the orange and blue cars pop against green turf.
(Image credit: Tom’s Hardware)
Razer’s panel covers 83.2% of the DCI-P3 color gamut, just a smidge higher than the Zephryus’ display. The XPS 13 covers 69.4%.
The Blade Stealth measured an average brightness of 343 nits, while the XPS 13 was the brightest of the bunch at 469 nits. The Zephyrus G14 was a tad behind at 323 nits.
Keyboard and Touchpad on the Razer Blade Stealth
(Image credit: Tom’s Hardware)
Earlier this year, Razer fixed a long-maligned keyboard layout that put the shift key in an awkward place. That’s a major improvement, and the next step should be to focus on key travel. The keys have soft switches, and I had a tendency to bottom out on the aluminum frame, which tired my fingers. As I got used to the keyboard, I hit 112 words per minute with my usual error rate, which isn’t bad, but I could’ve felt a bit better doing it with more travel.
It wouldn’t be a Razer device without Chroma RGB lighting. The keyboard is single-zone backlit and can be controlled via the Synapse software.
The 4.3 x 3-inch touchpad is tall, making it more spacious than much of the Windows competition (though it’s still not as luxuriously large as what you see on Apple’s MacBooks). Windows Precision drivers ensure accurate scrolling and gestures. This is definitely one of the best touchpads on a Windows 10 laptop.
Audio on the Razer Blade Stealth
I’ll give the Blade Stealth’s audio this: It gets loud. The twin top-firing speakers immediately filled up my apartment with sound — in fact, I found it uncomfortable at maximum volume. When I listened to Blackway and Black Caviar’s “What’s Up Danger,” I got the best results with audio around 85%. The audio was clear, with vocals mixing well with sirens and synths in the background, as well as some drums. Bass, however, was lackluster. In Rocket League, car motors and bouncing balls were all clear.
Of the little software pre-installed on the system, one you might want to check out is THX Spatial Audio. Switching between stereo and spatial audio didn’t make a huge difference, but there are some presets, including games, music and voice to toggle between.
Upgradeability of the Razer Blade Stealth
(Image credit: Tom’s Hardware)
Ten Torx screws hold the bottom of the Razer Blade Stealth’s
chassis
to the rest of the system. I used a Torx T5 screwdriver to remove them, and the bottom came off without much of a fight.
The SSD is immediately accessible, and the Wi-Fi card and battery are also easily available for upgrades. The RAM, on the other hand, is soldered onto the motherboard.
Battery Life on the Razer Blade Stealth
The Razer Blade Stealth’s history with battery life has been mixed, but this iteration with Intel’s 11th Gen Core processors is decent, especially considering it has a discrete GPU. The laptop ran for 9 hours and 11 minutes on our battery test, which continuously browses the web, run OpenGL tests and streams video over a Wi-Fi connection, all at 150 nits of brightness.
(Image credit: Tom’s Hardware)
But ithe Razer was outclassed by both the Dell XPS 13 (11:07) and the Zephyrus G14, the longest-lasting gaming notebook we’ve ever seen (11:32), so there’s still room for improvement on Razer’s part.
Heat on the Razer Blade Stealth
Since Razer classifies the Blade Stealth as a “gaming ultraportable,” we took our heat measurements by pushing it to the limits on our Metro Exodus test.
During the benchmark, the keyboard between the G and H keys measured 42.7 degrees Celsius (108.9 degrees Fahrenheit), while the touchpad was cooler at 30.9 degrees Celsius (87.6 degrees Fahrenheit).
Image 1 of 2
(Image credit: Tom’s Hardware)
Image 2 of 2
(Image credit: Tom’s Hardware)
The hottest point on the bottom of the laptop reached 47.3 degrees Celsius (117.1 degrees Fahrenheit).
Webcam on the Razer Blade Stealth
Above the display, the Razer Blade Stealth has a 720p resolution webcam with infrared (IR) sensors. The IR sensors let you use facial recognition to log in to Windows 10 with Windows Hello, which was quick and accurate.
The webcam is passable. It caught details about as fine as a 720p webcam can, like hairs on my head, but the picture was still a little grainy and could definitely be sharper. On a laptop that has so many of the little details right, I’m ready for an upgrade on the camera.
Software and Warranty on the Razer Blade Stealth
You won’t spend much time removing bloatware from the Stealth. The only major piece of software that Razer adds to Windows 10 is Synapse, its hub for Chroma RGB lighting, adjusting performance modes, registering products and syncing accessories.
Windows 10 comes with some bloatware of its own, including Roblox, Hulu, Hidden City: Hidden Object Adventure, Spotify and Dolby Access.
Razer sells the Blade Stealth with a one-year warranty.
Razer Blade Stealth Configurations
We reviewed the $1,999.99 top-end variant of the Stealth, with an Intel Core i7-1165G7, Nvidia GeForce GTX 1650 Ti, 16GB of RAM, a 512GB SSD and a 13.3-inch OLED FHD touchscreen.
I suspect that those who are using this system primarily for gaming will prefer the $1,799.99 base model, which has all of the same parts except for the display, which is a 120 Hz FHD screen.
Bottom Line
(Image credit: Tom’s Hardware)
The Razer Blade Stealth does a lot right, with great build quality, a lovely OLED screen and symmetrical Thunderbolt 4 ports for convenient charging on either side of the system.
If you’re buying this as an ultraportable, it’s expensive at $1,999.99 (with the OLED screen, anyway). But if you’re buying it as a gaming notebook, you should look elsewhere to save money and get better performance. The Asus Zephyrus G14 gives you the best Ryzen mobile processor for gaming around and has an RTX 2060 Max-Q for $450 less at $1,449.99. It doesn’t have Razer’s build quality or an OLED option, but it’s a worthwhile tradeoff in performance and you still get a 120 Hz display.
Razer doesn’t offer a version of the Stealth without the GTX 1650 Ti. That’s saved for the new Razer Book 13, which we haven’t reviewed yet as of this writing. But with that notebook in the wings, and the excellent
Razer Blade 15 Advanced
on the other side of Razer’s lineup, it makes a lot of sense to either spend more for better parts for gaming or spend less for better parts for productivity.
Those who just want to mix in some casual eSports play with work will get what they need out of this laptop, but a high-priced eSports laptop is a bit of a niche. While the Stealth is still a decent laptop, it doesn’t make as much sense as it used to.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.