back in February, now has full specs, pricing and is opening pre-orders for the machine. It will come in three configurations, starting at $999.
That base model has an Intel Core i5-1135G7 processor, 8GB of DDR4 RAM, a 256GB NVMe, Wi-Fi 6 and will run Windows 10 Home. A $1,399 performance configuration bumps the processor up to an i7-1165G7 and doubles the RAM and storage to 16GB and 512GB, respectively. A professional model, starting at $1,999, has an i7-1185G7, 32GB of RAM, a 1TB SSD, support for vPro and runs Windows 10 Pro.
There will also be a DIY edition, starting at $749 barebones, that you build yourself from a kit and customize with parts and modules.
Each laptop will also have a 3:2, 2256 x 1504 display, 1080p webcam with a privacy switch, a 55Wh battery and a keyboard with 1.5 mm of key travel, all features you might find in one of the
best ultrabooks
. The entire motherboard is replaceable to allow upgrades to future generations of processors, which are typically soldered to the board on laptops.
Pre-orders are starting today in the United States, and Canada will come soon, with Asia and Europe coming before the end of the year.
But like most tech companies, Framework hasn’t been immune to supply chain issues, which it says will “limit the number of Framework Laptops we have available at launch.” The company will take its pre-orders in small batches to ensure it can fulfill orders. The first batch, Framework says, will ship in July, with more to come. A pre-order requires a $100 refundable deposit, and the balance will be paid when it’s ready to ship.
While the Framework Laptop sounds promising on paper, other small computer vendors have faced issues with fulfillment. Eve Devices, for instance, had issues fulfilling its Eve V, and some potential buyers have proven far more cautious around its Spectrum monitor and second-gen convertible. But Framework is acknowledging the difficulty in sourcing parts right now, so at least it’s being straightforward there.
The modular Framework Laptop is now available for preorders, with prices starting at $999 for a fully assembled computer or $749 for the DIY Edition, which adds the option of paying less upfront if you’re willing bring your own RAM, SSD, charge, and Wi-Fi card.
For prebuilt systems, Framework is offering three starting configurations. There’s the entry level, $999 Base, which offers a Core i5-1135G7 processor, 8GB of RAM, a 256GB NVMe SSD, and Wi-Fi 6. The Performance model, starting at $1,399, ups the processor to a Core i7-1165G7 and doubles the RAM and SSD storage. And the Professional configuration, starting at $1,999, has a Core i7-1185G7, 32GB of RAM, 1TB of storage, and Windows 10 Pro.
Whether fully prebuilt or DIY, the Framework features a 13.5-inch 2256 x 1504 screen, a 1080p 60fps webcam, a 55Wh battery, and a 2.87-pound aluminum chassis. Ports are handled through a customizable expansion card system, which allows users to select up to four from a selection of USB-C, USB-A, HDMI, DisplayPort, and microSD slots. There’s also the option to add extra storage through removable USB-C SSDs, too, and Framework promises more port options in the future.
To preorder, Framework is only asking that users put down a $100 deposit, with the first laptops set to ship at the end of July. Preorders are only available in the US for now; Canada is promised in “the next few weeks,” and European and Asian availability is set for later this year. Additionally, Framework warns that due to the ongoing global supply shortages, it’ll have limited supply at launch but plans to do its preorders in batches that should ship throughout the year.
Framework is obviously a new company with an untested platform here. It is offering a 30-day return guarantee and a one-year limited warranty, though, which might help anyone deciding whether to plunk down the cash for a new computer.
(Pocket-lint) – The HTC Vive Pro was revealed as the successor to the HTC Vive, back at CES 2018 in Las Vegas. Since then the company has launched the HTC Vive Pro Eye and HTC Vive Cosmos. In 2021, at Vivecon, the company the revealed the HTC Vive Pro 2.
While Oculus is focussing on making accessible and affordable VR headsets for the masses with the Oculus Quest line-up, HTC is very much aiming for the top-tier, best-in-class VR experience.
We’re summarising the differences between the headsets so you know what’s changed.
HTC Vive Pro 2: Dual front-facing cameras, adjustable comfort dial
The original HTC Vive was a striking VR headset with a funky black finish and an unmistakable pocked-marked design. A wired virtual reality headset that required a fairly high-end gaming PC in order to work. This headset was the start of serious VR headsets from HTC and the company has continued to improve upon an award-winning formula since then with various iterations of the Vive Pro line-up.
The HTC Vive Pro is immediately recognisable thanks to some striking design changes. Where the original headset came in black, the HTC Vive Pro came in a bold blue with two front-facing cameras.
The classic pocked-marked design remained with the sensors still a key part of the VR tracking experience, but the Vive Pro included some comfort upgrades missing from the original Vive. As well as other improvements too.
Where the HTC Vive featured three velcro straps that need adjusting to get the right fit, the Vive Pro had an updated design that includes a solid strap, integrated headphones and a clever comfort system. This included a dial at the back that allows for easy fit and comfort adjustment.
The design of the HTC Vive Pro also features enhanced ergonomics to give a more balanced fit by decreasing weight on the front of your face while you play. This includes a redesigned face cushion and nose pad combination which blocks out more light than the current design on the original HTC Vive.
The HTC Vive Pro has two front-facing cameras that look like eyes on the front of the headset. These are primarily designed for developers to take advantage of, but allow for better tracking of your environment as you game too.
The HTC Vive Pro 2 has mostly maintained the same outward design aesthetics as the Vive Pro. The main difference being the front faceplate is now black instead of blue. A lot has changed under the hood, but HTC has taken an “if it isn’t broken don’t fix it” attitude to the general setup of the headset itself.
The HTC Vive Pro uses a DisplayPort 1.2 connection. This is something to bear in mind if you’re considering the upgrade or purchase of the HTC Vive Pro – as not all graphics cards have a DisplayPort output and you might need an adapter.
Despite significantly upgraded visuals, it’s said that any machine capable of running the Vive Pro will be handle the Vive Pro 2 as well. That’s thanks to something called “Display Stream Compression” which downscales visuals if necessary on lesser hardware.
HTC Vive Pro review: The best VR experience… if you can afford it
HTC Vive Pro Eye review: The future of VR is controller-free
HTC
The Vive Pro Eye was an interesting addition to the Vive line-up. A powerful VR headset that’s was more aimed at “professional” users than gamers.
It features similar design aesthetics to the HTC Vive Pro but stands out as having rings around the two front-facing cameras. The highlight of this device is the internal tech though as the HTC Vive Pro Eye features eye-tracking technology. This design, therefore, includes LED sensors around the lenses that both track and analyse eye movements as you observe the virtual world.
The HTC Vive Pro, the HTC Vive Pro Eye and Vive Pro 2 all features adjustable headphones, head strap and eye relief system to ensure you get a comfortable gaming experience. All these headsets are compatible with a wide range of games available from Steam and Viveport.
Best HTC Vive and Vive Pro games: Incredible experiences to play right now
HTC
Display resolution and specifications
HTC Vive Pro: 1400 x 1600 per eye (2800 x 1600 overall resolution), 110-degree field of view, 90Hz refresh rate
HTC Vive Pro Eye: 1400 x 1600 per eye (2800 x 1600 overall resolution), 615 PPI, 110-degree field of view, 90Hz refresh rate
HTC Vive Pro 2: 2448 x 2448 pixels per eye (4896 x 2448 overall resolution), 120-degree field of view, 120Hz refresh rate
The original HTC Vive was the pinnacle of VR when we first reviewed it. Things have come a long way since then and screen technology has changed a lot.
The HTC Vive Pro offered an increased resolution to deliver an even better optical experience. Dual-OLED displays on the headset offered a total resolution of 2880 x 1600. That’s 1400 x 1600 per eye compared to 1080 x 1200 per eye on the original HTC Vive.
The HTC Vive Pro Eye offered the same visual specifications as the Vive Pro. With the only difference being in the way this headset tracks your eyes.
The HTC Vive Pro 2 has lept forward even more with the offer of not only 4896 x 2448 pixels but a faster 120Hz refresh rate and a wider field of view too.
HTC claims the Vive Pro 2 has the “best-in-class” display with the highest resolution to date, even compared to top-of-the-line competitors like the HP Reverb G2 and Valve Index.
This resolution change improves clarity during gaming as well as enhancing immersion for gamers. The HTC Vive Pro 2 offers clearer text rendering and a crisper picture whether playing games or watching videos while using the headset. In-game textures are smoother and more realistic as well as stunning to look at.
HTC has also improved the Vive Pro 2 with the addition of a dual-stack lens design with two lenses redirecting the image for a wider field of view. This is said to have a bigger sweet spot and a more realistic view of the world around you. The fast-switch LCD IPS panel also sports RGB subpixel technology and that, combined with the high pixel count should result in virtually no screen-door effect.
Despite these changes, the Vive Pro 2 can still run on similarly specced gaming PCs:
The recommended specifications are:
Processor: Intel Core i5-4590 or AMD FX 8350, equivalent or better.
Graphics: NVIDIA GeForce® GTX1060 or AMD Radeon RX480, equivalent or better.
Memory: 4 GB RAM or more
Video out: DisplayPort 1.2 or newer
USB ports: 1x USB 3.0 or newer port
Operating system: Microsoft Windows 8.1 or Windows 10
Audio quality and features
HTC Vive Pro: High-performance Hi-Res certified headphones with a built-in amplifier and 3D spatial sound, dual microphones with active noise cancellation
HTC Vive Pro Eye: Hi-Res certified headphones, built-in digital amplifier, 3D spatial sound, dual microphones with active noise cancellation
The HTC Vive Pro includes earcups built right into the design. These headphones offer a similar design to the Deluxe Audio Strap upgrade for the HTC Vive, but with improvements to enhance them further.
The HTC Vive Pro includes high-performance headphones with a built-in amplifier that delivers a superior audio experience including a “heightened sense of presence” and better spatial sound.
The HTC Vive Pro only requires a single cable to connect to the link box which then attaches to your PC, so there are far fewer cables to get in the way as you game.
The headphones click down into place when you need them and click up out of the way when you don’t.
The design of the HTC Vive Pro also includes dual built-in microphones with active noise cancellation for a superior communication experience when playing multiplayer or co-op games. These headphones also include volume controls and a mic mute button built right into the design for easy access while you play.
The HTC Vive Pro Eye and the HTC Vive Pro 2 offer the same audio experience as the HTC Vive Pro. There are no upgrades here as far as we can see from the specs or from testing. It is worth noting though that the headphones on the HTC Vive Pro 2 are detachable so you can pop them off and use your own if you so wish.
HTC
Tracking compatibility and upgrades
HTC Vive Pro: Backwards compatibility with original base stations (sold separately)
The original HTC Vive required users to plug two base stations into the mains power supply in the room that would make up the playspace. These sensors would then help track and relay movement data of both the headset and controllers back to the PC. With a base station in either corner of the room, users can achieve a Room-Scale play space of around 4×3 metres.
The HTC Vive uses sensors that make it capable of tracking six degrees of movement – meaning it can track all movement up and down, back and forth and around the play space as long as the base stations can see you.
The HTC Vive Pro is compatible with the original HTC Vive base stations meaning theoretically if you own the original VR device you can just buy the new headset and it will work fine with the original setup. New and improved base stations also offer an increased level of Room-Scale tracking with up to 10×10 metre playspace.
Similarly, the Vive Pro 2 follows the same logic and with the headset available to buy on its own it makes a logical upgrade path for anyone who owns the original headsets.
As we mentioned earlier, the HTC Vive Pro Eye boasts an upgrade in terms of its tracking capabilities that includes LED sensors that monitor eye movements. This is said to not only allow your eyes to act as a controller but also allows the headset to gather data on your eye movements while you play or look around the virtual environment.
In practice, this will result in faster reactions in gaming and useful data for businesses who are trying to track audience gaze. For example, monitoring what products or virtual objects get the most attention from a lingering look. It also presents the possibility of controlling games with just your eyes – whether indicating where you want to go or by controlling different menus within the game.
The Vive Pro 2 is interesting as it’s still compatible with the HTC Vive wireless adapter, it will also work with the Facial Tracker and with the Vive Tracker 3.0 setup which means you can theoretically track anything in the real world.
The Vive Pro 2 will also work with both Vive wand controllers and Valve’s Knuckle controllers, giving you more options to control the headset with ease.
HTC
Which is the right HTC headset for you?
The HTC Vive Pro 2 is now the most logical choice for those considering an HTC VR headset. It isn’t cheap, but if you’re upgrading from previous HTC headsets then you can save some money by just purchasing the headset and nothing else.
For those who are new to VR, the HTC Vive Pro 2’s price tag might seem high compared to the likes of the Oculus Quest 2, but with some serious specs under the hood, it should be the pinnacle of VR. Though you’ll need a high-end PC to make the most of the headset and the full kit in order to successfully track it.
We thought the HTC Vive Pro was one of the best VR headsets money could buy and the Vive Pro 2 should continue that trend too.
Gigabyte is announcing seven new laptops featuring the hardware from Intel and Nvidia, two of which are packing Nvidia’s brand new RTX 3050 Ampere mobile GPU.
The new Gigabyte G5 and G7 are the company’s latest budget-friendly offerings for mainstream buyers. Both models are packing Intel’s 11th Gen Core processors, the eight-core i7-11800H or the hexa-core Core i5-11400H based on the Tiger Lake architecture. The G5 and G7 also use Nvidia’s newly released RTX 3050 and RTX 3050 Ti mobile GPUs.
Both notebooks feature dual DDR4-3200 slots supporting a max of 64GB (32GB per DIMM), and dual M.2 slots supporting PCIe with one allowing up to Gen 3 speeds and the other up to Gen 4. Plus, you get one 2.5-inch HDD/SSD slot that supports 7mm (or thinner) SATA drives.
The main difference between the G5 and the G7 is display size. The G5 is a 15-inch notebook while the G7 comes in a larger 17-inch form factor. Despite the changes in size, both laptops will come with the same panel specs, with a 1080p display at 144 Hz.
For connectivity the G5 and G7 come with four USB ports of different variations: You get a single USB 2.0 Type-A, dual USB 3.2 Gen 2 ports with one being type-C, and finally a USB 3.2 Gen 1 type A port.
For wireless connectivity, the G5 and G7 come with Intel’s AX200 or AX201 wireless cards which both support WiFi 6 and Bluetooth 5.2. For storage and memory, you get dual M.2 slots with one supporting Gen 3 speeds and the other supporting Gen 4.
The line starts at $1,149 for the lowest-end G5.
Refreshed AERO 15/17
Gigabyte is also updating its Aero line of laptops, which are targeted towards creators and gamers alike. Gigabyte is adding two upgraded models to the Aero lineup, the Aero 15 OLED and the Aero 17 HDR with new CPUs.
The main differences between the 15 and 17 will be its size and display type (as the name implies), the Aero 15 will come with a Samsung AMOLED display so you get those very crisply visuals and stunning visuals. Unfortunately, you will not be able to get an AMOLED display for the Aero 17, so Gigabyte has opted for a 4k HDR display instead.
The upgrade you’re getting on the new refreshed Aero 15 and 17 are the CPUs; both the OLED and HDR variants get upgraded to Intel’s 11th Gen Tiger Lake CPUs, specifically the i9-11980HK or the i7-11800H. Giving these laptops a big performance and efficiency boost over previous Comet Lake mobile CPUs.
Like the previous Comet Lake-based Aero 15 and 17, you get options for either an RTX 3070 or RTX 3080 GPU with a 105W TDP.
AORUS 17X
Gigabyte is also refreshing the Aorus 17X, the companies flagship gaming laptop with a 17.3 display and a thick chassis with vapor chamber cooling to cool Nvidia and Intel’s top tier CPUs and GPUs.
The 17X will come with Intel’s highest-end mobile processor you can get, the i9-11980HK with 8 cores and a max turbo frequency of 5GHz. The chip has a configurable TDP up to 65W. What we don’t know is how Gigabyte configured the TDP for the Aorus 17X.
For graphics, the Aorus 17X will come with an RTX 3080, with a whopping 165W of target graphics power.
This flagship device includes some other top-end specifications, including a 300 Hz display and a mechanical keyboard with Omron gaming switches and RGB backlighting.
This laptop is set to launch in June starting at $2,099.
Over half a dozen manufacturers have announced new models
Intel is adding new processors to its 11th Gen Core H-series lineup today, and over half a dozen laptop manufacturers are announcing new machines that make use of them. In total, there are 10 new Tiger Lake-H processors being announced today, including five consumer processors and five commercial processors, with between six and eight cores. Here’s our full writeup on the chips themselves.
According to Intel, its new H-series processors will be used in over 30 upcoming ultraportables (aka: laptops that are 20mm thick or less) and upward of 80 workstations. Companies including Razer, HP, Asus, Lenovo, MSI, Acer, Gigabyte, and Dell are announcing their first laptops with the new chips today, and we’ve rounded up their models below.
Razer
Razer has announced a range of new Blade 15 Advanced laptops featuring Intel’s 11th Gen H-series processors. At the top of the lineup is a model with a Core i9-11900H paired with an RTX 3080 GPU with 16GB of video memory and a 4K 60Hz OLED touchscreen. But if you’re looking for something a little less powerful, you can get a machine that’s just 15.8mm thick, and Razer claims it’s the smallest 15-inch gaming laptop with RTX graphics. This thinner model is a step down specs-wise: it has a Core i7-11800H, an RTX 3060 GPU, 16GB of RAM, and a QHD 240Hz IPS display.
Razer’s laptops will be available to preorder from May 17th and will ship in June. Prices start at $2,299. Read more about Razer’s new laptops here.
HP
HP has three new laptops it’s announcing today: the ZBook Fury G8, the ZBook Power G8, and the ZBook Studio G8. The Studio G8 can be configured with up to an Intel Core i9-11950H vPro processor, alongside an Nvidia RTX 3080 GPU with up to 16GB of video memory (there’s also the option of equipping it with a more creative-focused Nvidia RTX A5000 GPU). Available display options for the ZBook Studio G8 include 1080p IPS, 4K 120Hz IPS, or 4K OLED.
HP’s ZBook Studio G8 will be available from July at a price that’s yet to be announced. Meanwhile, the Power G8 and Fury G8 will launch at some point this summer. Read more about HP’s new laptops here.
Asus
Asus has new Zephyrus laptops to bring to the table today. First is the Zephyrus M16, which will sit above its more mainstream G-series laptops like the Zephyrus G14 and Zephyrus G15. Asus says the M16 will be configurable with up to an Nvidia RTX 3070 GPU, alongside Intel’s H-series chips. In terms of its display, the Zephyrus M16 has a tall 16:10 aspect ratio, QHD resolution, and 165Hz refresh rate. The company is also announcing the Zephyrus S17, a premium gaming laptop, which is available with up to an Intel Core i9-11900H, 48GB of RAM, and an Nvidia RTX 3080 with 16GB of VRAM.
Pricing and release information for the Zephyrus M16 is yet to be announced. The Zephyrus S17 will be available at some point in Q2 in North America. Read more about Asus’ new laptops here.
Lenovo
While we’re on the topic of 16:10 displays, Lenovo’s new Legion 7i and 5i Pro gaming laptops also use the aspect ratio for their 16-inch screens, paired with a 165Hz refresh rate. Specs for the 7i range up to the flagship Intel Core i9-11980HK, which can be paired with up to an Nvidia RTX 3080 GPU with 16GB of video memory. Step down to the Lenovo 5i Pro and your most powerful options drop to the Core i7-11800H, with an Nvidia RTX 3070. On the lower end, Lenovo also has models featuring Nvidia’s new RTX 3050 and 3050 Ti GPUs.
The Legion 7i and 5i Pro will both release in June starting at $1,769.99 and $1,329.99, respectively. Meanwhile, the 5i will release later in July with a starting price of $969.99. Read more about Lenovo’s new laptops here.
MSI
MSI is announcing a number of new gaming and creator-focused laptops today, ranging from two Creator Z16 models (which are aimed at the kinds of customers that would otherwise have bought a MacBook Pro), down to its more gaming-focused “Katana” and “Sword” machines.
The Creator Z16 has a 120Hz 16:10 QHD+ touch display and is available with an Nvidia GeForce 3060, and either a Core i7-11800H or a Core i9-11900H. Stepping down to the Creator M16 still gets you a QHD+ display, but its internal specs top out at Nvidia’s RTX 3050 Ti and Intel’s Core i7. There’s also a new Creator 17 using the new chips, which is available with up to a Core i9 and RTX 3080, and comes complete with a Mini LED display.
On the gaming side, MSI has also bumped over a half dozen laptops up to the new processors, including the GE76, GE66 Raider, GS76 Stealth, GS66 Stealth, GP76 Leopard, GP66 Leopard, GL76 Pulse, and GL66 Pulse. Finally, there’s the new “Katana” and “Sword” laptops. These are available with up to Core i7-11800H CPUs and include versions with Nvidia RTX 3060, RTX 3050 Ti, and RTX 3050 GPUs.
MSI’s Creator Z16 starts at $2,599, its Katana models start at $999, Sword will start at $1,099, and pricing for the Creator M16 is yet to be announced. The laptops are due to release later this month on May 16th. Read more about MSI’s new laptops here.
Dell / Alienware
Not to be left out of the action, Dell has a collection of new laptops it’s announcing based on Intel’s latest-generation H-series processors, with some targeting consumers and gamers, and others aimed at business users. There are Dell-branded models, as well as laptops from its Alienware subsidiary.
First up is the Alienware M15 R6. It’s available with up to a Core i9 11900H, 32GB of RAM, and an Nvidia RTX 3080 with 8GB of video memory. It’s got a 15.6-inch display, and there are options for a 1080p 165Hz display, 1080p 360Hz, or QHD 240Hz. Dell is also teasing the Alienware X17 in a series of images, as well as the teaser trailer embedded above. Details on this laptop are currently slim, but the company says it’ll eventually be available with 11th Gen Intel Core processors and 30-series GPUs from Nvidia.
Dell is also announcing a new G15 laptop today. The laptop will be available with up to an Intel 11th Gen six-core Core i7 CPU, Nvidia 30-series GPUs, and a choice of 120Hz or 165Hz refresh rates for its 15.6-inch 1080p display.
Away from its gaming machines, Dell is also announcing revamped XPS 15 and XPS 17 laptops today. They’ll be available with Intel’s latest processors, Nvidia RTX graphics, and there’s also a new OLED screen version of the XPS 15. Finally, Dell is also releasing updated models across its business-focused Precision and Latitude lineups.
The Alienware M15 R6 will start at $1,299.99, the Dell G15 at $949.99, the XPS 15 at $1,199.99, and the XPS 17 at $1,399.99. All are available from today. Expect more information on the X17 in the months ahead.
Gigabyte
Gigabyte is also announcing new laptops across its Aero, Aorus, and G series lineups.
First up from Gigabyte are new Aero series laptops aimed at creators. There’s the Aero 15 OLED, which is available with up to an Intel Core i9-11980HK, RTX 3080, and 4K HDR OLED display. Meanwhile, the Aero 17 HDR is available up to the same specs, but it’s got a larger 17.3-inch display (up from 15.6-inch with the Aero 15) which is IPS rather than OLED.
Meanwhile over on the gaming side, there’s the Aorus 15P, Aorus 17G, and Aorus 17X. The 15P and 17G are available with Intel Core i7-11800H processors and up to an Nvidia RTX 3080 with 16GB of video memory. The Aorus 15P has a 15.6-inch 1080p IPS display that’s available with either 240Hz or 360Hz refresh rates, while the Aorus 17G has a 17.3-inch IPS display with a refresh rate of 300Hz. The Aorus 17X also has a 17.3-inch 300Hz IPS display and is available with up to an RTX 3080, but it features a more powerful Intel Core i9-11980HK processor.
Finally, there are Gigabyte’s 15.6-inch G5 MD and G5 GD, and its 17.3-inch G7 MD, and G7 GD laptops. Resolution and refresh rate is 1080p and 144Hz across the board. The G5 MD and G5 GD have Intel Core i5-11400H processors, the G7 MD has an i7-11800H, and the G7 GD has an i5-11400H. The laptops are equipped with Nvidia’s new RTX 3050 and 3050 Ti GPUs.
The Aero 15 OLED starts at $1,799, and the Aero 17 HDR starts at $2,499, and both are officially on sale today. The Aorus 15P starts at $1,599, and the 17G starts at $2,099 (pricing for the 17X was not available at time of publication), and they’re also available starting today. Preorders for the new G5 and G7 models also open today, with the G5 starting at $1,149.
Acer
Acer has three new laptops it’s announcing today: the Predator Triton 300, Predator Helios 300, and the Nitro 5. All three are spec bumps of existing models.
The company says its Triton 300 will be available with up to a 4.6GHz Intel 11th Gen H-series processor, an Nvidia RTX 3080 GPU, and 32GB of RAM. Available displays include a 165Hz QHD screen, or a 360Hz 1080p panel.
Next up is the Helios 300. It’s also available with Intel’s latest processors paired with 32GB of RAM, but it maxes out at an Nvidia RTX 3070 GPU. Like the Triton 300, it’s also available with a 360Hz 1080p or a 165Hz QHD display. Similarly, the Nitro 5 is also available with Intel’s latest-generation chips, an RTX 3070 GPU, and 32GB of RAM. Acer says the Nitro 5 is available with 15.6 or 17.3-inch QHD IPS displays with 165Hz refresh rates.
The Predator Triton 300 will be available in North America from July starting at $1,699, while the Nitro 5 will be available from June starting at $999. Pricing and availability for the Predator Helios 300 was not available at time of publication.
Intel has added five consumer processors and five commercial processors to its 11th Gen Core H-series generation (codenamed “Tiger Lake-H”). Both groups include three eight-core chips and two six-core chips. All of the parts are 35W, save the flagship Core i9-11980HK, which is clocked at 65W. You’ll see them in over 30 upcoming ultraportables (laptops 20mm or thinner) and over 80 workstations.
The company (unsurprisingly) says the new chips will provide significant performance improvements over their predecessors from the 10th Gen “Comet Lake” series. It claims they’ll provide a 19 percent “gen-on-gen multithreaded performance improvement.”
On the gaming front, Intel says the Core i9-11980HK will deliver significantly better frame rates than its Comet Lake predecessor on titles including Hitman 3, Far Cry New Dawn, and Tom Clancy’s Rainbow Six Siege. The company also took aim at its competitors. It claims the 11980HK also beats the rival AMD Ryzen 9 5900HX on these titles and that its Core i5-11400H (meant for thin and light laptops) will outperform the Ryzen 9 5900HS on some and come close to matching its performance on others.
Intel did not make battery life claims in its presentation. That’s a bit concerning because recent AMD-powered laptops have been excellent in that department for the past two years.
In terms of more nitty-gritty specs, the chips will support up to 44 platform PCIe lanes, Thunderbolt 4 with up to 40Gbps bandwidth, discrete Intel Killer Wi-Fi 6E (Gig+), Optane H20, overclocking with Intel’s Speed Optimizer (on some SKUs), 20 PCIe Gen 4 lanes with RST-bootable RAID0, and turbo boost up to 5.0Ghz with Intel’s Turbo Boost Max Technology 3.0.
The commercial chips will support Intel’s vPro platform, which includes a number of business-specific security features and management tools, including Intel’s Hardware Shield (which includes a new threat-detection technology that Intel says is “the industry’s first and only silicon-enabled AI threat detection”), Total Memory Encryption, and Active Management Technology. Intel says its Core i9-11950H will be up to 29 percent faster than its predecessor in product development, 12 percent faster in financial services work, and 29 percent faster in media and entertainment.
Many eyes are on these new chips, as AMD’s Ryzen 5000 mobile series took the laptop market by storm when it was announced earlier this year. Its eight-core chips have shown significant performance gains over previous generations, particularly in multi-core workloads and efficiency. Meanwhile, Apple’s Arm-based M1 chip has put up startlingly good performance numbers while maintaining incredible battery life.
Intel is playing catch-up here, and the Tiger Lake-H chips we’ve gotten to try so far haven’t been astonishing. The lightweight Vaio Z, powered by the quad-core Core i7-11375H, yielded great results on single-core benchmarks but couldn’t hold a candle to Apple’s M1 Macbook Pro in multi-core tasks. On the gaming front, we’ve also tested MSI’s Stealth 15M and Acer’s Predator Triton 300 SE (both powered by the 11375H as well). The Stealth didn’t quite achieve the frame rates we’d expect from a laptop of its price (and couldn’t take full advantage of its QHD screen), and the Predator had disappointing battery life.
I’ll have more to say about these new CPUs when I’ve gotten to test them for myself — hopefully sooner rather than later.
Intel has just announced its new 11th Gen processors for more powerful laptops, and Dell is ready with refreshed versions of its XPS 15 and XPS 17 laptops that add the new chips, along with Nvidia’s latest RTX 30-series laptop GPUs.
The new models are virtually the same on the outside as the more substantial 2020 refresh, which saw the reintroduction of the largest 17-inch size and a redesign for the 15-inch model to better match Dell’s popular XPS 13 design.
But both laptops now offer improved specs, featuring Intel’s 11th Gen Tiger Lake H-series chips, bringing the company’s 10nm process to Dell’s more powerful laptops. Both the XPS 15 and XPS 17 can now be configured with the six-core i5-11400H or eight-core i7-11800H and i9-11900H option. The XPS 17 also adds an additional i9-11980HK option, offering eight cores and a maximum 5.0GHz clock speed for what Dell says is the “most powerful XPS laptop ever.”
There are also new, more powerful GPU options. The XPS 15 can now be configured with either Nvidia’s RTX 3050 or RTX 3050 Ti (with 45W of power), while the XPS 17 offers a beefier 60W RTX 3050 or a 70W RTX 3060 GPU.
Both computers still can be configured with up to 64GB of RAM, with options for either 4K (3840×2400) or FHD (1920 x 1200) panels, although the XPS 15 also has a 3456 x 2160 OLED option. Ports have also been upgraded: the XPS 17 now has four Thunderbolt 4 ports, while the XPS 15 offers two Thunderbolt 4 ports and a regular USB 3.2 Gen 2 Type-C port.
The XPS 15 will start at $1,199.99, while the XPS 17 will start at $1,399.99. Dell has yet to announce when the new laptops will be available.
Intel introduced its long-awaited eight-core Tiger Lake-H H35 chips for laptops today, vying for a spot on our best gaming laptop list and marking Intel’s first shipping eight-core 10nm chips for the consumer market. These new 11th-generation chips, which Intel touts as the ‘World’s best gaming laptop processors,’ come as the company faces unprecedented challenges in the laptop market — not only is it contending with AMD’s increasingly popular 7nm Ryzen “Renoir” chips, but perhaps more importantly, Intel is also now playing defense against Apple’s innovative new Arm-based M1 that powers its new MacBooks.
The halo eight-core 16-thread Core i9-11980HK peaks at 5.0 GHz on two cores, fully supports overclocking, and despite its official 65W TDP, can consume up to 110W under heavy load. Additionally, Intel has also added limited overclocking support in the form of a speed optimizer and unlocked memory settings for three of the ‘standard’ eight-core models.
As with Intel’s lower-power Tiger Lake chips, the eight-core models come fabbed on the company’s 10nm SuperFin process and feature Willow Cove execution cores paired with the UHD Graphics 750 engine with the Xe Architecture. These chips will most often be paired with a discrete graphics solution, from Nvidia or AMD. We have coverage of a broad selection of new systems, including from Alienware, Lenovo, MSI, Dell, Acer, HP, and Razer.
All told, Intel claims that the combination of the new CPU microarchitecture and process node offers up to 19% higher IPC, which naturally results in higher performance potential in both gaming and applications. That comes with a bit of a caveat, though — while Intel’s previous-gen eight-core 14nm laptop chips topped out at 5.3 GHz, Tiger Lake-H maxes out at 5.0 GHz. Intel says the higher IPC throws the balance towards even higher performance regardless of 10nm’s lower clock speed.
The new Tiger Lake-H models arrive in the wake of Intel’s quad-core H35 models that operate at 35W for a new ‘Ultraportable’ laptop segment that caters to gamers on the go. However, Intel isn’t using H45 branding for its eight-core Tiger Lake chips, largely because it isn’t marking down 45W on the spec sheet. We’ll cover what that confusing bit of information means below. The key takeaway is that these chips can operate anywhere from 35W to 65W. As usual, Intel’s partners aren’t required to (and don’t) specify the actual power consumption on the laptop or packaging.
Aside from the addition of more cores, a new system agent (more on that shortly), and more confusing branding, the eight-core Tiger Lake-H chips come with a well-known feature set that includes the same amenities, like PCIe 4.0, Thunderbolt 4, and support for Resizable Bar, as their quad-core Tiger Lake predecessors. These chips also mark the debut of the first eight-core laptop lineup that supports PCIe 4.0, as AMD’s competing platforms remain on the PCIe 3.0 connection. Intel also announced five new vPro H-series models with the same specifications as the consumer models but with features designed for the professional market.
Intel says the new Tiger Lake-H chips will come to market in 80 new designs (15 of these are for the vPro equivalents), with the leading devices available for preorder on May 11 and shipping on May 17. Surprisingly, Intel says that it has shipped over 1 million eight-core Tiger Lake chips to its partners before the first devices have even shipped to customers, showing that the company fully intends to leverage its production heft while its competitors, like AMD, continue to grapple with shortages. Intel also plans to keep its current fleet of 10th-Gen Comet Lake processors on the market for the foreseeable future to address the lower rungs of the market, so its 14nm chips will still ship in volume.
Intel Tiger Lake-H Specifications
Processor Number
Base / Boost
Cores / Threads
L3 Cache
Memory
Core i9-11980HK
2.6 / 5.0
8 / 16
24 MB
DDR4-2933 (Gear 1) / DDR4-3200 (Gear 2)
AMD Ryzen 9 5900HX
3.3 / 4.6
8 / 16
16 MB
DDR4-3200 / LPDDR4x-4266
Core i9-10980HK
2.4 / 5.3
8 / 16
16 MB
DDR4-2933
Core i7-11375H Special Edition (H35)
3.3 / 5.0
4 / 8
12 MB
DDR4-3200, LPDDR4x-4266
Core i9-11900H
2.5 / 4.9
8 / 16
24 MB
DDR4-2933 (Gear 1) / DDR4-3200 (Gear 2)
Core i7-10875H
2.3 / 5.1
8 / 16
16 MB
DDR4-2933
Core i7-11800H
2.3 / 4.6
8 / 16
24M
DDR4-2933 (Gear 1) / DDR4-3200 (Gear 2)
Core i5-11400H
2.7 / 4.5
6 / 12
12 MB
2933 (Gear 1) / DDR4-3200 (Gear 2)
Ryzen 9 5900HS
3.0 / 4.6
8 / 16
4 MB
DDR4-3200 / LPDDR4x-4266
Core i5-10400H
2.6 / 4.6
4 / 8
8 MB
DDR4-2933
Intel’s eight-core Tiger Lake-H takes plenty of steps forward — it’s the only eight-core laptop platform with PCIe 4.0 connectivity and hardware support for AVX-512, but it also takes steps back in a few areas.
Although Intel just released 40-core 10nm Ice Lake server chips, we’ve never seen the 10nm process ship with more than four cores for the consumer market, largely due to poor yields and 10nm’s inability to match the high clock rates of Intel’s mature 14nm chips. We expected the 10nm SuperFin process to change that paradigm, but as we see in the chart above, the flagship Core i9-11980HK tops out at 5.0 GHz on two cores, just like the quad-core Tiger Lake i7-11375H Special Edition. Intel uses its Turbo Boost 3.0, which targets threads at the fastest cores, to hit the 5.0 GHz threshold.
However, both chips pale in comparison to the previous-gen 14nm Core i9-10980HK that delivers a beastly 5.3 GHz on two cores courtesy of the Thermal Velocity Boost (TVB) tech that allows the chip to boost higher if it is under a certain temperature threshold. Curiously, Intel doesn’t offer TVB on the new Tiger Lake processors.
Intel says that it tuned 10nm Tiger Lake’s frequency for the best spot on the voltage/frequency curve to maximize both performance and battery life, but it’s obvious that process maturity also weighs in here. Intel offsets Tiger Lake’s incrementally lower clock speeds with the higher IPC borne of the Willow Cove microarchitecture that delivers up to 12% higher IPC in single-threaded and 19% higher IPC in multi-threaded applications. After those advances, Intel says the Tiger Lake chips end up faster than their prior-gen counterparts. Not to mention AMD’s competing Renoir processors.
Intel’s Core i9-11980HK peaks at 110W (PL2) and is a fully overclockable chip — you can adjust the core, graphics, and memory frequency at will. We’ll cover the power consumption, base clock, and TDP confusion in the following section.
Intel has also now added support for limited overclocking on the Core i7-11800H, i9-11900H, and the i9-11950. The memory settings on these three chips are fully unlocked, although with a few caveats we’ll list below, so you can overclock the memory at will. Intel also added support for its auto-tuning Speed Optimizer software. When enabled, this software boosts performance in multi-threaded work, but single-core frequencies are unimpacted.
Intel also made some compromises on the memory front, too. First, the memory controllers no longer support LPDDR4X. Instead, they top out at DDR4-3200, and that’s actually not the case for most of the 11th-Gen lineup, at least if you want the chip to run in the fastest configuration.
The eight-core Tiger Lake die comes with the System Agent Geyersville just like the Rocket Lake desktop chips. That means the company has brought Gear 1 and Gear 2 memory modes to laptops. The optimal setting is called ‘Gear 1’ and it signifies that the memory controller and memory operate at the same frequency (1:1), thus providing the lowest latency and best performance in lightly-threaded work, like gaming. All of the Tiger Lake chips reach up to DDR4-2933 in this mode.
Tiger Lake-H does officially support DDR4-3200, but only with the ‘Gear 2’ setting that allows the memory to operate at twice the frequency of the memory controller (2:1), resulting in higher data transfer rates. This can benefit some threaded workloads but also results in higher latency that can lead to reduced performance in some applications — particularly gaming. We have yet to see a situation where Gear 2 makes much sense for enthusiasts/gamers.
Intel also dialed back the UHD Graphics engine with Xe Architecture for the eight-core H-Series models to 32 execution units (EU), which makes sense given that this class of chip will often be paired with discrete graphics from either AMD or Nvidia. And possibly Intel’s own fledgling DG1, though we have yet to see any configurations yet. For comparison, the quad-core H35 Core i9 and i7 models come equipped with 96 EUs, while the Core i5 variant comes with 80 EUs.
Image 1 of 8
Image 2 of 8
Image 3 of 8
Image 4 of 8
Image 5 of 8
Image 6 of 8
Image 7 of 8
Image 8 of 8
This is Not The Tiger Lake H45 You’re Looking for – More TDP Confusion
As per usual with Intel’s recent laptop chip launches, there’s a bit of branding confusion. The company’s highest-end eight-core laptop chips previously came with an “H45” moniker to denote that these chips have a recommended 45W TDP. But you won’t find that designation with Intel’s new H-Series chips, this even though the quad-core 35W laptop chips that Intel introduced at CES this year come with the H35 designation. In fact, Intel also won’t list a specific TDP on the spec sheet for the eight-core Tiger Lake-H chips. Instead, it will label the H-series models as ’35W to 65W’ for the official TDP.
That’s problematic because Intel measures its TDP at the base frequency, so a lack of a clear TDP rating means there’s no concrete base frequency specification. We know that the PL2, or power consumed during boost, tops out at 110W, but due to the TDP wonkiness, there’s no official PL1 rating (base clock).
That’s because Intel, like AMD, gives OEMs the flexibility to configure the TDP (cTDP) to higher or lower ranges to accommodate the specific power delivery, thermal dissipation, and battery life accommodations of their respective designs. For instance, Intel’s previous-gen 45W parts have a cTDP range that spans from 35W to 65W.
This practice provides OEMs with wide latitude for customization, which is a positive. After all, we all want thinner and faster devices. However, Intel doesn’t compel manufacturers to clearly label their products with the actual TDP they use for the processor, or even list it in the product specifications. That can be very misleading — there’s a 30W delta between the lowest- and highest-performance configurations of the same chip with no clear method of telling what you’re purchasing at the checkout lane. There really is no way to know which Intel is inside.
Intel measures its TDP rating at the chip’s base clock (PL1), so the Tiger Lake-H chips will have varying base clocks that reflect their individual TDP… that isn’t defined. Intel’s spec table shows base clocks at both 45W and 35W, but be aware that this can be a sliding scale. For instance, you might purchase a 40W laptop that lands in the middle range.
As per usual, Intel’s branding practice leaves a lot to be desired. Eliminating the H45 branding and going with merely the ‘H-Series’ for the 35W to 65W eight cores simply adds more confusion because the quad-core H35 chips are also H-Series chips, and there’s no clear way to delineate the two families other than specifying the core count.
Intel is arguably taking the correct path here: It is better to specify that the chips can come in any range of TDPs rather than publish blatantly misleading numbers. However, the only true fix for the misleading mess created by configurable TDPs is to require OEMs to list the power rating directly on the device, or at least on the spec sheet.
Intel Tiger Lake-H Die
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
The eight-core H-series chip package comes with a 10nm die paired with a 14nm PCH. The first slide in the above album shows the Tiger Lake die (more deep-dive info here) that Intel says measures 190mm2, which is much larger than the estimated 146.1mm2 die found on the quad-core models (second image). We also included a die shot of the eight-core Comet Lake-H chip (third image).
We’ll have to wait for a proper die annotation of the Tiger Lake-H chip, but we do know that it features a vastly cut-down UHD Graphics 750 engine compared to the quad-core Tiger Lake models (32 vs 96 EUs) and a much larger L3 cache (24 vs 16MB).
The Tiger Lake die supports 20 lanes of PCIe 4.0 connectivity, with 16 lanes exposed for graphics, though those can also be carved into 2×8, 1×8, or 2×4 connections to accommodate more PCIe 4.0 additives, like additional M.2 SSDs. Speaking of which, the chip also supports a direct x4 PCIe 4.0 connection for a single M.2 SSD.
Intel touts that you can RAID several M.2 SSDs together through its Intel Rapid Storage Technology (IRST) and use them to boot the machine. This feature has been present on prior-gen laptop platforms, but Tiger Lake-H marks the debut for this feature with a PCIe 4.0 connection on a laptop.
The PCH provides all of the basic connectivity features (last slide). The Tiger Lake die and PCH communicate over a DMI x8 bus, and the chipset supports an additional 24 PCIe 3.0 lanes that can be carved up for additional features. For more fine-grained details of the Tiger Lake architecture, head to our Intel’s Tiger Lake Roars to Life: Willow Cove Cores, Xe Graphics, Support for LPDDR5, and Intel’s Path Forward: 10nm SuperFin Technology, Advanced Packaging Roadmap articles for more details.
Intel Tiger Lake-H Gaming Benchmarks
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Intel provided the benchmarks above to show the gen-on-gen performance improvements in gaming, and the performance improvement relative to competing AMD processors. As always, approach vendor-provided benchmarks with caution, as they typically paint the vendors’ devices in the best light possible. We’ve included detailed test notes at the end of the article, and Intel says it will provide comparative data against Apple M1 systems soon.
As expected, Intel shows that the Core i9-11980HK provides solid generational leads over the prior-gen Core i9-10980HK, with the deltas spanning from 15% to 21% in favor of the newer chip.
Then there are the comparisons to the AMD Ryzen 9 5900HX, with Intel claiming leads in titles like War Thunder, Total War: Three Kingdoms, and Hitman 3, along with every other hand-picked title in the chart.
Intel tested the 11980HK in an undisclosed OEM pre-production system with an RTX 3080 set at a 155W threshold, while the AMD Ryzen 9 5900HX resided in a Lenovo Legion R9000K with an RTX 3080 dialed in at 165W. Given that we don’t know anything about the OEM system used for Intel’s benchmarks, like cooling capabilities, and that the company didn’t list the TDP for either chip, take these benchmarks with a shovelful of salt.
Intel also provided benchmarks with the Core i5-11400H against the Ryzen 9 5900HS, again claiming that its eight-core chips for thin-and-lights offer the best performance. However, here we can see that the Intel chip loses in three of the four benchmarks, but Intel touts that its “Intel Sample System” is a mere 16.5mm thick, while the 5900HS rides in an ASUS ROG Zephyrus G14 that measures 18mm thick at the front and 20mm thick at the rear.
Intel’s message here is that it can provide comparable gaming performance in thinner systems, but there’s not enough information, like battery life or other considerations, to make any real type of decision off this data.
Intel Tiger Lake-H Application Benchmarks
Image 1 of 2
Image 2 of 2
Here we can see Intel’s benchmarks for applications, too, but the same rules apply — we’ll need to see these benchmarks in our own test suite before we’re ready to claim any victors. Also, be sure to read the test configs in the slides below for more details.
Intel’s 11th-Gen Tiger Lake brings support for AVX-512 and the DL Boost deep learning suite, so Intel hand-picks benchmarks that leverage those features. As such, the previous-gen Comet Lake-H comparable is hopelessly hamstrung in the Video Creation Workflow and Photo Processing benchmarks.
We can say much the same about the comparison benchmarks with the Ryzen 9 5900HX. As a result of Intel’s insistence on using AI-enhanced benchmarks, these benchmarks are largely useless for real-world comparisons: The overwhelming majority of software doesn’t leverage either AI or AVX-512, and it will be several years before we see broad uptake.
As noted, Intel says the new Tiger Lake-H chips will come to market in 80 new designs (15 of these are for the vPro equivalents), with the leading devices available for preorder on May 11 and shipping on May 17. As you can imagine, we’ll also have reviews coming soon. Stay tuned.
MSI is getting a new lineup, including some new designs, in sync with Intel launching its Tiger Lake-H processors. While it refreshed recently at CES 2021, this new launch includes more new designs. Some of them will also utilize Nvidia’s new RTX 3050 and RTX 3050 Ti graphics cards. Pre-orders begin today, and laptops will begin to ship on May 16.
MSI GE76 and GE66 Raider
The GE76 and GE66 Raider have taken the flagship spot. (The latter has long been on our list of the
best gaming laptops
.) They’re the same design, but with 17-inch and 15-inch screens, respectively. Both will go up to an overclockable Intel Core i9-11980HK and Nvidia GeForce RTX 3080. At launch, the GE66 will go up to
4K
, while the GE76 will only have faster but lower resolution 1080p screens. Higher-resolution screens for the 17-incher will come in May and June.
While the design is the same, including a blue aluminum, MSI said it intends on using more powerful cooling. The Raiders also have
FHD
webcams and have bumped up to Wi-Fi 6E and Thunderbolt 4 for connectivity.
MSI GE76 Raider
MSI GE66 Raider
MSI GS76 Stealth
CPU
Up to Intel Core i9-11980HK
Up to Intel Core i9-11980HK
Up to Intel Core i9-110900H
GPU
Up to Nvidia GeForce RTX 3080 (16GB GDDR6)
Up to Nvidia GeForce RTX 3080 (16GB GDDR6)
Up to Nvidia GeForce RTX 3080 (16GB GDDR6)
RAM
Up to 32GB at 3,200 MHz
Up to 32GB at 3,200 MHz
Up to 64GB at 3,200 MHz
Storage
Up to 1TB
Up to 2TB
Up to 2TB
Display
17.3-inches, 1920 x 1080, up to 360Hz (QHD coming late May)
15.6-inch, up to 4K, QHD up to 240 Hz
17.3-inches, up to 4K, FHD up to 300 Hz
Networking
Killer WiFi 6E AX1675 (2×2), Bluetooth 5.2
Killer WiFi 6E AX1675 (2×2), Bluetooth 5.2
Killer WiFi 6E AX1675 (2×2), Bluetooth 5.2
Battery
99.9 WHr
99.9 WHr
99.9 WHr
Starting Price
$1,499
$2,299
$1,999
MSI GS76 Stealth
We’re also seeing a larger version of the existing MSI Stealth. The new GS76 is a 17.3-inch version of the laptop. (We only saw the GS66, the 15.6-incher, at CES, though that is getting upgraded to new parts, too.) It won’t get the overclockable processor, but you get RAM going up to 64GB at 3,200 MHz, up to 2TB of SSD storage and the same 99.9 WHr battery as the Raider line. Like the Raider, there will be
QHD
options coming later in the month.
The new design has top-firing speakers, and MSI says this laptop will have a far more tactile keyboard than the previous 17-inch Stealth, the GS75.
MSI GL66 Pulse and Crosshair
The MSI GL Pulse is a new entry that joins the Crosshair, both of which are intermediate-level gaming laptops. They’re largely the same, including metal lids, but the Pulse has some engraved designs where the Crosshair is plainer.
Both the Pulse will start at $959 with a Core i5-11400H and RTX 3050 and go up from there, topping out at $1,799. Both are getting new keyboards with single-zone RGB, and while the more expensive Raider and Stealth will have Gen 4 SSDs, the GL lineup will stay on Gen 3.
MSI GF Katana and Sword
Image 1 of 2
Image 2 of 2
MSI’s most entry-level gaming notebooks are the new Katana and Sword. They’re replacing the previous GF Thin line. The two notebooks differ only in color: Katana is black with a red keyboard, while Sword is white with a blue keyboard. These differ from the GL lineup in that they are plastic and have fewer panel options.
Katana starts at $999 with a Core i7-11800H and an RTX 3050 Ti and goes up to $1,449 with a Core i7 and an RTX 3060. The white laptop, Sword has a single $1,099 configuration with a Core i7 and RTX 3050 Ti. Sword has a 15.6-inch, 1920 x 1080 display at 144 Hz, while Katana will be at both 15 and 17 inches.
MSI Creator Z16
Image 1 of 3
Image 2 of 3
Image 3 of 3
MSI is taking another shot at the MacBook Pro crowd with its new Creator Z16. It starts at an eye-watering $2,599 with an Intel Core i7-11800H, an RTX 3060, 32GB of RAM and 1TB of storage. The $2,999 configuration bumps up to a Core i9-11900H and 2TB SSD.
The new top-of-the-line creator notebooks are minimalist with a CNC aluminum build. MSI has opted for a 16:10 touch display with 2560 x 1600 resolution and a speed of 120 Hz. It also includes two Thunderbolt 4 ports, microSD slot. Unlike the Raider, this sports a 720p webcam and a 90 WHr battery.
The new Chia cryptocurrency has already started making waves in the storage industry, as we’ve reported back in April. With Chia trading now live, it looks set to become even more interesting in the coming months. The total netspace for Chia has already eclipsed 2 exabytes, and it’s well on its way to double- and probably even triple-digit EiB levels if current trends continue. If you’re looking to join the latest crypto-bandwagon, here’s how to get started farming Chia coin.
First, if you’ve dabbled in other cryptocurrencies before, Chia is a very different beast. Some of the fundamental blockchain concepts aren’t radically different from what’s going before, but Chia coin ditches the Proof of Work algorithm for securing the blockchain and instead implements Proof of Space — technically Proof of Time and Space, but the latter appears to be the more pertinent factor. Rather than mining coins by dedicating large amounts of processing power to the task, Chia simply requires storage plots — but these plots need to be filled with the correct data.
The analogies with real-world farming are intentional. First you need to clear a field (i.e., delete any files on your storage devices that are taking up space), then you plough and seed the field (compute a plot for Chia), and then… well, you wait for the crops to grow, which can take quite a long time when those crops are Chia blocks.
Your chances of solving a Chia coin block are basically equal to your portion of the total network space (netspace). Right now, Chia’s netspace sits at roughly 2.7 EiB (Exbibytes — the binary SI unit, so 1 EiB equals 2^60 bytes, or 1,152,921,504,606,846,976 bytes decimal). That means if you dedicate a complete 10TB (10 trillion bytes) of storage to Chia plots, your odds of winning are 0.00035%, or 0.0000035 if we drop the percentage part. Those might sound like terrible odds — they’re not great — but the catch is that there are approximately 4,608 Chia blocks created every day (a rate of 32 blocks per 10 minutes, or 18.75 seconds per block) and any one of them could match your plot.
Simple math can then give you the average time to win, though Chia calculators make estimating this far easier than doing the math yourself. A completely full 10TB HDD can store 91 standard Chia blocks (101.4 GiB). Yeah, don’t get lazy and forget to convert between tebibytes and terabytes, as SI units definitely matters. Anyway, 91 blocks on a single 10TB HDD should win a block every two months or so — once every 68 days.
Each Chia plot ends up being sort of like a massive, complex Bingo card. There’s lots of math behind it, but that analogy should suffice. Each time a block challenge comes up, the Chia network determines a winner based on various rules. If your plot matches and ‘wins’ the block, you get the block reward (currently 2 XCH, Chia’s coin abbreviation). That block reward is set to decrease every three years, for the first 12 years, after which the block reward will be static ad infinitum. The official FAQ lists the reward rate as 64 XCH per 10 minutes, and it will get cut in half every three years until it’s at 4 XCH per 10 minutes with a block reward of 0.125 XCH.
Of course, luck comes into play. It’s theoretically possible (though highly unlikely) to have just a few plots and win a block solution immediately. It’s also possible to have hundreds of plots and go for a couple of months without a single solution. The law of averages should equalize over time, though. Which means to better your chances, you’ll need more storage storing more Chia plots. Also, just because a plot wins once doesn’t mean it can’t win again, so don’t delete your plots after they win.
This is the standard cryptocurrency arms race that we’ve seen repeated over the past decade with hundreds of popular coins. The big miners — farmers in this case — want more of the total Chia pie, and rush out to buy more hardware and increase their odds of winning. Except, this time it’s not just a matter of buying more SSDs or HDDs. This time farmers need to fill each of those with plots, and based on our testing, that is neither a simple task nor something that can be done quickly.
Hardware Requirements for Chia Coin Farming
With Ethereum, once you have the requisite GPUs in hand, perhaps some of the best mining GPUs, all you have to do is get them running in a PC. Chia requires that whole ploughing and plotting business, and that takes time. How much time? Tentatively, about six or seven hours seems typical per plot, with a very fast Optane 905P SSD, though it’s possible to do multiple plots at once with the right hardware. You could plot directly to hard drive storage, but then it might take twice as long, and the number of concurrent plots you can do drops to basically one.
The best solution is to have a fast SSD — probably an enterprise grade U.2 drive with plenty of capacity — and then use that for the plotting and transfer the finished plots to a large HDD. Chia’s app will let you do that, but it can be a bit finicky, and if something goes wrong like exceeding the temp storage space, the plotting will crash and you’ll lose all that work. Don’t over schedule your plotting, in other words.
Each 101.4 GiB plot officially requires up to 350 GiB of temporary storage, though we’ve managed to do a single plot multiple times on a 260 GiB SSD. Average write speed during the plotting process varies, sometimes it reaches over 100MB/s, other times it can drop closer to zero. When it drops, that usually means more computational work and memory are being used. Plotting also requires 4 GiB of RAM, so again, high capacity memory sticks are par for the course.
Ultimately, for fast SSDs, the main limiting will likely be storage capacity. If we use the official 350 GiB temp space requirement, that means a 2TB SSD (1863 TiB) can handle at most five concurrent plots. Our own testing suggests that it can probably do six just fine, maybe even seven, but we’d stick with six to be safe. If you want to do more than that (and you probably will if you’re serious about farming Chia), you’ll need either a higher capacity SSD, or multiple SSDs. Each plot your PC is creating also needs 4GB of memory and two CPU threads, and there appear to be scaling limits.
Based on the requirements, here are two recommended builds — one for faster plotting (more concurrent plots) and one for slower plotting.
Our baseline Chia plotting PC uses a 6-core/12-thread CPU, and we’ve elected to go with Intel’s latest Core i5-11400 simply because it’s affordable, comes with a cooler, and should prove sufficiently fast. AMD’s Ryzen 5 5600X would be a good alternative, were it readily available — right now it tends to cost about twice as much as the i5-11400, plus it also needs a dedicated graphics card, and we all know how difficult it can be to find those right now.
For storage, we’ve selected a Sabrent Rocket 4 Plus 2TB that’s rated for 1400 TBW. That’s enough to create around 800–900 plots, at which point your Chia farm should be doing quite nicely and you’ll be able to afford a replacement SSD. Mass storage comes via a 10TB HDD, because that’s the most economical option — 12TB, 14TB, 16TB, and 18TB drives exist, but they all cost quite a bit more per GB of storage. Plus, you’ll probably want to move your stored plots to a separate machine when a drive is filled, but more on that below.
The other components are basically whatever seems like a reasonably priced option, with an eye toward decent quality. You could probably use a smaller case and motherboard, or a different PSU as well. You’ll also need to add more HDDs — probably a lot more — as you go. This PC should support up to six internal SATA HDDs, though finding space in the case for all the drives might be difficult.
At a rate of 18 plots per day, it would take about 30 days of solid plotting time to fill six 10TB HDDs. Meanwhile, the potential profit from 60TB of Chia plots (546 101.4 GiB plots) is currently… wow. Okay, we don’t really want to get your hopes up, because things are definitely going to change. There will be more netspace, the price could drop, etc. But right now, at this snapshot in time, you’d potentially solve a Chia block every 11 days and earn around $5,900 per month.
What’s better than a PC that can do six plots at a time? Naturally it’s a PC that can do even more concurrent plots! This particular setup has a 10-core CPU, again from Intel because of pricing considerations. We’ve doubled the memory and opted for an enterprise class 3.84TB SSD this time. That’s sufficient for the desired ten concurrent plots, which will require up to nearly all of the 3.57 TiB of capacity. We’ve also added a second 10TB HDD, with the idea being that you do two sets of five plots at the same time, with the resulting plots going out to different HDDs (so that HDD write speed doesn’t cause a massive delay when plotting is finished for each batch).
Most of the remaining components are the same as before, though we swapped to a larger case for those who want to do all the farming and plotting on one PC. You should be able to put at least 10 HDDs into this case (using the external 5.25-inch bays). At a rate of 30 plots per day, it should take around 30 days again to fill ten 10TB drives (which aren’t included in the price, though we did put in two). As before, no promises on the profitability since it’s virtually guaranteed to be a lot lower than this, but theoretically such a setup should solve a Chia block every seven days and earn up to $9,800 per month.
Chia farming rig from https://t.co/IPJadpARFa 96 terabytes running off a RockPi4 Model C pic.twitter.com/F6iKOMIdIyJanuary 15, 2021
See more
Long-term Efficient Chia Farming
So far we’ve focused on the hardware needed to get plotting, which is the more difficult part of Chia farming. Once you’re finished building your farm, though, you’ll probably want to look at ways to efficiently keep the farm online. While it’s possible to build out PCs with dozens of HDDs using PCIe SATA cards and extra power supplies, it’s likely far easier and more efficient to skip all that and go with Raspberry Pi. That’s actually the recommended long-term farming solution from the Chia creators.
It’s not possible to directly connected dozens of SATA drives to Raspberry Pi, but using USB-to-SATA adapters and USB hubs overcomes that limitation. There’s the added benefit of not overloading the 5V rail on a PSU, since the enclosures should have their own power — or the USB hubs will. And once you’re finished building out a farm, the power costs to keep dozens of hard drives connected and running are relatively trivial — you could probably run 50 HDDs for the same amount of power as a single RTX 3080 mining Ethereum.
How to Create Chia Plots
We’ve mostly glossed over the plot creation process so far. It’s not terribly complicated, but there are some potential pitfalls. One is that the plotting process can’t be stopped and restarted. You don’t want to do this on a laptop that may power off, though theoretically it should be possible to put a system to sleep and wake it back up, and then let it pick up where it left off. But if you overfill the temp storage, Chia will crash and you’ll lose all progress on any plots, and since it can take six or seven hours, that’s a painful loss.
The first step naturally is to install Chia. We’re using Windows, though it’s available on MacOS and can be compiled from source code for various Linux platforms. Once installed, you’ll need to let the blockchain sync up before you can get to work on farming. However, you can still create plots before the blockchain gets fully synced — that takes perhaps 10 hours, in our experience, but it will inevitably start to take longer as more blocks get added.
You’ll need to create a new private key to get started — don’t use the above key, as anyone else on the ‘net can just steal any coins you farm. Screenshot and write down your 24 word mnemonic, as that’s the only way you can regain access to your wallet should your PC die. Store this in a safe and secure place!
Next, you’ll see the main page. As noted above, it can take quite a while to sync up, and any information displayed on this screen prior to having the full blockchain won’t be current. For example, the above screenshot was taken when the total netspace was only 1.51 EiB (sometime earlier this week). The Wallets and Farm tabs on the left won’t have anything useful right now, so head over to Plots and get started on the plotting process.
If you’ve previously generated plots, you could import the folder here, but your key has to match the key used for generating plots. If you were to gain access to someone else’s plot files somehow, without the key they’d do you no good. Again, don’t lose your key — or share it online! Hit the Add a Plot button, though.
Here’s where the ‘magic’ happens. We’ve specified six concurrent plots, with a ten minute delay between each plot starting. That should result in roughly a ten minute delay between plots finishing, which should be enough time for the program to move a finished plot to the final directory.
The Temporary Directory will be your big and fast SSD drive. You could try for a smaller delay between plots starting, but six concurrent plots will certainly put a decent load on most SSDs. Note also that Chia says it needs 239 GiB of temporary storage per plot — it’s not clear (to us) if that’s in addition to the 101.4 GiB for the final plot, but the amount of used space definitely fluctuates during the course of plot creation.
Once everything is set, click the Create Plot button at the bottom, and walk away for the next 6–8 hours. If you come back in eight hours, hopefully everything will have finished without incident and you’ll now see active plots on your Chia farm. Queue up another set of six plots (or however many plots your PC can handle concurrently), and done properly you should be able to get around three cycles in per day.
Then you just leave everything online (or migrate full drives to a separate system that uses the same key), and eventually you should manage to solve a block, earn some XCH coin, and then you can hoard that and hope the price goes up, or exchange it for some other cryptocurrency. Happy farming!
Chia Farming: The Bottom Line
Just looking at that income potential should tell you one thing: More people are going to do this than what we’re currently seeing. That or price is going to implode. For the cost of an RTX 3080 off of eBay right now, you could break even in just a couple of weeks. Our short take: anyone looking for new hard drives or large SSDs — could be in for a world of hurt as Chia causes a storage shortage.
During its first week of trading, Chia started with a price of around $1,600, climbed up to a peak of around $1,900, and then dropped to a minimum value of around $560. But then it started going up again and reached a relatively stable (which isn’t really stable at all) $1,000 or so on Friday. A couple more exchanges have joined the initial trio, with OKex accounting for around 67% of trades right now.
More importantly than just price is volume of trading. The first day saw only $11 million in trades, but Thursday/Friday has chalked up over 10X as much action. It might be market manipulation, as cryptocurrencies are full of such shenanigans, but anyone that claimed Chia was going to fade away after the first 12 hours of trading clearly missed the boat.
Unlike other cryptocurrencies, Chia will take a lot more effort to bring more plots online, but we’re still seeing an incredibly fast ramp in allocated netspace. It’s currently at 2.7 EiB, which is a 55% increase just in the past four days. We’ll probably see that fast rate of acceleration for at least a few weeks, before things start to calm down and become more linear in nature.
There are still concerns with e-waste and other aspects of any cryptocurrency, but Chia at least does drastically cut back on the power requirements. Maybe that’s only temporary as well, though. 50 HDDs use as much power as a single high-end GPU, but if we end up with 50X as many HDDs farming Chia, we’ll be right back to square one. For the sake of the environment, let’s hope that doesn’t happen.
Reviews for Capcom’s Resident Evil Village have gone live, and we’re taking the opportunity to look at how the game runs on the best graphics cards. We’re running the PC version on Steam, and while patches and future driver updates could change things a bit, both AMD and Nvidia have provided Game Ready drivers for REV.
This installment in the Resident Evil series adds DirectX Raytracing (DXR) support for AMD’s RX 6000 RDNA2 architecture, or Nvidia’s RTX cards — both the Ampere architecture and Turing architecture. AMD’s promoting Resident Evil Village, and it’s on the latest gen consoles as well, so there’s no support of Nvidia’s DLSS technology. We’ll look at image quality in a moment, but first let’s hit the official system requirements.
Capcom notes that in either case, the game targets 1080p at 60 fps, using the “Prioritize Performance” and presumably “Recommended” presets. Capcom does state that the framerate “might drop in graphics-intensive scenes,” but most mid-range and higher GPUs should be okay. We didn’t check lower settings, but we can confirm that 60 fps at 1080p will certainly be within reach of a lot of graphics cards.
The main pain point for anyone running a lesser graphics card will be VRAM, particularly at higher resolutions. With AMD pushing 12GB and 16GB on its latest RX 6000-series cards, it’s not too surprising that the Max preset uses 12GB VRAM. It’s possible to run 1080p Max on a 6GB card, and 1440p Max on an 8GB card, but 4K Max definitely wants more than 8GB VRAM — we experienced inconsistent frametimes in our testing. We’ve omitted results on cards where performance wasn’t reliable in the charts.
Anyway, let’s hit the benchmarks. Due to time constraints, we’re not going to run every GPU under the sun in these benchmarks, but will instead focus on the latest gen GPUs, plus the top and bottom RTX 20-series GPUs and a few others as we see fit. We used the ‘Max’ preset, with and without ray tracing, and most of the cards we tested broke 60 fps. Turning on ray tracing disables Ambient Occlusion, because that’s handled by the ray-traced GI and Reflection options, but every other setting is on the highest quality option (which means variable-rate shading is off for our testing).
Our test system consists of a Core i9-9900K CPU, 32GB VRAM and a 2TB SSD — the same PC we’ve been using for our graphics card and gaming benchmarks for about two years now, because it continues to work well. With the current graphics card shortages, acquiring a new high-end GPU will be difficult — our GPU pricing index covers the details. Hopefully, you already have a capable GPU from pre-2021, back in the halcyon days when graphics cards were available at and often below MSRP. [Wistful sigh]
Granted, these are mostly high-end cards, but even the RTX 2060 still posted an impressive 114 fps in our test sequence — and it also nearly managed 60 fps with ray tracing enabled (see below). Everything else runs more than fast enough as well, with the old GTX 1070 bringing up the caboose with a still more than acceptable 85 fps. Based off what we’ve seen with these GPUs and other games, it’s a safe bet that cards like the GTX 1660, RX 5600 XT, and anything faster than those will do just fine in Resident Evil Village.
AMD’s RDNA2 cards all run smack into an apparent CPU limit at around 195 fps for our test sequence, while Nvidia’s fastest GPUs (2080 Ti and above) end up with a lower 177 fps limit. At 1080p, VRAM doesn’t appear to matter too much, provided your GPU has at least 6GB.
Turning on ray tracing drops performance, but the drop isn’t too painful on many of the cards. Actually, that’s not quite true — the penalty for DXR depends greatly on your GPU. The RTX 3090 only lost about 13% of its performance, and the RTX 3080 performance dropped by 20%. AMD’s RX 6900 XT and RX 6800 XT both lost about 30-35% of their non-RT performance, while the RTX 2080 Ti, RX 6800, RTX 3070, RTX 3060 Ti, and RTX 3060 plummeted by 40–45%. Meanwhile, the RX 6700 XT ended up running at less than half its non-DXR rate, and the RTX 2060 also saw performance chopped in half.
Memory and memory bandwidth seem to be major factors with ray tracing enabled, and the 8GB and lower cards were hit particularly hard. Turning down a few settings should help a lot, but for these initial results we wanted to focus on maxed-out graphics quality. Let us know in the comments what other tests you’d like to see us run.
The performance trends we saw at 1080p become more pronounced at higher resolutions. At 1440p Max, more VRAM and memory bandwidth definitely helped. The RX 6900 XT, RX 6800 XT, RTX 3090, and RTX 3080 only lost a few fps in performance compared to 1080p when running without DXR enabled, and the RX 6800 dipped by 10%. All of the other GPUs drop by around 20–30%, but the 6GB RTX 2060 plummeted by 55%. Only the RTX 2060 and GTX 1070 failed to average 60 fps or more.
1440p and ray tracing with max settings really needs more than 8GB VRAM — which probably explains why the Ray Tracing preset (which we didn’t use) opts for modest settings everywhere else. Anyway, the RTX 2060, 3060 Ti, and 3070 all started having problems at 1440p with DXR, which you can see in the numbers. Some runs were much better than we show here, others much worse, and after repeating each test a bunch of times, we still aren’t confident those three cards will consistently deliver a good experience without further tweaking the graphics settings.
On the other hand, cards with 10GB or more VRAM don’t show nearly the drop that we saw without ray tracing when moving from 1080p to 1440p. The RTX 3060 only lost 18% of its 1080p performance, and chugs along happily at just shy of 60 fps. The higher-end AMD and Nvidia cards were all around the 15% drop mark as well.
But enough dawdling. Let’s just kill everything with some 4K testing…
Well, ‘kill’ is probably too strong of a word. Without ray tracing, most of the GPUs we tested still broke 60 fps. But of those that came up short, they’re very short. RTX 3060 is still generally playable, but Resident Evil Village appears to expect 30 fps or more, as dropping below that tends to cause the game to slow down. The RX 5700 XT should suffice in a pinch, even though it lost 67% of its 1440p performance, but the 1070 and 2060 would need lower settings to even take a crack at 4K.
Even with DXR, the RTX 2080 Ti and RX 6800 and above continue to deliver 60 fps or more. The RTX 3060 also still manages a playable 41 fps — this isn’t a twitch action game, so sub-60 frame rates aren’t the end of the world. Of course, we’re not showing the cards that dropped into the teens or worse — which is basically all the RTX cards with 8GB or less VRAM.
The point isn’t how badly some of the cards did at 4K Max (with or without DXR), but rather how fast a lot of the cards still remained. The DXR switch often imposed a massive performance hit at 1080p, but at 4K the Nvidia cards with at least 10GB VRAM only lost about 15% of their non-DXR performance. AMD’s GPUs took a larger 25% hit, but it was very consistent across all four GPUs.
Resident Evil Village Graphics Settings
Image 1 of 8
Image 2 of 8
Image 3 of 8
Image 4 of 8
Image 5 of 8
Image 6 of 8
Image 7 of 8
Image 8 of 8
You can see the various advanced settings available in the above gallery. Besides the usual resolution, refresh rate, vsync, and scaling options, there are 18 individual graphics settings, plus two more settings for ray tracing. Screen space reflections, volumetric lighting and shadow quality are likely to cause the biggest impact on performance, though the sum of the others can add up as well. For anyone with a reasonably high-end GPU, though, you should be able to play at close to max quality (minus ray tracing if you don’t have an appropriate GPU, naturally).
But how does the game look? Capturing screenshots with the various settings on and off is a pain, since there are only scattered save points (typewriters), and some settings appear to require a restart to take effect. Instead of worrying about all of the settings, let’s just look at how ray tracing improves things.
Resident Evil Village Image Quality: Ray Tracing On / Off
Image 1 of 18
Image 2 of 18
Image 3 of 18
Image 4 of 18
Image 5 of 18
Image 6 of 18
Image 7 of 18
Image 8 of 18
Image 9 of 18
Image 10 of 18
Image 11 of 18
Image 12 of 18
Image 13 of 18
Image 14 of 18
Image 15 of 18
Image 16 of 18
Image 17 of 18
Image 18 of 18
Or doesn’t, I guess. Seriously, the effect is subtle at the best of times, and in many scenes, I couldn’t even tell you whether RT was on or off. If there’s a strong light source, it can make a difference. Sometimes a window or glass surface will change with RT enabled, but even then (e.g., in the images of the truck and van) it’s not always clearly better.
The above gallery should be ordered with RT off and RT on for each pair of images. You can click (on a PC) to get the full images, which I’ve compressed to JPGs (and they look visually almost the same as the original PNG files). Indoor areas tend to show the subtle lighting effects more than outside, but unless a patch dramatically changes the way RT looks, Resident Evil Village will be another entry in the growing list of ray tracing games where you could skip it and not really miss anything.
Resident Evil Village will release to the public on May 7. So far, reviews are quite favorable, and if you enjoyed Resident Evil 7, it’s an easy recommendation. Just don’t go in expecting ray tracing to make a big difference in the way the game looks or feels.
AMD’s Ryzen 5000 (Cezanne) desktop APUs will make their debut in OEM and pre-built systems before hitting the retail market by the end of this year. However, the hexa-core Zen 3 APU (via Tum_Apisak) is already showing up in multiple benchmarks around the Internet.
The Ryzen 5 5600G comes equipped with six Zen 3 cores with simultaneous multithreading (SMT) and 16MB of L3 cache. The 7nm APU operates with a 3.9 GHz base clock and 4.4 GHz within the a 65W TDP limit. The chip also leverages seven Vega Compute Units (CUs) that are clocked at 1,900 MHz.
The Core i5-11400, on the other hand, is part of Intel’s latest 11th Generation Rocket Lake lineup. Intel’s 14nm chip features six Cypress Cove cores with Hyper-Threading and 12MB of L3 cache. The hexa-core processor, which also conforms to a 65W TDP, sports a 2.6 GHz base clock and 4.4 GHz boost clock. On the graphics side, the Core i5-11400 rolls with the Intel UHD Graphics 730 engine with 24 Execution Units (EUs) with clock speeds between 350 MHz and 1.3 GHz.
The results were mixed, which didn’t come as a surprise. They probably originated from different systems with different hardware so one result might have an edge over the other that we don’t know about. Futhermore, the available benchmarks aren’t on our preferred list so we should take the results with a pinch of salt.
AMD Ryzen 5 5600G Benchmarks
Processor
Geekbench 5 Single-Core
Geekbench 5 Multi-Core
UserBenchmark 1-Core
UserBenchmark 8-Core
CPU-Z Single-Thread
CPU-Z Multi-Thread
Ryzen 5 5600G
1,508
7,455
149
889
596
4,537
Core i5-11400
1,593*
7,704*
161
941
544
4,012
*Our own results.
Starting with Geekbench 5, the Core i5-11400 outperformed the Ryzen 5 5600G by up to 5.6% in the single-core test and 3.3% in the multi-core test. The Core i5-11400 also prevailed over the Ryzen 5 5600G in UserBenchmark. The Rocket Lake part delivered up to 8.1% and 5.8% higher single-and multi-core performance, respectively.
The Ryzen 5 5600G didn’t go home empty-handed either. The Zen 3 APU offered up to 9.7% and 13.1% higher single- and multi-core peformance, respectively, in comparison to the Core i5-11400 in CPU-Z.
It goes to show that while Zen 3 is a solid microarchitecture, Intel’s Cypress Cove isn’t a pushover, either. The Ryzen 5 5600G has a 1.3 GHz higher base clock than the Core i5-11400, but the latter still managed overcome the Zen 3 APU.
So far, the benchmarks show the processors’ computing performance. It’s unlikely that the Core i5-11400 will beat the Ryzen 5 5600G in iGPU gaming performance, which is where the 7nm APU excels at. After all, consumers pick up APUs for their brawny integrated graphics. The Ryzen 5 5600G will makes its way to the DIY market later this year so we’ll get our chance to put the Zen 3 chip through its paces in a proper review. The Core i5-11400, which retails for $188.99, is the interim winner until then.
GPD’s latest iteration of its handheld gaming PC, the Win 3, is finally going on sale later this month. You’ll be able to grab it from multiple e-tailers such as Amazon starting May 15th or later.
The Win 3 is GPD’s latest handheld gaming device designed to run Windows 10 and play PC games. The biggest upgrade for the Win 3 over previous designs is its inclusion of a QWERTY keyboard along with gamepad controls (like joysticks, a d-pad, and triggers), all in a similar form factor as a Nintendo Switch.
The Win 3 is GPD’s first Tiger Lake-based gaming handheld, featuring an Intel Core i7-1165G7 or a Core i5-1135G7, with Intel’s big core count Xe graphics chip that comes in either an 80EU configuration (for the Core i5) or 96EUs (for the Core i7). To help boost Intel’s Xe graphics even further, the Win 3 comes with 16GB of LPDDR4x memory clocked at 4266MHz.
For the screen, GPC went with a 5.5-inch display with a resolution of 1280×720. A higher resolution isn’t really needed with such a tiny display. Plus, a 720P resolution will really help the integrated graphics push higher frame rates since it is still an integrated graphics chip, not a discrete Nvidia or AMD GPU.
If GPC’s performance metrics are to be believed, then the Win 3 is quite a capable gaming machine. In the most demanding games GPC tested, like Red Dead Redemption 2, Control, and Battlefield V, the Win 3 averaged 50FPS with the Core i5 version.
These were the worst-case scenarios as well, with other games like SEKIRO: Shadows Die Twice, World War Z, and other games maintaining 60FPS or higher, again on the Core i5 version. For the Core i7 model, frame rates were reportedly at least 10-15% better (thanks to the higher core count Xe graphics).
We don’t know what graphics details were used in these tests though so take those results with a grain of salt–and assume lots of things were turned down or off.
You will be able to grab the Win 3 from Amazon starting May 28th for $1130, Banggood for $1100 on May 15th, and IndieGoGo InDemand for $997 sometime in July. Specifically, those prices are for the i5 models; the i7 models are roughly $200 pricier.
Home/Software & Gaming/Predator: Hunting Grounds launches on Steam with crossplay
Matthew Wilson 13 hours ago Software & Gaming
It has been a full year since Predator: Hunting Grounds first released on PS4 and PC. The Sony-published title has been exclusive to the Epic Games Store on PC since launch, but that changed today, with the game finally landing on Steam.
Predator: Hunting Grounds is an asymmetrical multiplayer shooter, with four players playing as a marine fireteam and the fifth player taking on the role of the Predator. Each round, the marine squad works their way through the map completing objectives and facing smaller PvE enemies all while trying to avoid the Predator and get to the chopper. The Predator is constantly on the hunt, trying to get to the other players before they escape.
As of today, the game is now available on Steam for £34.99 with full crossplay support with other versions. Unfortunately, while cross-platform multiplayer is in place, there is no cross-save functionality, so your progress is tied to one platform.
The game is easy enough to run, with the minimum system requirements calling for an Intel Core i5-6400 or AMD FX-8320 CPU, 8GB of RAM and a GTX 960 or AMD R9 280x graphics card.
KitGuru Says: This game has been on PC for a while already, but we know a lot of people avoid the Epic Games Store and prefer to buy on Steam. Will you be picking this one up now that it has launched on Steam?
Become a Patron!
Check Also
Monster Hunter Rise has outsold Street Fighter V in just one month
For most of its history, the Monster Hunter franchise had seen support from a select …
Samsung announced two entry-level Galaxy Book laptops at its Unpacked event: the Galaxy Book and the Galaxy Book Flex2 Alpha. The former will start at around $800 (according to Samsung, that price is subject to change) and will launch in the second half of 2021, whereas the Flex2 Alpha starts at $849 and is available for preorder now, shipping in mid-May. These round out the fleet of premium models announced today, including the Galaxy Book Pro and Pro 360 with OLED screens that start at $999, and the $1,399 Galaxy Book Odyssey gaming laptop that’s the first to sport Nvidia’s new RTX 3050 Ti graphics card.
Starting with the Galaxy Book, it has a 15.6-inch 1080p TFT LCD display, and it supports up to two fast NVMe SSDs. It has two USB-C ports (one of which can recharge the laptop with the included 65W charger), two USB-A 3.2 ports, an HDMI port, a headphone jack, a microSD card slot, and an optional nano SIM tray for LTE.
Samsung’s mobile press site shows that the Intel Pentium Gold or Celeron processor may show up in some models globally, but the company hasn’t confirmed what will be in the starting configuration in the US. A Samsung spokesperson told The Verge that the final price and specs will be announced closer to its launch in the second half of 2021.
According to the site linked above, if you need more power, you’ll be able to bump it up to 11th Gen Intel Core i5 or i7 processors, with the option of taking advantage of their Iris Xe integrated graphics, or you can opt for Nvidia’s GeForce MX450 discrete graphics. The Galaxy Book can be upgraded to 16GB of RAM.
Rounding out the specs, the Galaxy Book comes in silver or blue, and every configuration will have a 54Wh battery. Similar to the Galaxy Book Pro lineup, this one supports Dolby Atmos audio. Its webcam is a 720p HD sensor with a dual array mic.
If you want a 2-in-1 laptop with a better QLED screen that’s still not as expensive as the Galaxy Book Pro, Samsung also announced two sizes (13.3 and 15.6 inches) of the new Galaxy Book Flex2 Alpha. Each model has a 1080p QLED display, and either Intel’s 11th Gen i5 or i7 processors. The price for the 13-inch model starts at $849.
The starting configuration includes 8GB of RAM, but supports up to 16GB, and the storage tops out at 512GB. It has a standard selection of ports, including two USB-A 3.0 ports, a USB-C port, a headphone jack, a power plug, HDMI, and a microSD slot. Like the Galaxy Book, this model also has a 54Wh battery.
Related:
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.