Microsoft is planning to automatically add HDR support to more than 1,000 PC games. The software maker is now testing a new Auto HDR feature on Windows 10, which works just like it does on the latest Xbox Series S and X consoles. Enabling Auto HDR will add high dynamic range (HDR) to a large number of DirectX 11 and DirectX 12 games, as long as you have a compatible HDR monitor.
“While some game studios develop for HDR gaming PCs by mastering their game natively for HDR, Auto HDR for PC will take DirectX 11 or DirectX 12 SDR-only games and intelligently expand the color / brightness range up to HDR,” says Hannah Fisher, a DirectX program manager at Microsoft. “It’s a seamless platform feature that will give you an amazing new gaming experience that takes full advantage of your HDR monitor’s capabilities.”
Auto HDR can be enabled in the latest Windows 10 test build (21337) released to Windows Insiders today. It should be automatically enabled, or you can toggle it in the display part of settings. Auto HDR is just in preview for now, and not all top DirectX 11 / 12 games will support it just yet. Microsoft is also working to optimize performance and fix some issues, and the company does admit “Auto HDR does take some GPU compute power to implement.”
Alongside the Auto HDR feature, the latest test version of Windows 10 also includes improvements to Virtual Desktops, a File Explorer layout update, and even some changes to built-in apps like Notepad.
Windows 10 will soon include custom backgrounds for each Virtual Desktop, with the ability to easily reorder desktops. Microsoft is also adding additional padding between elements in File Explorer. There’s a compact mode now with the classic File Explorer mode, and the new view is a little more touch-optimized.
Notepad also has a new icon now and will be updated via the Microsoft Store. Microsoft is also updating the apps it bundles with Windows 10, to include Windows Terminal and Power Automate Desktop.
LastPass, the popular password manager, announced changes last month to the free version of its software designed to make its Premium paid version much more attractive — and the free one much less so.
“LastPass offers access across two device types – computers (including all browsers running on desktops and laptops) or mobiledevices (including mobile phones, smart watches, and tablets),” the company wrote in a blog post. “Starting March 16th, 2021” — that’s today — “LastPass Free will only include access on unlimited devices of one type.”
What that means: if you’re a LastPass free user, you’ll have to choose whether you want to access your passwords on your computer — in browser or via desktop app — or on your mobile device. You won’t be able to use both, though your passwords will sync across devices regardless. That said, you’ll have the opportunity to switch your main device three times, starting today. (LastPass is also offering a discount on Premium subscriptions for a limited time, presumably to dampen the sting.)
If, however, you find all of this too onerous and you’d like to just switch password managers entirely, I have some good news for you: moving your passwords out of LastPass is pretty easy. I actually did it myself, using (naturally) The Verge’s guide. I chose Bitwarden because it syncs across mobile and desktop and it’s open source.
Also, if you’ve made it this far and you don’t have a password manager yet — what are you doing here? Get on that.
Nvidia’s now taken full culpability for accidentally removing its own RTX 3060 anti-mining lock, according to a statement the company made to The Vergeearlier today. This follows a recent RTX 3060 beta driver release that seemed to inadvertently unlock the card’s full crypto mining potential, despite claims that multiple levels of security would make its limiter unhackable. The driver has since been removed, but with the cat out of the bag the RTX 3060 is set to join its Ampere siblings as one of the best graphic cards for mining.
“A developer driver inadvertently included code used for internal development which removes the hash rate limiter on RTX 3060 in some configurations,” an Nvidia spokesperson confirmed to The Verge today. “The driver has been removed.”
But the internet doesn’t work that way, of course. Mirrors of the driver aren’t hard to find, so what’s done is done. At least we know for sure now where the blame lies, although it’s possible hackers would have figured out how to remove or circumvent the limiter eventually.
The RTX 3060’s anti-mining limiter always felt like a bit of an odd choice. Gamers might appreciate the company finally trying to dissuade miners away from buying up all of its GPU stock and driving up prices, but given that the RTX 3060 Ti, RTX 3070, RTX 3080 and RTX 3090 had no such limiters in place and couldn’t retroactively implement them without major uproar, it ran the risk of being too little too late. Not to mention the potentially dangerous precedent set by a hardware manufacturer purposefully limiting your component’s power.
To be fair to Nvidia, the driver that unlocks the limiter did seem to require some miners to flash a hacked vBIOS onto their cards, which meant using the card for mining wasn’t always as simple as downloading the update and grabbing a digital pickaxe. But that wasn’t the case in our own testing, and is still a far cry from the claims Nvidia made about the limiter’s unhackability just last month.
“It’s not just a driver thing,” Nvidia head of communications Bryan Del Rizzo said on Twitter. “There is a secure handshake between the driver, the RTX 3060 silicon, and the BIOS (firmware) that prevents removal of the hash rate limiter.”
Except when the driver skips the handshake. Whoops.
We’re curious to see how Nvidia will respond going forward. The company clearly wasn’t expecting to hack itself, but this raises serious concerns about the viability of software-side limiters going forward.
In the meantime, be prepared for the RTX 3060 to be even harder to buy than it already is.
The Raspberry Pi community is built upon a fellowship of makers who are always itching to find more ways to communicate and interact using their favorite SBC. One maker, known on Reddit as Splash07s, decided it was time to listen for a change and built a custom software-defined radio (SDR) rig called Raspberry Ham.
SDRs are popular among amateur radio operators—this is where the term Ham radio comes from. Users can listen to a wide range of frequencies ranging from air traffic control to local scanners. If the transmission is unencrypted and open to the public, you can listen in. This project uses a Raspberry Pi as the base for the radio.
Splash07s recently uploaded a complete, detailed parts list which you can view on Reddit. The major components used were a Raspberry Pi 3 B+, a Pelican 1200 case, and an RTL-SDR dongle.
Image 1 of 3
Image 2 of 3
Image 3 of 3
The Pelican case is entirely waterproof and features external ports with waterproof casings. Externally, you can take advantage of HDMI, an Ethernet port, multiple USB inputs, and even a 3.5mm jack. The image used for the SDR functionality comes from a developer named Luigi Cruz. You can find his SDR image for the Raspberry Pi on GitHub.
The best Raspberry Pi projects are ones with a clean finish, and this one has a really cool, custom-made logo featuring our favorite raspberry with a piggy twist. Check out the Raspberry Ham project thread for a complete breakdown of this awesome build.
Intel’s next-generation desktop chips are finally here: after a brief preview at CES, the company is fully unveiling its 11th Gen Core desktop chips (better known by their codename, Rocket Lake-S.)
Leading the pack is Intel’s new flagship chip, the Core i9-11900K, with eight cores, 16 threads, boosted clock speeds up to 5.3GHz, support for DDR4 RAM at 3,200MHz, a total of 20 PCIe 4.0 lanes, and backwards compatibility with Intel’s 400 Series chipsets.
Eagle-eyed Intel fans might notice that the new chip is, on paper, actually a downgrade from last year’s top model, the Core i9-10900K, which offered 10 cores and 20 threads (and a similar boosted clock speed of 5.3GHz).
That’s because Intel is debuting a new desktop core architecture for the first time in over half a decade with its 11th Gen Rocket Lake-S chips called Cypress Cove. Cypress Cove finally replaces the Skylake microarchitecture, which the company has been using since its 6th Gen chips in 2015.
But the Cypress Cove design isn’t actually a whole new microarchitecture — it’s actually Intel’s Willow Cove chip designs and technologies that the company has been using on its 11th Gen 10nm Tiger Lake chips which Intel is backporting to its 14nm production process.
Since those designs were meant for 10nm chips, though, Intel is limited in the number of cores it can fit when scaling them up to a 14nm size; hence, the reduction in core count year over year. But Intel still says that the new chips will offer better performance (at least, in some cases) than the 10th Gen, with the core architecture enabling up to 19 percent IPC (instructions per cycle) than the previous generation.
Intel’s argument here is effectively that sheer core count isn’t enough on its own — frequency speed and performance matters, too, and thanks to the maturity of the 14nm production process, Intel is very good at cranking out every last drop of performance from these chips.
Intel 11th Gen Desktop Chips
Model
Cores/Threads
Base clock speed (GHz)
Boosted clock speed (GHz)
Turbo Boost Max 3.0 clock speed
Thermal Velocity Boost speed, single core / all cores (GHZ)
Smart Cache
TDP (W)
Graphics
Recommended Price
Model
Cores/Threads
Base clock speed (GHz)
Boosted clock speed (GHz)
Turbo Boost Max 3.0 clock speed
Thermal Velocity Boost speed, single core / all cores (GHZ)
Smart Cache
TDP (W)
Graphics
Recommended Price
i9-11900K
8/16
3.5
Up to 5.1
Up to 5.2
Up to
5.3 / 4.8
16M
125
Intel UHD Graphics 750
$539
i9-11900
8/16
2.5
Up to 5.0
Up to 5.1
Up to
5.2 / 4.7
16M
65
Intel UHD Graphics 750
$439
i7-11700K
8/16
3.6
Up to 4.9
Up to 5.0
NA
16M
125
Intel UHD Graphics 750
$399
i7-11700
8/16
2.5
Up to 4.8
Up to 4.9
NA
16M
65
Intel UHD Graphics 750
$323
i5-11600K
6/12
3.9
Up to 4.9
NA
NA
12M
125
Intel UHD Graphics 750
$262
i5-11600
6/12
2.8
Up to 4.8
NA
NA
12M
65
Intel UHD Graphics 750
$213
i5-11500
6/12
2.7
Up to 4.6
NA
NA
12M
65
Intel UHD Graphics 750
$192
i5-11400
6/12
2.6
Up to 4.4
NA
NA
12M
65
Intel UHD Graphics 730
$182
And Intel’s benchmarks (obviously) support that argument: head to head with last year’s Core i9-10900K, the i9-11900K offered between 8 to 14 percent better performance on games like Gears 5, Grid 2019, Microsoft Flight Simulator, and Total War: Three Kingdoms. Intel also says that its top chip outperforms AMD’s flagship Ryzen 9 5900X processor for those titles, although by slightly smaller margins (between 3 and 11 percent better, according to Intel’s benchmarks).
That said, Intel’s tests were all running at 1080p, so we’ll have to stay tuned for more comprehensive benchmarking down the line on a wider range of titles — and particularly, at 4K resolution.
The new architecture also brings other improvements, with up to 50 percent better integrated graphics compared to Gen9 thanks to the company’s new Xe graphics, with one-third more EUs than its Gen9 graphics.
Given that these are desktop chips that will almost certainly be paired with a high-end discrete graphics card, that’s not the most groundbreaking improvement, however. And while Intel will be offering several F-series models of the new chips without GPUs, the overall design is still the same on those models. That means that Intel isn’t going to be offering any niche models that ditch integrated GPUs to try to fit in more cores, at least for now.
The new chips also feature other improvements. The 11th Gen chips add Resizable BAR, for a frame rate boost on compatible Nvidia and AMD graphics cards. There’s built-in support for both USB 3.2 Gen 2×2 at 20Gbps as well as Intel’s own Thunderbolt 4, along with DDR4-3200 RAM. And Intel has added four additional Gen 4 PCIe lanes, for a total of 20.
As is traditional for a major new chip launch, Intel is also introducing its 500 series motherboards alongside the new processors, but the Rocket Lake-S CPUs will also be backwards compatible with 400 series motherboards.
Additionally, there’s some new overclocking options with the new chips for users looking to squeeze out even more power. Specifically, Intel’s Extreme Tuning Utility software is getting refreshed with a new UI and some updated features alongside the 11th Gen chips.
The new 11th Gen Intel desktop processors are available starting today.
Google is reducing its long-standing 30 percent cut, which it takes from each Play Store digital purchase for all Android developers around the world, on the first $1 million they make on the digital storefront each year, starting on July 1st. According to Google, that change means the 99 percent of Android developers that make less than $1 million each year will see a 50 percent reduction in fees.
Google’s news follows Apple’s announcement of a reduced 15 percent fee last year as part of a new small business program, with one critical difference: Apple’s fee reduction only applies to developers that make under $1 million per year. But if an app maker goes over the $1 million threshold at any point in the year, they’ll be booted from Apple’s program and subject to the standard 30 percent rate.
Google’s program is a flat cut to the first $1 million developers make each year. That means whether you’re a student making your first app or a multibillion-dollar company, the first $1 million you make on the Play Store each year will only get charged a 15 percent service fee by Google. Any money you make after that will then be subject to the usual 30 percent cut. A Google spokesperson says the company felt that applying the reduced fees equally to all companies was a fair approach in line with Google’s goals of helping developers of all sizes.
Google has charged a 30 percent cut for any purchases through the Google Play Store since it first launched as the “Android Market” — although originally, the company claimed that “Google does not take a percentage,” with the 30 percent cut going toward “carriers and billing settlement fees.” In its more modern incarnation as the Play Store, Google now puts that 30 percent cut toward its “distribution partner and operating fees.”
The 30 percent fee has been constant for the lifespan of Google’s storefront. The only exception is subscriptions: in 2018, Google (in another similar move to Apple) announced that it would reduce its cut down to 15 percent for subscription products after users had been subscribed for a full year.
The number of developers that make more than $1 million each year — and will end up still being charged the full 30 percent — is proportionally tiny. Google notes that only about 3 percent of Android developers actually charge for either downloading their apps or for digital in-app purchases to begin with, and only 1 percent of those developers make more than the $1 million threshold that would see the 30 percent cut kick in.
The new policy also comes at a critical moment when Google (and Apple’s) app store policies are under intense public scrutiny, kicked off by the removal of Epic Games’ Fortnite from both the App Store and Play Store and the game developer’s subsequent antitrust lawsuits against Apple and Google.
The issue is also coming to a head in legislation, with states like Arizona and North Dakota debating new laws that would force Apple and Google to offer more alternative software distribution methods and payment options on their platforms.
GoPro is releasing a new version of its main smartphone app that will now be called “Quik.” The new app will remain the main interface for connecting to and controlling GoPro cameras, but it is also getting new features, including one called “mural” that’s sort of like a private Instagram feed meant to help people organize their favorite images and videos — regardless of whether they were taken by a GoPro camera — and save them from the “abyss of your camera roll,” GoPro CEO Nick Woodman says in an interview.
Close followers of GoPro’s efforts in the software space know that the company already once launched an app called Quik way back in 2016 that was all about auto-editing footage to a beat. But that app has not been supported for a while and will no longer be available to download after today with the launch of the new Quik app.
The auto-editing feature will live on in the new app, which launches on iOS and Android today. It also has a few other features like a video editing suite (including a speed adjustment tool), themes and filters, and unlimited original quality cloud backup of everything posted to the mural feed. GoPro is charging $1.99 per month or $9.99 per year for those features, though the basic camera connection and control side the app will remain free to use for people who don’t want to pay for the new stuff. Customers who already pay for GoPro’s Plus subscription service (which includes unlimited cloud storage, live-streaming functionality, and camera replacement) will get Quik’s features for free.
Woodman sees the new Quik app as something of a culmination of a yearslong effort at GoPro to diversify away from hardware that started around 2013 and 2014. And by gearing the app at a wider audience, not just GoPro users, he thinks there’s great opportunity to be had.
In fact, that wasthe strategy with the original Quik app, which let users mash together photos and videos from their camera roll without requiring the use of a GoPro. And it worked: Woodman says that app still had “roughly eight million monthly active users” despite having been essentially abandoned by the company.
While he doesn’t expect all of those users to pony up for the paid version of the new app, he thinks many will appreciate the mural feature because he still doesn’t see any good solutions to that camera roll clutter problem — especially not albums. “Albums suck. Albums are just miniature camera rolls,” he says. “You don’t go into albums [thinking] ‘This is going to be a super awesome experience. Hey honey, let’s AirPlay our album to the TV and kick back and reminisce.’ You don’t do that.”
Users can build out the mural feed in the Quik app a few different ways. One is fairly straightforward: after you open the Quik app and give it access to your camera roll, you can scroll through and add photos to the mural feed or to “events” (not albums, of course) on the feed. The more attractive option, in Woodman’s eyes, is to add photos and videos you take on the fly using the share sheet every time you capture a “keeper.” (Users can also text or email photos to the mural feed.)
That said, Woodman thinks people may use the feed in all sorts of ways, like saving images that inspire them or for planning a project, a la Pinterest. Others will just use it for their GoPro footage and photos and nothing else.
“It can be all of those things,” he says. “I think that what we’re solving for people is like a very relatable and widespread problem: I don’t have a convenient, private place to put content that matters most to me, and you know what, sharing it to your Instagram feed ain’t working because there’s that tension of, ‘Well, this matters to me, but I know it’s not going to really matter to anybody that I would socialize it with.’”
GoPro has carved out a decent supplemental business so far with its Plus subscription service, with nearly 800,000 paying subscribers as of the end of 2020 (the equivalent of just shy of roughly $40 million of annual revenue). But with Quik, Woodman sees not just a great business opportunity or a chance to reach new customers. He sees it serving a higher purpose.
“Not to bash on social feeds, like there’s a lot of good from them, we get a lot of inspiration from what other people are doing. But damn it, man, you can get a lot of inspiration from just looking at what you’ve been doing with your life. It’s pretty awesome,” he says. “This is the cosmic moment where I point to the deeper meaning behind what it is that we’re doing for people with Quik, because I think we’re really going to help people develop a stronger sense of self-esteem, self-worth, and ultimately happiness. You don’t have to find happiness in what other people are doing. There’s a ton of happiness to be found in what you’re doing with your life and and Quik helps you bring that to the forefront.”
Philosophical value aside, bringing more customers under the GoPro tent has long been a goal for Woodman; it’s a big part of what inspired the company to make a more concerted push into software. But whether or not GoPro turns the new Quik app into a moneymaker, that it’s attempting another shift in its software strategy is on its own a sign that the company is back on solid ground. It spent the last few years pruning its camera lineup back to the essentials, quickly scuttling a dalliance with the drone market, and focusing more on selling directly to consumers. That has the company back in the black and willing to take chances again.
“We’re known for enabling amazing content. It’s just until now, it’s always required a GoPro,” Woodman says. “[But it’s] too limiting to just serve people through our hardware alone. Let’s also serve people through software. Meet them where they are. And we can build a phenomenal business.”
Corsair is the latest company to introduce a 60-percent wired mechanical keyboard of its own, lopping off the arrow keys and other functions for a more compact design. The K65 RGB Mini costs $110 and has a design that is about as subtle as Corsair has ever produced. It connects via its included, detachable USB-C-to-USB-A braided cable to your PC, macOS computer, or Xbox One. This keyboard joins the ranks of Razer’s $120 Huntsman Mini, HyperX’s $100 Alloy Origins 60, and Ducky’s One 2 Mini, among others.
Like other 60-percent models, many of the function keys are embedded as secondary functions you can execute by holding the “FN” key. As a result, it lacks about several keys you might be accustomed to seeing on a keyboard. If you primarily use a PC for gaming, or are able to quickly learn a new keyboard layout, the transition to a 60-percent keyboard shouldn’t be too difficult.
The K65 RGB Mini that I briefly tested is equipped with Cherry MX Speed linear switches, which have the signature mechanical “thock” sound. Unlike some other switch types, these are very easy to press and have short, smooth travel. You can also choose between Cherry MX Silent or Red switches, depending on your region.
This keyboard also has per-key RGB backlighting that you can tweak in Corsair’s iCue software (available on Windows 10 and macOS Catalina and later). The keys are removable, and there’s a key removal tool included in the box, along with a different space key. The bottom row is the standard layout, so you can equip it with custom key caps if you prefer.
The K65 RGB Mini supports up to an 8,000Hz polling rate through its iCue software. In other words, it can report new presses up to 8,000 times per second, or once every 0.125 milliseconds (on macOS and Xbox One, it tops out at 1,000Hz). No one can type that fast and it might not bear any impact on your gaming, but it ensures this model is far more responsive to fast key presses than other keyboards. Other notable features include full N-key rollover and support for up to 50 custom mapping profiles saved to its onboard storage.
As I mentioned earlier, the design of this keyboard is subtle, clean, and subdued. Aside from its RGB backlighting, it’s light on logos and other details, which makes sense. Corsair knows it needs to appeal to gamers who prefer a minimalist design, since that’s the whole appeal of opting for a 60-percent keyboard anyway.
The Corsair K65 RGB Mini is a well-performing highly customizable keyboard that should help raise awareness for the 60% form factor despite a few (mostly cosmetic) flaws.
For
+ Bounty of customization options
+ Polling rates up to 8,000 Hz
+ Doubleshot PBT keycaps
+ Custom spacebar, Esc key
Against
– Pinging on common keys
– Cosmetic problems with many keycaps
– 8,000 Hz polling rate may not be useful
Corsair today announced that it’s entering the 60% keyboard market with the Corsair K65 RGB Mini ($109.99). This diminutive board ditches the number pad, arrow cluster and other keys so it can occupy as little desk space as possible without compromising on the stuff that matters most to gamers. Mechanical switches? Present. RGB lighting? Accounted for. True love? Never say never.
This keyboard also shows that the 60% form factor is becoming mainstream among the best gaming keyboards. Other manufacturers have offered 60% keyboards for years, of course, and enthusiasts have designed even smaller boards for personal use. But the K65 RGB Mini’s arrival means Corsair has joined Razer, HyperX, and other prominent gaming manufacturers in embracing the form factor. And it looks to stand out with a unique custom spacebar and whopping 8,000 Hz polling rate that you probably won’t notice.
Corsair K65 RGB Mini Specs
Switches
Cherry MX RGB Red (tested), Cherry MX Silent Red or Cherry MX Speed Silver
The most important aspect of the K65 RGB Mini is its size. It measures in at 11.6 inches long, 4.1 inches wide and 1.7 inches tall at its peak, making it similar to the HyperX Alloy Origins 60 (11.5 x 4 x 1.5 inches). Although, Corsair’s 60% keyboard will feel slightly lighter than HyperX’s (1.6 pounds versus 1.3 pounds). The bad news for those who like some extra height (perhaps due to an extra thick wrist rest), the K65 RGB Mini’s height isn’t adjustable, as it doesn’t have any adjustable feet.
Corsair achieved those measurements by paring the keyboard down to the most essential keys, most of which pull double duty when they’re pressed at the same time as the Fn key. Many of those dual functions make sense. The Backspace key is also used as Delete, for example, and the number row serves as a de facto function row as well. But there are many other combinations besides: several keys have been assigned media functions Z to B’s secondary functions control lighting aspects of the keyboard, and the keys above them perform mouse functions. These functions are all reprogrammable if you download the keyboard’s software. We’ll talk more about that later.
For now, let’s get back to the basics. The K65 RGB Mini boasts a braided, detachable USB-C to USB-A cable that should make travel easier. Corsair makes the keyboard stand out a little more by including an extra Esc keycap with the Corsair logo on it and a fancy spacebar. These are a cheaper form of plastic, ABS, than the rest of the keycaps.
The custom spacebar looks cool, even if I prefer the topographic design HyperX used for the Alloy Origins 60. Its light texturing adds a bit of flair without becoming a distraction every time the key is pressed. And it probably would’ve been enough to help the K65 RGB Mini stand out. Corsair didn’t stop there, however. The company also used a custom finish on the standard keycaps that makes it look like someone with severe dandruff scratched their scalp over the keyboard.
The keycaps also suffer from a lack of clarity on their legends that can make it hard to see the RGB backlighting and make the keys seem a bit messy even when the lighting is off. This problem is most noticeable in the number row, but it affects other keycaps as well. That doesn’t really matter while the keyboard’s actually in use, of course, but it does undermine Corsair’s other efforts to make the K65 RGB Mini aesthetically pleasing.
It’s a shame, too, because the standard keycaps are doubleshot PBT plastic that should be able to withstand all sorts of abuse. (Not that any of us have ever been anything but totally gentle with a keyboard, of course.) Doubleshot PBT is typically more durable than ABS. And these succeed in preventing that shiny look. Beneath those 1.5mm-thick keycaps lies your choice of one of three Cherry MX switches rated for between 50 million and 100 million keystrokes, so the K65 RGB Mini should prove fairly durable, despite its plastic exterior.
Typing Experience
In an attempt to appeal to gamers who want switches that are easy to depress, the K65 RGB Mini comes with a range of linear switch options: Cherry’s MX Silent Red, MX Speed Silver or MX RGB Red. We tested the keyboard with the latter, essentially standard MX Red with a transparent casing meant to help the LEDs underneath them shine through. Cherry’s official website puts the MX RGB Reds at requiring 30 cN initial force and 45 cN actuation force with 2mm pretravel and 4mm total travel. It’s a solid linear switch that offers very little resistance throughout a smooth keypress.
After about a week with the K65 RGB Mini, I averaged 125.6 words per minute (wpm) with 97.7% accuracy on the 10fastfingers.com typing test. That’s faster than I was with the Alloy Origins 60 (117 wpm) but equally accurate. Some of that speed boost may have more to do with me getting more familiar with the test and 60% keyboards though.
While appropriate for gaming, I find Red switches a bit light to depress for heavy typing. Your experience may vary, but I find that any hesitation when pressing a key can result in an accidental keypress. Tactile mechanical switches would come in handy in that regard, but, again, the K65 RGB Mini is only available with linear ones. The 60% layout also takes some getting used to. People who need a number pad balk at tenkeyless keyboards; I bet they gasp in horror upon sight of a 60% board.
That’s all just a matter of acclimation, though, even if Corsair decided to put the arrow keys all the way on the “UHJK” cluster instead of somewhere closer to where they’d be on a larger keyboard. Buying a 60% keyboard is making a commitment to learning how to perform everyday functions on that particular board, and I don’t recommend switching between various models.
But the biggest problems with typing on the K65 RGB Mini are its noise levels and lack of ergonomic control. In a side-by-side comparison, the keyboard was louder than the Alloy Origins 60, the full-sized Asus ROG Strix Scope RX with optical mechanical switches and other boards I’ve reviewed lately, with notable pinging on certain keys. Every time I hit the “Shift” key or the spacebar it sounds like I’m operating an old-timey cash register.
The lack of feet on the bottom of the keyboard also means the K65 RGB Mini is limited to just one height. That might not bother some people, but it’s nice to have more control over a keyboard’s positioning.
Gaming Experience
When gaming, the K65 RGB Mini feels a lot like other keyboards with linear mechanical switches but with the added bonus of leaving more desk space available to the mousepad. Its keys feel responsive, which is exactly what people expect from linear switches. Sometimes that led to mis-presses for me but not as often as when I’m just typing.
None of these traits are exclusive to the K65 RGB Mini. By now we’ve come to expect that a gaming keyboard will offer reliable inputs, responsive switches,and features like n-key rollover; their absence would be more notable than their presence.
The K65 RGB Mini’s standout features are similarly hard to notice. It features the Corsair Axon Hyper-Processing Technology that was introduced in October 2020. Corsair said the feature is enabled by a 32-bit Arm Cortex SoC running a “purpose-engineered real-time operating system.” It’s supposed to offer up to an 8,000 Hz polling rate and key scanning at a rate of 4,000 Hz. Most gaming keyboards offer 1,000 Hz polling rates, so the K65 RGB Mini is eight times as fast, in theory.
Here’s how the math breaks down: A 1,000 Hz polling rate leads to a 1ms delay between a key being pressed and a PC registering a keypress. The K65 RGB Mini’s maximum 8,000 Hz polling rate reduces that to a 0.125ms delay. Corsair has strayed from the 1,000 Hz standard before with the Corsair K100 RGB, but that much pricier keyboard’s maximum polling rate is 4,000 Hz.
That would all be something to celebrate, if only human eyes could perceive the 0.875ms of time Corsair Axon is saving. Estimates vary—Tobii claims we react to visual stimuli in about 80ms, while MIT has said we can recognize images that appear for just 13ms—but the consensus is that we can’t detect the kind of sub-millisecond difference Corsair is enabling with the greater-than-1,000 Hz polling rates.
In-game I didn’t notice any improvements either. I was still lumbering around the generations-old landscapes of Halo: Reach and accidentally using my utility before the round even starts in Valorant at exactly the same speeds that I was with other keyboards. That doesn’t make Corsair Axon a detriment to the K65 RGB Mini, though. It just means that it’s another spec that sounds impressive on paper but isn’t noticeable in-game.
To use the K65 RGB Mini’s 8,000 Hz polling rate, you must have the iCue software installed, as well as a USB 3.0 Type-A port and Windows 10 or macOS 10.15 or later. iCue warns that higher rates are limited based on system performance but doesn’t offer minimum specs, and there is an option to enable the Corsair Axon-afforded polling rates despite those warnings. (We’ve reached out to Corsair about recommended system specs for 8,000 Hz and will update this review if we hear back.)
Software and Features
Image 1 of 6
Image 2 of 6
Image 3 of 6
Image 4 of 6
Image 5 of 6
Image 6 of 6
The K65 RGB Mini offers a bevy of customization features via Corsair’s iCue software. Most settings can be saved directly to the keyboard, and Corsair claimed the 8MB of onboard storage has capacity for between 50 and 200 profiles. (We’ve reached out to the company for clarification; our reviewer’s guide claims 50 but the keyboard’s packaging claims 200.)
Settings that have been saved to the K65 RGB Mini’s onboard storage are easy to cycle through using various modifier keys. The default layout has profile settings, brightness levels, and lighting effects assigned to the “Z” to “B” keys. Keyboard shortcuts can also be used to record, assign, and delete macros without having to venture into iCue beforehand.
iCue splits its settings into six categories: Key Assignments, Hardware Key Assignments, Lighting Effects, Hardware Lighting, Performance and Device Settings. Changes made via Key Assignments and Lighting Effects are only effective when the software is running; changes made to the other categories persist, as long as they’re saved to onboard storage first.
The Key Assignment categories at their most basic enable remapping, the ability to switch between languages with a keypress and simulate mouse input. You can also set them to perform more sophisticated actions, such as inserting predetermined text, launching specific programs, controlling media playback and running macros.
Each key offers two levels of customization. The first is activated when the key is pressed by itself, so I wouldn’t recommend it for keys that see a lot of use. If you don’t use the right Shift key a lot, however, it might make sense to have it perform another function instead. The second level of customization performs the specified action when the key is pressed alongside the Fn and Menu modifier buttons, and Corsair said additional modifier keys will be available soon.
RGB lighting categories perform as expected. The K65 RGB Mini offers per-key RGB backlighting that you can modify with iCue’s built-in lighting effects. Each of those effects offers at least some level of customization as well, including the ability to control how bright they are, what colors they include, what areas of the keyboard they affect and how they behave.
Performance mostly offers control over how the Windows Lock feature activated by pressing Fn + Win functions: It also controls the indicator colors shown when a key is locked, a profile is activated or a macro is being recorded.
Device Settings is used to update the K65 RGB Mini’s firmware, manage profiles saved to its onboard storage, control the brightness of its lighting, and change the active keyboard layout. It also offers a choice of polling rate: 125, 250, 500, 1,000, 2,000, 4,000, or 8,000 Hz.
Bottom Line
The Corsair K65 RGB Mini is a niche product. It’s a 60% form factor keyboard from Corsair that’s only available with linear switches. It’s also Corsair’s first attempt at a lot of things: The company said this is its first keyboard with a 60% form factor, detachable USB C-to-A cable, function layers on each key and the ability to hit polling rates up to 8,000 Hz.
The K65 RGB Mini isn’t cheap either, but that cost is justified by the doubleshot PBT keycaps, braided cable and per-key RGB backlighting, as well as all the extensive gaming features enabled by iCue. Opting for Cherry MX switches instead of their more affordable counterparts also helps to explain Corsair’s pricing.
That doesn’t mean the K65 RGB Mini is perfect. The pinging on some keys is frustrating, the all-plastic build could raise questions about the keyboard’s durability and the cosmetic flaws in the keycaps take some getting used to.
It will be interesting to see how Corsair improves upon the 60% form factor in the future. But the K65 RGB Mini is still an exciting first attempt at a 60% keyboard;
Cricut machines offer a customizable, automated way to speed up the precise cutting required in crafting projects, but the capabilities of these devices are about to be limited by an upcoming update. Cricut is updating the machines’ accompanying software, Design Space, by putting caps on uploads that could restrict the number of projects Cricut owners are actually able to make.
Getting a Cricut ready to cut requires using premade patterns or uploading original designs to an application called Design Space. That’s been free and unlimited for all users in the past, but now Cricut is limiting users to 20 free uploads per month. To upload more, owners will soon have to subscribe to Cricut’s Access Standard plan for $9.99 per month / $95.88 per year or Access Premium plan for $118.88 per year.
Anything that’s previously been uploaded will be able to stay in Design Space without any changes or limits, but it’s important to understand that to use a modern Cricut, you need to use Design Space in some capacity. And Cricut machines themselves can cost anywhere from around $179 to $399, before the potential subscription.
At its most basic, Design Space is required to upload designs created in other apps so they can be formatted to work with Cricut machines. For example, a pattern for a paper flower made in Adobe Illustrator or a logo sketched in Procreate can be uploaded so the Cricut machine knows where to cut on whatever material is being used, whether it’s paper, fabric, vinyl, or even wood. Design Space works on its own as a creation software, but if you subscribe, it becomes more fully featured, with access to exclusive fonts, images, and patterns. Even with the optional subscription for more features, many users still choose to create their work elsewhere and only use Design Space for preparation before cutting.
Now with this new upload limit, a subscription is needed to restore the functionality crafters original bought a Cricuit for: creating as many projects as needed, with the only limit being materials, rather than an arbitrary number set by a software update.
Beyond the ever-present, scary reality that companies can limit the capabilities of a product you “own” after the fact, multiple Cricut owners have contacted The Verge about an even more glaring problem: projects can often require multiple uploads to complete, meaning that 20 upload limit could be reached even faster. This could be because of user error or a complex project requiring multiple pieces, but for any person who uses a Cricut in their business, that means they could very well be forced to subscribe if they want to keep up their normal level of productivity.
In a statement provided to The Verge, Cricut said it remains committed to its plan and “creating the best possible experience for [its] members”:
Cricut announced changes to our Design Space software, including new Offset and Project Collection features, as well as an update to personal uploads, limiting image and pattern uploads to 20 per month for members without a Cricut Access subscription. All users will still be able to design and cut regardless of uploads. Cricut remains dedicated to creating the best possible experience for our members, and we will continue to support our community of makers as our top priority.
The response to Cricut’s planned change is also boiling over on the company’s unofficial subreddit. Cricut users have shared the contact information of Cricut employees and launched a Change.org petition in protest of the upcoming update. Cricut says it will start prompting users to subscribe in Design Space in the next few weeks until the upload limit goes into effect at an unspecified date.
BMW is pulling the curtain back on its next iteration of iDrive, the software and infotainment platform that has served as the centerpiece of the automaker’s in-car experience for the last 20 years.
The eighth version of iDrive will mostly live on a new “curved” display that starts behind the steering wheel and extends halfway across the dashboard. This involves merging the 12.3-inch instrument cluster and the central 14.9-inch infotainment screen into a single unit angled toward the driver. The size of the screen will vary, depending on the vehicle, but the screen will have the appearance of “floating,” the automaker said. The new iDrive will make its debut later this year in BMW’s new iX electric SUV, as well as the BMW i4 electric sedan.
The brain of this car will also be a significant improvement over past models, BMW says. The onboard computer will be able to process 20 to 30 times the data volume of previous models, or around double the amount of data that was previously possible. This will enable a greater fusion of the vehicle’s sensors, which will help enable higher levels of autonomous driving.
According to BMW chief technology officer Frank Weber, iDrive is a “major step” toward fully autonomous vehicles. He explained that iDrive is designed to support both Level 2 and Level 3 autonomous driving systems.
“It is not an evolutionary step from what we had in the [previous] generation,” Weber said. “It’s an all new, all new system when it comes to sensors, computing, [and] the way it was developed.”
Advanced driver assistance systems, defined as Level 2 by the Society of Automotive Engineers, include lane keeping, blind-spot detection, automatic emergency braking, and adaptive cruise control. Most major automakers include some version of advanced driver assistance in their vehicles today. Level 3 refers to highly automated driving, also called conditional automation, where the driver still needs to be able to take over the vehicle upon request.
Other automakers have been tripped up by the promise of Level 3 driving. Audi, for example, said its A8 sedan would come with a feature called Traffic Jam Pilot that, when active, would relieve human drivers of the need to pay attention during stop-and-go traffic. But the feature was contingent on approval from local authorities, and Traffic Jam Pilot remains dormant in most markets around the world. Audi has no plans to activate the feature, and Level 3 automation remains a morass of legal, regulatory, and business-related challenges.
Weber wouldn’t say when BMW would introduce Level 3 automation and hinted it was conditional on racking up more test miles in vehicles equipped with the new version of iDrive.
“Nobody currently can offer at start of production Level 3 capabilities, because you need so many test miles,” he said. “And so you need a production vehicle, and then you run all your validation tests for Level 3.”
iDrive can be controlled with touch, voice activation, or gesture control. There are three main layouts: Drive, in which drivers can use a “dynamically changing area in the center of the information display to show individually selectable information”; Focus, “designed for extremely dynamic driving situations”; and Gallery, which minimizes the driving content “to clear as much space as possible for widget content.”
There is a theme of personalization that runs through the automaker’s new software update. BMW’s Personal Intelligent Assistant, built on top of Microsoft’s Azure cloud system, will “adjust to the driver’s individual needs and routines,” the company says, making it “a central operating channel of human-machine interaction.”
The virtual assistant, which has been available in BMW’s cars for a number of years, will play the role of a “digital character which can engage in natural dialogue with the driver and front passenger – similarly to a relationship between humans.” Expect some similarities to Mercedes-Benz’s MBUX system or Volvo’s Android-powered Google Assistant.
Using ultra-wideband (UWB) technology, iDrive will be able to load a driver’s personal settings as soon as they start approaching the vehicle by sensing their key fob or smartphone. BMW describes this as a “great entrance moment,” which includes geometric projections, lighted door handles, and other lighting effects.
There will be three driving modes: sport, personal, and efficient. These control driving functions like engine throttle, steering characteristics, regenerative braking, and chassis settings, as well as internal and external sounds. New modes may be added via over-the-air software updates in the future.
Information about navigation, parking, and EV charging will be fully integrated into iDrive. BMW extends its theme of personalization to its mapping capabilities with a new feature called “learning navigation,” in which the vehicle will learn and anticipate the destination the driver is likely to head for next, based on the driver’s personal ID. This is meant to be a time saver, as well as a way to identify possible road hazards that may delay the journey.
iDrive will support both Apple CarPlay and Android Auto wirelessly, the company said. For several years, BMW had the dubious distinction of being one of the few automakers to charge its customers an annual fee to mirror their smartphone’s display on their car’s infotainment screen. BMW reversed that decision in 2019, and since then offers both CarPlay and Android Auto to its customers for free.
(Pocket-lint) – For 2021 it seems like Asus is going after competitive gamers with the 2021 versions of the ROG Strix G15 and G17. The G15 is now not only more compact than the previous model, but also boasts Nvidia’s RTX 3000 series graphics cards. Something that’s notoriously hard to get hold of in the desktop world.
On paper, the ROG Strix G15 (G513) is a gaming beast with some great options, whether you favour fast screens or stunning visuals. That’s because not only is there a lot of power under the hood, it’s also specced with a choice of Full HD 300Hz or a WQHD 165Hz panel.
The aim to create either gaming powerhouses or the ultimate portable e-sport machines depending on your preference. Or maybe both? We’ve been living with, working with, and gaming with the Strix G15 for a couple of weeks to see how it all stacks up.
Power and prowess
Up to an Nvidia GeForce RTX 3070 Max Q 8GB GDDR6 GPU
Up to AMD Ryzen 9 5900HX CPU
Up to 32GB DDR4 3200MHz SDRAM
Up to 1TB M2 NVMe storage
Don’t be fooled by the compact-yet-snazzy frame of Asus ROG Strix G15 – this is still a powerhouse of a gaming machine. It needs to be as well, in order to reach the lofty goals Asus is aiming for and, of course, to make the most of the 300Hz refresh rate screen (as specified for this review).
On the outside, the Strix G15 retains the usual Asus ROG aesthetic. It boasts an aluminium chassis, a textured finish with ROG logo etching, and a backlit keyboard with RGB underlighting. It’s compact and stylish – but beauty is more than skin deep as the goodness continues when you open the lid and turn the device on.
As you’d expect, the G15 uses NVMe storage, so it boots to Windows in the blink of an eye when you press that power button. Your experience is going to vary depending on whether you go for the WQHD (1440p) model or the Full HD (1080p) one, but even at 1080p we were struck by just how easy-on-the-eye the panel is.
It’s rich and vivid. But more importantly, it’s fast and accurate. With Armoury Crate, you can also adjust the visuals and switch between several pre-programmed settings including Vivid, Cinema, RTS, FPS, and Eye Care. That last one is our favourite for working during the day as it reduces the blue light and makes the screen easier on the eye while you’re beavering away.
Gaming goodness
Display options: Full HD 1080p 300Hz IPS // WQHD 1440p 165Hz, 3ms response
62.5% sRGB, 47.34% Adobe RGB, Adaptive Sync
Benchmarks: PC Mark, TimeSpy, TimeSpy Extreme, Port Royale, FireStrike Ultra, FireStrike Extreme
The screen really shines when you get into a good gaming session of course. If you opt for the 300Hz panel, you can push competitive games to their limit and theoretically make the most of the display’s nifty fast refresh rate.
We played Rainbow Six Siege at around 200fps on Max settings. We managed 66fps average on Dirt 5, 80fps on Far Cry 5, 60-70fps on Assassin’s Creed Odyessy. Even outputting to a 32-inch Samsung Odyssey G7, the G15 still did the business with comparable frame rates.
Other games, including CS:Go and Apex Legends, are bound to make the most of this panel too.
Connection options and downfalls
3x USB 3.2 Gen 1 Type-A, 1x USB 3.2 Gen2 Type-C (support Display Port & 100w PD Charger)
1x LAN RJ-45 Ethernet jack, 1x HDMI 2.0b, 1x audio combo jack
Wi-Fi 6 802.11ax (2×2), Bluetooth w/ support for Range Boost
One of our niggles with the Strix G15 is the connection options. While there are some highlights – there’s an Ethernet port, for example – it lacks a Display Port or Display Port Mini connection unless you have a DP to USB-C adapter.
It also doesn’t have a webcam as standard. We begrudge that in a world where everyone’s on Zoom calls or Microsoft Teams meetings. Yes, it could be countered by simply buying an external webcam but that’s more expense. It also means you’d be using up one of the precious few USB ports as well. Plug in a mouse (no right-minded gamer would use a trackpad), headset and external keyboard and you’ve barely got any ports left.
That said, a nice amount of thought has been put into the overall design. The USB ports are located on the left and rear, meaning if you do plug in a gaming mouse you won’t find cables getting in the way while you play.
Great sound and positional audio too
Up to Twin 4W Smart Amp speakers
Built-in array microphone
Dolby Atmos compatibility
AI microphone noise cancellation
Compact and thin gaming laptops usually run hot and loud in our experience. Sometimes painfully so. The Strix G15 is intelligently designed with excellent cooling vents that seem to keep it running cool under pressure.
It has various different fan modes too, all of which can be switched to from within the Armoury Crate software. Choose from Windows, Silent, Performance and Turbo modes. Under general use, the Strix G15 is pleasantly quiet and barely ramps up when watching video, surfing the web or working away.
Under gaming load it remains fairly quiet too. This is in part thanks to Nvidia’s Whisper Mode technology, which uses AI-powered analysis to adjust cooling to keep things quiet as well as cool. Of course, if you need power for the best performance then you can ramp things up with Turbo and Performance modes. These bring more frames per second potential – but also more fan noise.
4 reasons you should protect your computer with Malwarebytes antivirus
By Pocket-lint Promotion
·
We’re happy to report that even under pressure the Strix G15 is not only quieter than other gaming laptops we’ve tried, but the speakers also do a great job of overpowering any noise from the fan sufficiently enough to not spoil your gaming fun.
Those speakers are also smashing in other ways. They’re great sounding and deliver satisfying audio whatever you’re doing. A two-way AI-noise cancellation mic also removes fan noise and background noise from your chat as well – whether you’re on a work call or gaming with friends.
Battery longevity
90WHr 4-cell li-ion battery
240W AC adapter, 100W PD Charger via USB-C
Despite its compact frame, the G15 packs in some neat battery charging tech that includes the ability to fast-charge for as much as 50 per cent power in just 30 minutes at the plug.
But the highlight for us is under standard, everyday load – browsing and working – we managed to get between five to six hours out of it before it needed charging. That’s great capacity and pleasing if you’re away from a plug or just want to work wire-free around the home.
Unlike other models we’ve tried, the Strix G15 is also capable of playing games when not plugged in with semi-decent performance. We managed to play Rainbow Six Siege at between 30-60fps while unplugged and other casual or less taxing games like Valheim will run nicely too.
Verdict
The Asus ROG Strix G15 is a great bit of kit that’s solidly built and powerful enough to make light work of modern games.
It doesn’t annoy with excessive fan noise, instead delighting with visual pleasures and audible goodness. The battery life means you can happily work all day too. Indeed, about the only irks are the limited connections and lack of webcam.
All told, the ROG Strix G15 is a great gaming device with far more delights than downfalls. It’s one of the best gaming laptops we’ve seen to date.
Also consider
Asus ROG Zephyrus Duo 15
squirrel_widget_305519
If you want something even slimmer and with extra screens then look no further than Zephyrus Duo 15. It runs a bit hotter but it’s also a really pleasing laptop whether you’re working or trying to be gaming productive.
Asus ROG Zephyrus Duo 15 (GX550) review: Too hot to handle?
Gigabyte Aorus 17X
squirrel_widget_4157367
If you don’t mind your laptop being a bit fatter, then the Gigabyte Aorus 17X is an interesting choice as well. It’s designed as a desktop replacement with some serious power under the hood. It also has great highlights including a mechanical keyboard and AI designed to help tune performance.
AMD unveiled its EPYC 7003 ‘Milan’ processors today, claiming that the chips, which bring the company’s powerful Zen 3 architecture to the server market for the first time, take the lead as the world’s fastest server processor with its flagship 64-core 128-thread EPYC 7763. Like the rest of the Milan lineup, this chip comes fabbed on the 7nm process and is drop-in compatible with existing servers. AMD claims it brings up to twice the performance of Intel’s competing Xeon Cascade Lake Refresh chips in HPC, Cloud, and enterprise workloads, all while offering a vastly better price-to-performance ratio.
Milan’s agility lies in the Zen 3 architecture and its chiplet-based design. This microarchitecture brings many of the same benefits that we’ve seen with AMD’s Ryzen 5000 series chips that dominate the desktop PC market, like a 19% increase in IPC and a larger unified L3 cache. Those attributes, among others, help improve AMD’s standing against Intel’s venerable Xeon lineup in key areas, like single-threaded work, and offer a more refined performance profile across a broader spate of applications.
The other attractive features of the EPYC lineup are still present, too, like enhanced security, leading memory bandwidth, and the PCIe 4.0 interface. AMD also continues its general approach of offering all features with all of its chips, as opposed to Intel’s strict de-featuring that it uses to segment its product stack. As before, AMD also offers single-socket P-series models, while its standard lineup is designed for dual-socket (2P) servers.
The Milan launch promises to reignite the heated data center competition once again. Today marks the EPYC Milan processors’ official launch, but AMD actually began shipping the chips to cloud service providers and hyperscale customers last year. Overall, the EPYC Milan processors look to be exceedingly competitive against Intel’s competing Xeon Cascade Lake Refresh chips.
Like AMD, Intel has also been shipping to its largest customers; the company recently told us that it has already shipped 115,000 Ice Lake chips since the end of last year. Intel also divulged a few details about its Ice Lake Xeons at Hot Chips last year; we know the company has a 32-core model in the works, and it’s rumored that the series tops out at 40 cores. As such, Ice Lake will obviously change the competitive landscape when it comes to the market.
AMD has chewed away desktop PC and notebook market share at an amazingly fast pace, but the data center market is a much tougher market to crack. While this segment represents the golden land of high-volume and high-margin sales, the company’s slow and steady gains lag its radical advance in the desktop PC and notebook markets.
Much of that boils down to the staunchly risk-averse customers in the enterprise and data center; these customers prize a mix of factors beyond the standard measuring stick of performance and price-to-performance ratios, instead focusing on areas like compatibility, security, supply predictability, reliability, serviceability, engineering support, and deeply-integrated OEM-validated platforms. To cater to the broader set of enterprise customers, AMD’s Milan launch also carries a heavy focus on broadening AMD’s hardware and software ecosystems, including full-fledged enterprise-class solutions that capitalize on the performance and TCO benefits of the Milan processors.
AMD’s existing EPYC Rome processors already hold the lead in performance-per-socket and pricing, easily outstripping Intel’s Xeon at several key price points. Given AMD’s optimizations, Milan will obviously extend that lead, at least until the Ice Lake debut. Let’s see how the hardware stacks up.
AMD EPYC 7003 Series Milan Specifications and Pricing
Cores / Threads
Base / Boost (GHz)
L3 Cache (MB)
TDP (W)
1K Unit Price
EPYC Milan 7763
64 / 128
2.45 / 3.5
256
280
$7,890
EPYC Milan 7713
64 / 128
2.0 / 3.675
256
225
$7,060
EPYC Rome 7H12
64 / 128
2.6 / 3.3
256
280
?
EPYC Rome 7742
64 / 128
2.25 / 3.4
256
225
$6,950
EPYC Milan 7663
56 / 112
2.0 / 3.5
256
240
$6,366
EPYC Milan 7643
48 / 96
2.3 / 3.6
256
225
$4.995
EPYC Milan 7F53
32 / 64
2.95 / 4.0
256
280
$4,860
EPYC Milan 7453
28 / 56
2.75 / 3.45
64
225
$1,570
Xeon Gold 6258R
28 / 56
2.7 / 4.0
38.5
205
$3,651
EPYC Milan 74F3
24 / 48
3.2 / 4.0
256
240
$2,900
EPYC Rome 7F72
24 / 48
3.2 / ~3.7
192
240
$2,450
Xeon Gold 6248R
24 / 48
3.0 / 4.0
35.75
205
$2,700
EPYC Milan 7443
24 / 48
2.85 / 4.0
128
200
$2,010
EPYC Rome 7402
24 / 48
2.8 / 3.35
128
180
$1,783
EPYC Milan 73F3
16 / 32
3.5 / 4.0
256
240
$3,521
EPYC Rome 7F52
16 / 32
3.5 / ~3.9
256
240
$3,100
Xeon Gold 6246R
16 / 32
3.4 / 4.1
35.75
205
$3,286
EPYC Milan 7343
16 / 32
3.2 / 3.9
128
190
$1,565
EPYC Rome 7302
16 / 32
3.0 / 3.3
128
155
$978
EPYC Milan 72F3
8 / 16
3.7 / 4.1
256
180
$2,468
EPYC Rome 7F32
8 / 16
3.7 / ~3.9
128
180
$2,100
Xeon Gold 6250
8 / 16
3.9 / 4.5
35.75
185
$3,400
AMD released a total of 19 EPYC Milan SKUs today, but we’ve winnowed that down to key price bands in the table above. We have the full list of the new Milan SKUs later in the article.
As with the EPYC Rome generation, Milan spans from eight to 64 cores, while Intel’s Cascade Lake Refresh tops out at 28 cores. All Milan models come with threading, support up to eight memory channels of DDR4-3200, 4TB of memory capacity, and 128 lanes of PCIe 4.0 connectivity. AMD supports both standard single- and dual-socket platforms, with the P-series chips slotting in for single-socket servers (we have those models in the expanded list below). The chips are drop-in compatible with the existing Rome socket.
AMD added frequency-optimized 16-, 24-, and 32-core F-series models to the Rome lineup last year, helping the company boost its performance in frequency-bound workloads, like databases, that Intel has typically dominated. Those models return with a heavy focus on higher clock speeds, cache capacities, and TDPs compared to the standard models. AMD also added a highly-clocked 64-core 7H12 model for HPC workloads to the Rome lineup, but simply worked that higher-end class of chip into its standard Milan stack.
As such, the 64-core 128-thread EPYC 7763 comes with a 2.45 / 3.5 GHz base/boost frequency paired with a 280W TDP. This flagship part also comes armed with 256MB of L3 cache and supports a configurable TDP that can be adjusted to accommodate any TDP from 225W to 280W.
The 7763 marks the peak TDP rating for the Milan series, but the company has a 225W 64-core 7713 model that supports a TDP range of 225W to 240W for more mainstream applications.
All Milan models come with a default TDP rating (listed above), but they can operate between a lower minimum (cTDP Min) and a higher maximum (cTDP Max) threshold, allowing quite a bit of configurability within the product stack. We have the full cTDP ranges for each model listed in the expanded spec list below.
Milan’s adjustable TDPs now allow customers to tailor for different thermal ranges, and Forrest Norrod, AMD’s SVP and GM of the data center and embedded solutions group, says that the shift in strategy comes from the lessons learned from the first F- and H-series processors. These 280W processors were designed for systems with robust liquid cooling, which tends to add quite a bit of cost to the platform, but OEMs were surprisingly adept at engineering air-cooled servers that could fully handle the heat output of those faster models. As such, AMD decided to add a 280W 64-core model to the standard lineup and expanded the ability to manipulate TDP ranges across its entire stack.
AMD also added new 28- and 56-core options with the EPYC 7453 and 7663, respectively. Norrod explained that AMD had noticed that many of its customers had optimized their applications for Intel’s top-of-the-stack servers that come with multiples of 28 cores. Hence, AMD added new models that would mesh well with those optimizations to make it easier for customers to port over applications optimized for Xeon platforms. Naturally, AMD’s 28-core’s $1,570 price tag looks plenty attractive next to Intel’s $3,651 asking price for its own 28-core part.
AMD made a few other adjustments to the product stack based on customer buying trends, like reducing three eight-core models to one F-series variant, and removing a 12-core option entirely. AMD also added support for six-way memory interleaving on all models to lower costs for workloads that aren’t sensitive to memory throughput.
Overall, Milan has similar TDP ranges, memory, and PCIe support at any given core count than its predecessors but comes with higher clock speeds, performance, and pricing.
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
Milan also comes with the performance uplift granted by the Zen 3 microarchitecture. Higher IPC and frequencies, not to mention more refined boost algorithms that extract the utmost performance within the thermal confines of the socket, help improve Milan’s performance in the lightly-threaded workloads where Xeon has long held an advantage. The higher per-core performance also translates to faster performance in threaded workloads, too.
Meanwhile, the larger unified L3 cache results in a simplified topology that ensures broader compatibility with standard applications, thus removing the lion’s share of the rare eccentricities that we’ve seen with prior-gen EPYC models.
The Zen 3 microarchitecture brings the same fundamental advantages that we’ve seen with the desktop PC and notebook models (you can read much more about the architecture here), like reduced memory latency, doubled INT8 and floating point performance, and higher integer throughput.
AMD also added support for memory protection keys, AVX2 support for VAES/VPCLMULQD instructions, bolstered security for hypervisors and VM memory/registers, added protection against return oriented programming attacks, and made a just-in-time update to the Zen 3 microarchitecture to provide in-silicon mitigation for the Spectre vulnerability (among other enhancements listed in the slides above). As before, Milan remains unimpacted by other major security vulnerabilities, like Meltdown, Foreshadow, and Spoiler.
Image 1 of 11
Image 2 of 11
Image 3 of 11
Image 4 of 11
Image 5 of 11
Image 6 of 11
Image 7 of 11
Image 8 of 11
Image 9 of 11
Image 10 of 11
Image 11 of 11
The EPYC Milan SoC adheres to the same (up to) nine-chiplet design as the Rome models and is drop-in compatible with existing second-gen EPYC servers. Just like the consumer-oriented chips, Core Complex Dies (CCDs) based on the Zen 3 architecture feature eight cores tied to a single contiguous 32MB slice of L3 cache, which stands in contrast to Zen 2’s two four-core CCXes, each with two 16MB clusters. The new arrangement allows all eight cores to communicate to have direct access to 32MB of L3 cache, reducing latency.
This design also increases the amount of cache available to a single core, thus boosting performance in multi-threaded applications and enabling lower-core count Milan models to have access to significantly more L3 cache than Rome models. The improved core-to-cache ratio boosts performance in HPC and relational database workloads, among others.
Second-gen EPYC models supported either 8- or 4-channel memory configurations, but Milan adds support for 6-channel interleaving, allowing customers that aren’t memory bound to use less system RAM to reduce costs. The 6-channel configuration supports the same DDR4-3200 specification for single DIMM per channel (1DPC) implementations. This feature is enabled across the full breadth of the Milan stack, but AMD sees it as most beneficial for models with lower core counts.
Milan also features the same 32-bit AMD Secure Processor in the I/O Die (IOD) that manages cryptographic functionality, like key generation and management for AMD’s hardware-based Secure Memory Encryption (SME) and Secure Encrypted Virtualization (SEV) features. These are key advantages over Intel’s Cascade Lake processors, but Ice Lake will bring its own memory encryption features to bear. AMD’s Secure Processor also manages its hardware-validated boot feature.
AMD EPYC Milan Performance
Image 1 of 13
Image 2 of 13
Image 3 of 13
Image 4 of 13
Image 5 of 13
Image 6 of 13
Image 7 of 13
Image 8 of 13
Image 9 of 13
Image 10 of 13
Image 11 of 13
Image 12 of 13
Image 13 of 13
AMD provided its own performance projections based on its internal testing. However, as with all vendor-provided benchmarks, we should view these with the appropriate level of caution. We’ve included the testing footnotes at the end of the article.
AMD claims the Milan chips are the fastest server processors for HPC, cloud, and enterprise workloads. The first slide outlines AMD’s progression compared to Intel in SPECrate2017_int_base over the last few years, highlighting its continued trajectory of significant generational performance improvements. The second slide outlines how SPECrate2017_int_base scales across the Milan product stack, with Intel’s best published scores for two key Intel models, the 28-core 6258R and 16-core 4216, added for comparison.
Moving on to a broader spate of applications, AMD says existing two-socket 7H12 systems already hold an easy lead over Xeon in the SPEC2017 floating point tests, but the Milan 7763 widens the gap to a 106% advantage over the Xeon 6258R. AMD uses this comparison for the two top-of-the-stack chips, but be aware that this is a bit lopsided: The 6258R carries a tray price of $3,651 compared to the 7763’s $7,890 asking price. AMD also shared benchmarks comparing the two in SPEC2017 integer tests, claiming a similar 106% speedup. In SPECJBB 2015 tests, which AMD uses as a general litmus for enterprise workloads, AMD claims 117% more performance than the 6258R.
The company also shared a few test results showing performance in the middle of its product stack compared to Intel’s 6258R, claiming that its 32-core part also outperforms the 6258R, all of which translates to improved TCO for customers due to the advantages of lower pricing and higher compute density that translates to fewer servers, lower space requirements, and lower overall power consumption.
Finally, AMD has a broad range of ecosystem partners with fully-validated platforms available from top-tier OEMs like Dell, HP, and Lenovo, among many others. These platforms are fed by a broad constellation of solutions providers as well. AMD also has an expansive list of instances available from leading cloud service providers like AWS, Azure, Google Cloud, and Oracle, to name a few.
Image 1 of 2
Image 2 of 2
Model #
Cores
Threads
Base Freq (GHz)
Max Boost Freq (up to GHz11)
Default TDP (w)
cTDP Min (w)
cTDP Max (w)
L3 Cache (MB)
DDR Channels
Max DDR Freq (1DPC)
PCIe 4
1Ku Pricing
7763
64
128
2.45
3.50
280
225
280
256
8
3200
x128
$7,890
7713
64
128
2.00
3.68
225
225
240
256
8
3200
X128
$7,060
7713P
64
128
2.00
3.68
225
225
240
256
8
3200
X128
$5,010
7663
56
112
2.00
3.50
240
225
240
256
8
3200
x128
$6,366
7643
48
96
2.30
3.60
225
225
240
256
8
3200
x128
$4,995
75F3
32
64
2.95
4.00
280
225
280
256
8
3200
x 128
$4,860
7543
32
64
2.80
3.70
225
225
240
256
8
3200
x128
$3,761
7543P
32
64
2.80
3.70
225
225
240
256
8
3200
X128
$2,730
7513
32
64
2.60
3.65
200
165
200
128
8
3200
x128
$2,840
7453
28
56
2.75
3.45
225
225
240
64
8
3200
x128
$1,570
74F3
24
48
3.20
4.00
240
225
240
256
8
3200
x128
$2,900
7443
24
48
2.85
4.00
200
165
200
128
8
3200
x128
$2,010
7443P
24
48
2.85
4.00
200
165
200
128
8
3200
X128
$1,337
7413
24
48
2.65
3.60
180
165
200
128
8
3200
X128
$1,825
73F3
16
32
3.50
4.00
240
225
240
256
8
3200
x128
$3,521
7343
16
32
3.20
3.90
190
165
200
128
8
3200
x128
$1,565
7313
16
32
3.00
3.70
155
155
180
128
8
3200
X128
$1,083
7313P
16
32
3.00
3.70
155
155
180
128
8
3200
X128
$913
72F3
8
16
3.70
4.10
180
165
200
256
8
3200
x128
$2,468
Thoughts
AMD’s general launch today gives us a good picture of the company’s data center chips moving forward, but we won’t know the full story until Intel releases the formal details of its 10nm Ice Lake processors.
The volume ramp for both AMD’s EPYC Milan and Intel’s Ice Lake has been well underway for some time, and both lineups have been shipping to hyperscalers and CSPs for several months. The HPC and supercomputing space also tend to receive early silicon, so they also serve as a solid general litmus for the future of the market. AMD’s EPYC Milan has already enjoyed brisk uptake in those segments, and given that Intel’s Ice Lake hasn’t been at the forefront of as many HPC wins, it’s easy to assume, by a purely subjective measure, that Milan could hold some advantages over Ice Lake.
Intel has already slashed its pricing on server chips to remain competitive with AMD’s EPYC onslaught. It’s easy to imagine that the company will lean on its incumbency and all the advantages that entails, like its robust Server Select platform offerings, wide software optimization capabilities, platform adjacencies like networking, FPGA, and Optane memory, along with aggressive pricing to hold the line.
AMD has obviously prioritized its supply of server processors during the pandemic-fueled supply chain disruptions and explosive demand that we’ve seen over the last several months. It’s natural to assume that the company has been busy building Milan inventory for the general launch. We spoke with AMD’s Forrest Norrod, and he tells us that the company is taking steps to ensure that it has an adequate supply for its customers with mission-critical applications.
One thing is clear, though. Both x86 server vendors benefit from a rapidly expanding market, but ARM-based servers have become more prevalent than we’ve seen in the recent past. For now, the bulk of the ARM uptake seems limited to cloud service providers, like AWS with its Graviton 2 chips. In contrast, uptake is slow in the general data center and enterprise due to the complexity of shifting applications to the ARM architecture. Continuing and broadening uptake of ARM-based platforms could begin to change that paradigm in the coming years, though, as x86 faces its most potent threat in recent history. Both x86 vendors will need a steady cadence of big performance improvements in the future to hold the ARM competition at bay.
Unfortunately, we’ll have to wait for Ice Lake to get a true view of the competitive x86 landscape over the next year. That means the jury is still out on just what the data center will look like as AMD works on its next-gen Genoa chips and Intel readies Sapphire Rapids.
Tightly curved monitors like the MSI MPG Artymis 343CQR can really enhance gameplay, especially in first-person environments. With class-leading contrast, accurate out-of-box color and superb HDR, the 343CQR should be on everyone’s curved screen short list.
For
High contrast
Accurate out-of-box color
Solid gaming performance
1000R curve
Against
Slightly light gamma
Blur reduction feature makes the screen too bright
Higher input lag than some 144 Hz screens
Features and Specifications
In the world of curved monitors, there are more things to consider than just screen size. Not only are there three different aspect ratios, 16:9, 21:9 and 32:9, they also come in a wide variety of curve radii. This number is expressed in millimeters like 1500R or 1800R. Larger numbers indicate less curvature. When you see 1000R, you know the curve is as extreme as it gets
MSI has jumped on the 1000R train with its MPG Artymis 343CQR. In addition to that tight curve, it sports a high-contrast VA panel running at 3440×1440 resolution with USB-C, HDR support, Adaptive-Sync and an impressive 165 Hz refresh rate worthy of competing with the best gaming monitors. Selling for a premium price ($900 as of writing), the 343CQR is a sharply focused display that is at its best when gaming — going even as far as to include an aim magnifier for shooters.
MSI MPG Artymis 343CQR Specs
Panel Type / Backlight
VA / W-LED, edge array
Screen Size, Aspect Ratio & Curve
34 inches / 21:9
Curve radius: 1000mm
Max Resolution & Refresh
3440×1440 @ 165 Hz
FreeSync: 48-165 Hz
Native Color Depth & Gamut
10-bit (8-bit+FRC) / DCI-P3
DisplayHDR 400, HDR10
Response Time (MPRT)
1ms
Brightness (mfr)
SDR: 350 nits
HDR: 550 nits
Contrast (mfr)
3,000:1
Speakers
None
Video Inputs
1x DisplayPort 1.4
2x HDMI 2.0
1x USB-C
Audio
3.5mm headphone output
USB 3.2
1x up, 2x down
Power Consumption
32.6w, brightness @ 200 nits
Panel Dimensions WxHxD w/base
31.3 x 16.5-20.5 x 12.4 inches (795 x 419-521 x 315mm)
Panel Thickness
6.5 inches (165mm)
Bezel Width
Top/sides: 0.4 inch (9mm)
Bottom: 0.9 inch (22mm)
Weight
20.2 pounds (9.2kg)
Warranty
3 years
The 343CQR is all about gaming with support for AMD FreeSync from 48-165 Hz. It’s not G-Sync Compatible-certified, but we still got Nvidia G-Sync to work (see our How to Run G-Sync on a FreeSync Monitor article for instructions).
MSI’s specs sheet includes nearly 85% coverage of the DCI-P3 color gamut. You’ll be using that gamut for all content, SDR and HDR alike, because there is no sRGB mode available.
MSI designed the 343CQR with consoles in mind too. It will accept 4K resolution signals and down-convert them to 3440 x 1440 resolution. The 343CQR is also the first monitor we’ve seen with HDMI CEC (Consumer Electronics Control). Originally developed to support universal remotes, the CEC implementation in this monitor is designed to sense whether the incoming signal is coming from a PC or a console and adjust its picture mode based on designated profiles. The feature supports both PlayStation and Nintendo Switch.
Assembly and Accessories of MSI MPG Artymis 343CQR
To assemble the MSI MPG Artymis 343CQR, the panel and upright are mated with four fasteners, so you’ll need to have a Phillip’s head screwdriver handy. Next, you attach the base with a captive bolt. The resulting package is rock-solid and shows impressive build quality. It certainly meets the standard one expects for the price.
Bundled cables include IEC for the internal power supply, DisplayPort, HDMI and USB. A small snap-on cover hides the panel’s mounting hardware. And if you’d rather use a monitor arm, the bolt holes are in a 100mm VESA pattern with large-head bolts included. In a nice touch, a small hook snaps onto the bottom of the panel to help manage your best gaming mouse’s cable.
MSI MPG Artymis Product 360
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
From the front, the MSI MPG Artymis 343CQR is all business with a thin flush bezel around the top and sides and a molded strip across the bottom adorned only with a small MSI logo. A tiny LED appears red in standby mode and white when the power’s on. Around the back right is a joystick and two buttons. One activates the Gaming OSD (on-screen display) app, and the other toggles power.
The upright is very solid with a stiff-moving 4-inch height adjustment. You also get 30 degrees swivel to both sides and 5/20 degrees tilt. There isn’t even a hint of slop or wobble. A small hole helps tidy up cables. The base is solid metal with thin legs that go more than 1 foot deep. That, combined with the fact that the panel’s 6.5-inch thick means you’ll need a bit of extra desktop space to accommodate the 343CQR.
From the top, you can see the 1000R curvature clearly. That radius means that if you made a circle from 343CQRs, it would be just two meters in diameter. If you have the room for three of them, they’ll wrap around almost 180 degrees. They would make a great flight simulator or, perhaps, a solid solution for a Zwift (cycling virtual training app) setup.
The back of the MSI MPG Artymis 343CQR is styled nicely with a variety of different textures and an RGB effect that shows as a strip and MSI shield graphic with a dragon. The color breaths gently through different shades. You can turn it on and off in the OSD and control it ever further with the Gaming OSD app. You can also sync up the lighting effect with that of other MSI products that support the vendor’s Mystic Light-branded RGB. That way, you can create a custom light show with everything working in concert.
The input panel includes two HDMI 2.0 ports that support refresh rates up to 100 Hz with Adaptive-Sync and HDR. Meanwhile, the DisplayPort 1.4 and USB-C inputs accept 165 Hz signals, also with HDR and Adaptive-Sync. There are no built-in speakers, but you get a 3.5mm audio port for headphones.
OSD Features of MSI MPG Artymis
Pressing the joystick brings up the MSI MPG Artymis 343CQR’s OSD,which is divided into seven sub-menus. There are plenty of gaming features as well as most of what you’ll need for calibration.
The Gaming menu offers five picture modes. Four are game genre-specific, and there’s also the default mode, User. User’s the mode to pick because it delivers solid accuracy with no need for calibration. There are a few minor flaws, but the 343CQR definitely makes our Calibration Not Required list.
The Night Vision option is a shadow detail enhancer. We didn’t need it because the monitor’s black levels are both deep and richly detailed. Response Time is a three-level overdrive. Fast, the middle setting, is best. Next, MPRT is a backlight strobe that reduces motion blur and cancels out Adaptive-Sync.
It also pegs the brightness at over 860 nits, which is unusual. You can reduce this with the contrast control, but that removes much of the picture’s depth and quality. We recommend sticking with Adaptive-Sync and leaving MPRT off. Finally, Zero Latency should always be turned on for the lowest possible input lag.
Additional features include a frame rate indicator, alarm clock, aiming points and an Optix Scope feature. This is geared at fans of first-person shooters and lets you magnify the area underneath your crosshair in multiple levels using hot keys. As this will take some finessing to execute smoothly and without slowing down your gameplay, it won’t be for everyone.
The OSD will always show you the MSI MPG Artymis 343CQR’s signal status at the top with resolution, refresh rate, HDR status, FreeSync status and the active video input.
The Image menu offers three color temperature presets, plus a User mode. Normal is the default and best choice. We were unable to make a visual improvement to the color temp with calibration. The test numbers show a tiny gain but not one that can be seen with the naked eye. The only thing we wished for was a gamma control. The default luminance curve is a tad light, though that’s somewhat mitigated by the 343CQR’s extremely high contrast.
Calibration Settings of MSI MPG Artymis 343CQR
You can simply unpack the MSI MPG Artymis 343CQR, plug it in and enjoy. The image is very accurate by default — even the brightness is already set close to 200 nits in the User picture mode. We attempted a calibration and made no visible improvement.
Our settings are below if you want to try them. Note that in the User color temp, the RGB sliders start at 50%, which reduces brightness by roughly that amount. We turned them all up to 100%, then adjusted from there to achieve maximum dynamic range.
Picture Mode
User
Brightness 200 nits
49
Brightness 120 nits
6 (min. 109 nits)
Contrast
70
Color Temp User
Red 100, Green 93, Blue 93
HDR signals lock out all picture controls. You can still access the modes, but changing them does not affect the image. HDR grayscale runs a tad red, but the EOTF is spot-on, as is the color tracking. The 343CQR also uses dynamic contrast to achieve tremendous contrast for HDR content.
Gaming and Hands-on with MSI MPG Artymis 343CQR
At 1000R, the MSI MPG Artymis 343CQR is as curvy as a gaming monitor gets today. At first, we noticed a little image distortion when working in productivity apps, like word processors and spreadsheets. However, we got used to the look after a short time.
When browsing the web, that distortion became unnoticeable. The monitor’s image is sharp and contrast-y enough to overshadow any horizontal line curvature. It’s best to set the panel exactly vertical with no back or forward tilt. By adjusting the height so our eyes were centered, it made all parts of the screen equidistant from the body. The 343CQR is perfectly usable for workday tasks.
Color was nicely balanced with slightly more than sRGB saturation but not so much that it looked unnatural. MSI has tuned the gamut so it renders SDR content more accurately without the need to switch color spaces, a capability the MSI MPG Artymis 343CQR lacks. When HDR was on, color looked far more vibrant, as it should. This is one of the few monitors that you could leave in HDR mode all the time for Windows apps. Brightness is reasonable with the highest levels reserved only for small highlights.
The monitor also supports 10-bit color, though the panel uses Frame Rate Conversion to achieve this. Despite the internal upconversion, we didn’t see any banding artifacts.
Gaming tests started with our usual trip through Tomb Raider, which clipped along at a sprightly 165 fps on a Radeon RX 5700 XT and GeForce RTX 3090. Both FreeSync and G-Sync worked without a hitch. The MSI MPG Artymis 343CQR’s middle overdrive setting, Fast, struck the best balance between ghosting and blur reduction. The MPRT backlight strobe feature also worked well at reducing blur without artifacts but at the cost of a very bright and overly harsh image. Playing games at over 800 nits peak grew tiring after a short time.
Engaging HDR for a few hours of Call of Duty: WWII proved to be a singular experience. The MSI MPG Artymis 343CQR nears equalling a FALD display when it comes to HDR contrast and color. Every hue, down to the murkiest greens and browns, leapt from the screen. Black levels seemed almost OLED-like in their depth and detail, offset by perfectly balanced highlight areas. Color accuracy was also top-notch. Though we noted a slight red tint during the grayscale tests, it did not affect games or movies we played. This is one of the best HDR monitors we’ve seen in a while.
If you download MSI’s Dragon Center software, you can also use the 343CQR’s Sound Tune feature which uses “AI calculations” to block out background noise coming through a plugged in headset. Since it requires software and many of the best gaming headsets include similar tech on their own, its usefulness will vary depending on the gamer.
Another unique feature comes in what MSI calls Mobile Projector. It lets you display your phone’s screen in a 5:9 column on the side of the monitor. Although having your phone on your computer screen could generally be distracting, if you have a specific task that requires using both your smartphone and PC, this could come in handy.
Intel may offer more information about its upcoming products soon. The company’s hosting a session at GDC Showcase that promises to offer a “first look at the new Tiger Lake H-series notebook and Rocket Lake desktop processors.”
It’s not clear what exactly Intel plans to share at GDC Showcase, which is essentially the pre-show for Game Developers Conference 2021, especially since we got our “first looks” at Tiger Lake and Rocket Lake in September and October 2020.
We already know Tiger Lake is supposed to introduce a new ultraportable gaming segment; that models with four, six, and eight cores will be available; and that Intel claims these processors will outperform AMD’s Ryzen 4000-series “Renoir” chips.
Intel’s also claimed that manufacturers have already built more than 150 products around Tiger Lake-H processors, and even though the line is supposed to be limited to notebooks, ASRock’s already planning to use the chips in several NUC models.
We also know Rocket Lake is supposed to help Intel claim more spots on our list of the best CPUs with a claimed peak boost speed of 5.3GHz, the introduction of the Cypress Cove architecture, and the inclusion of 12th-gen Xe LP Graphics.
GDC Showcase might have been a good time for Intel to announce Rocket Lake retail availability, but the company’s already said enthusiasts should be able to get their hands on the new CPUs on March 30. (Assuming they haven’t already bought some.)
But that doesn’t mean Intel will show up to GDC Showcase empty-handed. We’re still awaiting official specs for eight-core Tiger Lake models, for example, and so far the only information we have about Rocket Lake pricing has come from retailer leaks.
So far as what Intel’s said about its plans: The session will purportedly help viewers “learn how Intel empowers software developers with the latest tools and technology helping to make the best gaming and content creation experiences possible.”
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.