Qualcomm has announced a new 700-series chipset for mobile devices: the Snapdragon 778G 5G. It will start appearing in premium midrange phones from manufacturers including Motorola, Xiaomi, Realme, Honor, Oppo, and iQOO in the next few months, bringing with it some video capture and AI capabilities borrowed from the Snapdragon 888, the current chipset of choice for flagship Android phones. The company has also made a couple of other announcements today designed to get 5G connectivity into more tech.
The Snapdragon 778G offers three image-signal processors, or ISPs — a feature Qualcomm touted in its flagship 888 chipset, and also appears in the higher-end 780G. This makes it possible to capture photos and video from three different cameras at once. You can easily switch between different cameras’ video feeds during recording ala Samsung’s director view on the Snapdragon 888-powered S21 series.
The processor also supports cameras with staggered HDR sensors like the 50-megapixel chip in the Xiaomi Mi 11 Ultra for better HDR video recording. The 778G also includes some improvements for more GPU-efficient mobile gaming, and things like better noise suppression and camera experiences on video calls. Both mmWave and sub-6GHz 5G are supported, as well as Wi-Fi 6.
In “more 5G in more places” news, Qualcomm is also making M.2 reference designs available for current OEM customers of its X65 and X62 5G modems. This makes it easier for laptop, desktop, gaming, and IoT manufacturers to incorporate 5G connectivity into their products. The company is also debuting a new X65 5G modem, which Qualcomm says is more energy efficient and offers wider support of mmWave frequencies. It will start appearing in commercial mobile devices later this year, the company says.
Summer Game Fest is back this year, and it will start on June 10th with an event called “Kick Off Live!” that’s billed as a “spectacular world premiere showcase” with “more than a dozen” world premieres and announcements. The show, which will begin at 2PM ET, will be hosted by Geoff Keighley, who you might also know as the host of The Game Awards.
Kick Off Live! is just the first of many events that will be part of Summer Game Fest. Some of the publishers confirmed to be participating in Summer Game Fest include 2K, Activision, Blizzard, Capcom, Epic Games, Sony PlayStation, Riot Games, Square Enix, Ubisoft, and Microsoft Xbox.
This first event will also feature a performance by Weezer, “who will debut a brand new, stream safe game soundtrack song that can be freely streamed on Twitch, YouTube and anywhere else without being blocked or losing monetization,” according to a press release. (So it seems like we won’t see a repeat of what happened with Metallica’s performance at BlizzCon.)
Summer Game Fest launched last year to let publishers showcase their upcoming games after some of the industry’s biggest events were restricted or canceled due to the COVID-19 pandemic. Many gaming events have been affected by the pandemic this year as well, forcing some, like GDC and E3, to shift to digital formats.
Exclusive: After the Galaxy Z Fold and Galaxy Z Flip, Samsung is preparing for the introduction of its first rollable smartphone, the Samsung Galaxy Z Roll.
Ahead of the Display Week 2021 conference, Samsung Display showed its new generation OLED screens yesterday. These included an S Foldable, a Slidable and a Rollable device. So far, the display devices have only been demonstrated by Samsung’s display division. The question therefore remains when will Samsung Electronics integrate these new types of screens for the first time and what will be the name of these devices?
Today we can already unveil you the name of Samsung’s upcoming rollable smartphone, as Samsung Electronics has applied for a remarkable trademark at the European Union Intellectual Property Office.
Samsung Z Roll rollable smartphone
On May 18, 2021, Samsung Electronics filed a trademark for “Z Roll”. The application is categorized as Class 9 and comes with the following description.
Samsung Z Rolltrademark description: “Smartphones; mobile telephones; tablet computers; telecommunication apparatus; electronic pens for smartphones and tablet computers “.
Based on the name “Roll”, it is very likely that this device will have a rollable display. The “Z” seems to refer to the series, all smartphones with a folding screen are housed in the Galaxy Z series. This application shows that Samsung intends to place its rollable smartphones within the same series.
The name does not come as a complete surprise. In November last year, we already suggested that Z Roll would be a very appropriate name for Samsung’s rollable smartphone. After the Galaxy Z Fold and the Galaxy Z Flip, this time it seems to be the turn of the Samsung Galaxy Z Roll.
For the time being, it remains unclear when Samsung will announce its first smartphone with retractable display. Perhaps that we will hear more about this futuristic device in August 2021, during the Galaxy Unpacked event where the Galaxy Z Fold 3 and Galaxy Z Flip 3 foldable smartphones are also expected. However, it will likely take until 2022 before the Z Roll gets released.
From the images released by Samsung Display, it can be concluded that the slidable smartphone is a regular-sized smartphone in its most compact form. In portrait mode, the screen can be pulled out to the right, after which the screen area is enlarged by approximately 30%. The extra screen part can serve, among other things, for displaying system icons or a messaging app.
Over time, LetsGoDigital has reported several times about a Samsung phone with a retractable screen. For example, in mid-2019 we already managed to track down a patent for a Samsung Galaxy smartphone that could be pulled out in width – as shown in the image below.
At the beginning of this year, we also reported on a slide smartphone from Samsung where the screen could be enlarged by about 30%. Rumors have been going on for some time about a rollable Samsung smartphone. At the beginning of 2020, a retractable Samsung Galaxy phone was already shown to a limited group of people during CES.
Samsung is not the only manufacturer that sees a future in rollable phones. For a long time it was thought that the LG Rollable would become the first slidable smartphone, however the development of this device has been discontinued now that LG Electronics has officially stopped the production of smartphones.
Oppo also showed a retractable phone last year. However, the Oppo x 2021 was just a concept, the company has indicated that it has no plans yet to actually release this device. More recently, TCL also showed a special concept of a Fold ‘n Roll smartphone. It is certainly not inconceivable that Samsung will become the very first manufacturer to actually release a rollable smartphone.
Here you can take a look at the trademark of Samsung Electronics for Z Roll.
For those wondering why the name “Samsung Electronics” is not mentioned in the PDF, it appears that Samsung simply has not paid the bill for the application yet – which is for the first time. In our system, however, the name “Samsung Electronics” immediately comes up as the rightful owner of this trademark. In addition, the application was submitted by intermediary Abril Abogados from Spain, this company has more often been responsible for filing European trademarks for the South Korean manufacturer, including for the name Samsung Z Fold.
Ilse is a Dutch journalist and joined LetsGoDigital more than 15 years ago. She is highly educated and speaks four languages. Ilse is a true tech-girl and loves to write about the future of consumer electronics. She has a special interest for smartphones, digital cameras, gaming and VR.
The Alienware m15 Ryzen Edition R5 is so good that it makes us wonder why Dell didn’t team up with AMD on a laptop sooner.
For
+ Strong gaming performance
+ Excellent productivity performance
+ Unique chassis
+ Not too costly for it power
Against
– Internals run hot
– Middling audio
– Bad webcam
It’s been 14 years since Alienware’s used an AMD CPU in one of its laptops, but AMD’s recent Ryzen processors have proven to be powerhouses that have generated a strong gamer fanbase. It also doesn’t hurt that AMD-based laptops have frequently undercut Intel in price. Point being, times have changed and now Team Red can easily compete with the best gaming laptops that Intel has to offer.
So it makes sense that Alienware’s finally been granted permission to board Dell’s UFO. And with the Alienware m15 Ryzen Edition R5, it’s getting a first class treatment.
Alienware m15 Ryzen Edition R5 Specifications
CPU
AMD Ryzen 7 5800H
Graphics
Nvidia GeForce RTX 3060 6GB GDDR6, 1,702 MHz Boost Clock, 125 W Total Graphics Power
Memory
16GB DDR4-3200
Storage
512GB M.2 PCIe NVMe SSD
Display
15.6 inch, 1920 x 1080, 165Hz, IPS
Networking
802.11ax Killer Wi-Fi 6, Bluetooth 5.2
Ports
USB-A 3.2 Gen 1 x 3, HDMI 2.1, USB-C 3.2 Gen 2 x 1 (DisplayPort), RJ-45 Ethernet, 3.5mm combination headphone/microphone port
Camera
720p
Battery
86 WHr
Power Adapter
240W
Operating System
Windows 10 Home
Dimensions(WxDxH)
14.02 x 10.73 x 0.9 inches (356.2 x 275.2 x 22.85 mm)
Weight
5.34 pounds (2.42 kg)
Price (as configured)
$1,649
Design of the Alienware m15 Ryzen Edition R5
Image 1 of 7
(Image credit: Tom’s Hardware)
Image 2 of 7
(Image credit: Tom’s Hardware)
Image 3 of 7
(Image credit: Tom’s Hardware)
Image 4 of 7
(Image credit: Tom’s Hardware)
Image 5 of 7
(Image credit: Tom’s Hardware)
Image 6 of 7
(Image credit: Tom’s Hardware)
Image 7 of 7
(Image credit: Tom’s Hardware)
Unlike other recent Alienware laptops, the m15 R5 Ryzen Edition only comes in black. The “lunar light” white isn’t an option here. Still, it’s a bold design that puts the emphasis on the laptop’s build quality rather than on decoration, and it pays off. The m15 R5 feels sturdy in the hand and its smooth edges give it a unique premium flare. It’s not too plain, since lighting options for the Alienware logo on the lid plus a circular LED strip along the back rim add a touch of flair. On that note, the stylized “15” on the lid is stylish, though it can look a bit much like a “13” from the wrong angle.
Hexagonal vents that sit above the keyboard and along the back also give the m15 R5 a bit of functional decoration and help make up for the small and well hidden side vents. The keyboard on this model has four-zone RGB, but it can be a little dim in well-lit areas.
This laptop veers on the large and heavy end for systems with an RTX 3060. At 14.02 x 10.73 x 0.9 inches large and 5.34 pounds heavy, it’s generally bulkier than the Asus TUF Dash F15 we reviewed, which has a mobile RTX 3070 and is 14.17 x 9.92 x 0.78 inches large and weighs 4.41 pounds. The Acer Predator Triton 300 SE, which manages to fit a mobile RTX 3060 into a 14 inch device, is also especially impressive next to the m15 R5. Granted, both of those use lower-power processors designed for thinner machines. Specifically, the Acer is 12.7 x 8.97 x .70 inches large and weighs 3.75 pounds.
The Alienware m15 R4, which has a 10th gen 45W Intel Core i7 processor and an RTX 3070, is 14.19 x 10.86 x 0.78 inches large and weighs 5.25 pounds. That leaves it not as bulky as the m15 Ryzen Edition R5, but about as heavy.
Port selection is varied, although distribution differs from my usual preferences. The left side of the laptop only has the Ethernet port and the 3.5mm headphone/microphone jack, which is a shame as that’s where I typically like to connect my mouse. The back of the laptop has a few more connections, including the DC-in, an HDMI 2.1 port, a USB 3.2 Gen 1 Type-A port and a USB 3.2 Gen 2 Type-C port that also supports DisplayPort. The right side of the laptop has two additional USB 3.2 Gen 2 Type-A ports.
Gaming Performance on the Alienware M15 Ryzen Edition R5
Image 1 of 4
(Image credit: Tom’s Hardware)
Image 2 of 4
(Image credit: Tom’s Hardware)
Image 3 of 4
(Image credit: Tom’s Hardware)
Image 4 of 4
(Image credit: Tom’s Hardware)
Our review configuration of the Alienware m15 Ryzen Edition R5 came equipped with an 8-core, 16-thread Ryzen R7 5800H CPU and an RTX 3060 laptop GPU. It’s the first time we’ve tested a 45W CPU with an RTX 3060 and, to that end, we’ve decided to compare it to one 35W laptop with an RTX 3070 CPU, the Asus TUF Dash F15 with an Intel Core i7-11370H, and one 35W laptop with an RTX 3060 GPU, the Acer Predator Triton 300 SE with an Intel Core i7-11375H. We’ve also thrown the Alienware m15 R4 into the mix, which has a 45W 10th gen Intel CPU and an admittedly more powerful RTX 3070, plus a significantly higher price tag than any other competitor even on its cheapest configuration (the thing starts at $2,149).
I played Control on the Alienware laptop for a half hour to get a personal feel for gaming on the system. I tended to fall between 60 – 70 fps at high settings throughout, and turning ray tracing on using its high preset dropped that to 30 – 40 fps. The fans are certainly noticeable but aren’t ear-splitting, and the laptop neither got hot-to-the-touch nor did it spray hot air on my hands.
In Shadow of the Tomb Raider’s benchmark running at highest settings, the m15 Ryzen Edition R5’s CPU seemed to do it a favor, as its 73 fps average only barely fell behind the m15 R4’s 77 fps average. The Acer laptop was next in line with 61 fps, while the Asus laptop was significantly behind all other options at 54 fps.
Scores were a bit more even in Far Cry: New Dawn’s benchmark running at ultra settings. While the m15 R4 hit 91 fps, everything else was in the 70s. The m15 Ryzen Edition R5 had an average of 79 fps, while the Asus scored 74 fps and the Acer reached 73 fps.
The m15 Ryzen Edition R5 fell to third place in the Grand Theft Auto V benchmark running at very high settings, where it hit an 82 fps average and the Asus laptop achieved an 87 fps average. The Acer laptop was significantly behind at 72 fps, while the m15 R4 was significantly ahead at 108 fps.
Red Dead Redemption 2’s benchmark running at medium settings saw the m15 Ryzen Edition R5 once again stay in third place, though by a more significant margin this time. The R5 achieved a 53 fps average, while the Asus led with 61 fps score. The Acer was once again behind at 48 fps, while the m15 R4 stayed ahead at 69 fps.
We also ran the Alienware M15 R5 Ryzen Edition through the Metro Exodus RTX benchmark 15 times in a row to test how well it holds up to a sustained heavy load. During this benchmark, it hit an average 56 fps. The CPU ran at an average 3.63-GHz clock speed while the GPU ran at an average clock speed of 1.82 GHz. The CPU’s average temperature was 90.36 degrees Celsius (194.65 degrees Fahrenheit) and the GPU’s average temperature was 82.02 degrees Celsius (179.64 degrees Fahrenheit).
Productivity Performance for the Alienware m15 Ryzen Edition R5
Image 1 of 3
(Image credit: Tom’s Hardware)
Image 2 of 3
(Image credit: Tom’s Hardware)
Image 3 of 3
(Image credit: Tom’s Hardware)
While Alienware is a gaming brand, the use of a 45W AMD chip does open the Alienware m15 Ryzen Edition R5 up to high productivity potential.
On Geekbench 5, which is a synthetic test for tracking general PC performance, the m15 Ryzen Edition R5 hit 1,427 points on single-core tests and 7,288 points on multi-core tests. While its single core score was on the lower end when compared to the Asus TUF Dash F15’s 1,576 points and the Acer Predator Triton 300 SE’s 1,483 points, the Alienware blew those laptops away on multi-core scores. The Asus’ multi-core score was 5,185, while the Acer’s multi-core score was 5,234.
The Alienware m15 R4 was a bit more even with its AMD cousin, scoring 1,209 on single-core Geekbench 5 tests and 7,636 on the program’s multi-core benchmarks.
Unfortunately, the m15 Ryzen Edition R5 couldn’t maintain that momentum for our 25GB file transfer benchmark. Here, it transferred files at a 874.14 MBps speed, while the Asus hit 1,052.03 MBps and the Acer reached 993.13 MBps. The m15 R5 hit speeds of 1137.34 MBps.
The m15 Ryzen Edition R5 was the fastest contender in our Handbrake video encoding test, though, where we track how long it takes a computer to transcode a video down from 4K to FHD. The m15 Ryzen Edition R5 completed this task in 7:05, while the Asus took 10:41 and the Acer was even slower at 11:36. The m15 R5 almost caught up to its AMD cousin with a time of 7:07.
Display for the Alienware M15 R5 Ryzen Edition
(Image credit: Tom’s Hardware)
Our configuration for the Alienware m15 Ryzen Edition R5 came with a 15.6 inch 1920 x 1080 IPS display with a 165Hz refresh rate. While it boasted impressive gaming performance and strong benchmark results, it still proved problematic for viewing content.
I watched the trailers for Nomandland and Black Widow on the m15 Ryzen Edition R5, where I found the blacks to be shallow and the viewing angles to be restrictive. In my office during the daytime, I couldn’t easily see the screen’s picture unless I was sitting directly in front of it. Turning my lights off and closing my curtain only extended viewing angles to about 30 degrees. Glare also proved to be an issue in the light, although turning lights off did fix this problem.
Colors were bright enough to pop occasionally but not consistently, with bolder tones like reds and whites holding up better than more subdued ones. Here, Black Widow came across a bit more vividly than the naturalistic style of Nomadland, so this screen might be better suited for more colorful, heavily produced films.
Our testing put the m15 Ryzen Edition R4’s color range above its closest competitors, the Asus TUF Dash F15 and Acer Predator Triton 300 SE, though not by much. With an 87.3 DCI-P3 color gamut, it’s only slightly ahead of the Asus’ 80.6% DCI-P3 score. The TUF Dash F15 had a starker difference, with a 78.5% DCI-P3 color gamut.
Our brightness testing saw the Alienware pull a more solid lead. With an average of 328 nits, it easily surpassed the Acer’s 292 nits and the Asus’ 265 nits.
The Alienware m15 R4 blew all of these systems out of the water, although the OLED screen our configuration had makes the comparison more than a bit unfair. Its DCI-P3 gamut registered at 150% while its average brightness was 460.2 nits.
To test the m15 Ryzen Edition R5’’s 165Hz screen, I also played Overwatch on it. Here, I had a much more pleasant experience than I did when watching movie trailers. The game’s bright colors appeared quite vivid and the fast refresh rate was perfectly able to keep up with the 165 fps I was hitting on Ultra settings.
Keyboard and Touchpad on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 configuration we received has a 4-zone RGB membrane keyboard, though other configurations do offer mechanical switches made in collaboration with Cherry. You can currently get that upgrade for an additional $98.
The membrane nature of this keyboard didn’t mean it wasn’t impressive, though. Keys have a noticeable resistance when pressed and 1.7mm of key travel gives you plenty of tactile feedback. I consistently scored around 83 words per minute on the 10fastfingers.com typing test, which is impressive as my average is usually around 75 wpm.
In an unusual choice, the Alienware’s audio control keys sit on the keyboard’s furthest right row rather than being mapped to the Fn row as secondary functions. Instead, the Page Up and Page Down keys that would normally be found there are secondary functions on the arrow keys.
The 4.1 x 2.4-inch touchpad doesn’t fare as well. While it has precision drivers and is perfectly smooth when scrolling with one finger, I felt too much friction when using multi-touch gestures to pull them off comfortably or consistently. For instance, when trying to switch apps with a three-fingered swipe, I would frequently accidentally pinch zoom instead.
Audio on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 has two bottom firing speakers that are loud with surprisingly decent bass, but tend to get tinny on higher notes.
I tested the m15 Ryzen Edition R5’s audio by listening to Save Your Tears by The Weeknd, which easily filled up my whole two bedroom apartment with sound. I was also surprised to be able to hear the strum of the song’s bass guitar, as it’s not uncommon for other laptops to either cut it out, make it quiet, or give it a more synth-like quality. Unfortunately, higher notes suffered from tinniness and echo.
Upgradeability of the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 is easy to open and has plenty of user customizability. Just unscrew the four screws closest to the back of the laptop, then loosen the four screws on the front (we used a PH0 Phillips Head bit).
Gently pry the case off, and you’ll see the networking card, two swappable DIMMs of RAM, the M.2 SSD and a second, open M.2 SSD slot (if you don’t buy the laptop with dual SSDs).
The only tradeoff here is that the SSDs are in a smaller, less common M.2 2230 form factor (most are 2280) , so you’ll probably need to buy a specialized drive for this laptop.
Battery Life on the Alienware m15 Ryzen Edition R5
(Image credit: Tom’s Hardware)
The Alienware m15 Ryzen Edition R5 is a power hog, with half the non-gaming battery life of the RTX 3060 and RTX 3070 35W laptops we tested it against. This shouldn’t come as too much of a surprise, since it also has a 45W CPU, but don’t expect to be able to spend too much time away from an outlet.
In our non-gaming battery test, which continually streams video, browses the web and runs OpenGL tests over Wi-Fi at 150 nits of brightness, the M15 Ryzen Edition R5 held on for 3:29. That’s about 3 hours less time than we got out of both the Asus TUF Dash F15, which had a 6:32 battery life, and the Acer Predator Triton 300 SE, which lasted for 6:40.
The Alienware m15 R5, with its 45W Intel chip, also had a shorter battery life than our 35W laptops, though it was slightly longer than the m15 Ryzen Edition R5’s. It lasted 4:01 on our non-gaming test.
Heat on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5’s surface temperature was impressively cool during non-gaming use but could get toasty in select areas during our gaming benchmarks. For our tests, we measured its temperature both after 15 minutes of streaming video and during the sixth consecutive run of the Metro: Exodus extreme benchmark.
The laptop’s touchpad proved coolest during the video test, registering 81.1 degrees Fahrenheit. This was only slightly behind the center of the keyboard’s temperature, as the typer hit 85.5 degrees Fahrenheit in between the G and H keys. The bottom of the laptop was warmer, hitting 90.9 degrees, although the center-left of the display hinge is where it was hottest, registering 101.1 degrees Fahrenheit.
Our gaming test saw a mild jump in temperatures in all areas except the bottom and the hinge, where numbers spiked much higher. The touchpad was 83.3 degrees Fahrenheit and the center of the keyboard was 90.9 degrees Fahrenheit. By contrast, the bottom of the laptop was now 121.5 degrees Fahrenheit and the hot zone on the hinge was now 136.1 degrees Fahrenheit.
Despite these higher numbers, though, the laptop never became too hot to touch while gaming. It did feel pleasantly warm, however.
Webcam on the Alienware m15 Ryzen Edition R5
The Alienware M15 R4 Ryzen Edition’s 720p webcam is, like many premium gaming laptops, a bit of an afterthought. Regardless of lighting conditions, its shots always have a blocky and fuzzy appearance. Adding light also adds a distracting halo effect to silhouettes, while dimming your surroundings will just bring down detail even further.
Software and Warranty on the Alienware m15 Ryzen Edition R5
The Alienware m15 Ryzen Edition R5 comes packed with software, although most of it serves a genuinely useful purpose.
Most of these are apps like Alienware Command Center, which lets you customize lighting and thermals as well as set up macros. Some are less useful than others — Alienware Customer Connect simply exists to get you to fill out surveys — but apps like Alienware Mobile Connect, which lets you easily mirror your phone’s screen, transfer its files or take phone calls from your laptop are definitely a standout. It might be easier to navigate these functions if they were all centralized into one hub app rather than being their own standalone programs, though. My Alienware tries to be this hub app, although it’s mostly just a redirect to Alienware Command Center with a bunch of ads on the side.
This laptop also comes with typical Windows pack-ins like Microsoft Solitaire Collection and Spotify. Its default warranty is limited to one year, although you can extend it at checkout.
Configurations for the Alienware M15 R5 Ryzen Edition
Our configuration of the Alienware M15 R5 Ryzen Edition came with an AMD Ryzen 7 5800H CPU, an RTX 3060 laptop GPU, 16GB of RAM, a 512GGB SSD and a 1920 x 1080, 165Hz display for $1,649. That actually puts it towards the lower end of what’s available.
You can upgrade this laptop’s CPU to the Ryzen 9 5900HX, which has the same thread count but boosts up to 4.6 GHz, and its GPU to an RTX 3070 laptop card. Memory options range from 8GB to 32GB, while storage options range from 256GB to 2TB. You can also add on an additional SSD with the same range of options, making for up to 4TB of total combined storage.
There’s also a 360Hz version of the FHD display available, as well as a QHD version with a 240Hz refresh rate and G-Sync support.
Perhaps the most interesting option that wasn’t included on our configuration is the mechanical keyboard, which features physical ultra low-profile switches made in collaboration with CherryMX.
These upgrades can raise your price up to $2,479, with the display and keyboard upgrades being the most costly components in Dell’s customization tool. The Cherry MX keyboard will add $98 to your price at checkout, while the QHD display costs $78. The FHD @ 360Hz display is only available on the highest preset option, which locks you into a Ryzen 9 5900HX chip and starts at $2,332.
By contrast, the low end of this laptop starts at $1,567.
Bottom Line
(Image credit: Tom’s Hardware)
The Alienware m15 Ryzen Edition R5 proves that Team Red and Alienware make a strong pairing . While it’s not quite the beast that the minimum $2,149 Alienware m15 R4 is, it still manages performance that equates to and sometimes beats peers in its price range on most titles, all while rocking Alienware’s unique premium looks. At $1,649 for our configuration, it’s an easy premium choice over the $1,450 Asus TUF Dash F15. And if you prefer power over size, it’s also a better option for you than the $1,400 Acer Predator Triton 300 SE.
While it’s certainly not the most portable contender and could do with more even port distribution and stronger audio, its 45W CPU lends it just enough of an edge on power to make it a solid first step into Dell’s flagship gaming brand.
After almost a decade of total market dominance, Intel has spent the past few years on the defensive. AMD’s Ryzen processors continue to show improvement year over year, with the most recent Ryzen 5000 series taking the crown of best gaming processor: Intel’s last bastion of superiority.
Now, with a booming hardware market, Intel is preparing to make up some of that lost ground with the new 11th Gen Intel Core Processors. Intel is claiming these new 11th Gen CPUs offer double-digit IPC improvements despite remaining on a 14 nm process. The top-end 8-core Intel Core i9-11900K may not be able to compete against its Ryzen 9 5900X AMD rival in heavily multi-threaded scenarios, but the higher clock speeds and alleged IPC improvements could be enough to take back the gaming crown. Along with the new CPUs, there is a new chipset to match, the Intel Z590. Last year’s Z490 chipset motherboards are also compatible with the new 11th Gen Intel Core Processors, but Z590 introduces some key advantages.
First, Z590 offers native PCIe 4.0 support from the CPU, which means the PCIe and M.2 slots powered off the CPU will offer PCIe 4.0 connectivity when an 11th Gen CPU is installed. The PCIe and M.2 slots controlled by the Z590 chipset are still PCI 3.0. While many high-end Z490 motherboards advertised this capability, it was not a standard feature for the platform. In addition to PCIe 4.0 support, Z590 offers USB 3.2 Gen 2×2 from the chipset. The USB 3.2 Gen 2×2 standard offers speeds of up to 20 Gb/s. Finally, Z590 boasts native support for 3200 MHz DDR4 memory. With these upgrades, Intel’s Z series platform has feature parity with AMD’s B550. On paper, Intel is catching up to AMD, but only testing will tell if these new Z590 motherboards are up to the challenge.
The BIOSTAR Z590 Valkyrie features a massive VRM featuring 90 A top of the line power stages. BIOS flashback has also been included, as well as a dual BIOS. Along with the heavy-duty VRM design, the BIOSTAR Z590 Valkyrie features a unique aesthetic of black and gold, 2.5 Gb/s LAN from Realtek, and more. Let’s take a closer look at what the BIOSTAR Z590 Valkyrie has to offer!
2x WIFI antenna ports 1x PS/2 keyboard / mouse port 1x HDMI Port (HDMI2.0) 1x DP Port (DP1.4) 1x USB 3.2 (Gen2x2) Type-C port 5x USB 3.2 (Gen2) ports 2x USB 3.2 (Gen1) ports 1x 2.5 GbE LAN port 5x Audio jack 1x SPDIF Out
For the next two weeks, the virtual realm of Roblox is getting a dose of high fashion. Currently, fashion house Gucci is hosting an artsy garden space, which will be available for Roblox users to explore starting today through to May 31st. The space was conceived as a virtual counterpart to a real-world installation called the Gucci Garden Archetypes, which is taking place in Florence, Italy.
Both the virtual and IRL spaces are described as multimedia experiences, which are divided into a series of themed rooms with names like “urban romanticism” and “Tokyo tribe.” It all sounds pretty surreal. Here’s the official description of what you’ll be doing in the digital version:
As they enter the Gucci Garden experience, visitors will shed their avatars becoming a neutral mannequin. Without gender or age, the mannequin symbolizes that we all begin our journeys through life as a blank canvas. Wandering through the different rooms, visitors’ mannequins absorb elements of the exhibition. With every person experiencing the rooms in a different order and retaining different fragments of the spaces, they will emerge at the end of their journey as one-of-a-kind creations, reflecting the idea of individuals as one among many, yet wholly unique.
It may sound like a strange combination, but over the last year, fashion and gaming have become much more closely connected. Burberry launched in-game outfits for a Chinese strategy game, Balenciaga made a post-apocalyptic world in place of a traditional runway show, and Louis Vuitton dressed up a virtual hip-hop group made up of League of Legends characters. In fact, this isn’t even Gucci’s first foray into the space; the company previously made a pair of virtual sneakers.
Just remember: Roblox is definitely not a video game.
Intel is starting to get its legs again. The company, which initially had issues with its 10nm chips, has released its first eight-core, 10nm Tiger Lake-H processors that are ready for gaming and high-end productivity notebooks.
For its 10th gen chips, Intel used a 10nm process (“Ice Lake”) for ultrabooks but used a 14nm chip (“Comet Lake”) for these enthusiast machines. Now, we have time to see what Intel’s 10nm SuperFin chips can do on the high end. Like the U-series Tiger Lake chips, these use Willow Cove execution cores paired with a UHD Graphics 750 engine that’s powered by Intel’s Xe architecture.
It comes at a crucial time. AMD’s
Ryzen 5000 series
(“Cezanne,” on a 7nm process) has proven powerful and, among gamers, popular. During current hardware shortages, some of the
best gaming laptops
have been nearly impossible to find. Intel claims that it has already shipped more than 1 million of its chips to its partners and that it will come in more than eighty different laptop designs.
The 11th Gen H-series processors include Thunderbolt 4 (and
USB 4
) and Resizable Bar support, and are notably Intel’s first eight-core laptop chips that work with PCIe 4.0 SSDs. AMD’s competing Zen 3 mobile chips are still on PCIe 3.0.
A lot is riding on Tiger Lake H’s success. Intel has already called its 11th generation the “world’s best gaming laptop processors,” and now, with the help of a sample unit, we’ve had a chance to see if those claims ring true.
How We Tested Tiger Lake i9
Image 1 of 4
(Image credit: Tom’s Hardware)
Image 2 of 4
(Image credit: Tom’s Hardware)
Image 3 of 4
(Image credit: Tom’s Hardware)
Image 4 of 4
(Image credit: Tom’s Hardware)
Our Tiger Lake-H testing was performed on an Intel-branded sample “white box” system, similar to our early testing of Tiger Lake-U and Ice Lake. This isn’t a review of the Intel Core i9-11980HK processor inside so much as a performance preview of what you can expect from upcoming systems that will be available to buy. Our full reviews will come when we see the i9-11980HK and other 11th Gen CPUs in computers that are on sale.
Intel loaned reviewers these systems with the knowledge that they are pre-production systems that aren’t necessarily representative of final systems, which may have more finished drivers.
Unlike previous Intel sample systems, this one couldn’t toggle between TDPs. Many of Intel’s 11th gen processors will be configurable by the manufacturer, ranging from 35 to 65W (the Core i9-11980HK is a 65W, overclockable processor that peaks at 110W (PL2). In HWInfo, our unit showed a PL1 of 65W and a PL2 of 109W.
We did our testing on the suite we use to test gaming laptops to get an idea of where something specced similar to this sample system might fall. We had a limited amount of time with the system, so we could only run some tests. Some, like battery life, are more important on actual systems that will be on sale than this early sample.
Intel Reference Design for Tiger Lake i9 and Competitors
The Tiger Lake-H i9 reference design came with the following specifications:
2x 512GB Phison SM280512GKBB4S-E162 PCIe Gen 4 SSD
Display
16-inch, 2560 x 1600 (16:10)
Networking
Killer Wi-Fi 6E Ax1675X
Ports
2x Thunderbolt 4, 2x USB Type-A, microSD card reader, 3.5mm headphone jack
Battery
90 WHr
Operating System
Windows 10 Pro
Yes, Intel’s sample system paired its top-end GPU with a mid-range Nvidia GPU. It’s an odd pairing on paper, but one that allows for slim systems. Intel claims that this will enable “thin enthusiast” laptops, which fall in between ultraportable notebooks with its H35 processors and the big, thick machines that include the most intensive graphics cards.
From our reviews database, we chose to compare a number of different laptops depending on the task. For gaming, we broke out the
Acer Predator Triton 300 SE
with a 35W i7-11375H and the Alienware m15 Ryzen Edition R5 with a Ryzen 7 5800H. Both of these also use RTX 3060 GPUs, like the reference system.
For our productivity benchmarks, we also included some other, bigger systems that may have more powerful GPUs to compare against a range of processors, including the Ryzen 9 5900HX in the
Asus ROG Strix Scar 17 G733
; the Intel Core i9-10980HK in the
Alienware m17 R4
; and the 35W Ryzen 9 5980HS in the Asus ROG Flow X13.
Acer Predator Triton 300 SE
Alienware m15 Ryzen Edition R5
Asus ROG Strix Scar 17 G733
Alienware m17 R4
Asus ROG Flow X13
CPU
Intel Core i7-11375H
AMD Ryzen 7 5800H
AMD Ryzen 9 5900HX
Intel Core i9-10980HK
AMD Ryzen 9 5980HS
GPU
Nvidia GeForce RTX 3060 Max-Q, 75W TGP
Nvidia GeForce RTX 3060 125W TGP
Nvidia GeForce RTX 3080, 130W
Nvidia GeForce RTX 3080
AMD Radeon Graphics (integrated)
RAM
16GB DDR4-3200
16GB DDR4-3200
32GB DDR4-3200
32GB DDR4-2933
32GB LPDDR4x-4266
Storage
512GB M.2 PCIe NVMe SSD
512GB M.2 PCIe NVMe SSD
2x 1TB M.2 NVMe SSD
512GB Boot, 2TB (2 x 1TB RAID 0) SSD
1TB M.2 2230 NVMe SSD
Display
14-inch, 1920 x 1080, 144 Hz IPS
15.6 inch, 1920 x 1080, 165Hz, IPS
17.3-inch, 1920 x 1080, 360 Hz, IPS
17.3-inch, 1920 x 1080, 360 Hz
13.4-inch, 3840 x 2400, 16:10, 60 Hz, touch
And here’s how the CPUs all stack up on paper:
Cores / Threads
Process Node
Base Frequency
Max Turbo Frequency
TDP
Intel Core i9-11980HK
16-Aug
10nm SuperFin
2.6 GHz
5.0 GHz
45 – 65 W
Intel Core i9-10980HK
16-Aug
14nm
2.4 GHz
5.3 GHz
45 – 65 W
Intel Core i7-11375H
8-Apr
10nm SuperFin
3.3 GHz
5.0 GHz
28 – 35W
AMD Ryzen 7 5800H
16-Aug
7nm FinFET
3.2 GHz
4.4 GHz
35 – 54W
AMD Ryzen 9 5900HX
16-Aug
7nm FinFET
3.3 GHz
4.6 GHz
35 – 54W
AMD Ryzen 9 5980HS
16-Aug
7nm FinFET
3.0 GHz
4.8 GHz
35W
Productivity Performance of Tiger Lake i9
We started out with our productivity suite to test the Core i9-11980HK to its 10th Gen counterpart, the highest-end Intel H35 processor and a series of AMD Ryzen competitors.
(Image credit: Tom’s Hardware)
On Geekbench 5, the Tiger Lake-H system started strong, pushing the highest single-core (1,649) of the bunch and beating the next highest multi-core score by more than 1,000 points (9,254). The next closest was the AMD Ryzen 9 5900HX in the Asus ROG Strix Scar 17 G733, which also had 32GB of RAM.
The Core i9-10980HK, the 10th Gen chip from Intel, was in a close third on multi-core, though in single-core other Ryzen laptops surpassed it.
(Image credit: Tom’s Hardware)
The Intel sample system was also the fastest system to complete our Handbrake test, which transcodes a
4K
video to 1080p (with one caveat: we removed laptops with far more powerful GPUs, which could have some effect. If you left in the Ryzen 9 5900HX, it was faster at 6:11 in the Asus ROG Strix Scar 17 G733).
It was (unsurprisingly) significantly faster than the 35W Core i7, and also ahead of the Ryzen 7 5800H and 35W Ryzen 9 59080HS.
(Image credit: Tom’s Hardware)
The Intel sample system contained a pair of 512GB Phison PCIe Gen 4 SSDs, which the Core i9-11980HK can take advantage of. It was one of the speedier laptops in our test pool, but the Asus ROG Flow X13 was actually a little bit faster in our 25GB file transfer test.
(Image credit: Tom’s Hardware)
To check stability over a longer duration, we ran Cinbench R23 for 20 runs. The cooling, which was exceptionally loud during all of the tests (and sometimes while the sample system was doing absolutely nothing) kept it stable.
It started at a high of 11,846.31 while largely settling in the 11,600 range. During the Cinebench stress test, the CPU ran at an average of 3.5 GHz and an average temperature of 85.77 degrees Celsius (186.39 degrees Fahrenheit). While the chart looks largely stable, the monitoring tool HWinfo reported that the CPU was being thermally throttled for the majority of the test. This is the downside of putting a high-wattage processor in a slim system, and also explains the constant fan noise.
Gaming and Graphics Performance of Tiger Lake i9
In this system, Intel paired its top-of-the-line mobile processor with an RTX 3060 Max-Q. It’s a questionable decision for this kind of performance preview, as our first impression didn’t give us the chance to see what happens when this chip is used with a more powerful graphics card that would take full advantage of its capabilities. So our test pool here includes other laptops with an RTX 3060, either full or
Max-Q
.
On most of the benchmarks we ran, this thin and light notebook performed almost identically to what you would expect from Intel’s 35-watt Tiger Lake H processors that were launched earlier this year. That is, at
1080p
. We also ran the tests at the laptop’s native 2560 x 1600 resolution.
(Image credit: Tom’s Hardware)
On Shadow of the Tomb Raider (highest settings), the Intel sample system ran the benchmark at 62 frames per second, within one frame of the Acer Predator Trion 300 SE with H35. The Alienware m15 Ryzen Edition R5 with a full RTX 3060 Max-Q won out at 73 fps.
(Image credit: Tom’s Hardware)
We saw a very similar pattern on Grand Theft Auto V (very high settings). Intel’s system matched the Acer but fell behind the Alienware. On both Shadow of the Tomb Raider and GTAV, the Intel system was still playable above 30 fps at 2560 x 1600 on the same settings.
(Image credit: Tom’s Hardware)
Tiger Lake-H finally had its moment on Far Cry New Dawn (ultra settings), running at 91 fps, beating out both the Predator (73 fps) and AMD-based Alienware (79 fps) at 1080p. At native resolution, the sample system was still over 60 fps.
(Image credit: Tom’s Hardware)
But on Red Dead Redemption 2 and Borderlands 3, we were back to the same old tale, coming extremely close to the H35 laptop. On RDR 2‘s medium settings, it ran at 48 fps at 1080p and 33 fps at 2560 x 1440.
(Image credit: Tom’s Hardware)
On Borderlands 3‘s “badass” quality settings, the game ran at 56 fps at 1080p, falling about 10 frames behind the Alienware. Intel’s sample system ran the game at 37 fps at 2560 x 1600.
Lastly, we ran the Metro Exodus gauntlet that we run in our laptop review. We have laptops play through the benchmark 15 times on the RTX preset (1920 x 1080) to simulate a half-hour of gaming. Intel’s CPU ran at an average of 3.38 GHz with an average temperature of 64.71 degrees Celsius (148.48 degrees Fahrenheit). There was some throttling, but not as often as during the Cinebench R23 stress test. The GPU ran at an average of 1,188.23 MHz and 64.21 degrees Celsius (147.58 degrees Fahrenheit).
Cooling Tiger Lake i9
(Image credit: Tom’s Hardware)
Unlike with some previous early Intel samples, we were allowed to crack this one open to show it to you.
The laptop has three fans, while even most gaming laptops stick to two larger ones. That may explain the decibels. But what’s also fascinating is that the motherboard in the reference platform has been placed effectively upside down. This means that we can’t see the full cooler, including the heat pipes. That would require far more disassembly.
There are still serviceable parts, but they are connected to the edge of the board. Notably, there’s only one 512GB SSD that’s easily accessible. The other one must be on the other side of the motherboard.
Impressions of Tiger Lake i9
As always, it’s extremely difficult to get a complete picture of how high-end, Tiger Lake-H chips will run in laptops that OEMs will start selling today. Our testing was done under extremely limited time, and only used one new 11th Gen H-series chip.
To complicate things, this reference design is meant to represent a new “thin enthusiast” sector for Intel, which meant we couldn’t see how the Core i9-11980HK will perform at its best, in a thicker laptop with more elaborate cooling. Of course, every laptop is unique, so the processors may perform slightly differently based on size, cooling and other factors. We hope to be able to see a bigger, flagship gaming system with this processor for a fuller idea soon.
In productivity testing, our early benchmarks show a leap for Intel and its 10nm SuperFin process, especially in multi-core workloads. But AMD’s best, the Ryzen 9 5900HX still puts up a fight in some areas.
In gaming, we’ll really have to wait. What we now expect from finalized thin systems is that they won’t run games much differently from H35 variants unless those titles really hit the CPU hard.
As usual, the best way to truly tell is when we start testing laptops with a Tiger Lake-H that you can actually buy. As those hit our labs, we’ll see a wider variety of laptop designs and the full range of 11th Gen H-series processors.
(Pocket-lint) – The Pulsefire Haste is HyperX’s answer to the trend of superlight honeycomb gaming mice out there. It ticks a number of boxes too – unless you love RGB lighting by the bucket load.
We’ve been happily working, surfing, and gaming away with the Pulsefire Haste to bring you our thoughts on how this lightweight mouse stacks up against the competition.
Lightweight design and an agile frame
Honeycomb design with 59g lightweight shell
Hyperflex USB cable
Dustproof TTC Golden Micro dustproof switches
Virgin-grade PTFE feet
When we started using the HyperX Pulsefire Haste we noticed a few striking things about it. First up, it’s one of the lightest gaming mice we’ve used, weighing just 59 grams.
Pocket-lint
However, it is wired – which detracts from that sense of freedom. That said, the floppy Hyperflex cable ensures that the USB connection doesn’t weigh you down or get in the way while gaming.
Although the Pulsefire Haste might not be the best-looking gaming mouse we’ve seen, it’s still a satisfying mouse to use with plenty of features that make it appealing for the price.
Its body verges on being a good fit for medium to small hands – we suspect that large-handed gamers might struggle with it – and we found it comfortable to use during our play sessions.
Pocket-lint
If you’re worried about the honeycomb shell presenting a problem for dust and dirt ingress, you don’t need to fret as the Pulsefire Haste sports dust-proof switches. We’re also fairly confident a spray of compressed air would soon sort the insides out. But we wouldn’t recommend spilling a drink on it.
The outer shell also has a nice PBT-esque finish to it, making it easy to grip, and there’s not much danger of it slipping around in the hand. A nice addition to this is the included textured grip tape which can be applied to the main mouse buttons and either side of the mouse. This tape is easy to apply, without the annoyance of bubbles, and maintains the overall honeycomb finish too.
In the box, you also get some replacement PTFE skates, so if and when those wear out you can replace them and rejuvenate the mouse without much hassle.
One area the Pulsefire Haste lacks is in RGB lighting. We’ve seen much nicer-looking lightweight mice with snazzier appeal. The SteelSeries Aerox 3, for example, sports some nice RGB lighting that glows from within the honeycomb shell, but that’s sadly lacking on the Pulsefire Haste.
The Pulsefire Haste only has RGB on the mouse wheel, which is customisable within the software. Where thing this does shine though is the way it changes when you change DPI level. The mouse wheel lights up with different colours showing briefly as you change levels, so you know at a glance what you’re in.
Capable gaming setup
1000Hz polling rate, Up to 16000dpi
Speed: 450ips / Acceleration: 40G
6 programmable buttons
When it comes to gaming, we found the Pulsefire Haste to be a relative joy to use. Like other lightweight gaming mice, it’s easy to flick about the desk which makes it perfect for fast-paced first-person shooters and battle royale games.
Pocket-lint
With 450ips and 40G acceleration, it’s more than capable too. We wish the PTFE feet were a bit bigger, as we’ve certainly seen larger ones on other mice, but generally the HyperX mouse is easy to grip and even easier to move.
The DPI switching button on top also makes it easy to set and switch between various different DPI levels within the HyperX Ngenuity software. From there you can also programme macros, reprogramme keys, and adjust other settings.
Pocket-lint
The software is a bit barebones compared to the likes of Razer’s Synapse software or Corsair’s iCue, but it’s easy to use and allows you setup and tweak most things with ease. We found macros easy to record and set, with the easy adjustment of delays and tweaking a doddle also. Macros don’t seem as fast to activate as with other mice though.
Verdict
All told, the HyperX Pulsefire Haste is an appealing lightweight gaming mouse. It’s not the fanciest we’ve seen, but it has a number of highlights both in design aesthetics and specification terms.
It’s comfortable, capable and agile – which is obviously an appealing part of any mouse design, but it’s also so much more. The affordable price tag and the overall finish of this mouse make it well worth considering if you’re after something light and pleasing for your gaming sessions.
Also consider
Corsair Sabre RGB Pro
Pocket-lint
If you’re not bothered about having a super lightweight mouse, don’t like holes in your peripherals and want something built for pros then consider this instead.
The Corsair Sabre RGB Pro is another affordable mouse with some serious specs that include an 8,000Hz polling rate and some pro-level specs that include Nvidia Reflex Latency Analyser compatibility.
Best gaming mice: The best wired, wireless and RGB gaming mice to buy today
squirrel_widget_4443128
SteelSeries Aerox 3 Wireless
Pocket-lint
It might be more expensive, but the SteelSeries Aerox 3 Wireless is snazzier too. With some nice RGB lighting and other highlights that include P54 dust resistance, up to 200 hours battery life and both 2.4Ghz and Bluetooth connectivity options.
Best gaming mice: The best wired, wireless and RGB gaming mice to buy today
Google patent describes in detail how the search engine giant wants to implement an under-screen camera in future Google Pixel smartphones.
Smartphone manufacturers have invested heavily in reducing the bezels in recent years, in order to create a beautiful edge-to-edge display. However, this poses some challenges, in particular because there is too little space left to integrate the front camera and sensors in the bezel. Currently, a punch-hole camera system is widely used, a small hole is made in the screen to accommodate the selfie camera.
The next step is the under-screen camera, as ZTE has already implemented with the Axon 20. It is expected that several phone manufacturers will release their first smartphone with a under-screen camera this year, including Samsung, Xiaomi and Oppo. Nevertheless, this new technology is not so easy to apply, without compromising image quality. Google has come up with an ingeniously solution, according to a new patent.
Google smartphone with under-display selfie camera
In mid-September 2020, Google LLC filed for a patent with the World Intellectual Property Organization (WIPO). The 23-page documentation entitled “Full-screen display with sub-display camera” was released on May 6, 2021 and describes a Google Pixel smartphone with a front camera embedded under the screen.
The documentation makes mention of a Google phone with an OLED display panel. The front camera is placed under the screen. To make this possible, an extra display panel is integrated, an OLED panel with the same pixel density will be used for this.
It is not the first time that we report on the integration of a sub-display to support the under-screen camera technology. At the beginning of this year, Samsung filed a similar patent. However, Google wants to integrate this auxiliary display in a different way.
The camera sensor and lens are placed under the screen. The second screen is placed directly opposite the camera, it is located under the main screen. A prism or mirror is placed between the camera and the auxiliary display – as illustrated in the patent image above.
When the camera is not in use, the content displayed on the second screen is reflected through the prism on the main display. As soon as the camera is activated, the prism rotates in such a way that incoming light can reach the camera, enabling the user to take high-quality photo and video recordings. The optical module is shielded by a window that is about 2 to 3 mm in size, according to the documentation.
The small auxiliary display also contains three sensors. The type of sensor can be further determined, such as an ambient sensor, a proximity sensor and / or an IR sensor. The latter can be used to enable 3D face detection.
With the Google Pixel 5, Face Unlock was omitted to make way for a fingerprint sensor. Nevertheless, it was indicated at the time that Face Unlock may return in future models.
Google Pixel 7 may be first model to feature a camera under the screen
For the time being, it remains unclear when the first Google smartphone with an under-display camera will be announced. It was briefly assumed that the Google Pixel 6 (Pro), which is expected in the second half of 2021, will have such a new type of selfie camera.
Recently, however, the first product renders of this device appeared online, showing a punch-hole camera system. This may mean that we have to be patient until 2022 when the Google Pixel 7 series will be introduced.
However, there is another option. In December last year, display analyst Ross Young reported via Twitter that Google will present its first foldable smartphone in the second half of 2021. Details about this device are still very scarce, but Google may see an opportunity to equip its folding phone with this new camera technology. Just as it is expected that the Samsung Galaxy Z Fold 3 will also have an under-display camera, this foldable phone is expected around August 2021. In any case, it doesn’t seem to be the question as to whether Google will launch an under-screen camera smartphone, but when …
Here you can take a look at the documentation of the Google smartphone with under-screen camera.
Ilse is a Dutch journalist and joined LetsGoDigital more than 15 years ago. She is highly educated and speaks four languages. Ilse is a true tech-girl and loves to write about the future of consumer electronics. She has a special interest for smartphones, digital cameras, gaming and VR.
If there’s one thing we’ve noticed over the years, it’s that the Raspberry Pi is deeply intertwined with retro gaming. This project, created by developer Pip Austin, proves this by bringing Pong’s classic game to our favorite microcontroller, the Raspberry Pi Pico.
According to Austin, who shared the development on LinkedIn, this is her first Raspberry Pi-based game. She opted to use a Pico Unicorn from Pimoroni for visual output. This board features an LED matrix that you can easily program with Python.
Image 1 of 2
(Image credit: Pip Austin)
Image 2 of 2
(Image credit: Pip Austin)
The best Raspberry Pi projects are built from the ground up and, according to Austin, the code for this game was created entirely from scratch. It operates just like Pong mechanics-wise with small adjustments made to suit the hardware at hand.
Players control the side bars using the four buttons on the Pimoroni Unicorn. Scores are tallied in real-time for both players. When a player wins, a victory message scrolls across the screen for the winner.
If you want to play Pico Pong, Austin was awesome enough to share the code on GitHub. All you need to get started is a Raspberry Pi Pico, Pimoroni Unicorn and someone to play against.
DLSS 2.0 off vs DLSS 2.0 on (Image credit: Nvidia)
DLSS stands for deep learning super sampling. It’s a type of video rendering technique that looks to boost framerates by rendering frames at a lower resolution than displayed and using deep learning, a type of AI, to upscale the frames so that they look as sharp as expected at the native resolution. For example, with DLSS, a game’s frames could be rendered at 1080p resolution, making higher framerates more attainable, then upscaled and output at 4K resolution, bringing sharper image quality over 1080p.
This is an alternative to other rendering techniques — like temporal anti-aliasing (TAA), a post-processing algorithm — that requires an RTX graphics card and game support (see the DLSS Games section below). Games that run at lower frame rates or higher resolutions benefit the most from DLSS.
Nvidia’s chart shows RTX 3080 performance at 4K with max graphics settings and DLSS 2.0 Performance Mode and ray tracing on.(Image credit: Nvidia)
According to Nvidia, DLSS 2.0, the most common version, can boost framerates by 200-300% (see the DLSS 2.0 section below for more). The original DLSS is in far fewer games and we’ve found it to be less effective, but Nvidia says it can boost framerates “by over 70%.” DLSS can really come in handy, even with the best graphics cards, when gaming at a high resolution or with ray tracing, both of which can cause framerates to drop substantially compared to 1080p.
In our experience, it’s difficult to spot the difference between a game rendered at native 4K and one rendered in 1080p and upscaled to 4K via DLSS 2.0 (that’s the ‘performance’ mode with 4x upscaling). In motion, it’s almost impossible to tell the difference between DLSS 2.0 in quality mode (i.e., 1440p upscaled to 4K), though the performance gains aren’t as great.
For a comparison on how DLSS impacts game performance with ray tracing, see: AMD vs Nvidia: Which GPUs Are Best for Ray Tracing?. In that testing we only used DLSS 2.0 in quality mode (2x upscaling), and the gains are still quite large in the more demanding games.
When DLSS was first released, Nvidia claimed it showed more temporal stability and image clarity than TAA. While that might be technically true, it varies depending on the game, and we much prefer DLSS 2.0 over DLSS 1.0. An Nvidia rep confirmed to us that because DLSS requires a fixed amount of GPU time per frame to run the deep learning neural network, games running at high framerates or low resolutions may not have seen a performance boost with DLSS 1.0.
Below is a video from Nvidia (so take it with a grain of salt), comparing Cyberpunk 2007 gameplay at both 1440p resolution and 4K with DLSS 2.0 on versus DLSS 2.0 off.
DLSS is only available with RTX graphics cards, but AMD is working on its own alternative for Team Red graphics cards. AMD Fidelity FX Super Resolution (FSR) is supposed to debut in 2021. It will require separate support from games, and we haven’t seen it in action yet. But like other FidelityFX technologies, it’s supposed to be GPU agnostic, meaning it will work on Nvidia and even Intel GPUs that have the necessary hardware features. We’re also expecting the next Nintendo Switch to have DLSS via an integrated SoC designed by Nvidia.
DLSS Games
In order to use DLSS, you need an RTX graphics card and need to be playing a game that supports the feature. You can find a full list of games supporting DLSS as of April via Nvidia below. Unreal Engine and Unity Engine also both have support for DLSS 2.0, meaning games using those engines should be able to easily implement DLSS.
Anthem
Battlefield V
Bright Memory
Call of Duty: Black Ops Cold War
Call of Duty: Modern Warfare
Call of Duty: Warzone
Control
CRSED: F.O.A.D. (Formerly Cuisine Royale)
Crysis Remastered
Cyberpunk 2077
Death Stranding
Deliver Us the Moon
Edge of Eternity
Enlisted
F1 2020
Final Fantasy XV
Fortnite
Ghostrunner
Gu Jian Qi Tan Online
Iron Conflict
Justice
Marvel’s Avengers
MechWarrior 5: Mercenaries
Metro Exodus
Metro Exodus PC Enhanced Edition
Minecraft With RTX For Windows 10
Monster Hunter: World
Moonlight Blade
Mortal Shell
Mount & Blade II: Bannerlord
Nioh 2 – The Complete Edition
Outriders
Pumpkin Jack
Shadow of the Tomb Raider
System Shock
The Fabled Woods
The Medium
War Thunder
Watch Dogs: Legion
Wolfenstein: Youngblood
Xuan-Yuan Sword VII
DLSS 2.0 and DLSS 2.1
(Image credit: Shutterstock)
In March 2020, Nvidia announced DLSS 2.0, an updated version of DLSS that uses a new deep learning neural network that’s supposed to be up to 2 times faster than DLSS 1.0 because it leverages RTX cards’ AI processors, called Tensor Cores, more efficiently. This faster network also allows the company to remove any restrictions on supported GPUs, settings and resolutions.
DLSS 2.0 is also supposed to offer better image quality while promising up to 2-3 times the framerate (in 4K Performance Mode) compared to the predecessor’s up to around 70% fps boost. Using DLSS 2.0’s 4K Performance Mode, Nvidia claims an RTX 2060 graphics card can run games at max settings at a playable framerate. Again, a game has to support DLSS 2.0, and you need an RTX graphics card to reap the benefits.
The original DLSS was apparently limited to about 2x upscaling (Nvidia hasn’t confirmed this directly), and many games limited how it could be used. For example, in Battlefield V, if you have an RTX 2080 Ti or faster GPU, you can only enable DLSS at 4K — not at 1080p or 1440p. That’s because the overhead of DLSS 1.0 often outweighed any potential benefit at lower resolutions and high framerates.
In September 2020, Nvidia released DLSS 2.1, which added an Ultra Performance Mode for super high-res gaming (9x upscaling), support for VR games, and dynamic resolution. The latter, an Nvidia rep told Tom’s Hardware, means that, “The input buffer can change dimensions from frame to frame while the output size remains fixed. If the rendering engine supports dynamic resolution, DLSS can be used to perform the required upscale to the display resolution.” Note that you’ll often hear people referring to both the original DLSS 2.0 and the 2.1 update as “DLSS 2.0.”
DLSS 2.0 Selectable Modes
One of the most notable changes between the original DLSS and the fancy DLSS 2.0 version is the introduction of selectable image quality modes: Quality, Balanced, or Performance — and Ultra Performance with 2.1. This affects the game’s rendering resolution, with improved performance but lower image quality as you go through that list.
With 2.0, Performance mode offered the biggest jump, upscaling games from 1080p to 4K. That’s 4x upscaling (2x width and 2x height). Balanced mode uses 3x upscaling, and Quality mode uses 2x upscaling. The Ultra Performance mode introduced with DLSS 2.1 uses 9x upscaling and is mostly intended for gaming at 8K resolution (7680 x 4320) with the RTX 3090. While it can technically be used at lower target resolutions, the upscaling artifacts are very noticeable, even at 4K (720p upscaled). Basically, DLSS looks better as it gets more pixels to work with, so while 720p to 1080p looks good, rendering at 1080p or higher resolutions will achieve a better end result.
How does all of that affect performance and quality compared to the original DLSS? For an idea, we can turn to Control, which originally had DLSS 1.0 and then received DLSS 2.0 support when released. (Remember, the following image comes from Nvidia, so it’d be wise to take it with a grain of salt too.)
Control at 1080p with DLSS off (top), the DLSS 1.0 on (middle) and DLSS 2.0 Quality Mode on (bottom) (Image credit: Nvidia)
One of the improvements DLSS 2.0 is supposed to bring is strong image quality in areas with moving objects. The updated rendering in the above fan image looks far better than the image using DLSS 1.0, which actually looked noticeably worse than having DLSS off.
DLSS 2.0 is also supposed to provide an improvement over standard DLSS in areas of the image where details are more subtle.
Control at 1440p using the original DLSS (top) and DLSS 2.0 Quality Mode (bottom)(Image credit: Nvidia)
Nvidia promised that DLSS 2.0 would result in greater game adoption. That’s because the original DLSS required training the AI network for every new game needed DLSS support. DLSS 2.0 uses a generalized network, meaning it works across all games and is trained by using “non-game-specific content,” as per Nvidia.
For a game to support the original DLSS, the developer had to implement it, and then the AI network had to be trained specifically for that game. With DLSS 2.0, that latter step is eliminated. The game developer still has to implement DLSS 2.0, but it should take a lot less work, since it’s a general AI network. It also means updates to the DLSS engine (in the drivers) can improve quality for existing games. Unreal Engine 4 and Unity have both also added DLSS 2.0 support, which means it’s trivial for games based on those engines to enable the feature.
How Does DLSS Work?
Both the original DLSS and DLSS 2.0 work with Nvidia’s NGX supercomputer for training of their respective AI networks, as well as RTX cards’ Tensor Cores, which are used for AI-based rendering.
For a game to get DLSS 1.0 support, first Nvidia had to train the DLSS AI neural network, a type of AI network called convolutional autoencoder, with NGX. It started by showing the network thousands of screen captures from the game, each with 64x supersample anti-aliasing. Nvidia also showed the neural network images that didn’t use anti-aliasing. The network then compared the shots to learn how to “approximate the quality” of the 64x supersample anti-aliased image using lower quality source frames. The goal was higher image quality without hurting the framerate too much.
The AI network would then repeat this process, tweaking its algorithms along the way so that it could eventually come close to matching the 64x quality with the base quality images via inference. The end result was “anti-aliasing approaching the quality of [64x Super Sampled], whilst avoiding the issues associated with TAA, such as screen-wide blurring, motion-based blur, ghosting and artifacting on transparencies,” Nvidia explained in 2018.
DLSS also uses what Nvidia calls “temporal feedback techniques” to ensure sharp detail in the game’s images and “improved stability from frame to frame.” Temporal feedback is the process of applying motion vectors, which describe the directions objects in the image are moving in across frames, to the native/higher resolution output, so the appearance of the next frame can be estimated in advance.
DLSS 2.0 (Image credit: Nvidia)
DLSS 2.0 gets its speed boost through its updated AI network that uses Tensor Cores more efficiently, allowing for better framerates and the elimination of limitations on GPUs, settings and resolutions. Team Green also says DLSS 2.0 renders just 25-50% of the pixels (and only 11% of the pixels for DLSS 2.1 Ultra Performance mode), and uses new temporal feedback techniques for even sharper details and better stability over the original DLSS.
Nvidia’s NGX supercomputer still has to train the DLSS 2.0 network, which is also a convolution autoencoder. Two things go into it, as per Nvidia: “low resolution, aliased images rendered by the game engine” and “low resolution, motion vectors from the same images — also generated by the game engine.”
DLSS 2.0 uses those motion vectors for temporal feedback, which the convolution autoencoder (or DLSS 2.0 network) performs by taking “the low resolution current frame and the high resolution previous frame to determine on a pixel-by-pixel basis how to generate a higher quality current frame,” as Nvidia puts it.
The training process for the DLSS 2.0 network also includes comparing the image output to an “ultra-high-quality” reference image rendered offline in 16K resolution (15360 x 8640). Differences between the images are sent to the AI network for learning and improvements. Nvidia’s supercomputer repeatedly runs this process, on potentially tens of thousands or even millions of reference images over time, yielding a trained AI network that can reliably produce images with satisfactory quality and resolution.
With both DLSS and DLSS 2.0, after the AI network’s training for the new game is complete, the NGX supercomputer sends the AI models to the Nvidia RTX graphics card through GeForce Game Ready drivers. From there, your GPU can use its Tensor Cores’ AI power to run the DLSS 2.0 in real-time alongside the supported game.
Because DLSS 2.0 is a general approach rather than being trained by a single game, it also means the quality of the DLSS 2.0 algorithm can improve over time without a game needing to include updates from Nvidia. The updates reside in the drivers and can impact all games that utilize DLSS 2.0.
This article is part of the Tom’s Hardware Glossary.
VideoCardz today shared a snippet of a document claiming to show Intel’s strong desire for motherboard vendors to adopt the ATX12VO power connector on future Intel 12th Gen Alder Lake LGA1700 motherboards.
The ATX12VO is a 10-pin power connector that Intel has been pushing since a year ago to replace the conventional 24-pin power connector on modern motherboards. The connector ditches the 3.3V and 5V rails and only maintains the 12V rail. A more compact power connector minimizes power supply production costs, as well as cable clutter for the end user.
The flipside is that motherboard manufacturers would have to implement DC-to-DC converters on their motherboards to transform the 12V voltage down to usable 3.3V and 5V voltages, since there are still many components that use one of the latter.
Intel’s own numbers show that ATX12VO specification is more power-efficient at idle or low power loads. With a 20W load, an ATX12VO 500W 80 PLUS Gold power supply offers a power efficiency of up to 83%, compared to an ATX 500W 80 PLUS Gold unit’s 64%. Alder Lake features a hybrid combination of high-performance Golden Cove cores and low-power Gracemont cores so we can see the connection there. The chipmaker has gone as far as saying that Alder Lake offers the best performance per watt for a desktop processor.
According to VideoCardz’s unnamed sources, Intel is very committed to the ATX12VO power connector. However, power supply and motherboard vendors aren’t very fond of the idea. It’s understandable since both parties will have to ultimately redesign their best power supplies and best motherboards to embrace the ATX12VO standard, which cost both money and time.
(Image credit: VideoCardz)
ATX12VO adoption so far has been fairly modest. With the previous generation, ASRock released its Z490 Phantom Gaming 4SR that uses the ATX12VO power connector. We’ve heard that MSI is preparing the Z590 Pro 12VO, but MSI hasn’t made an official announcement.
There’s a very small window remaining to get ATX12VO-compatible power supplies and motherboards ready for the Alder Lake CPU launch, which is rumored to take place in late 2021 or early 2022. Power supply manufacturers need about four months to ready for mass production, and motherboard makers require anywhere up to four to five months to validate ATX12VO motherboards.
What all this means is that OEM, ODM and LOEMs would need to be working hand-in-hand with both power supply and manufacturer vendors by the end of this month in order to have any chance of getting their products up in time for Alder Lake’s debut.
However, VideoCardz’s sources claimed that entry-level motherboards and pre-built systems are likely the only candidates to leverage the ATX12VO power connector. High-end and workstation-grade motherboards should continue to utilize the 24-pin power connector that we all know. You don’t necessarily need an ATX12VO power supply anyway, since most power supply vendors offer a ATX12VO adapter cable to plug into a standard ATX unit. However, you’ll be losing out on the power savings, which is the point of the ATX12VO specification.
Let’s hope that Intel’s ATX12VO power connector sees more success than Nvidia’s 12-pin PCIe power connector. Only time will tell whether the ATX12VO will ever become a mainstream power connector, especially in the DIY market. For the meantime, it’ll just have to co-exist with the 24-pin power connector.
ASUS is a Taiwan-based computer hardware and electronics company founded in 1989. Back in 2014, the ROG Gladius was the first gaming mouse released by ASUS under the ROG brand. With the ROG Gladius III, ASUS both iterates on and innovates this classic design. For the Gladius III, PixArt’s flagship sensor for both wired and wireless applications, the PAW3370, is used, though customized to allow for up to 26,000 CPI. Furthermore, the Gladius III comes with second-generation hot-swappable main button switch sockets, allowing one to use both 3-pin mechanical and 5-pin Omron optical switches. By default, ROG micro switches rated for 70 million clicks are installed, but may be swapped with Omron optical switches, a set of which is included in the box. At just 76 g, the Gladius III is noticeably lighter than its predecessors without having to resort to externally visible holes. The cable is the same as on the ROG Keris, featuring a soft braid and being exceptionally flexible. Pure PTFE feet are installed by default, and a set of replacement feet is included. Lastly, the RGB lighting is fully configurable through Armoury Crate, which also provides the usual customization features along with a new function called Rapid Fire. In addition to the wired Gladius III, the Gladius III Wireless will also be released, and reviewed at a later point here on TechPowerUp.
Massively popular game creation tool Roblox is now a massively popular experience creation tool Roblox, possibly in response to the ongoing Epic v. Apple trial.
Roblox allows a variety of user-created projects on its platform, and until earlier this week, these were all grouped under a tab called “Games” on Roblox’s website. Roblox creators could create and manage “games” through an editor, and individual games had a user limit called “max players.”
That’s all changed now. The “Games” tab now reads “Discover” on the web, although it still points to an address of “roblox.com/games.” Developers can create and manage “experiences,” and experiences have “max people” allowed. The word “game” has been replaced by “experience” across nearly the entire Roblox website, and the iOS and Android apps now have a Discover tab instead of a Games tab — although both apps are currently classed as games in their respective stores. Roblox acknowledged a message from The Verge, but it didn’t offer an explanation for the latest change by press time.
Roblox has used the term “experience” in place of “game” before, and CEO David Baszucki called Roblox a “metaverse” rather than a gaming platform last year. But this change happened days after a legal fight over whether Roblox experiences are games — and by extension, whether Roblox itself should be allowed on the iOS App Store.
The Epic v. Apple antitrust trial has produced a weeks-long, frequently hilarious debate over the definition of a video game. Epic wants to prove that its shooter Fortnite is a “metaverse” rather than a game, pushing the trial’s scope to cover Apple’s entire App Store instead of just games. Apple wants to prove that Epic is an almost purely game-related company and that the App Store maintains consistent, user-friendly policies distinguishing “apps” from “games.” It also wants to defend a ban on “stores within a store” on iOS.
Roblox blurs the line between a large social game and a game engine or sales platform. Users don’t enter a single virtual world like Second Life; they launch individual experiences created by users. Developers can sell items within those experiences, and there are full-fledged game studios that build with Roblox instead of, say, the Unity or Unreal engines. But all of this activity happens within a single Roblox app, instead of as a series of separately packaged games.
Apple has apparently worried about this fuzziness. In a 2017 email, Apple marketing head Trystan Kosmynka said he was “surprised” that Roblox (which he referred to as “Roboblox”) had been approved for the App Store. The email chain indicates that App Store reviewers raised concerns in 2014, but Roblox was approved without ever resolving the issues. Epic brought the decision up again in court, hoping to cast doubt on Apple’s App Store review process.
Instead, Kosmynka justified the choice by saying that neither Roblox nor its user-built projects should be defined as games. “If you think of a game or app, games are incredibly dynamic, games have a beginning, an end, there’s challenges in place,” he testified. “I look at the experiences that are in Roblox similar to the experiences that are in Minecraft. These are maps. These are worlds. And they have boundaries in terms of what they’re capable of.” Kosmynka said Apple considered Roblox itself an app (rather than a game) because the company used that label in the App Store, although this doesn’t appear to be accurate.
Besides the crucial factors of “beginning,” “end,” and “challenges,” Kosmynka seemingly argued that these experiences weren’t games because Roblox contained their code in a safe, Apple-vetted Roblox sandbox — making them less objectionable than standalone installable games. But Apple doesn’t use that same logic for cloud gaming services, which stream video of games from remote servers. In fact, it requires these services to list each game as a separate app. That would probably be a nightmare for Roblox, where experiences range from full-fledged professional projects to tiny personal spaces.
Some Roblox users have left irritated messages on Twitter, Reddit, and other platforms. But Roblox has promoted itself as a general-purpose metaverse in the past. It’s got virtually nothing to gain by deliberately stepping into Apple’s minefield of iOS gaming rules, particularly after such an extended courtroom debate about its status. On iOS, it turns out, the only winning move is to not play — or at least not tell anyone you’re playing.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.