The BPI-M2 Pro single-board computer Banana Pi announced in March is now available worldwide for $61. However, that doesn’t necessarily mean the board is fully ripe, with CNX Software noting that Banana Pi still hasn’t released its firmware.
Banana Pi said the BPI-M2 Pro features an Amlogic S905X3 equipped with a quad-core Cortex-A55 processor clocked at 1.5 GHz and a Mali-G31 MP2 GPU. It also has 2GB of LPDDR4 memory, 16GB of onboard eMMC storage, and a microSD slot.
The BPI-M2 Pro also offers an HDMI 2.1 port that can theoretically support up to an 8K video output. Still, according to the Banana Pi wiki, it will be limited to 4K output with a 60Hz refresh rate. (Which is to be expected from a single-board computer.)
The board also features 40 GPIO pins, numerous USB ports, and built-in networking that offers Gigabit Ethernet, Wi-Fi, and Bluetooth 5 support out of the box. More information about the BPI-M2 Pro’s specs are available via the Banana Pi wiki.
The BPI-M2 Pro is supposed to support Android and various Linux distros, but there aren’t any official images available at the time of writing. It’s not clear if that’s because the board supports images made for the Banana Pi BPI-M5, because it wasn’t supposed to debut yet, or because they’ll be ready when the board reaches buyers.
If the lack of official firmware doesn’t deter you, the Banana Pi BPI-M2 Pro is available now from AliExpress. Just be prepared to wait a while for it to arrive: The retailer said it doesn’t expect orders placed on June 16 to be delivered until July 20.
LG is offering an extended 5-year warranty on its 4K G1 OLED in the US, covering any problems that develop with its panel after the standard one-year warranty. In the UK the company says the extended warranty also applies to its Z1 8K OLED. It’s unclear in how many other countries LG is offering the extended warranty, and the company did not immediately respond to The Verge’s questions.
The extended warranty appears to be an attempt to upsell customers from LG’s popular lower-end sets like the C1 to its more premium G1 model. But it should also allay any fears that the new brighter Evo OLED panel LG is using in the G1 won’t last as long as its previous OLED panels. Or, as LG said in a statement given to TechRadar, “The warranty is designed to offer peace of mind to customers purchasing our flagship, more expensive OLED models.”
Of course, the big question is whether the extended warranty covers permanent image retention, aka “burn-in,” where parts of an image can remain visible after being displayed for an extended period. In response to Engadget’s question about this, LG gave this evasive response:
“LG’s five-year limited warranty program is in line with the company’s consistent communication regarding the low risk of image retention on LG OLED TVs, when used in normal viewing conditions… As with any self-emitting display, OLED TVs may experience temporary image retention under certain conditions, but permanent image retention, or burn-in, is rare under normal viewing conditions. Image retention is not a product defect.”
The implication seems to be that temporary image retention is normal and is nothing to worry about, but all LG will say about permanent image retention is that it’s “rare.” Meanwhile, in a statement given to HDTVTest the company said the warranty covers “any image issues,” so long as the display is being used normally, and not in a commercial setting where static images might be left onscreen for far longer (the famous example of this is a South Korean airport which had to swap out the OLED TVs it was using to show departure times). We’ve contacted LG to try and get a simple yes or no answer.
Like other warranties, LG’s also comes with various exceptions, such as damage caused by power surges, acts of nature, or improper installation. But LG’s warranty should still provide some assurance that you’ll get a decent few years of use if you splash out on one of its more expensive TVs.
In a partnership with Nvidia, Colorful has opened the world’s first graphics card museum. The museum, which is located in Shenzhen, China, is expecting to open its doors to visitors soon.
Colorful recently relocated its headquarters to Shenzhen’s New Generation Industrial Park. The brand might not as well-known on this side of the globe, but Colorful is one of the oldest players in the Chinese market.
There are many rare graphics cards on display at Colorful’s museum hailing from the ’80s that you won’t find on anyone’s Best Graphics Cards list today. The company has each one categorized chronologically. The list of graphics cards includes old-school Voodoo graphics cards and Nvidia’s GeForce 256, dubbed the world’s first graphics card. Some of the chipmaker’s first GeForce gaming graphics cards are also part of the exhibition.
Not everything is about Nvidia though. Colorful also has a priceless collection of ATI graphics cards, such as the legendary Rage Fury MAXX. It was the company’s first dual-chip graphics card and binded two Rage 128 Pro chips together on the same PCB with Nvidia’s SLI technology serving as the main highway for communication. The museum also houses other rare artifacts from IBM, 3Dlabs, Intel, S3, Trident, 3Dfx and many others.
Image 1 of 2
Image 2 of 2
Since it’s Colorful’s museum, after all, the company also has diverse sections to pay homage to the evolution of eSports in China, as well as the brand’s own iGame and Kudan bloodline.
Colorful also endowed its museum with some of the latest toys on the market. The company has setup a racing simulator with three 8K displays and a VR station.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Colorful stated that the museum will be open for visitor registration “soon,” but didn’t commit to a specific date.
This week Sharp unveiled the 8M-B32C1, the industry’s first professional 8K monitor with a 1,000-nits peak luminance aimed at professionals.
The Sharp 8M-B32C1 is an LCD that relies on a high-end 31.5-inch (presumably IPS/IGZO) panel featuring a 7680 × 4360 resolution, a 800 nits typical brightness (a 1000 nits peak brightness in HDR mode), a 1300:1 contrast ratio, a 60 Hz maximum refresh rate, 9 ms GtG response time, and 176°/176° horizontal/vertical viewing angles. The unit is equipped with a direct LED backlighting.
While there are a number of 8K LCD monitors on the market today, only Sharp’s 8M-B32C1 supports a 1,000-nits peak luminance and numerous HDR transports, including HLG and PQ. There is no word about HDR10 or Dolby Vision. Furthermore, Sharp opted not to obtain VESA’s DisplayHDR badge, which some of the best computer monitors for HDR carry.
Being a professional monitor, the Sharp 8M-B21C1 uses a panel that can display 1.07 billion colors and reproduce virtually all color gamuts that are in use today and for the future, including Adobe RGB, BT.2100 (HLG), BT.2100 (PQ), BT.2020, DCI-P3, and sRGB/BT.709. The monitor can reproduce 85% of the BT.2020 color gamut. In general, the monitor can be used equally well for photo and video editing. The LCD should come factory calibrated, but it can also be calibrated manually.
To meet requirements of photo and video editors, the Sharp 8M-B21C1 supports numerous professional features, such as luminance clipping, out-of-color warning, peaking, and false color.
As far as inputs are concerned, the new professional LCD from Sharp has one HDMI port that supports an 8Kp60 input over an HDMI 2.1 cable (yet the manufacturer does not call the input HDMI 2.1 for some reason, perhaps because an 4Kp120 mode is not supported) from an appropriate source, four HDMI inputs that can get an 8Kp60 image from four ports, a DisplayPort 1.2, and an HDMI 1.4. The monitor also has a 3.5-mm audio output and a USB Type-A port for firmware updates.
Sharp plans to start sales of its 8M-B21C1 sometimes in late June. The company does not disclose pricing of the monitor, but with a monthly output of around 150 units, it is pretty obvious that the product will be expensive.
(Image credit: Future / Ori and the Will of the Wisps, Xbox Game Studios)
LG’s premium OLED TVs now come with a five-year warranty, so any defects within that time period will be fixed free of charge.
It applies to the LG OLED G1 (above), which comes in 55-, 65- and 75-inch sizes, and the OLED Z1 8K set, which comes in 77- and 88-inch sizes.
The warranty starts from the date you buy the TV. It covers parts and labour for the first year after purchase, and a free panel service for the five-year period.
The 65-inch version of the G1 bowled us over, earning five stars in our review. It boasts a beautiful, punchy, sharp picture with tons of detail and a better remote control. About the only downside we could find was the lack of feet or stand in the box, and the slightly underwhelming audio performance. But that can always be righted by adding a soundbar or surround sound system.
It’s interesting that the C1 OLED, which sits just below the G1 in LG’s 2021 OLED range, doesn’t get the same warranty. Could this be another way in which LG is attempting to encourage purchasers to step up to the next model?
We haven’t tested the Z1. But considering it’s an 8K monster, it’s sure to deliver tons of fun.
Th G1 range starts at £2000, while the Z1 will set you back a cool £20,000 for the 77 incher, and £30,000 for the 88-inch model. For that money, we’d certainly want any problems fixed by our own personal batman.
MORE:
Look at the best TVs around
Ready for a deep dive? LG 2021 TV lineup: everything you need to know
Compare that with the full Samsung 2021 TV line-up
Microsoft develops under-screen camera with 4 color filters for Surface devices. When the camera is turned off, an adjustable logo or avatar will be displayed.
Many technology companies are currently working on under-display camera technology for smartphones and other portable electronic devices, including Microsoft. However, the US software manufacturer seems to take a slightly different approach. The company filed a patent application for an under-display camera with 4 colors filters and 4 image sensors. The technology is meant for smartphones and Microsoft Surface devices.
The Surface line-up offers different types of laptops, mainly 2-in-1 tablet PCs with touchscreen functionality. Like, for example, the Surface Go and the Surface Pro. One of the newest additions is the Surface Duo, a dual-screen Android tablet smartphone. A new patent that was discovered by LetsGoDigital suggests Microsoft could integrate a ‘logo camera’ in one of its future Microsoft Surface devices.
Microsoft under-screen logo camera
In October 2020, Microsoft Technology Licensing LLC filed a patent with the World Intellectual Property Office (WIPO) for a ‘Logo camera’. The 40-page documentation was published on May 14, 2021.
It is a unique invention, which we have not encountered before. The documentation mentions four under-display cameras, each with their own color filter. The 4 cameras together are called a ‘logo camera’ by Microsoft. Thanks to the use of color filters, which can display a color icon, a true-to-color logo can be displayed when the camera is turned off.
By default, the Microsoft logo will be displayed. As soon as the camera is activated, the logo will disappear and the shutter will be activated. In addition, an icon menu will be available, enabling users to set a logo or avatar as desired. Like a company logo or a club logo.
In order to visualize the patented technology, in-house designer Giuseppe Spinelli has made a number of product renders of this unique camera system. These images are for illustrative purposes only and based on the patent of Microsoft Technology.
The quad front-camera is placed in a 2×2 array. By choosing multiple camera sensors, thinner camera modules can be used, allowing the device to retain its slim form factor. To achieve a high camera resolution, the pixel density of the screen will be increased. Each sensor and lens is configured to be optimized for particular colors.
The documentation makes mention of 4 sensors, one optimized for the color Blue, the other for Green, Red and Yellow. By adding a fourth color, yellow (RGBY), in addition to the standard RGB (red, green, blue) colors, a much larger color range can be displayed.
The color filters can also improve camera performance in low light. In addition, they can be used to give the user feedback, for example a color signal can be emitted when the camera is activated.
However, there are also disadvantages, as color conversion can be sensitive to noise, especially when the colors are less saturated. Microsoft wants to counter this phenomenon through software. In addition, AI technology is used to learn the difference between obscured and non-obscured light in images, in order to correct light loss and diffraction.
To implement a logo camera on the front, several small holes (1mm) have to be made in the screen, which provide a light path for the camera. The patented technology could also be used for a rear camera, according to the documentation.
Microsoft Surface devices
Whether Microsoft actually sees a chance to integrate an under-display logo camera in one of its upcoming Surface products remains unknown. In any case, it is a totally new and different solution than we have seen so far.
It is clear that Microsoft is also involved in the development of under-screen cameras. Last month, the company published several detailed publications on its website about ‘Camera In Display technology’ and the use of machine learning to achieve better image quality.
In addition, Microsoft also posted a vacancy online for a ‘Principal Android Camera System Architect / Engineer for Surface Development’ in October last year. The software giant seems to intend to take the camera performance of its next-gen Microsoft Surface devices to a whole new level.
Here you can take a look at the documentation of the Microsoft logo camera.
Note to editors :The high-resolution 8K 3D renders in this publication are created by in-house graphic designer Giuseppe Spinelli (aka Snoreyn). The copyrighted images are based on the patent of Microsoft Technology. Feel free to use the images on your own website, please be so respectful to include a source link into your publication.
Ilse is a Dutch journalist and joined LetsGoDigital more than 15 years ago. She is highly educated and speaks four languages. Ilse is a true tech-girl and loves to write about the future of consumer electronics. She has a special interest for smartphones, digital cameras, gaming and VR.
Sony is now fully revealing its Airpeak S1 drone, which it teased at CES 2021 in January. The announcement contains a lot more detail on the drone’s capabilities, features, and reveals a $9,000 pricetag for the drone sans gimbal or camera, all of which cements the idea that this drone will be aimed squarely at the professional video market.
The Airpeak S1 is built to work with Sony’s mirrorless cameras, including the A7S Mark III, FX3, or even the 8K-capable Alpha 1. They’ll be attached to a special version of the Gremsy T3 gimbal that’s been designed specifically for the Airpeak and that you’ll have to buy separately. With a camera, the drone will have around 12 minutes of flight time (though it can achieve 22 minutes without any load). It’s also worth noting that the camera needs its own batteries — it isn’t being provided power by the drone.
Sony’s already released a preview of the types of shots you can pull off with the drone, which you can see below. You can also get a shot of the retracting landing gear in motion.
One of the drone’s biggest selling points is its stability and wind-resistance. According to Sony, it can stay stable in winds of up to 44.7 miles per hour (that’s 20m/s, double what DJI quotes for the Inspire 2), and it has five sets of stereo cameras that let the drone and an infrared rangefinder that should help the drone stop before it hits obstacles and stay steady even without satellite reception. Sony even enlisted JAXA, the Japanese space agency, to help it do some of the tests for the drone:
The Airpeak is also quick — it can do 0-50 (which is close to its top speed of 55.9 miles per hour) in 3.5 seconds. It is worth noting, though, that’s without any sort of attachments — Sony hasn’t said what kind of speed or acceleration can be achieved when the drone is flying a camera. That said, Sony showed me and other journalists a video of the drone doing figure-eights in the air, which it pulled off with impressive speed and agility.
For comparison, DJI’s Matrice 600 Pro, which costs around $7,000 without a gimbal or camera, has a top speed of 40 miles per hour and a quoted battery life of 32 minutes alone or 16 minutes with a 13-pound payload, using its stock batteries.
The Airpeak S1 can be operated with just the included controller, but Sony has an app called Airpeak Flight to help make things easier. The app is iOS/iPadOS-only for now, but it will allow for control of the camera and gimbal. The Airpeak can be operated by a single person, but also allows for dual-operator mode, where one person controls flight and the other controls the camera. Sony says the controller’s range is still being tested.
While you won’t get a camera or gimbal for the Airpeak S1’s $9,000 price, it does come with two pairs of propellers, the controller, two batteries, and a charger. Sony expects to ship it in the fall, and will be offering a service plan to cover damage that could occur from crashes.
In light of all the legislation and controversy around drones from China, Sony is making it clear that the Airpeak S1 is designed and made in Japan; it came up repeatedly in a press briefing and again in the press release.
AMD has officially launched its latest round of professional graphics cards, the Radeon Pro W6800, Radeon Pro W6600, and for laptop users, the Radeon Pro W6600M. Like many of the best graphics cards, these new GPUs leverage AMD’s latest RDNA2 architecture, aka Big Navi, which means they bring ray tracing hardware to the lineup, along with the large Infinity Cache. The W6800 also packs 32GB of high-speed GDDR6 memory, which can greatly benefit certain professional workloads.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Starting with the top model, the Radeon Pro W6800 has specs that are very similar to the Radeon RX 6800, just with double the VRAM and with a different card design and drivers. It uses the Navi 21 GPU, but with 60 Compute Units and 3840 GPU cores enabled. It delivers up to 17.83 TFLOPS of FP32 performance and 35.66 TFLOPS of FP16 performance, equating to a boost clock of around 2320 MHz. Like the RX 6800, it also had a 250W TDP, 16Gbps GDDR6 memory and it supports PCIe Gen4.
On the other hand, the actual card design is completely different from the consumer RX 6000-series parts. Like most previous Radeon Pro designs, the Radeon Pro W6800 features a blower cooler, which tends to be preferable to open-air coolers for workstation use as it better supports the use of multiple cards. AMD also equips the W6800 with six mini-DisplayPort outputs, all supporting DisplayPort 1.4 with DSC. Six of the ports can handle up to 5120×2880 resolutions, and two of the ports can do up to 7680×4320 (presumably that’s with DSC, Display Stream Compression).
AMD provided testimonials and some benchmarks showing how the Radeon Pro W6800 compares to other professional cards. The biggest point in AMD’s favor will likely be the price, with the W6800 officially selling for $2,250. AMD compares it with the Nvidia RTX A6000, which costs over twice as much ($4,649), except AMD’s benchmarks then go back to the RTX 5000, a previous generation and obviously slower competitor — #GrainsOfSalt. Depending on the various benchmarks, AMD also shows significant gains over its previous generation Radeon Pro WX 9100 and W5700.
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
The Radeon Pro W6600 takes a big step down in performance, and arguably the most interesting aspect is that it uses AMD’s Navi 23 GPU. Also, it has a suggested price of $649. We still haven’t seen Navi 23 in any desktop GPUs, though it’s expected to eventually show up in the RX 6600 XT and RX 6600. This is the first time we’ve seen official specs for anything using Navi 23 with 11.06 billion transistors. Also, as previously rumored, Navi 23 comes with 32MB of Infinity Cache. That’s a big step down from Navi 22’s 96MB and Navi 21’s 128MB, but of course, it was necessary to hit the desired chip size and cost parameters.
AMD doesn’t give the size or maximum core counts, but the W6600 does include 28 CUs and 1792 GPU cores. It has peak performance of 10.4 TFLOPS FP32 and 20.8 TFLOPS FP16, which works out to a boost clock of around 2900 MHz. It looks like Navi 23 will clock even higher than Navi 22, if that’s correct. Perhaps even more impressive is that the W6600 only has a 100W TDP. Other specs include 8GB of GDDR6 memory, clocked at 14Gbps. It’s a single-slot blower design, with four full-size DisplayPort outputs — again, all 1.4 with DSC, though only one can handle 8K resolutions with the other three maxing out at 5K.
The Radeon Pro W6600M has nearly identical specs to the W6600, with the only difference being the TDP ranges from 65–95W. All the other features are the same, and actual boost clocks and display connections will be up to the laptop manufacturers. AMD didn’t provide any specific benchmarks of the W6600M, likely because actual laptops using the GPU aren’t yet available for testing.
Overall, AMD claims performance is “up to 79 percent faster” than its previous generation hardware. It can be even more than that in some cases (e.g., anything that uses the ray tracing hardware), but 79% will serve as a reasonable estimate. AMD also shared some performance data with and without the Infinity Cache enabled on the W6800, showing gains of up to 10%, though again, that will vary greatly across workloads.
These Radeon Pro W6800 is available now, with the W6600 slated for availability in Q3 2021. The W6600M should show up first in the HP ZBook G8 mobile workstation in July. Like other professional cards, they come with drivers that include a variety of ISV optimizations.
Apple’s annual developer extravaganza, the Worldwide Developers Conference (WWDC), is coming up fast, kicking off with the keynote presentation on June 7th at 1PM ET. Like last year, WWDC will be an entirely digital and online-only event due to the COVID-19 pandemic, and for the keynote, that means we can likely expect another tightly produced video highlighting everything Apple has in store.
While we aren’t expecting any announcements on the level of Apple’s shift to custom silicon in its computers, which was WWDC 2020’s big news, Apple presumably has some notable changes in the works for iOS, iPadOS, macOS, and its other operating systems. And if the current rumors pan out, we could also see brand-new MacBook Pros with the return of some long-missed features, such as MagSafe charging.
Read on to learn everything we expect from the big show. And don’t be surprised if Apple has a few surprises in store, too.
iOS 15 may bring improvements to notifications and iMessage
We haven’t heard much about what may be coming to Apple’s next version of its mobile operating system, which will presumably be called iOS 15, but we could see big changes to notifications and possibly iMessage, according to Bloomberg.
For notifications, you may be able to have different notification settings for situations like driving, working, sleeping, or even a custom category, and you’ll be able to flip those on as you need to. You might also be able to set automatic replies based on which notification setting you’re currently using, like what you can do now with Do Not Disturb while driving mode. Personally, I’m hoping iOS 15 will let me allow notifications from a select few people while silencing just about everything else.
As for iMessages, Apple is apparently working on features to make it act like “more of a social network” to compete with Facebook’s WhatsApp, Bloomberg said, but those features are still “early in development” and could be announced at a later date.
Apple also plans to add a feature that shows you apps that are silently collecting data about you, continuing the company’s trend of adding privacy-focused updates to its operating systems.
For iPadOS 15, you can apparently expect a major update to the homescreen, including the ability to put widgets anywhere you want. And with Apple just introducing the new M1-powered iPad Pros, here’s hoping we see some new upgrades to take advantage of the new chip.
In May, Apple also announced a lot of new accessibility features coming to Apple’s operating systems, such as improvements in iOS to VoiceOver, support for bidirectional hearing aids, a built-in background sounds player, and new Memoji customizations like cochlear implants. Apple said these features would arrive “later this year,” which suggests they’ll be included in iOS 15.
We don’t know much about macOS, watchOS 8, and tvOS 15 — but we could see a new “homeOS”
We haven’t heard all that much about upcoming software updates for the Mac, Apple Watch, and Apple TV, so we’ll just have to wait and see what Apple is cooking up. One tidbit: macOS could be a “more minor” update, Bloomberg says. That wouldn’t be too much of a surprise, given that the macOS operating system got a big overhaul with Big Sur last year.
However, we could see the introduction of a brand-new operating system called “homeOS,” which was recently mentioned in and later removed from an Apple job listing. While it’s unclear exactly which devices this OS is for, perhaps it will work on Apple’s home-focused products like the Apple TV and HomePod Mini.
New, redesigned MacBook Pros and a new Apple CPU could be announced
Apple doesn’t always introduce new hardware at WWDC, but this year, new MacBook Pros seem like a possibility. In a May 18th report, Bloomberg said that new MacBook Pros might arrive “as soon as early this summer,” which could indicate an announcement at WWDC.
These new laptops would have new Apple-designed processors that would “greatly outpace the performance and capabilities of the current M1 chips,” according to Bloomberg. The M1 is already pretty dang good, so it sounds like these new chips could be even more impressive.
Apple is apparently planning on releasing two chips for the new Pros. Both should have eight high-performance cores and two energy-efficient cores, while leaving you with the option of either 16 or 32 graphics cores. (By comparison, the M1’s CPU has four high-performance and four energy-efficient cores, while its GPU is offered with either seven or eight cores.) You’ll probably also be able to spec the laptops with as much as 64GB of memory, up from a max of 16GB on M1-equipped computers.
The new laptops should be offered with either 14-inch or 16-inch screens and those screens could have “brighter, higher contrast” displays, according to a Bloomberg report from January. The laptops may also have a new design with flat edges as in the iPhone 12, analyst Ming-Chi Kuo said in January. I’m curious to see what that design might look like in practice — I worry that the hard edges could be uncomfortable if you have the laptop on your lap.
The best rumor is that the new design may also mark the return of some of the ports and features that were taken away with the now-infamous 2016 MacBook Pro redesign, including a MagSafe charger, an HDMI port, and an SD card slot, Bloomberg said in its May report. And, according to Kuo, the OLED Touch Bar currently found on Intel-based MacBook Pros will apparently be removed in favor of physical function keys.
We could see at least one other new Mac
While it seems like MacBook Pros are the only new hardware we’ll be seeing at WWDC this year, that hasn’t stopped some other Mac rumors from swirling lately, and there’s always the chance Apple could announce more at its big event. According to Bloomberg, Apple also has “a revamped MacBook Air, a new low-end MacBook Pro, and an all-new Mac Pro workstation” in the works as well as a “higher-end Mac Mini desktop and larger iMac,” all of which would be powered by Apple’s custom silicon.
The new Mac Mini may have the same chip as the new MacBook Pros. The new Mac Pro could be a beast, with processors that are “either twice or four times as powerful as the new high-end MacBook Pro chip.”
And the redesigned “higher-end” MacBook Air could arrive as early as the end of this year. Frankly, I hope that refreshed Air arrives even later. I just bought the M1-equipped Air and it’s one of the best computers I’ve ever used, but I have a bad feeling I’ll be first in line to buy a redesigned and more capable Air anyway. (Especially if it gets the MagSafe charger that’s rumored for the new Pros.)
Apple might have dropped a hint about its AR / VR headset
Apple has long been rumored to have a mixed reality headset in the works, and recently, we’ve learned a few more potential details about it. The headset might be very expensive — approximately $3,000, according to one report — though it could be packed with 8K displays, more than a dozen cameras to track hand movements and capture footage, and might weigh less than an iPhone, too.
While the headset could be a ways out, as it’s not expected to ship until 2022 at the earliest, a few suspicious details in Apple’s WWDC promotional images may be hinting toward some kind of reveal of Apple’s upcoming headset or the software on which it runs.
Check out this image below (that I also used at the top of this post), which Apple released alongside the announcement of WWDC in March. Notice the way the app icons are reflected in the glasses — I could imagine some sort of mixed reality headset showing icons in front of your eyes in a similar way.
Apple continued that reflections motif with new images released in May — you can see things from the laptop screens reflected in all of the eyes of the Memojis.
Now, these reflections may just be Apple’s artists flexing their design chops. And if I had to guess, given how far out a rumored mixed reality headset is, I don’t think we’re going to see anything about it at WWDC this year.
But Apple has surprised us in the past, and maybe these images are an indication of one more thing Apple has in store for WWDC.
Acer is joining the likes of Razer and Corsair in pushing gaming peripherals that changes the rate at which your mouse sends data to your PC. Most gaming mice offer polling rates of 1,000 Hz max, meaning it sends a report to your PC telling it of your mouse’s position 1,000 times per second. But the Acer Predator Cestus 335 announced today goes up to 2,000 Hz. While most gamers are still happy with 1,000 Hz, the Predator Cestus 335 may be introducing a happy middle ground for extreme gamers.
Let’s do some (painless) math. When a mouse has a 1,000 Hz polling rate, it can take as little as 0.001 second — or 1ms — to send a report. 1 second divided by 1,000 reports equals 0.001 second per report. With 2,000 Hz, the Cestus 335’s expected delay decreases to 0.5ms (1 second divided by 2,000 reports equals 0.0005 second).
The Predator Cestus 335 is extreme in its attempt to deliver twice the number of reports per second than most of the best gaming mouse options today. But there are already more extreme options. This year, the Razer Viper 8K Hz and Corsair Sabre RGB Pro mice came out, as well as the Corsair K70 RGB TKL and Corsair K65 RGB Mini keyboards. Each has an 8,000 Hz polling rate (yes, some of the best gaming keyboards are getting pulled into the trend too) and, therefore, have input delays as low as 0.125ms compared to the Cestus 335’s 0.5ms.
Of course, the question becomes do you really need those extra Hz? In action, I did find the Viper 8K Hz to bring an improvement in tracking. For example, when I moved my mouse around in an oval, I saw more instances of the pointer with the 8,000 Hz mice compared to when using a couple 1,000 Hz mice. I also fared slightly better in anecdotal tests such as the Human Benchmark reaction time test, although mouse comfort and buttons are also a factor. When gaming, however, there was so much going on that I didn’t notice any obvious visual gains or gaming advantages.
According to Razer, the increased polling rate might have been more effective if I had a more powerful PC and a display with a faster refresh rate. Acer hasn’t confirmed yet, but it’s possible that the Predator Cestus 335 will also come with recommended PC and/or monitor specs.
That will be a big factor in if the polling rate race takes off among gamers. When we tested the Corsair Sabre RGB Pro, we noticed a roughly 6-10% spike in CPU usage when using an AMD Ryzen 7 3700X. But we’d expect the 2,000 Hz Predator Cestus 335 to be less taxing than an 8,000 Hz mouse.
Besides a high polling rate, the Predator Cestus 335 is packing a PixArt 3370 sensor with the ability to hit up to 19,000 CPI sensitivity.
The mouse should also make it easy to toggle through 5 preset CPI levels and profiles, all customizable via Acer’s QuarterMaster software. That’s also where gamers will be able to program the mouse’s 10 programmable buttons. Photos Acer shared with the press show three side buttons, dedicated macro, CPI and profile buttons and a scroll wheel that can also move left and right. Of course, there’s also a fair amount of RGB here.
Ultimately, the Predator Cestus 335 seems to have a decent featureset here, and eSports-level players may jump at the opportunity at cutting input delay — even if by a hair. 2,000 Hz should, hopefully, call for less extreme PC specs, but we’ll have to wait until testing the mouse to know for sure.
Acer doesn’t have a price or release date for the Predator Cestus 335 yet.
Earlier this year, Samsung announced its ISOCELL HM3 – a 1/1.33” sensor with 108MP resolution which we saw in action on the Galaxy S21 Ultra. Now, we get a new official video which hints the sensor might soon make its way to other flagship offerings from Samsung or and flagship phones from other makers.
The HM3 brings 9-to-1 pixel binning with an effective pixel size of 2.4µm. The sensor captures 12-bit images with 12MP resolution in default and features Smart-ISO Pro that upscales HDR content by capturing simultaneous low and high ISO shots to balance out ghosting on moving objects. Video capture tops out at 8K resolution while slow-motion videos can go up to 240fps at FHD resolution.
Given the timing of the new video, Samsung could be hinting at a possible new Galaxy flagship featuring the ISOCELL HM3 or that it will be offering the sensor to other OEMs.
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
This is an iMac unlike any other iMac we’ve seen before, and it all comes down to the M1 chip.
Sure, there are some other differences between this 24-inch iMac and the 21.5-inch model from 2019 that it’s replacing. There are better microphones and better speakers. There are fewer ports, and some of them have moved around. The screen is bigger and better. The keyboard now has TouchID. But the M1 is the star of the show.
It’s not just the performance increase. It’s not just the fact that you can run iOS and iPadOS apps natively on the system. It’s not just the new advanced image signal processor, which helps create better low-light images than I’ve ever seen from an integrated webcam. It’s also the groundbreaking efficiency with which this processor runs, which has enabled Apple to create a slim, sleek, and quite unique iMac chassis.
Whether you actually get every upgrade here depends on the configuration you choose. The entry-level iMac is $1,299 for 256GB of SSD storage, two Thunderbolt / USB 4 ports, 8GB of unified memory, and a seven-core GPU — but that’s only available in four colors and doesn’t come with TouchID. The model I tested bumps the storage up to 512GB and the memory up to 16GB. It has two USB-3 ports in addition to the two Thunderbolt, an eight-core GPU, Touch ID, and a gigabit Ethernet port (which is in the power brick). I also received both the Magic Mouse and the Magic Trackpad with my model. You’d need to pay a total of $2,028 to get everything Apple sent me (and which I’ll be sending back, for the record).
In short, this device costs money. And it’s true that you’d get similar performance and save a few hundred bucks, if you just plugged a Mac Mini into an external display. But this iMac has almost everything that most people need in one package: processing power, sure, but also a camera, speakers, microphones, a keyboard, a mouse, a trackpad, and a display. And they’re all good. This is a computer you can plonk on your desk and never think about again. And for some of the iMac’s target audience, that’s probably worth the extra money. You’re paying for simplicity.
The M1 processor uses what’s called a “hybrid” configuration. The easiest way to conceive of this is that most competing Intel and AMD chips have a number of equally “okay” cores, where Apple’s M1 has four very fast cores and four lower-powered high-efficiency cores. This allows M1 devices to deliver arguably the best performance-per-watt in the world. It also means that they’re nearly unbeatable in single-core workloads.
That advantage bore out in our benchmark testing. This iMac model achieved a higher score on the Geekbench 5 single-core benchmark than any Mac we’ve ever seen before — even the iMac Pro. That means if you’re looking for a device for simpler everyday tasks that don’t scale to every available CPU core (and that largely seems to be the demographic that Apple is trying to sell this machine to), there has literally never been a better iMac for you to buy.
You can see the rest of our benchmarks below:
Apple iMac 24 (2021) benchmarks
Benchmark
Score
Benchmark
Score
PugetBench for Premiere Pro
372
Cinebench R23 Multi
7782
Cinebench R23 Single
1505
Geekbench Multi
7668
Geekbench Single
1739
Geekbench OpenCL
19114
These results help illuminate where this iMac fits into Apple’s all-in-one lineup, and where its limitations are. The 24-incher is a significant improvement over the 21.5-inch iMac in both single-core and multi-core workloads. And it’s very comparable in graphics tasks — which is quite impressive, given that the 21.5-inch iMac has a discrete GPU and this one relies on what’s integrated with the M1.
On the other end, these results (with the exception of single-core performance) are not close to what we’d expect from the 27-inch Intel iMac with discrete graphics. In this comparison, multi-core results are more important. They indicate that the 27-inch iMac is going to do much better on the types of tasks that owners (or prospective buyers) are likely to be doing: intense multitasking, computations, design, video work, and other more complex loads that may leverage the GPU.
There are other limitations that may put some workloads out of reach. As is the case with the MacBook Pro and Mac Mini, you can’t configure the iMac with more than 16GB of memory and 2TB of storage; we wouldn’t recommend those specs to anyone who regularly edits 4K or 8K video, for example. The memory and storage are soldered, so you can’t upgrade them after purchase. Only one external display is supported (up to 6K resolution at 60Hz). Ports are also bizarrely limited; the base model has just two Thunderbolt / USB-4 ports and a headphone jack, while more expensive models have an additional two USB-3 ports and Gigabit Ethernet. These all may be reasons Apple is pushing this iMac as a “home and family” PC, even though its processor is clearly capable of all kinds of professional work.
Another way to interpret these numbers is that I was getting effectively the same performance out of this machine as we got from the M1 MacBook Pro and the Mac Mini. That’s completely unsurprising, since these devices all use the same processor. But it’s a good proxy for gauging whether the iMac can handle your work: if you expect you could get a task done with the M1 MacBook Pro, you should be able to do it on this.
More anecdotally, I was able to use my test unit for all kinds of daily tasks, from emailing to YouTube to amateur photo and video work. I was able to hop between over 25 Chrome tabs with Cinebench looping in the background, with no stutter or slowdown whatsoever. If you’re buying the iMac for this kind of thing, I can’t imagine you’ll see too many spinning wheels.
During this testing process, I also got a sense of just how well cooled this chassis is. On thinner laptops that I test often (including the fanless MacBook Air), you’ll see performance decrease if you run heavy tasks over and over again. None of that on this iMac: I looped Cinebench R23 as well as a Premiere Pro 4K video export several times over and never saw scores go down. It took a lot to get the fans going — they were checked out during my daily office multitasking. When they did spin up, mostly while I was working in Premiere, I could barely hear them. They were quieter than the background hum of my refrigerator. That’s quite a quality-of-life improvement over prior Intel iMacs.
The M1’s advantage, after all, has never been raw power; it’s the combination of power and efficiency. We saw much better battery life in the MacBook Air and MacBook Pro than we did in their Intel predecessors. Battery life obviously isn’t a concern with the iMac, but efficiency certainly is. Chips are limited by two things: the power available and how well their systems can keep them cool. They vent almost all the energy they use as heat, and because the M1 has such incredibly high performance per watt, Apple doesn’t need a heavy-duty cooling system to keep it from frying itself. Because it doesn’t need that heavy-duty cooling system, Apple has finally been able to redesign the iMac from the ground up.
This iMac is sleek. Even though it has a 24-inch screen, it’s close in size to its 21.5-inch predecessor. Apple reduced the screen’s borders by close to 50 percent in order to squeeze the bigger screen into the compact chassis. This device is also 11.5 millimeters thick, or just under half an inch — which is quite thin as all-in-ones go. Next to the 27-inch iMac, it looks like a tablet on a stand.
Size isn’t everything; this iMac also comes in seven colors. There’s blue, green, pink, orange, purple, yellow, and the boring silver we know and love. I’m not quite convinced that the jazzier models will fit in outside of especially stylish homes and offices. But I will say: I’ve never seen so many of my friends, or so many people on TikTok, as excited about a tech product as they seem to be about the colored iMacs. The hues are a nice change, aren’t obnoxious, and are clearly a hit with certain crowds.
Some traditional iMac touches remain, of course. The bezels are still substantial compared to those of some modern monitors. You can’t raise or lower the display height — the built-in stand only allows tilt adjustments. (You can also buy it with a built-in VESA mount adapter.) And there’s still that pesky chin, though it’s no longer emblazoned with the Apple logo.
Pretty much every other notable part of the iMac has been upgraded in some way. There’s a 4.5K (4480 x 2520) Retina display, a step up from the predecessor’s 4096 x 2304 Retina display (though both have effectively the same pixel density). It has Apple’s True Tone technology, which automatically adjusts colors and intensity based on your surroundings.
But the screen is also another reminder that this iMac doesn’t have “Pro” in its name. Twenty-four inches is on the small side as screens go; most of the best external monitors are 27 inches or larger these days. Professionals on The Verge’s video team also noticed some vignetting on the sides of the screen, which caused issues with off-angle viewing — we had a similar issue with Apple’s Pro Display XDR. Of course, neither of these limitations were a problem for my untrained eye; I thought the display looked great, with sharp details and plenty of room for my Chrome tabs and apps.
Elsewhere, Apple has upgraded the camera, microphones, and speakers. The company claims that they’re the best camera, mic system, and speaker system that have ever appeared in a Mac. I’d believe it. The six-speaker sound system is easily on par with a good external speaker. I played some music in my kitchen, and it was audible all over the house. Percussion and bass were strong, and I felt very immersed in the songs. It also supports spatial audio when playing video with Dolby Atmos.
I don’t have too much to say about the three-mic array except that nobody on my Zoom calls had any trouble hearing me. But the webcam was a very pleasant surprise. The iMac has a 1080p FaceTime HD camera, which has a higher resolution than the 720p shooter that lives in the 21.5-inch iMac (as well as the MacBook Pro, MacBook Air, and many other AIOs). The M1 also lends a hand here: its built-in image signal processor and neural engines help optimize your picture in low-light settings.
I wouldn’t say I looked amazing on my Zoom calls — parts of my background were sometimes washed out, and the image looked processed in some dimmer areas. But I was visible and clear, which is better than you get from most webcams these days. And the difference between this webcam and the grainy mess the MacBook Pro has is night and day.
When I review a computer, my task is usually to figure out for whom that computer is made.
But all kinds of people use iMacs, from college students to accountants to podcast producers to retired grandparents. And this model has arguably the most widespread consumer appeal of any iMac that Apple has made in recent years. So it’s much easier to figure out for whom this iMac isn’t made.
It’s not for people who can’t handle dongles and docks; I kept a USB-C to USB-A dongle next to me on my desk while I was testing the iMac, and I used it very frequently. It’s not for people who already own a 27-inch iMac, because it would be a downgrade in display size and quality, port selection, upgradability, and raw power. And it’s not for people with serious performance needs.
It’s not for people who are looking for the very best value for their money. Most folks won’t need the specs and accessories that I tested here, but even $1,299, the base price, is certainly more than plenty of people want to spend on a computer. The base Mac Mini is $600 cheaper than the base iMac; plug that into a monitor and some speakers (you can find plenty of good ones for well under $600), and you’ll get the same M1 performance at a massive discount.
And that, right there, is the biggest reason that this iMac, despite its power, is primarily targeting the family market. Because it’s asking you to pay more in order to do less. You’re paying $600 not to have to research and budget out monitors, speakers, webcams, docks, keyboards, and mice. You’re paying not to have to arrange thousands of things on your desk. You’re paying for a device where everything, out of the box, works well. You’re paying to eliminate fuss.
Tech enthusiasts (especially those who want to pop their machines open and make their own upgrades) may see that as a waste of money. And for them, it probably is. But they’re not the target audience for this Mac — even if its specs might suit their needs.
Could Apple have done more with this iMac? Of course. I was hoping to see a 30-inch, 6K iMac with a powerhouse 12-core workstation chip this month as much as the next person. But I have faith that we’ll get one in the future — and in the meantime, I’m glad Apple released this. It’s not earth-shattering in its design; it doesn’t redefine its category. But it’s fun. It improves upon the 21.5-inch iMac to offer a simple, attractive, and very functional device for users across all kinds of categories. It’s not the iMac to beat — but it is the iMac for most people to buy.
DLSS 2.0 off vs DLSS 2.0 on (Image credit: Nvidia)
DLSS stands for deep learning super sampling. It’s a type of video rendering technique that looks to boost framerates by rendering frames at a lower resolution than displayed and using deep learning, a type of AI, to upscale the frames so that they look as sharp as expected at the native resolution. For example, with DLSS, a game’s frames could be rendered at 1080p resolution, making higher framerates more attainable, then upscaled and output at 4K resolution, bringing sharper image quality over 1080p.
This is an alternative to other rendering techniques — like temporal anti-aliasing (TAA), a post-processing algorithm — that requires an RTX graphics card and game support (see the DLSS Games section below). Games that run at lower frame rates or higher resolutions benefit the most from DLSS.
According to Nvidia, DLSS 2.0, the most common version, can boost framerates by 200-300% (see the DLSS 2.0 section below for more). The original DLSS is in far fewer games and we’ve found it to be less effective, but Nvidia says it can boost framerates “by over 70%.” DLSS can really come in handy, even with the best graphics cards, when gaming at a high resolution or with ray tracing, both of which can cause framerates to drop substantially compared to 1080p.
In our experience, it’s difficult to spot the difference between a game rendered at native 4K and one rendered in 1080p and upscaled to 4K via DLSS 2.0 (that’s the ‘performance’ mode with 4x upscaling). In motion, it’s almost impossible to tell the difference between DLSS 2.0 in quality mode (i.e., 1440p upscaled to 4K), though the performance gains aren’t as great.
For a comparison on how DLSS impacts game performance with ray tracing, see: AMD vs Nvidia: Which GPUs Are Best for Ray Tracing?. In that testing we only used DLSS 2.0 in quality mode (2x upscaling), and the gains are still quite large in the more demanding games.
When DLSS was first released, Nvidia claimed it showed more temporal stability and image clarity than TAA. While that might be technically true, it varies depending on the game, and we much prefer DLSS 2.0 over DLSS 1.0. An Nvidia rep confirmed to us that because DLSS requires a fixed amount of GPU time per frame to run the deep learning neural network, games running at high framerates or low resolutions may not have seen a performance boost with DLSS 1.0.
Below is a video from Nvidia (so take it with a grain of salt), comparing Cyberpunk 2007 gameplay at both 1440p resolution and 4K with DLSS 2.0 on versus DLSS 2.0 off.
DLSS is only available with RTX graphics cards, but AMD is working on its own alternative for Team Red graphics cards. AMD Fidelity FX Super Resolution (FSR) is supposed to debut in 2021. It will require separate support from games, and we haven’t seen it in action yet. But like other FidelityFX technologies, it’s supposed to be GPU agnostic, meaning it will work on Nvidia and even Intel GPUs that have the necessary hardware features. We’re also expecting the next Nintendo Switch to have DLSS via an integrated SoC designed by Nvidia.
DLSS Games
In order to use DLSS, you need an RTX graphics card and need to be playing a game that supports the feature. You can find a full list of games supporting DLSS as of April via Nvidia below. Unreal Engine and Unity Engine also both have support for DLSS 2.0, meaning games using those engines should be able to easily implement DLSS.
Anthem
Battlefield V
Bright Memory
Call of Duty: Black Ops Cold War
Call of Duty: Modern Warfare
Call of Duty: Warzone
Control
CRSED: F.O.A.D. (Formerly Cuisine Royale)
Crysis Remastered
Cyberpunk 2077
Death Stranding
Deliver Us the Moon
Edge of Eternity
Enlisted
F1 2020
Final Fantasy XV
Fortnite
Ghostrunner
Gu Jian Qi Tan Online
Iron Conflict
Justice
Marvel’s Avengers
MechWarrior 5: Mercenaries
Metro Exodus
Metro Exodus PC Enhanced Edition
Minecraft With RTX For Windows 10
Monster Hunter: World
Moonlight Blade
Mortal Shell
Mount & Blade II: Bannerlord
Nioh 2 – The Complete Edition
Outriders
Pumpkin Jack
Shadow of the Tomb Raider
System Shock
The Fabled Woods
The Medium
War Thunder
Watch Dogs: Legion
Wolfenstein: Youngblood
Xuan-Yuan Sword VII
DLSS 2.0 and DLSS 2.1
In March 2020, Nvidia announced DLSS 2.0, an updated version of DLSS that uses a new deep learning neural network that’s supposed to be up to 2 times faster than DLSS 1.0 because it leverages RTX cards’ AI processors, called Tensor Cores, more efficiently. This faster network also allows the company to remove any restrictions on supported GPUs, settings and resolutions.
DLSS 2.0 is also supposed to offer better image quality while promising up to 2-3 times the framerate (in 4K Performance Mode) compared to the predecessor’s up to around 70% fps boost. Using DLSS 2.0’s 4K Performance Mode, Nvidia claims an RTX 2060 graphics card can run games at max settings at a playable framerate. Again, a game has to support DLSS 2.0, and you need an RTX graphics card to reap the benefits.
The original DLSS was apparently limited to about 2x upscaling (Nvidia hasn’t confirmed this directly), and many games limited how it could be used. For example, in Battlefield V, if you have an RTX 2080 Ti or faster GPU, you can only enable DLSS at 4K — not at 1080p or 1440p. That’s because the overhead of DLSS 1.0 often outweighed any potential benefit at lower resolutions and high framerates.
In September 2020, Nvidia released DLSS 2.1, which added an Ultra Performance Mode for super high-res gaming (9x upscaling), support for VR games, and dynamic resolution. The latter, an Nvidia rep told Tom’s Hardware, means that, “The input buffer can change dimensions from frame to frame while the output size remains fixed. If the rendering engine supports dynamic resolution, DLSS can be used to perform the required upscale to the display resolution.” Note that you’ll often hear people referring to both the original DLSS 2.0 and the 2.1 update as “DLSS 2.0.”
DLSS 2.0 Selectable Modes
One of the most notable changes between the original DLSS and the fancy DLSS 2.0 version is the introduction of selectable image quality modes: Quality, Balanced, or Performance — and Ultra Performance with 2.1. This affects the game’s rendering resolution, with improved performance but lower image quality as you go through that list.
With 2.0, Performance mode offered the biggest jump, upscaling games from 1080p to 4K. That’s 4x upscaling (2x width and 2x height). Balanced mode uses 3x upscaling, and Quality mode uses 2x upscaling. The Ultra Performance mode introduced with DLSS 2.1 uses 9x upscaling and is mostly intended for gaming at 8K resolution (7680 x 4320) with the RTX 3090. While it can technically be used at lower target resolutions, the upscaling artifacts are very noticeable, even at 4K (720p upscaled). Basically, DLSS looks better as it gets more pixels to work with, so while 720p to 1080p looks good, rendering at 1080p or higher resolutions will achieve a better end result.
How does all of that affect performance and quality compared to the original DLSS? For an idea, we can turn to Control, which originally had DLSS 1.0 and then received DLSS 2.0 support when released. (Remember, the following image comes from Nvidia, so it’d be wise to take it with a grain of salt too.)
One of the improvements DLSS 2.0 is supposed to bring is strong image quality in areas with moving objects. The updated rendering in the above fan image looks far better than the image using DLSS 1.0, which actually looked noticeably worse than having DLSS off.
DLSS 2.0 is also supposed to provide an improvement over standard DLSS in areas of the image where details are more subtle.
Nvidia promised that DLSS 2.0 would result in greater game adoption. That’s because the original DLSS required training the AI network for every new game needed DLSS support. DLSS 2.0 uses a generalized network, meaning it works across all games and is trained by using “non-game-specific content,” as per Nvidia.
For a game to support the original DLSS, the developer had to implement it, and then the AI network had to be trained specifically for that game. With DLSS 2.0, that latter step is eliminated. The game developer still has to implement DLSS 2.0, but it should take a lot less work, since it’s a general AI network. It also means updates to the DLSS engine (in the drivers) can improve quality for existing games. Unreal Engine 4 and Unity have both also added DLSS 2.0 support, which means it’s trivial for games based on those engines to enable the feature.
How Does DLSS Work?
Both the original DLSS and DLSS 2.0 work with Nvidia’s NGX supercomputer for training of their respective AI networks, as well as RTX cards’ Tensor Cores, which are used for AI-based rendering.
For a game to get DLSS 1.0 support, first Nvidia had to train the DLSS AI neural network, a type of AI network called convolutional autoencoder, with NGX. It started by showing the network thousands of screen captures from the game, each with 64x supersample anti-aliasing. Nvidia also showed the neural network images that didn’t use anti-aliasing. The network then compared the shots to learn how to “approximate the quality” of the 64x supersample anti-aliased image using lower quality source frames. The goal was higher image quality without hurting the framerate too much.
The AI network would then repeat this process, tweaking its algorithms along the way so that it could eventually come close to matching the 64x quality with the base quality images via inference. The end result was “anti-aliasing approaching the quality of [64x Super Sampled], whilst avoiding the issues associated with TAA, such as screen-wide blurring, motion-based blur, ghosting and artifacting on transparencies,” Nvidia explained in 2018.
DLSS also uses what Nvidia calls “temporal feedback techniques” to ensure sharp detail in the game’s images and “improved stability from frame to frame.” Temporal feedback is the process of applying motion vectors, which describe the directions objects in the image are moving in across frames, to the native/higher resolution output, so the appearance of the next frame can be estimated in advance.
DLSS 2.0 gets its speed boost through its updated AI network that uses Tensor Cores more efficiently, allowing for better framerates and the elimination of limitations on GPUs, settings and resolutions. Team Green also says DLSS 2.0 renders just 25-50% of the pixels (and only 11% of the pixels for DLSS 2.1 Ultra Performance mode), and uses new temporal feedback techniques for even sharper details and better stability over the original DLSS.
Nvidia’s NGX supercomputer still has to train the DLSS 2.0 network, which is also a convolution autoencoder. Two things go into it, as per Nvidia: “low resolution, aliased images rendered by the game engine” and “low resolution, motion vectors from the same images — also generated by the game engine.”
DLSS 2.0 uses those motion vectors for temporal feedback, which the convolution autoencoder (or DLSS 2.0 network) performs by taking “the low resolution current frame and the high resolution previous frame to determine on a pixel-by-pixel basis how to generate a higher quality current frame,” as Nvidia puts it.
The training process for the DLSS 2.0 network also includes comparing the image output to an “ultra-high-quality” reference image rendered offline in 16K resolution (15360 x 8640). Differences between the images are sent to the AI network for learning and improvements. Nvidia’s supercomputer repeatedly runs this process, on potentially tens of thousands or even millions of reference images over time, yielding a trained AI network that can reliably produce images with satisfactory quality and resolution.
With both DLSS and DLSS 2.0, after the AI network’s training for the new game is complete, the NGX supercomputer sends the AI models to the Nvidia RTX graphics card through GeForce Game Ready drivers. From there, your GPU can use its Tensor Cores’ AI power to run the DLSS 2.0 in real-time alongside the supported game.
Because DLSS 2.0 is a general approach rather than being trained by a single game, it also means the quality of the DLSS 2.0 algorithm can improve over time without a game needing to include updates from Nvidia. The updates reside in the drivers and can impact all games that utilize DLSS 2.0.
This article is part of the Tom’s Hardware Glossary.
(Pocket-lint) – When it comes to compact phones with plenty of power, there aren’t a huge number of choices in the Android space. Sony has long operated in this area, offering a compact version, with the Xperia 5 III being the latest model in this range.
Asus has joined the fray with the Zenfone 8, taking its phones in a different direction and wanting to offer a compact powerhouse of its own. Here’s how the two phones compare.
Design
Zenfone 8: 148 x 68.5 x 8.9mm, 169g
Xperia 5 III: 157 x 68 x 8.2mm, 168g
Sony’s Xperia 5 III will look familiar, because it follows similiar design lines as previous models, most notably defined by the 21:9 display, meaning it’s a tall handset. Well, tall for something that’s compact.
It’s almost 1cm taller than the Zenfone 8, while these phones are otherwise a similar width, so theyt are equally easy to grip. Asus has the advantage in that you’re more likely to be able to reach the top of the phone, but Sony Mobile’s counter argument would be that it’s offering you more screen space without increasing the width, an argument that has merits.
Sony has a flatter design, with Asus using curves to the rear of the phone; we think Sony’s device looks more interesting, but that comes down to personal preference. Both have IP65/68 water protection which is a real benefit, but Asus uses Gorilla Glass Victus while Sony has Gorilla Glass 6 – so Asus’ device might have greater scratch resistance.
Both come in at the same weight, but Sony’s phone is a little slimmer.
Display
Zenfone 8: 5.9-inch, AMOLED, 2400 x 1080, HDR, 120Hz
Xperia 5 III: 6.1-inch, OLED, 2520 x 1080, HDR, 120Hz
Both these phones feature and AMOLED display, both have Full HD+ resolutions, but the Sony phone is taller, so it offers 6.1-inches of screen space compared to 5.9-inches on the Zenfone.
The aspect is the big difference with a 21:9 aspect on the Sony deivce making it a little more distinct. Reletively Sony packs in a few more pixels with a pixel density of 449ppi compared to the Zenfone’s 446ppi which is essentially the same.
Both phones support HDR, both also claim really accurate displays and both offer 120Hz refresh rates. There’s not going to be much to pick technically between these displays – again, it’s whether you want that taller Sony screen.
Both these phones offer the same core hardare, with the Qualcomm Snapdragon 888 5G delivering the latest power. Both start at 8GB RAM, but Asus offers up to 16GB (depending on the region). That means both will offer 5G – and the performance of these phones should be similar. In our testing, we’ve noted that the Zenfone 8 gets pretty hot when gaming – we’ve not been able to test the Xperia 5 III yet, but this wasn’t a problem we encountered on the Xperia 5 II.
Both also come with 128 or 256GB options, but the Xperia 5 III offers microSD expansion up to 1TB, so might be the better device for those who crave physical storage.
Sony has the advantage when it comes to battery capacity, with a 4500mAh battery and 30W charging. The Asus comes in with a 4000mAh battery and 30W charging, so it’s likely that Sony will offer slightly longer endurance – but Sony also offers wireless charging which Asus doesn’t.
Both phones have a 3.5mm headphone socket and stereo speakers.
Camera
Zenfone 8: Douple rear camera
Main: 64MP, 1/1.7in, f/1.8, 0.8μm
Ultra-wide: 12MP, 1/2.55in, f/2.2, 1.4μm
Selfie: 12MP, 1/2.93in, f/2.2, 1.22μm
Xperia 5 III: Triple rear camera
Main: 12MP, 1/2.6in, f/2.2
Ultra wide: 12MP, 1/1.7in, f/1.7
Telephoto: 12MP, 1/2.9in, f/2.3-f/2.8
Selfie: 8MP , 1/4in, f/2.0
Wading through the mass of camera specs, the big difference is that Asus offers 8K video recording on the Zenfone 8, thanks to that 64-megapixel main sensor, while Sony manages to offer a whole additional camera – and it’s a periscope-type telephoto, offering lossless zoom at 3x and 4.4x thanks to the variable focal length in the lense.
That gives Sony an immediate advantage here: it’s offering a wider range of cameras and lenses – and although we’ve not seen the performance from that camera, just offering an optical telephoto will deliver images that Asus won’t be able to match on quality.
Asus’ play comes from video, promising 8K video which Sony can’t match. Both offer 4K at up to 120fps for slow motion, while Sony also offers HDR video capture at 4K.
From the spec sheet it’s impossible to judge the performance of the cameras, with Asus putting in a good showing from what we’ve seen from it so far. But Sony is likely to emerge as the favoured model because of the additional zoom.
Price and availability
Zenfone 8: from £599/€599
Xperia 5 III: TBC
The prices aren’t known for all models, but the Zenfone 8 will start from 3/€599, which is likely to be cheaper than Sony – who hasn’t confirmed the pricing of the Xperia 5 III. The 16/256GB version will cost £699.
The Sony phone will be available in summer 2021, and the Zenfone 8 will be available in May 2021.
Conclusion
Both these phones sit in the compact phone space and share a lot in common: both have similiar core hardware, and the same power and both are likely to offer a similar experinece from Android 11 so in normal day-to-day use, there’s probably little between them.
Both come well packaged, with the Sony the more interesting phone to look at (although you may disagree), but the Zenfone 8 is shorter, so some might prefer it from a usability point of view. Technically the displays are closely matched, the only real difference being the aspect – with Sony’s 21:9 being more unique, but leading to a taller phone.
Sony is expected to have the longer battery life thanks the physically larger cell, while it also packs in a variable focal length periscope zoom on the rear, so will offer a range of photography choices that the Zenfone 8 can’t match – and that’s likely to come at a cost, with Sony expected to have a higher price.
As a daily driver, the Zenfone 8 looks like a great choice for those wanting something compact and not too expensive – but Sony’s display might be preferred by those who want to watch more movies or play more games.
(Pocket-lint) – Gaming phones have become something of a fixture in the Android space; while many flagship devices push their gaming prowess, for a select few, gaming is their raison d’être, their everything.
The ROG Phone is one such device, pushing Asus’ Republic of Gamers brand and weaving into that the experience Asus has gained from its regular phones. And in the fourth-generation of this phone Asus is more ambitious than ever.
Here’s why the Asus ROG Phone 5 is not only a great gaming phone, it’s a great phone outside of that too.
Design & Build
Dimensions: 173 x 77 x 9.9mm / Weight: 239g
Under-display optical fingerprint scanner
3.5mm headphone jack
ROG Vision rear display
Gaming phones often show their colours when it comes to the design. Aside from being large – which the ROG Phone 5 definitely is – you’ll often find more overt graphics and emotive finishes rather than just being a safe black or grey.
Pocket-lint
The ROG Phone 5 doesn’t go to an extreme though: from the front it just looks like a normal phone. Flip it over and you’re treated to subtle design touches etched into the rear glass, which also gives some indicator of where the touch points are for the AirTriggers (which Asus describes as “ultrasonic sensor zones that can be customised to perform different functions, such as reproducing actions in specific games and launching specific apps”. We touch upon these in more detail in the last section of this review).
The thing that gives the game away is the ROG Vision display on the rear of the phone. There are two different versions of the display, with a dot display on the regular ROG Phone models and a slightly smaller but more sophisticated display panel on the Pro and Ultimate models – the Pro is shown in this review.
ROG Phone 5 comes in regular, Pro and Ultimate editions
Pocket-lint
That blows the subtlety out of the water, allowing you to have RBG illumination on the back of the phone – with the Pro and Ultimate models offering a wider range of graphics and animations – all of which can be controlled through the Armoury Crate app on the phone, just like Asus PC components.
That control includes turning the Vision display off if you don’t want it – but you’ll soon forget it’s there until people mention it. It’s on the back of the phone and it’s rare to be looking at the back of the phone when you’re doing something, so let’s not dwell on it.
There are a couple of other quirks around the body: The USB-C on the base of the phone is offset to one side rather than central (and we don’t know exactly why), while there’s a secondary USB-C on the side of the phone. This secondary USB sits alongside the contact point to power the AeroActive Cooler 5 – the clip-on fan – and both have a rubber seal that presses into the side to keep out dust.
Pocket-lint
This cover is probably the worst piece of design implementation on the ROG Phone 5. The fact that there are a couple of spares in the box tell you everything you need to know: you’re going to lose this cover, because it’s a separate piece of rubber.
Motorola’s new Moto G9 Plus is a stunner of a phone – find out why, right here
By Pocket-lint Promotion
·
We’ve found it flapping off when pulling the phone from a pocket, and just when handling the device. We’re constantly pushing it back into place and a couple of times we’ve found it missing and then located it in the bottom of a pocket.
An out of box experience all phones can learn from
One of the great things about gaming phones is what you get for your money. There are a whole range of phones on offer and none are really expensive compared to flagships from brands like Samsung and Apple. The ROG Phone 5 starts at £799 in the UK – and that’s for a 12GB RAM model with 256GB storage, not the bottom of the range loadout.
Pocket-lint
But it’s not just about the core device, it’s about the rest of the experience. Lavishly packaged, opening the ROG Phone 5 is an event. From the cool comic book graphics of inside of the box, that flow through into the startup process for the phone, there’s a sense of theatre. It’s a reward for your custom and it’s so much better than just sliding a phone out of a box.
You also get more in the box: the 65W charger that will deliver a fast charge; the case that brings some grip to what is, admittedly, a slippery phone given its massive size; and the clip-on AeroActive Cooler 5 fan, which integrates a kickstand, two physical buttons, and another RGB logo.
Pocket-lint
Some might baulk at this as more landfill, but some companies will make you pay for the charger – and here you’re getting a powerful charger you can use with your other devices too.
Display
6.78-inch AMOLED panel
Up to 144Hz refresh rate
2448 x 1080 resolution
There’s a 6.78-inch display in the ROG Phone 5. It’s big by any standard, with Asus hanging onto the bezels top and bottom. The top bezel integrates the front-facing camera, so there’s no need for a notch or punch-hole.
It’s also a flat display, all practical design decisions made to give you the best gaming experience, ensuring that you get as much visual space as possible. Given how problematic we found the Xiaomi Mi 11 Ultra’s display, we’re just fine with the ROG Phone 5 going flat.
Pocket-lint
The ROG Phone 5 models all stick to a Full HD resolution and while devices like the Samsung Galaxy S21 Ultra can technically produce finer detail, generally speaking that makes little difference. We can’t fault the ROG Phone’s display for detail.
It also offers refresh rates up to 144Hz (if you have any games that support that, there’s a full list on the ROG website), with options to select 60 or 120Hz – or Auto, which will pick the refresh rate based on the content.
HDR 10+ is supported to bring pop to the visuals for high dynamic range content, while that AMOLED panel provides rich colour visuals, with the option to tune that to your preferences.
It’s a great display and about the only thing that separates it from the best displays on the market is the peak brightness. It offers 800 nits, which is still bright enough for most, but Samsung’s top-end offerings will outshine this model – most notable when outside in sunny conditions.
Pocket-lint
Flanking the display top and bottom are dual stereo speakers, while there’s also a 3.5mm headphone socket for those wanting to go wired. The speaker performance is stellar, amongst the best you’ll find on a smartphone. It’s rich and immersive, with substantial bass and volume that means you don’t need headphones to get the most from your content.
Hardware & Performance
Qualcomm Snapdragon 888 platform
8GB-18GB RAM, 128GB-512GB storage
6000mAh battery, dual USB-C 65W wired charging
The fact the ROG Phone 5 houses Qualcomm’s Snapdragon 888 platform makes it especially good value for money – as you’re getting the latest flagship hardware that will embarrass some other phones.
Of course it comes in at different price points, with RAM and storage leveraging the price, although not all models will be available in all regions. We actually tested the 16GB/512GB model (the ROG Phone 5 Pro – a model that isn’t planned for the UK; although there’s a 16GB/512GB version of the standard ROG Phone 5, the only difference being the type of display you get on the back of the phone).
The performance is also exemplary. There are a number of elements to this. It’s got that great hardware and, as a result, we’ve found the gaming performance to be outstanding.
Pocket-lint
This is a phone that eats hours of Call of Duty Mobile or PUBG Mobile, giving solid gameplay, combined with those design elements and some software enhancements that feel like they give you edge, or at least give you the opportunity to establish new preferences thanks to the bespoke gaming options offered.
We also didn’t find the ROG Phone 5 to get excessively hot under load, despite the option of the clip-on fan.
But the important point about performance is that the ROG Phone 5 also runs fast and smooth outside of gaming. We’ve seen gaming phones that drop the ball when it comes to simple tasks, because of poor software. The ROG Phone 5 is stable, which makes for a great experience.
There’s a huge 6000mAh battery, which is fitting for a phone of this size, again with Asus splitting the battery and enabling 65W wired charging. That makes for really fast charging, with the option to bypass charging – and just have the power used for the system rather than recharging the battery.
Again, this is an option for gamers, so you’re not charging (which produces heat) and loading the system (which produces heat) and could potentially lead to a drop in performance.
Pocket-lint
A big battery means big battery life. In regular use the ROG Phone 5 will easily see you through the day and into the next. It’s not a charge every night type of phone. Even with a couple of hours of gaming thrown in – at top brightness and max settings – battery life isn’t a worry. That’s a great position not just for a gaming phone, but any smartphone.
There are power modes available, with X Mode firing up full power to let things rip, and a Dynamic Mode to keep things balanced. You can customise the power modes to suit your preferences with things like network, display, performance, and other controls all selectable.
There’s an under-display fingerprint scanner that’s fast to unlock, while calls comes through loud and clear too – with no detected problems with Wi-Fi or 5G connectivity.
The camera on any gaming phone is often something of an afterthought. The focus is on the experience of gaming – so the camera is seen as less of a focus. Despite that, Asus is pushing the ROG Phone 5 as having a triple camera system.
Pocket-lint
The main camera is a 64-megapixel sensor, using pixel combining to produce a 16-megapixel image as standard. You can shoot in full resolution, but you have to dig into the menu to find that option, which no one is ever going to do.
There’s an ultra-wide lens, giving the equivalent of 0.6x, although the quality isn’t great, with visible blurring around the edges if there’s any detail there – but fine for open shots of expansive landscapes.
Pocket-lint
1.0X MAIN CAMERA
The final camera is a macro camera, which we’re generally non-plussed about. As on other devices, macro cameras seem to be thrown in to make up the numbers – and that’s what it feels like here too.
So back the main camera and the performance is reasonable, producing naturally balanced pictures, although perhaps not getting the most out of scenes and not showing as much pop as other cameras we’ve seen can offer.
Low-light shooting offers that slow exposure so you can watch the image get lighter, which we like – and it will take those shots automatically in low light, which means people will actually use it.
There’s a portrait mode for blurring the background that works well enough, although it seems to soften the background with over-exposure which makes results look a little clumsy.
Portrait works on the front and back cameras and we generally prefer the results without portrait mode – and you can’t adjust the levels of blur after the fact, so it’s worth taking a few photos and figuring out what gives you pleasing results so you can change the settings before you take the picture. The selfie camera is generally good, although images quickly get softer in lower light conditions and aren’t good when it gets dark.
There’s no optical zoom on offer here, although you can pinch-to-zoom from the main camera out to 8x. It’s not an especially elegant system and the results are typical of digital zoom, with quality dropping as you increase the “magnification”.
Pocket-lint
One of the reasons for the high-resolution sensor – apart from for the benefit of the spec sheet – is to allow 8K video capture, on top of the 4K 60fps option.
The important thing about the camera is that it gets the job done: while other phones will sell themselves on camera features above all else, that’s not really the ethos behind the ROG Phone 5. This phone is all about the power and the gaming experience. So, yes, there are more engaging cameras elsewhere, but at the same time, this Asus will give you perfectly good results in most situations.
Software and custom gaming options
Android 11
Armoury Crate
Custom gaming controls
As we’ve said previously, the software on the ROG Phone 5 runs smooth and fast. We’ve experienced no problems with the tweaks and changes that Asus has made over Google’s Android operating system, and it’s easy to swing in with Google versions of apps rather than supplied alternatives.
It’s running Android 11 too, so the latest version of Google’s OS – although Asus doesn’t quite have the update record that a company like Samsung now offers, so there’s no telling how long it would be before it moves to Android 12 once that’s released later down the line.
Pocket-lint
What’s more relevant here is the gaming software and the options that controls. We’ve mentioned Armoury Crate, which will let you control things like the ROG Vision display on the back of the phone, and act as a launch pad for your games.
Within each game you can see how long you’ve spent playing that game, but more usefully you have a record of profiles for that game. You can, for example, restrict background CPU usage when playing a particular game, change the touch performance, turn off background network syncing – all designed to ensure you have the optimal gaming experience.
That you can customise this to each game is great. For something like a shooter where connection and touch matters more, you might want to restrict everything else. For something casual like Pokemon Go, you might be happy to have everything else on your phone happening. It’s freedom to choose, rather than one gaming mode fits all.
Pocket-lint
Within games you have access to the Game Genie dashboard too, allowing you to perform essential things, like tweak the brightness, turn off alerts or calls, speed up your phone – and block navigation gestures so you don’t accidentally exit the game.
There’s the option to have stats always showing – CPU and GPU usage, battery, temperature, fps – and you can drag these to anywhere on the screen so they are out of the way.
But it’s the AirTriggers that are the biggest differentiator from other phones, giving you a range of touch zones around the body of the phone that you can customise. That also includes two physical buttons on the AeroActive Cooler accessory too – which might convince some people to use it, as those buttons feel more positive than the touch areas of the phone’s casing.
Pocket-lint
The Cooler buttons are great for things like dropshotting in shooters, because you can hit the deck while still firing, and get back to your feet, all without having to touch anything on the screen – which is a real advantage during games.
There are two ultrasonic buttons on the top of the phone, like shoulder buttons, with haptic feedback. These can offer a full range of programmable options – taps, swipes, slides – and they can be divided into two buttons each side, or you can programme and assign a macro to that button for a sequence you might use in a game.
Then there’s motion support, which you can assign to controls in the game – like forward tilt to reload, or whatever you like.
There’s also (on the Pro and Ultimate models only) rear touch zones you can use for slide input for your fingers on the rear of the phone.
Pocket-lint
The challenge is how you incorporate all these tools to make things easier for you during games – although setting them up is easy enough and each setup is unique to each game.
Even if you just find one thing that’s useful, then you’re a step ahead. That might be using an additional AirTrigger for an on-screen control you find hard to hit – or that you can then remove from the display so you have less UI in the way of the game.
Verdict
The thing that really hits home about the Asus ROG Phone 5 is that it’s not just a great gaming phone: it’s a great phone full stop.
Yes, you can’t avoid the fact that the majority of phones are now based around the camera experience – and that’s one area that the ROG Phone 5 doesn’t really go to town on. But with huge battery and display, this is a great media phone in addition to a gaming delight.
For keen gamers, there’s a market of phone choices out there – and the ROG Phone 5 should definitely be high up your shortlist. For everyone else, if you can accept that this Asus is designed for gamers first, it’s still an awful lot of phone for the money.
Also consider
Pocket-lint
Nubia Red Magic 6
This gaming phone attempts to steal the show with a 165Hz display. Despite being a powerful device that’s good value for money, it does oversell the cameras and also brings with it some software quirks you’ll need to work around.
Read our full review
squirrel_widget_4543360
Writing by Chris Hall. Editing by Mike Lowe.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.