AMD is partnering with Samsung to provide RDNA 2 graphics technology for an Exynos mobile system-on-chip, potentially giving a boost to GPU performance in flagship Samsung phones. The announcement was made today at Computex Taipei.
There aren’t many details on the chip or which products it’ll be used in, but AMD describes the chip as a “next-generation Exynos SoC,” and says Samsung will provide further information later in 2021. The GPU will use AMD’s RDNA 2 architecture, enabling features like ray tracing and variable rate shading. AMD says it’ll make its way to “flagship mobile devices.”
“The next place you’ll find RDNA 2 will be the high-performance mobile phone market,” AMD CEO Lisa Su said on stage. “AMD has partnered with industry leader Samsung for several years to accelerate graphics innovation in the mobile market, and we’re happy to announce that we’ll bring custom graphics IP to Samsung’s next flagship mobile SoC with ray tracing and variable rate shading capabilities. We’re really looking forward to Samsung providing more details later this year.”
Exynos is the brand name that Samsung uses for its own in-house processors. In the US and certain other markets, Samsung’s flagship Galaxy phones ship with Snapdragon SoCs from Qualcomm, while the rest of the world gets Exynos chips. The Exynos models are generally regarded as slightly less performant than their Qualcomm equivalents, but it was seen as a surprise when Samsung decided to switch to the Snapdragon variant of the Galaxy S20 in its home market of South Korea.
Whether AMD’s mobile solution will provide tangible benefits over Qualcomm’s Adreno GPUs is unknown. But by throwing out buzzwords like ray tracing and lending its latest RDNA 2 architecture, AMD is certainly setting expectations high for future Samsung devices.
Nvidia is unveiling its latest flagship gaming GPU today, the GeForce RTX 3080 Ti. Based on Nvidia’s latest Ampere architecture, the RTX 3080 Ti will succeed the RTX 3080 and promises to deliver 1.5x more performance over the previous generation RTX 2080 Ti. Nvidia is making the RTX 3080 Ti available worldwide on June 3rd, priced from $1,199.
The RTX 3080 Ti looks very much like the RTX 3080, with an identical design and ports. The main difference is a jump in power and VRAM. The RTX 3080 Ti ships with more VRAM than the RTX 3080, with 12GB of GDDR6X in total. This new GPU is essentially as close as you can get to an RTX 3090 on paper, with half the VRAM. The $1,199 price matches the same pricing Nvidia used for the RTX 2080 Ti Founders Edition cards, and it’s $300 less than the giant RTX 3090.
RTX 3080 Ti specs
RTX 3090
RTX 3080 Ti
RTX 3080
RTX 3090
RTX 3080 Ti
RTX 3080
GPU clusters
82
80
68
CUDAs
10496
10240
8704
RTs
82
80
68
Tensor
328
320
272
ROPs
112
112
96
Boost clock
1695MHz
1665Mhz
1710Mhz
Memory
24GB G6X
12GB G6X
10GB G6X
Bus
384-bit
384-bit
320-bit
Bandwidth
936 GB/s
912 GB/s
760 GB/s
TDP
350W
350W
320W
Price
$1,499
$1,199
$699
You’re obviously losing out on an extra 12GB of VRAM if you opt for the RTX 3080 Ti over the 3090, and what will likely be a small improvement in performance for that $300 difference. But the RTX 3090 is giant because it has a far bigger cooler, and the RTX 3080 Ti has the same hardware design as the RTX 3080. That may prompt concerns around how hot the RTX 3080 Ti will run, but we’ll have to wait on reviews to find out if it’s really an issue.
Nvidia is also including its cryptocurrency nerf on the RTX 3080 Ti, much like new RTX 3080 and RTX 3070 cards. Nvidia offers a separate Cryptocurrency Mining Processor (CMP) for Ethereum miners instead. These cards include the best performance for mining and efficiency, but they’re not designed to handle games.
Elsewhere, the RTX 3080 Ti has the same power requirements as the RTX 3090. You’ll need a 750-watt power supply, and the card can draw up to 350 watts of power. That’s the same as the RTX 3090, but the RTX 3080 draws less at up to 320 watts. Just like the RTX 3080 before it, the 3080 Ti also uses Nvidia’s new 12-pin connector. Nvidia will include an adapter that’s compatible with eight-pin cables.
Nvidia is also launching a second GPU next week, the RTX 3070 Ti. The $599 RTX 3070 Ti will be available on June 10th, and is designed to offer 1.5x more performance over the previous RTX 3070 Super. It will include 8GB of GDDR6X memory.
Both new RTX cards will support all of Nvidia’s ray-tracing, DLSS, and Reflex technologies. More than 50 games now support Deep Learning Super Sampling (DLSS), offering AI-powered performance boosts to games.
While both of Nvidia’s new GPUs will be available this month, actual availability and pricing is obviously going to differ. Everyone has had a hard time getting hold of new RTX 30-series GPUs since their launch last year, and a flagship RTX 3080 Ti and more affordable RTX 3070 Ti isn’t going to help improve that.
A global chip shortage has pushed GPU prices up, and demand is still incredibly high during ongoing supply constraints. Nvidia has already warned these supply issues will continue throughout 2021, so don’t expect to easily be able to get hold of an RTX 3080 Ti or RTX 3070 Ti any time soon.
Alienware is keen on giving Razer a run for its money when it comes to making a super-thin gaming laptop. Two of the configurations of Alienware’s new X15 flagship model are actually 15.9mm thick, almost the same as Razer’s just-refreshed 15.8mm-thick Blade 15 Advanced. That’s impressively thin, especially considering that Alienware doesn’t usually try to compete in this realm.
What’s also noteworthy is that, despite its thin build, the X15 looks like it will be a capable machine. Alienware is also announcing a bigger and thicker 17-inch X17 laptop that’s even more powerful. We’ll go into detail on both below.
Let’s start with the X15, which will cost $1,999 for the base model, available starting today. Packed into that entry model is Intel’s 11th Gen Core i7-11800H processor (eight cores and a boost clock speed of up to 4.6GHz), 16GB of RAM clocked at 3,200MHz (but not user-upgradeable due to size constraints), 256GB of fast NVMe storage (which is user-upgradeable, with two slots that support either M.2 2230 or 2280-sized SSDs), and Nvidia’s RTX 3060 graphics chip (90W maximum graphics power, and a base clock speed of 1,050MHz and boost clock of 1,402MHz). A 15.6-inch FHD display with a 165Hz refresh rate, 3ms response time, and up to 300 nits of brightness with 100-percent sRGB color gamut support comes standard.
Alienware hasn’t shared pricing for spec increases, but you can load the X15 with up to an Intel Core i9-11900H processor, a 2TB NVMe M.2 SSD (with a maximum 4TB of dual storage supported via RAID 0), and 32GB of RAM. To top it off, you can put in an RTX 3080 graphics card (the 8GB version, with 110W maximum graphics power, a base clock speed of 930MHz and a boost clock speed of 1,365MHz). The display can be upgraded to a 400-nit QHD G-Sync panel with a 240Hz refresh rate, 2ms response time, and 99-percent coverage of the DCI-P3 color gamut. The X15 has a 87Wh battery and includes a 240W “small form factor” adapter. At its lowest weight, the X15 comes in at five pounds, but it goes up to 5.2 pounds depending on the specs.
All of the X15’s ports, aside from a headphone jack and power input, are located on its back. There’s a USB-A 3.2 Gen 1 port, one USB-C 3.2 Gen 2 port, one Thunderbolt 4 port, a microSD card slot, and an HDMI 2.1 port that will allow the X15 to output a 4K signal at up to 120Hz.
If you’re all about getting a 17.3-inch screen, the X17 starts at $2,099 and has similar starting specs. It has a thicker chassis than the X15 at 20.9mm, and it’s heavier, starting at 6.65 pounds. But that extra heft apparently allows for more graphical and processing power, if you’re willing to pay for it. For example, its RTX 3060 card has a higher maximum graphics power of 130W. This pattern is seen for more pricey GPU upgrades, too, especially the RTX 3080 (16GB) that can sail with 165W of max graphics power at a boost clock speed of 1,710MHz. In the processor department, you can go up to an Intel Core i9-11900HK. Additionally, you can spec this one with up to 64GB of XMP RAM clocked at 3,466MHz.
As for the screen, there’s an upgrade option to get a 300-nit FHD G-Sync panel with a 360Hz refresh rate and 1ms response time, but you can go all the way up to a 500-nit 4K display with a 120Hz refresh rate and 4ms response time. Like the X15, the X17 has an 87Wh battery, but whether you get a 240W or 330W power supply will depend on the configuration that you buy.
The X17 has all of the same ports as the X15, along with one extra USB-A port, a Mini DisplayPort jack, and a 2.5G ethernet port (the X15 includes a USB-C to ethernet adapter).
Generally speaking, thinner laptops struggle with heat management. But Alienware’s Quad Fan claims to move a lot of air, and in X15 and X17 models that have the RTX 3070 or 3080 chips, it touts a new “Element 31 thermal interface material” that apparently provides a boost in the thermal resistance of its internals compared to previous Alienware laptops. We’ll have to see how this fares when we try out a review unit. I’m curious how loud they might get in order to stay cool.
If you’re an Alienware enthusiast, be aware that the company’s mainstay graphics amplifier port is missing. We asked Alienware about this, and it provided this statement to The Verge:
Today’s latest flagship desktop graphics cards achieve graphical power beyond what the Alienware Graphics Amplifiers (as well as other external graphics amplifiers) can successfully port back through PCI (and Thunderbolt) connections. For Alienware customers who are already purchasing high-end graphics configurations, the performance improvements from our Alienware Graphics Amplifier would be limited. While improvements would be noticeable, in many cases it wouldn’t be enough to justify purchasing an external amplifier and flagship graphics card. So instead, we are using that additional space to offer extra ports and thermal headroom which provides a better experience for all gamers purchasing this product.
Wrapping up this boatload of specs, the X15 and X17 each have a 720p Windows Hello webcam, and configurations with the RTX 3080 have an illuminated trackpad that can be customized within Alienware’s pre-installed software. These laptops come standard with Alienware’s X-Series keyboard that has per-key lighting, n-key rollover, anti-ghosting, and 1.5mm of key travel. In the X17, you have the option to upgrade to Alienware’s Cherry MX ultra low-profile mechanical switches, which have a longer 1.8mm key travel.
Lastly, both laptops are available in the “Lunar Light” colorway, which is white on the outside shell and black on the inside.
AMD introduced its new Radeon RX 6000M-series laptop graphics at Computex, during a keynote by AMD’s CEO, Dr. Lisa Su. The new mobile graphics lineup is made up of the top-end AMD Radeon RX 6800M, a mid-range RX 6700M and the entry level RX 6600M. For now at least, the GPUs are being paired in systems from laptop vendors with AMD’s Ryzen processors for what the company calls “AMD Advantage.”
These are the first laptop GPUs from AMD that use its RDNA 2 architecture, with Infinity Cache for higher memory bandwidth, low power consumption (AMD claims near 0 watts at idle) and high frequencies even when the system is running at low power. The company is claiming up to 1.5 times performance over last-gen RDNA graphics and up to 43% lower power consumption.
AMD Radeon RX 6800M
AMD Radeon RX 6700M
AMD Radeon RX 6600M
Compute Units
40
36
28
Game Clock
2,300 MHz
2,300 MHz
2,177
Memory
12GB GDDR6
10GB GDDR6
8GB GDDR6
Infinity Cache
96MB
80MB
32MB
AMD Smart Access Memory
Yes
Yes
Yes
AMD Smart Shift
Yes
Yes
Yes
Power Targets
145W and above
Up to 135W
Up to 100W
Resolution Targets
1440p
1440p/1080p
1080p
The most powerful of the new bunch is the AMD Radeon RX 6800M, which will be available starting June 1 in the Asus ROG Strix G15 Advantage Edition. It has 40 compute units and ray accelerators, along with a 2,300 MHz game clock, 12GB of GDDR6 memory and a 96MB cache. It will also be compatible with AMD SmartShift and Smart Access Memory.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
AMD compared the ROG Strix G15 with the RX 6800M and a Ryzen 9 5900HX to a 2019 MSI Raider GE63 with a 9th Gen Intel Core i7 processor and an RTX 2070, claiming up to 1.4 times more frames per second at 1440p max settings in Assassin’s Creed Valhalla and Cyberpunk 2077, 1.5 times the performance in Dirt 5 and 1.7x more frames while playing Resident Evil: Village.
In closer comparisons, to an RTX 3070 (8GB) and RTX 3080 (8GB), AMD claimed its flagship GPU was typically the top performer – within a frame or so – in several of those games, as well as Borderlands 3 and Call of Duty: Black Ops Cold War, though it’s unclear which settings and resolutions were used for these tests.
Unlike Nvidia, AMD isn’t aiming for 4K gaming. The most powerful of the cards, the RX 6800M, aims for a power target of 145W and above and is designed for 1440p.
The middle-tier AMD Radeon RX 6700M is designed for 1440p or 1080p gaming, depending on the title. It has 36 compute units with a 2,300 MHz game clock, 10GB of GDDR6 RAM and an 80MB infinity cache, as well as the same support for SmartShift and SAM. AMD says these will ship in laptops “soon.’ It also said that the GPU will allow for 100 fps gaming at 1440p and high settings in “popular games,” though didn’t specify which games it was referring to.
Image 1 of 3
Image 2 of 3
Image 3 of 3
The RX 6600M sits at the bottom of the stack for gaming at 1080p. AMD compared it to an RTX 3060 (6GB) on 1080p max settings, and found that it led in Assassin’s Creed Valhalla, Borderlands 3 and Dirt 5. It was five frames behind in Call of Duty: Black Ops Cold War in AMD’s tests, and there was a one-frame difference playing Cyberpunk 2077. Like the RX 6800M, the 6600M will start shipping on June 1.
AMD Advantage Laptops
AMD is now referring to laptops with both AMD processors and graphics as offering the “AMD Advantage.” The company says these designs should offer great performance because of power sharing between the CPU and GPU.
Image 1 of 2
Image 2 of 2
AMD says its technologies can achieve up to 11% better performance in Borderlands 3, 10% in Wolfenstein Young Blood, 7% in Cyberpunk 2077 and 6% in Godfall.
Additionally, the company says AMD Advantage laptops will only have “premium” displays — either IPS or OLED, but no VA or TN panels. They should hit or surpass 300 nits of brightness, hit 144 Hz or higher and use AMD FreeSync.
Image 1 of 3
Image 2 of 3
Image 3 of 3
Each laptop should come with a PCIe NVMe Gen 3 SSD, keep the WASD keys below 40 degrees Celsius while gaming and allow for ten hours of video on battery. (AMD tested this with local video, not streaming.)
The first of these laptops is the Asus ROG Strix G15, with up to a Ryzen 9 5900HX and Radeon RX 6800M, a 15-inch display (either FHD at 300 Hz or WQHD at 165 Hz) with FreeSync Premium, liquid metal for cooling both the CPU and GPU along with a vapor chamber. It will launch in mid-June.
The HP Omen 16 will also come with a 165 Hz display with up ao a Ryzen 9 5900Hx and AMD Radeon RX 6600M for 1080p gaming. It will launch sune on JD.com, then become available worldwide.
In June, we should see more releases from HP, Asus, MSI and Lenovo.
AMD has announced its long-awaited Radeon RX 6000M series of mobile GPUs, featuring its RDNA 2 architecture.
Today’s release consists of three chips: the RX 6800M (configurable at 145W and above), the RX 6700M (up to 135W), and the RX 6600M (up to 100W). AMD says the flagship 6800M delivers the fastest AMD graphics for laptops yet; it claims the 6800M will run modern AAA games at frame rates that are comparable to or better than those of Nvidia’s mobile RTX 3080. It’s also purported to outperform Nvidia’s chip while gaming on battery.
AMD says the RX 6700M will deliver up to 100fps “in popular games” at 1440p resolution. The 6600M is better for “epic 1080p gaming.” Keep an eye out for independent reviews of these chips in the coming weeks for better idea of the performance you can expect from each one.
The 6000M series will be available starting on June 1st.
Radeon RX 6000M series
GPU
Power target
Compute units / ray accelerators
Game clock (MHz)
Memory (GDDR6)
Infinity cache
GPU
Power target
Compute units / ray accelerators
Game clock (MHz)
Memory (GDDR6)
Infinity cache
Radeon RX 6800M
145W and above
40
2300
12GB
96MB
Radeon RX 6700M
Up to 135W
36
2300
10GB
80MB
Radeon RX 6600M
Up to 100W
28
2177
8GB
32MB
AMD also announced AMD Advantage, a new “design framework initiative” meant to encourage OEMs to include certain features on their AMD-powered systems, and to indicate to consumers which Ryzen- and Radeon-powered laptops AMD thinks are the best. It appears to be a similar idea to Intel’s Evo program, but it’s just for gaming laptops, and the standards look much more stringent. It AMD Advantage laptops are expected to include the following:
AMD Ryzen 5000 mobile processors, Radeon 6000 graphics and Radeon software
Support for AMD’s Smart Acess Memory and Smart Shift technology
A display that reaches at least 300 nits of brightness, covers either 100 percent of the sRGB gamut or 72 percent of the NTSC gamut, has at least a 144Hz refresh rate and low latency, and supports AMD Freesync
At least one NVME PCIE Express Gen 3 SSD
The ability to maintain a surface temperature under 40 degrees Celsius on the WASD keys
Over 10 hours of video playback on battery
It’s unclear how many laptops will actually meet all of these standards. Forty degrees Celsius is close to as hot as keyboards commonly get in the center. But there aren’t too many gaming rigs that reliably break 10 hours of video playback on battery, and plenty of the best gaming laptops out there max out below 300 nits of brightness. That said, all kinds of Intel Evo-certified laptops also don’t meet all the Evo requirements in my testing — units and methodologies can vary.
The first AMD Advantage laptop to be announced is Asus’ new ROG Strix G15. This can be configured with up to a Ryzen 9 5900Hx, a Radeon RX 6800M, and a 15-inch WQHD 165Hz display with 3ms response time. The G15 will be available at Best Buy in June.
Intel kicked off Computex 2021 by adding two new flagship 11th-Gen Tiger Lake U-series chips to its stable, including a new Core i7 model that’s the first laptop chip for the thin-and-light segment that boasts a 5.0 GHz boost speed. As you would expect, Intel also provided plenty of benchmarks to show off its latest silicon.
Intel also teased its upcoming Beast Canyon NUCs that are the first to accept full-size graphics cards, making them more akin to a small form factor PC than a NUC. These new machines will come with Tiger Lake processors. Additionally, the company shared a few details around its 5G Solution 5000, its new 5G silicon for Always Connected PCs that it developed in partnership with MediaTek and Fibocom. Let’s jump right in.
Intel 11th-Gen Tiger Lake U-Series Core i7-1195G7 and i5-1155G7
Intel’s two new U-series Tiger Lake chips, the Core i7-1195G7 and Core i5-1155G7, slot in as the new flagships for the Core i7 and Core i5 families. These two processors are UP3 models, meaning they operate in the 12-28W TDP range. These two new chips come with all the standard features of the Tiger Lake family, like the 10nm SuperFin process, Willow Cove cores, the Iris Xe graphics engine, and support for LPDDR4x-4266, PCIe 4.0, Thunderbolt 4 and Wi-Fi 6/6E.
Intel expects the full breadth of its Tiger Lake portfolio to span 250 designs by the holidays from the usual suspects, like Lenovo MSI, Acer and ASUS, with 60 of those designs with the new 1195G7 and 1155G7 chips.
Intel Tiger Lake UP3 Processors
PROCESSOR
CORES/THREADS
GRAPHICS (EUs)
OPERATING RANGE (W)
BASE CLOCK (GHZ)
SINGLE CORE TURBO FREQ (GHZ)
MAXIMUM ALL CORE FREQ (GHZ)
Cache (MB)
GRAPHICS MAX FREQ (GHZ)
MEMORY
Core i7-1195G7
4C / 8T
96
12 -28W
2.9
5.0
4.6
12
1.40
DDR4-3200, LPDDR4x-4266
Core i7-1185G7
4C / 8T
96
12 – 28W
3.0
4.8
4.3
12
1.35
DDR4-3200, LPDDR4x-4266
Core i7-1165G7
4C / 8T
96
12 – 28W
2.8
4.7
4.1
12
1.30
DDR4-3200, LPDDR4x-4266
Core i5-1155G7
4C / 8T
80
12 – 28W
2.5
4.5
4.3
8
1.35
DDR4-3200, LPDDR4x-4266
Core i5-1145G7
4C / 8T
80
12 – 28W
2.6
4.4
4.0
8
1.30
DDR4-3200, LPDDR4x-4266
Core i5-1135G7
4C / 8T
80
12 – 28W
2.4
4.2
3.8
8
1.30
DDR4-3200, LPDDR4x-4266
Core i3-1125G4*
4C / 8T
48
12 – 28W
2.0
3.7
3.3
8
1.25
DDR4-3200, LPDDR4x-3733
The four-core eight-thread Core i7-1195G7 brings the Tiger Lake UP3 chips up to a 5.0 GHz single-core boost, which Intel says is a first for the thin-and-light segment. Intel has also increased the maximum all-core boost rate up to 4.6 GHz, a 300 MHz improvement.
Intel points to additional tuning for the 10nm SuperFin process and tweaked platform design as driving the higher boost clock rates. Notably, the 1195G7’s base frequency declines by 100 MHz to 2.9 GHz, likely to keep the chip within the 12 to 28W threshold. As with the other G7 models, the chip comes with the Iris Xe graphics engine with 96 EUs, but those units operate at 1.4 GHz, a slight boost over the 1165G7’s 1.35 GHz.
The 1195G7’s 5.0 GHz boost clock rate also comes courtesy of Intel’s Turbo Boost Max Technology 3.0. This boosting tech works in tandem with the operating system scheduler to target the fastest core on the chip (‘favored core’) with single-threaded workloads, thus allowing most single-threaded work to operate 200 MHz faster than we see with the 1185G7. Notably, the new 1195G7 is the only Tiger Lake UP3 model to support this technology.
Surprisingly, Intel says the 1195G7 will ship in higher volumes than the lower-spec’d Core i7-1185G7. That runs counter to our normal expectations that faster processors fall higher on the binning distribution curve — faster chips are typically harder to produce and thus ship in lower volumes. The 1195G7’s obviously more forgiving binning could be the result of a combination of the lower base frequency, which loosens binning requirements, and the addition of Turbo Boost Max 3.0, which only requires a single physical core to hit the rated boost speed. Typically all cores are required to hit the boost clock speed, which makes binning more challenging.
Image 1 of 3
Image 2 of 3
Image 3 of 3
The four-core eight-thread Core i5-1155G7 sees more modest improvements over its predecessor, with boost clocks jumping an additional 100 MHz to 4.5 GHz, and all-core clock rates improving by 300 MHz to 4.3 GHz. We also see the same 100 MHz decline in base clocks that we see with the 1195G7. This chip comes with the Iris Xe graphics engine with 80 EUs that operate at 1.35 GHz.
Intel’s Tiger Lake Core i7-1195G7 Gaming Benchmarks
Intel shared its own gaming benchmarks for the Core i7-1195G7, but as with all vendor-provided benchmarks, you should view them with skepticism. Intel didn’t share benchmarks for the new Core i5 model.
Image 1 of 3
Image 2 of 3
Image 3 of 3
Intel put its Core i7-1195G7 up against the AMD Ryzen 7 5800U, but the chart lists an important caveat here — Intel’s system operates between 28 and 35W during these benchmarks, while AMD’s system runs at 15 to 25W. Intel conducted these tests on the integrated graphics for both chips, so we’re looking at Iris Xe with 96 EUs versus AMD’s Vega architecture with eight CUs.
Naturally, Intel’s higher power consumption leads to higher performance, thus giving the company the lead across a broad spate of triple-A 1080p games. However, this extra performance comes at the cost of higher power consumption and thus more heat generation. Intel also tested using its Reference Validation Platform with unknown cooling capabilities (we assume they are virtually unlimited) while testing the Ryzen 7 5800U in the HP Probook 455.
Intel also provided benchmarks with DirectX 12 Ultimate’s new Sampler Feedback feature. This new DX12 feature reduces memory usage while boosting performance, but it requires GPU hardware-based support in tandem with specific game engine optimizations. That means this new feature will not be widely available in leading triple-A titles for quite some time.
Intel was keen to point out that its Xe graphics architecture supports the feature, whereas AMD’s Vega graphics engine does not. ULMark has a new 3DMark Sampler Feedback benchmark under development, and Intel used the test release candidate to show that Iris Xe graphics offers up to 2.34X the performance of AMD’s Vega graphics with the feature enabled.
Intel’s Tiger Lake Core i7-1195G7 Application Benchmarks
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Here we can see Intel’s benchmarks for applications, too, but the same rules apply — we’ll need to see these benchmarks in our own test suite before we’re ready to claim any victors. Again, you’ll notice that Intel’s system operates at a much higher 28 to 35W power range on a validation platform while AMD’s system sips 15 to 25W in the HP Probook 455 G8.
As we’ve noticed lately, Intel now restricts its application benchmarks to features that it alone supports at the hardware level. That includes AVX-512 based benchmarks that leverage the company’s DL Boost suite that has extremely limited software support.
Intel’s benchmarks paint convincing wins across the board. However, be aware that the AI-accelerated workloads on the right side of the chart aren’t indicative of what you’ll see with the majority of productivity software. At least not yet. For now, unless you use these specific pieces of software very frequently in these specific tasks, these benchmarks aren’t very representative of the overall performance deltas you can expect in most software.
In contrast, the Intel QSV benchmarks do have some value. Intel’s Quick Sync Video is broadly supported, and the Iris Xe graphics engine supports hardware-accelerated 10-bit video encoding. That’s a feature that Intel rightly points out also isn’t supported with MX-series GPUs, either.
Intel’s support for hardware-accelerated 10-bit encoding does yield impressive results, at least in its benchmarks, showing a drastic ~8X reduction in a Handbrake 4K 10-bit HEVC to 1080P HEVC transcode. Again, bear in mind that this is with the Intel chip running at a much higher power level. Intel also shared a chart highlighting its broad support for various encoding/decoding options that AMD doesn’t support.
Intel Beast Canyon NUC
Image 1 of 2
Image 2 of 2
Intel briefly showed off its upcoming Beast Canyon NUC that will sport 65W H-Series Tiger Lake processors and be the first NUC to support full-length graphics cards (up to 12 inches long).
The eight-litre Beast Canyon certainly looks more like a small form factor system than what we would expect from the traditional definition of a NUC, and as you would expect, it comes bearing the Intel skull logo. Intel’s Chief Performance Strategist Ryan Shrout divulged that the system will come with an internal power supply. Given the size of the unit, that means there will likely be power restrictions for the GPU. We also know the system uses standard air cooling.
Intel is certainly finding plenty of new uses for its Tiger Lake silicon. The company recently listed new 10nm Tiger Lake chips for desktop PCs, including a 65W Core i9-11900KB and Core i7-11700KB, and told us that these chips would debut in small form factor enthusiast systems. Given that Intel specifically lists the H-series processors for Beast Canyon, it doesn’t appear these chips will come in the latest NUC. We’ll learn more about Beast Canyon as it works its way to release later this year.
Intel sold its modem business to Apple back in 2019, leaving a gap in its Always Connected PC (ACPC) initiative. In the interim, Intel has worked with MediaTek to design and certify new 5G modems with carriers around the world. The M.2 modules are ultimately produced by Fibocom. The resulting Intel 5G Solution 5000 is a 5G M.2 device that delivers up to five times the speed of the company’s Gigabit LTE solutions. The solution is compatible with both Tiger and Alder Lake platforms.
Intel claims that it leads the ACPC space with three out of four ACPCs shipping with LTE (more than five million units thus far). Intel’s 5G Solution 5000 is designed to extend that to the 5G arena with six designs from three OEMs (Acer, ASUS and HP) coming to market in 2021. The company says it will ramp to more than 30 designs next year.
Intel says that while it will not be the first to come to market with a 5G PC solution, it will be the first to deliver them in volume, but we’ll have to see how that plays out in the face of continued supply disruptions due to the pandemic.
Xiaomi has shown off its latest fast charging tech demo, and consequently is claiming the new world records for both wired and wireless charging speeds. Using a modified Mi 11 Pro with a 4,000mAh battery, Xiaomi says it’s able to fully charge the phone in 8 minutes over a 200W wired “HyperCharge” system, or in 15 minutes with 120W wireless charging.
Charging speeds are a frequent battleground for Chinese smartphone companies, who often release demonstrations of breakthroughs that may or may not show up in final products. Two years ago, for example, Xiaomi announced a 100W system that could charge a 4,000mAh battery in 17 minutes, while last year’s Mi 10 Ultra filled up in 23 minutes at 120W — though it did have a bigger 4,500mAh battery.
Oppo is another leader in this field, with its VOOC technology forming the basis of OnePlus’ Dash and Warp fast charging systems. Last year it demonstrated a 4,000mAh-in-20-minutes 125W system, though that was more than a year after Xiaomi’s supposedly faster 100W announcement, and the excellent current flagship Find X3 Pro charges at “only” 65W.
You can see Xiaomi’s latest effort here:
Progress is always welcome, and the thought of being able to fully charge a phone in eight minutes is certainly appealing. It’s worth noting, though, that these fast-charge systems always require proprietary chargers and cables, so they’ll mostly be used at home rather than in the emergency situations where they might be most useful. Ultimately, any given phone’s ability to make it through the day on its own juice will continue to be the more important factor for a while.
MSI GeForce RTX 3080 Ti Suprim X 12G (Image credit: VideoCardz)
LambdaTek (via momomo_us), a retailer in the United Kingdom, has given us a first look at the pricing for custom GeForce RTX 3080 Ti models. The GeForce RTX 3080 Ti is probably one of Nvidia’s worst-kept secrets in recent years, and the continuing string of information leaves absolutely no doubt that it will contend for a spot on our list of Best Graphics Cards in the coming month or so.
LambaTek has listed prices starting at $2,000 for custom 3080 TI models, but there are caveats. The current consensus is that Nvidia could launch the GeForce RTX 3080 Ti with a $999 to $1,099 MSRP to fill in the small gap left by the GeForce RTX 3080 and GeForce RTX 3090. Even if the price is accurate, it’s probably for the Founders Edition, which means custom models will ultimately arrive with a steeper price tag. Furthemore, the graphics card market isn’t exactly in a good place right now, so we doubt retailers will respect Nvidia’s MSRP for the GeForce RTX 3080 Ti.
LambdaTek is a foreign shop and as we all know, computer hardware is generally more expensive outside U.S. soil. Since we don’t know if these are placeholder prices, do take them with a bit of salt. For easier comprehension, we’ve converted the pricing from pounds to dollars and factored out the 20% VAT (value-added tax) rate.
Nvidia GeForce RTX 3080 Ti Pricing
Vendor
Graphics Card
Part Number
Price
MSI
GeForce RTX 3080 Ti Suprim X 12G
V389-057R
$2,559
Gigabyte
Aorus GeForce RTX 3080 Ti Master
GV-N308TAORUS M-12GD
$2,559
Gigabyte
GeForce RTX 3080 Ti Gaming OC
GV-N308TGAMING OC-12GD
$2,432
Gigabyte
GeForce RTX 3080 Ti Vision OC
GV-N308TVISION OC-12GD
$2,303
Gigabyte
GeForce RTX 3080 Ti Eagle OC
GV-N308TEAGLE OC-12GD
$2,177
Gigabyte
GeForce RTX 3080 Ti Eagle
GV-N308TEAGLE-12GD
$2,177
The trend is that all the listed GeForce RTX 3080 Ti graphics cards cost over $2,000 — almost twice the rumored $1,099 MSRP. However, we think LambdaTek’s pricing might be accurate. If anything, these prices look too conservative.
For instance, a custom GeForce RTX 3080 graphics card starts at $2,600 in the U.S. market. Therefore, the Ti version, which is faster, will likely arrive with a significant retailer markup. Sadly, that’s the reality of the graphics card market right now. Stock is limited, and whatever is in stock sells at ridiculous prices.
LambdaTek has the GeForce RTX 3080 Ti Suprim X 12G up for $2,559, which is the most expensive model out of the lot. That’s believable because the Suprim X is MSI’s flagship part and comes with all the bells and whistles.
Gigabyte’s custom GeForce RTX 3080 Ti offerings span from $2,177 to $2,559. The Aorus GeForce RTX 3080 Ti Master is right up there with the GeForce RTX 3080 Ti Suprim X 12G in terms of pricing. Specification-wise, we don’t know which one is faster. The GeForce RTX 3080 Ti Eagle is evidently Gigabyte’s cheapest SKU, and even that costs a whopping $2,177.
Nvidia has already teased a GeForce event that will take place on May 31. The date falls in line with the rumored announcement for the GeForce RTX 3080 Ti and GeForce RTX 3070 Ti.
GeForce RTX 3080 Ti Founders Edition (Image credit: VideoCardz)
VideoCardz has shared alleged images of Nvidia’s approaching GeForce RTX 3080 Ti Founders Edition graphics card. The new Ampere offering looks like a carbon copy of the already released GeForce RTX 3080.
Nvidia didn’t go much with the GeForce RTX 3080 Ti Founders Edition. It appears that the chipmaker literally only added the “Ti” moniker to the GeForce RTX 3080’s shroud. The graphics card still retains the same dual-slot design, which raises the question on how the GeForce RTX 3080 Ti will dissipate the extra heat.
The GeForce RTX 3080 is rated for 320W, and the GeForce RTX 3080 Ti is rumored to conform within the 350W limit, same as the flagship GeForce RTX 3090. However, the GeForce RTX 3090 features triple-slot cooler to keep the heat under control. It’llbe interesting to see what the GeForce RTX 3080 Ti’s cooling system can do with 350W.
Image 1 of 3
Image 2 of 3
Image 3 of 3
The GeForce RTX 3080 Ti will still draw its power from Nvidia’s 12-pin PCIe power connector. It’s more than enough tp provide the necessary power that the GeForce RTX 3080 Ti needs. If you’re not a fan of the connector, a simple adapter is enough to solve your problems. As always, custom GeForce RTX 3080 Ti models will stick with the conventional 6-pin and 8-pin PCIe power connectors. In terms of power requirements, the GeForce RTX 3080 Ti is perfectly fine with a 750W unit on mainstream systems.
By now, it shouldn’t come as a surprise that the GeForce RTX 3080 Ti will potentially offer the same number and type of display outputs as the GeForce RTX 3080 or GeForce RTX 3090. If you’ve forgotten, the mix of outputs include three DisplayPort 1.4a outputs and one standard HDMI 2.1 port.
Without sounding like a broken record, the GeForce RTX 3080 Ti should perform very close to the GeForce RTX 3090. The graphics card has already appeared in a GPU-Z submission where its specifications were completely exposed. The only missing piece to the puzzle is the pricing, which we may find out on May 31 if Nvidia’s recent GeForce teaser is legit.
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
There are plenty of buying guides for figuring out the best phone to buy at a given time, across a wide variety of prices. We have two of them, in fact. But if you’re a dyed-in-the-wool phone enthusiast, you don’t need a guide that’s going to give you the best rational recommendation. You need something that’s going to help you scratch your new gadget itch in the most satisfying way.
I’m here for you because, well, I’m one of you. This is the internet’s premiere buying guide for phone enthusiasts. We’re not going to focus on practical, rational choices, nor are we going to concern ourselves with budgets. These are the dream phones, the ones you buy not as a utilitarian tool, but for the fun of playing with a new piece of tech that you’ll probably sell or trade in for a loss in six months. I’m not even going to single out a specific model that you should buy, because you’re likely buying a new phone every year (or maybe more!) anyway. This is all about throwing caution to the wind, diving deep into a hobby, and buying something you don’t need but absolutely want. Budgets be damned.
Grab your cargo shorts, we’re going shopping.
1. Samsung Galaxy Z Fold 2 5G
The best folding phone because you know you need one in your collection
Folding phones are the future, right? You can’t call yourself a Real Phone Enthusiast without one in your life. Hands down, the best folding phone you can buy right now is Samsung’s Galaxy Z Fold 2 5G. It’s got everything you might want from a modern smartphone, but it also opens up into a tablet-sized screen that lets you put more than one app side-by-side or look at a giant version of Google Maps. It’s easy to justify because you can tell yourself that you will get SO MUCH WORK done on it, right after you finish watching that YouTube video. Promise.
The Fold 2 costs more than most laptops, you have to baby it, and there’s a really good chance the screen will crack or break on you even if you are careful, but that’s just the price you have to pay to be on the bleeding edge.
2. Apple iPhone 12 Mini
The best iPhone for when you’re double-fisting an Android phone
Look, I know that you know that everyone in the world has an iPhone and it’s the farthest thing from an “interesting” phone. But at the same time, it’s hard to ignore what Apple’s doing, and really, iMessage and the Apple Watch are pretty great. Lots of people carry an iPhone alongside an Android phone, and you could be one of them.
The best iPhone for doing this is the iPhone 12 Mini. It does everything its bigger siblings can do, but it can easily fit in a secondary pocket and isn’t a burden to carry around. The battery life is kinda lousy, but who cares, that’s why you have a second phone on you anyway, right?
3. Asus ROG Phone 5 Ultimate
The best phone for seeing what this “gaming phone” trend is all about
Gaming phones are so hot right now (unless they’ve got a great cooling system) — it seems like a new model is released every three months. You can’t call yourself a true phone enthusiast without being up on this trend, and the best one to dip your toes into it with is the Asus ROG Phone 5 Ultimate.
The ROG Phone 5 Ultimate has a ton of features that can make any mobile accomplisher swoon. A massive battery. A ridiculously high refresh screen. An OLED screen on the back you can customize to show whatever you want. A bunch of accessories to make gaming better. 18GB of RAM! How could you not buy this phone?
Asus ROG Phone 5
$1,000
Prices taken at time of publishing.
The Asus ROG Phone 5 is the quintessential example of a modern gaming phone. It has over-the-top specs, lots of gaming accessories, and a head-turning design.
$1,000
at Asus
4. Samsung Galaxy Note 20 Ultra
The best phone for pretending you’re going to use a stylus
Writing on a phone screen with a pen is so cool! It feels futuristic and is just so natural. At least, it is for the first week until you forget about it and it never leaves the little garage built into the side of the phone again.
If you’ve been telling yourself that little “I’m gonna be a stylus person” lie, you need a phone that supports one and it’s hard to see buying anything other than Samsung’s Galaxy Note 20. It’s got a low latency S Pen, a bunch of software features that can utilize the stylus, and all of the other bells and whistles of a modern smartphone, which means it works quite well long after you’ve forgotten about the stylus.
5. Google Pixel 4A
The best phone for messing around with the Android 12 beta
Okay, I said I wasn’t going to recommend anything rational in this guide, but this is perhaps the recommendation that makes the most sense: if you want to mess around with the Android 12 beta but don’t want to install it on your main phone, you should just buy a Google Pixel 4A. It’s only $350 (that’s just $20 more than it costs to fix a broken screen on an iPhone 12 Pro Max) and can work with Google’s latest and greatest software even before it’s released to the public.
You know the Android betas are going to be messy — battery life is going to be bad, there will be lots of bugs, certain apps might not work correctly — so you don’t want to put it on a device you actually need to rely on. Once the beta period is over, the Pixel 4A is a great device for experimenting with the aftermarket ROM world. Get a Pixel 4A and flash those ROMs to your heart’s content.
6. Microsoft Surface Duo
The best phone for making people go “Whoa, is that a phone?”
As a phone enthusiast, you already know this harsh truth: the Microsoft Surface Duo is not a good phone. It has an old processor. The camera is worse than any iPhone of the past five years. The battery life is decidedly Not Great. It gets hot doing simple tasks. There are SO MANY software bugs. It’s got a generation-old version of Android. It doesn’t even support wireless charging or NFC payments! Oh yeah, and there’s that questionable build quality to worry about.
But there’s something undeniably cool about the Surface Duo, like it’s a device from the future coming here to bless us in the early 21st century. It’s so thin, it has two screens, the hinge is incredibly neat. Open it up in public and you’re sure to get someone to ask “wow, is that a phone?” which we all know is the ultimate goal here. You can then show them all of its cool features, right after it’s done rebooting itself for the fifth time that day.
7. Apple iPhone 12 Pro Max
The best phone for telling yourself that you don’t need an actual camera
For years now we’ve been told that phone cameras are so good that you don’t need an actual camera. The iPhone 12 Pro Max might be the best example of that yet. It’s got a bigger sensor! It’s got three focal lengths! It can shoot video in Dolby Vision HDR!
At the end of the day, it’s still a phone camera and can’t really hold a candle to the image quality or creative control you get with a larger mirrorless camera. But hey, it’s fun to live in that lie and you can totally see the difference between the 12 Pro Max images and other phones. When you blow them up on a big screen. And zoom in.
8. Oppo Find X3 Pro
The best phone for saying “you can’t get this in the US”
Perhaps the ultimate phone flex is pulling a phone out of your pocket that nobody else is going to have. If you’re in the US, the Oppo Find X3 Pro is that phone. It’s got features you can’t get on any American phone and a design you won’t see everywhere, like a microscope camera and softly rounded camera bump. Sure, it won’t really work great on the cellular networks here, importing it is an expensive hassle, and you won’t have any warranty whatsoever. But just think of the envy on your friends’ faces when you tell them they can’t have this phone.
Oppo Find X3 Pro
$1,178
Prices taken at time of publishing.
Oppo’s Find X3 Pro is the company’s latest flagship and it’s not sold in the US. It has a unique camera system and head-turning design that you won’t see on American phones.
$1,178
at Amazon
9. OnePlus 9 Pro
The best phone for when you realize that Pixel phones aren’t great, but you don’t want a Samsung either
Google’s Pixel phones have such great software and then… mediocre everything else. Samsung phones have incredible hardware but are laden with heavy software and actual ads inside of the stock apps. The OnePlus 9 Pro splits that difference — it has software that’s similar to Google’s on hardware that’s virtually a Samsung with a different logo on it.
The 9 Pro is just what the phone enthusiast ordered: a high-end, bells-and-whistles device with All Of The Specs but none of the cruft.
10. Samsung Z Flip 5G
The best weekend phone for when you’re “disconnecting”
Here comes the weekend, with all of its promises of relaxation and enjoyment. You don’t need a phone that’s going to make you more productive, you need something that’s going to slip into your pocket and won’t distract you with a colorful display unless you absolutely need it to.
The Z Flip 5G is this phone. You can flip it closed to ignore it and then pop it open and have a full smartphone inside, complete with every feature you get on non-flippy phones. You’re making a compromise without really making a compromise, because we all know that you had no intention of actually disconnecting for the weekend.
After months of build-up, we finally see a GPU-Z validation (courtesy of Matthew Smith) for Nvidia’s looming GeForce RTX 3080 Ti graphics card. Barring any last-second surprises, the information from the validation entry should be the final specifications for the GeForce RTX 3080 Ti.
If you haven’t been following the rumor mill, the GeForce RTX 3080 Ti utilizes the GA102 silicon. The submission fails to specify the exact die revision, but we expect it to come with the new silicon that has the improved Ethereum anti-mining limiter. With 80 enabled Streaming Multiprocessors (SMs), the Ampere graphics card debuts with 10,240 CUDA cores, 320 Tensor cores and 80 RT cores. The reference clock speeds appear to be 1,365 MHz base and 1,665 MHz boost. With these specifications, the GeForce RTX 3080 Ti pushes out a single-precision performance of up to 34.1 TFLOPs.
For comparison, the GeForce RTX 3080 and GeForce RTX 3090 offer around 29.77 TFLOPs and 35.58 TFLOPs, respectively. If we look at single-precision performance figures alone, the GeForce RTX 3080 Ti is up to 14.5% faster than a GeForce RTX 3080. At the same time, the flagship GeForce RTX 3090 is only 4.3% faster than a GeForce RTX 3080 Ti, which explains the close proximity between the two graphics cards in Geekbench 5.
Nvidia GeForce RTX 3080 Ti Specifications
GeForce RTX 3090
GeForce RTX 3080 Ti*
GeForce RTX 3080
GeForce RTX 3070
Architecture (GPU)
Ampere (GA102)
Ampere (GA102)
Ampere (GA102)
Ampere (GA104)
CUDA Cores / SP
10,496
10,240
8,704
5,888
RT Cores
82
80
68
46
Tensor Cores
328
320
272
184
Texture Units
328
320
272
184
Base Clock Rate
1,395 MHz
1,365 MHz
1,440 MHz
1,500 MHz
Boost Clock Rate
1,695 MHz
1,665 MHz
1,710 MHz
1,730 MHz
Memory Capacity
24GB GDDR6X
12GB GDDR6X
10GB GDDR6X
8GB GDDR6
Memory Speed
19.5 Gbps
19 Gbps
19 Gbps
14 Gbps
Memory Bus
384-bit
384-bit
320-bit
256-bit
Memory Bandwidth
936 GBps
912.4 GBps
760 GBps
448 GBps
ROPs
112
112
96
96
L2 Cache
6MB
6MB
5MB
4MB
TDP
350W
350W
320W
220W
Transistor Count
28.3 billion
28.3 billion
28.3 billion
17.4 billion
Die Size
628 mm²
628 mm²
628 mm²
392 mm²
MSRP
$1,499
$999 – $1,099
$699
$499
*Specifications are unconfirmed.
Looking at the memory system, the GeForce RTX 3080 Ti delivers up to 12GB of GDDR6X memory that adheres to a 19 Gbps memory clock. This memory operates across a 384-bit memory interface, meaning we get a maximum theoretical memory bandwidth up to 912.4 GBps. That’s 20% more bandwidth than a GeForce RTX 3080 and only 2.5% less than the GeForce RTX 3090.
The GPU-Z validation submission doesn’t specify the GeForce RTX 3080 Ti’s TDP (thermal design power) rating. However, there is heavy speculation that the Ampere graphics card could max out at 350W, which is the same thermal limit for the GeForce RTX 3090.
Nvidia most likely produced the GeForce RTX 3080 Ti to cross swords with AMD’s Radeon RX 6900 XT at the $999 price bracket. The GeForce RTX 3090 was already a formidable opponent for the Radeon RX 6900 XT. However, the flagship Ampere part’s $1,499 price tag dissuaded consumers from taking the graphics card into consideration. At a rumored price range between $999 and $1,099, the GeForce RTX 3080 Ti should be a very attractive option. The Ampere offering has yet to prove its worth beside the Radeon RX 6900 XT, though.
Hopefully, we won’t have to wait long to find out if the rumored dates are accurate. The GeForce RTX 3080 Ti may see an official announcement on May 31. Lastly, the GeForce RTX 3080 Ti should be available on the streets on June 4, although the exact pricing remains a mystery.
One of the most beloved Sonic the Hedgehog games is making a return. Sega announced today that a remaster of Sonic Colors — called Sonic Colors: Ultimate — will be coming to the PS4, Xbox One, and Nintendo Switch on September 7th.
The platformer originally launched in 2010, garnering a cult following, and Sega says the new version will feature “stunning visuals, additional features, a new mode, and improved gameplay enhancements.” The game will also be getting a two-part animated tie-in called Rise of the Wisps. The first episode will be out later in the summer.
The announcement was the headlining piece of a Sonic-focused event, which included the reveal of several other games. Sega also announced that the next flagship Sonic title is in the works from Sonic Team — though no real details are available yet, aside from a 2022 release window — as well as a new collection of classic games called Sonic Origins, which will compile the first three games along with Sonic CD. Oh, and the animated Netflix series has a name: Sonic Prime.
Elsewhere, existing games are jumping to new platforms: Sonic Mania and Team Sonic Racing just launched on Amazon’s Luna cloud service; both of those games and Sonic Forces will be on PlayStation Now on June 1st; and Sonic Mania will be available through the Epic Games Store on June 24th.
(Pocket-lint) – The ‘Style Edition’ edition of Acer’s Predator Triton series returns in a 16-inch format, bringing gaming/creator levels of performance into an altogether more discreet, less flashy clamshell than the ‘gaming norm’.
The Predator Triton 500 SE arrives hot on the heels of the smaller-scale Triton 300 SE becoming available to buy. So if the smaller model doesn’t quite pack enough of a punch then is the larger device worth waiting for – and worth saving up for?
Design & Display
16-inch Mini LED panel
2560 x 1600 resolution (WQXGA)
1600 nits brightness maximum
240Hz refresh rate
16:10 aspect ratio
Built-in fingerprint sensor
Thickness: 19.9mm
DTS:X Ultra audio
The 500 SE is, as its 16-inch diagonal panel would dictate, a larger machine than the original 14-inch 300 SE. Not only that, the 500 SE is a rather more developed device, its screen embodying the latest Mini LED technology for a much brighter experience.
Pocket-lint
Mini LED – a technology used by some high-end TVs – houses multiple LEDs behind the surface for a more intense brightness, because there are literally more of the illuminators than earlier technologies could cram into place.
In the case of the Predator Triton 500 SE that means a maximum of 1600 nits – which is as bright as you’ll see the most flagship of mobile phone achieve. It’s better than most high-end OLED tellies, too, so this panel has got the guns to really deliver a strong image to the eyes.
Not only that, it’s a WQXGA resolution, bringing greater sharpness potential to your games, movies and content. All across a 16:10 aspect ratio, which is versatile for all kinds of content and not ‘tall’ like some older laptops.
Pocket-lint
The screen, then, is the Triton 500 SE’s main event, no doubt. But the sell of this laptop is in its design – the idea being that its silvery colour is subtle enough to not scream ‘gaming laptop!‘. The lid has a simple raised Predator symbol logo to the top corner, but no in-your-face text or other logo prints anywhere else to be seen.
However, just as we said of the smaller-scale Style Edition original, the Triton 500 SE’s panel just feels a bit, well, flimsy. There’s too much flex to it; the lid looks and feels too plasticky – when it really shouldn’t at this end of the market.
Pocket-lint
It’s all pretty discreet, although switch on the RGB lighting under the keyboard and there’s no hiding it. And you only need to look at the large vents to the rear to know that it’s ready to pass a lot of air through for the sake of cooling. Still, at 19.9mm thick, it’s really not that massive for such a device.
11th Gen Intel Core i7 / Core i9 processor options
Nvidia RTX 3070 / 3080 GPU options
Up to 4GB PCIe storage / 64GB RAM
5th Gen AeroBlade fan cooling tech
Turbo button for overclocking
Killer Wi-Fi 6 (AX1650i)
Predator Sense
In terms of power available the Triton 500 SE delivers a lot more than the 300 SE can muster. The 16-inch model packs in 11th Gen Intel Core i7 and Nvidia RTX 3070 for its circa two-grand asking price (£1,999 in the UK). That’s nearer three-grand (£2,999 in the UK) if you opt for the Core i9 and RTX 3080. No small chunk of change, more just a big chunk of awesome power.
EaseUS is the easiest way to recover your sensitive data on Mac or PC
By Pocket-lint Promotion
·
Pocket-lint
All of that obviously requires more cooling than your average, hence those big vents to the back and side of the device. But we’ve found the fans do kick in with little fuss, meaning there’s quite a bit of potential noise. There are additional fan controls within Predator Sense software – which has its own dedicated activation button – to take extra command, including maxxing them during gaming sessions.
Even the dedicated Turbo button to the top left area above the keyboard, can push overclocking – and that’ll send those fans into a frenzy. The cooling setup is called AeroBlade 3D – now in its fifth generation – a system that uses the fans to pull air in over the most heat capacitive components (CPU, GPU, RAM) and hold air in chambers to aid with this cooling process.
We’ve not had the time to test this laptop under full pressure, merely see it at a pre-launch Acer event to gauge some of how well it will handle serious tasks. Being a gaming laptop with Intel architecture we wouldn’t assume the battery will last especially long – and you’re going to need it plugged into the wall to get maximum potential anyway – but Acer does claim it can manage up to 12 hours in altogether more work-like conditions.
Pocket-lint
Interestingly there’s some pretty serious ports built into the design, from the dedicated Ethernet port for best connectivity, to the full-size SD card reader – which is a really rare sight these days on laptops. As for speeds, the USB-C ports are Thunderbolt 4, so there’s certainly no slack there – a bit like the Predator Triton 500 SE’s overall ethos really.
First Impressions
If Acer’s original 14-inch ‘Style Edition’ Predator Triton didn’t quite deliver on scale or power, then the Predator Triton 500 SE is here to up the ante. It’s got a bigger, brighter and meaner screen, plus power options that are far more considerable – but then so is the price tag, so you’ll need to get saving.
The design – pretty much pitched as ‘gaming laptop for the business person’ – is more discreet than your gaming laptop average, but there’s still all the RGB lighting, cooling vents, ports and Turbo overclocking that you could want.
It’s good to see something a bit different to diversify the gaming laptop and creators market. Although, as we said of the original SE model, the 500 SE ought to up its game when it comes to screen sturdiness – especially at this price point.
(Pocket-lint) – When ZTE told us the Axon 30 Ultra 5G was en route for review, we got that fuzzy feeling inside. That’s because the older Axon 20 5G was the first device we’d ever seen with an under-display selfie camera – so surely the Axon 30 Ultra would take this technology to the next level?
Um, nope. Instead the Axon 30 Ultra instead has a more traditional punch-hole selfie camera front and centre, so that fuzzy feeling quickly dissipated. Without such a ‘magic camera’ on board what then is the appeal of this flagship?
The Axon 30 Ultra is all about power and affordability. It crams a top-tier Qualcomm Snapdragon 888 processor into a slender body with a 6.67-inch AMOLED display that can push its refresh rate to a class-leading 144Hz. All for just £649 in the UK and $749 in the USA. So is that as exceptional value as it sounds or are there hidden compromises?
Having moved out of the gigantic Xiaomi Mi 11 Ultra, the ZTE’s more slender frame and trim 20:9 aspect ratio felt like a revelation by comparison. It’s not that the Axon 30 Ultra is small, per se, but it’s a well balanced scale.
Pocket-lint
The model we have in review is apparently black – that’s what the box says anyway – but the phone’s rear has a much softer metallic appearance about it, with some degree of blue to its colour balance. Really we’d call it a metallic grey. It looks pleasant, while fingerprint smears aren’t a massive problem thanks to the soft-touch material.
The camera unit on the rear is a fairly chunky protrusion, but that’s because there’s a 5x zoom periscope housed within that frame. It’s a relatively elegant block of cameras, though, and even with the phone flat against a desk it doesn’t rock about unwantedly.
The screen is the big selling point though. It’s a 6.67-inch AMOLED panel, the kind we’ve seen in the Redmi Note 10 Pro, for example, except the ZTE goes all-out when it comes to refresh rate by offering up to 144Hz. You can pick from 60Hz/90Hz/120Hz too, with the option to display the refresh rate in the upper left corner.
Pocket-lint
Having a faster refresh rate means smoother visuals, especially when it comes to moving content. You’re more likely to notice it when scrolling through emails than much else, though, so we’ve found our preference for balancing rate to battery life has meant settling on 90Hz. A more dynamic software approach would be better, or the option to designate specific apps to function at specific frame rates – especially games.
Are you really going to tell the difference between 144Hz and 120Hz? No. But the simple fact the Axon 30 Ultra can do this is to show its worth; to show that it’s got more power credentials than many less adept phones at this price point.
Otherwise the screen hits all the right notes. It’s got ample resolution. Colours pop. Blacks are rich thanks to the AMOLED technology. It’s slightly curved to the edges too, but only subtly to help hide away the edge bezel from direct view – and we haven’t found this to adversely affect use due to accidental touches and such like.
Pocket-lint
There’s also an under-display fingerprint scanner tucked beneath the screen’s surface, which we’ve found to be suitably responsive for sign-ins. Or you can sign-up to face unlock instead to make things even easier.
Having that scanner in such a position, rather than over the power button, leaves the Axon 30 Ultra’s edges to be rather neat. Other than the on/off and volume up/down rocker to the one side, and USB-C port, single speaker and SIM tray to the bottom edge, there’s nothing to disrupt the phone’s form. That keeps it looking neat and tidy. It also means no 3.5mm headphone jack, but that’s hardly a surprise.
Performance & Battery
Processor: Qualcomm Snapdragon 888, 8GB/12GB RAM
Storage: 128GB/256GB/1TB, no microSD card slot
Battery: 4600mAh, 66W fast-charging
Software: ZTE MyOS 11 (Android 11)
Elegant looks complement an elegant operation, too, largely down to the power that’s available on tap. With Qualcomm’s Snapdragon 888 processor on board, couple with 8GB RAM, there’s little else more powerful that you can buy. Indeed, the Axon 30 Ultra is knocking on the door of gaming phone territory given that 144Hz refresh rate screen.
Pocket-lint
Navigating around the interface is super smooth and speedy, apps open quickly, and there’s no downturn in performance if you happen to open a whole bunch. Games are a breeze, too, as you’d expect from this kind of hardware – although we’d like a game centre to prevent over-screen notifications and such like.
But it’s not perfectly smooth sailing on account of ZTE’s own software, which here is MyOS 11 over the top of Google’s Android 11 operating system. It’s a common problem among Chinese makers, so we probably sound like a broken record, but there are definitely issues with notifications. WhatsApp might take a couple of hours to notify you of a message, for example, but there’s never a fixed period of time – and other times it’s immediate. The mail app Outlook rarely to never notified of new mails in the inbox either.
A lot of this is down to software management. Because there’s rather a lot of it in MyOS. Under battery settings is an ‘Apps AI-control’, which is said to intelligently manage apps to save power. Except, as we’ve highlighted above, this can stifle some apps inappropriately. It can be turned off for manual control, where individual apps can have their auto-start and background running characteristics specified.
All of this is an attempt to aid the overall battery life. Because, as you can imagine, cranking out gaming sessions using the 144Hz and top-end engine from Qualcomm’s SD888 definitely eats away at the supply pretty rapidly. The 4,600mAh cell on board isn’t as capacious as some competitors we’ve seen and that, as a result, can see a heavy use day only just about scrape through a 15 hours day. It’ll manage, but only just.
Pocket-lint
Another oddity we’ve experienced with the Axon 30 Ultra is Wi-Fi connectivity seems to be a little up and down. With less strong signal our Zwift Companion app was very choppy in its updating of data – something that hasn’t been an issue with other phones we’ve compared in the same environment. We suspect that’s because the ‘a/b/g/n/ac/6e’ designation is catering for higher frequencies (‘ac’ is 5GHz only, for example, whereas ‘ax’ caters for both 2.6GHz and 5GHz, while the newly adopted ‘6e’, i.e. 6GHz, isn’t widely supported yet).
On the rear the Axon 30 Ultra houses an apparent four lenses: a 64-megapixel main; a 0.5x ultra-wide (also 64MP); a 5x periscope zoom lens (just 8MP); and what we would call a ‘portrait lens’ with 2x zoom (also 64MP).
It’s a bit of a mish-mash when it comes to results though. The main camera, at its best, is really great. It snaps into focus quickly, reveals heaps of detail – as you can see from the main flower shot below – but isn’t the most subtle when you look in detail, as images are over-sharpened.
The ability to zoom in the camera app is actioned on a slider to the side, but you don’t really ever know which lens you’re using – until there’s a clear ‘jump’ between one visualisation and the next, because, for example, the 5x periscope zoom is far poorer in its delivery. It’s only 8-megapixels, for starters, so there’s not nearly the same clarity revealed in its images. Plus the colour balance looks far out of sync with the main lens. Really this periscope is overoptimistic.
The 2x portrait zoom lens we also can’t really work out. Sometimes zoom shots are great, sometimes they’re quite the opposite – all mushy and, again, over-sharpened. It seems to depend which sensor/lens the camera is using at that particular moment – because the image of a horse in a field that we captured (within gallery above) looks fine, whereas the sheep in a field (shown in our wide-to-main-to-zoom-to-periscope gallery, below) is miles off the mark.
Motorola’s new Moto G9 Plus is a stunner of a phone – find out why, right here
By Pocket-lint Promotion
·
Pocket-lint
: Ultra-wide lensUltra-wide lens
There’s potential here overall. The specifications read rather well, but somehow the Axon 30 Ultra gets away from itself a little. It needs to rein in the offering really, simplify things, and deliver a more detailed app that explains specifically what kit you’re shooting with. That said, the main lens will please plenty, while close-up macro work – with the artificial intelligence ‘AI’ activated – snaps into focus really well.
Verdict
To answer our opening question: what compromises do you have to accept if looking to buy the ZTE Axon 30 Ultra 5G? Relatively few at this price point. There are some irks, though, such as the software causing notification problems (by which we mean absences), the battery being a little stretched, and the cameras get away from their potential somewhat – despite the main lens being perfectly decent.
Otherwise ZTE has crammed one heck of a lot into the Axon 30 Ultra. Its screen is commendable and having that headline-grabbing 144Hz refresh rate is sure to bring attention. The subtlety of the design is elegant, too, delivering a well-balanced scale that’s comfortable to hold and fairly fingerprint-resistant on the rear. And there’s bundles of power from the top-end Qualcomm Snapdragon 888 platform, ensuring apps and games run a treat.
There might be less ‘wow factor’ than if there was an under-display front-facing camera to captivate prospective customers (like there was in the Axon 20), but given the Axon 30 Ultra 5G’s price point undercuts the big-dog Samsung, that’ll be enough of a lure to many.
Also consider
Pocket-lint
Samsung Galaxy S20 FE
The ‘Fan Edition’ Galaxy might be a year older than the ZTE, but it’s a similar price, has more stable software in our experience – and that makes all the difference to everyday use.
It’s been a couple of years now since Acer overhauled its big-screen Helios 500 desktop replacement rig. But the laptop is back in a 17-inch shell that’s a bit of a departure from
the 2018 model
. As expected, it brings current top-end 11th Gen Intel/Nvidia components, plus perhaps more RGB than I’ve ever seen on a portable PC. Aside from the per-key RGB keyboard, there are light bars that run along all four edges of the laptop (yes, even the back). And the light show can be set to dynamically react to sound playing from the system’s speakers, or even what’s on the screen.
Acer sent us a pre-production sample in the days before its announcement to get some hands-on time with the new gaming flagship. Sadly, one of the most intriguing options of the new laptop wasn’t included in the sample they shipped. There will be an optional Mini LED 4K panel with a 120 Hz refresh rate and full-array local dimming, which Acer says is “comparable to VESA Display HDR 1000.” If you’re a top-end competitive gamer, the crazy-fast 360 Hz 1080p screen that came with our unit is undoubtedly the better option. But my slow reflexes and HDR-happy eyeballs would love to see what a bright, pixel-dense display with 512 backlight zones looks like on a laptop.
Design of the Acer Predator Helios 500 (2021)
First off, Acer would probably like me to reiterate that what it sent us was a pre-production unit, and some things will change. For instance, the Predator logo on the lid will be RGB-lit, rather than the basic blue seen here. And while the light bar at the back will remain, there are lights in the rear exhaust that will go away (which is good because they’re very bright). Also, as this was a pre-production unit, we were not allowed to test performance or battery life. The laptop is expected to arrive in August, with a starting price of $2,499.
Also note that, as this is a configuration with both a Core i9-11980HK and an Nvidia RTX 3080, the laptop does ship with two large power bricks. You’ll want to keep both plugged in for long gaming sessions, but with the system asleep or while doing basic productivity, one brick was more than enough to keep the laptop charged.
If you’re expecting something approaching a thin-and-light gaming experience, you should look elsewhere (and expect less performance). At 8.59 pounds and approximately 12.6 x 15.75 x 1.75 inches, the Helios 500 is unapologetically a high-end desktop replacement.
And the black metal shell with silver and blue accents back up the ‘gaming’ looks–accented nicely by the blue-metal heatsinks that can be seen from the rear sides and back of the laptop.
As noted earlier, the stand-out visual element here is the sheer abundance of RGB, in the form of diffused light bars that run along most of the front, about two-thirds of both sides, and nearly the entire back edge. The Helios 500 is a light show in a box, if ever there was one.
And if you’re into light shows, the laptop makes good use of all the RGB here. Aside from being able to choose from the usual number of presets or individually select the color of each key, the company’s PredatorSense software’s Pulsar Lighting tab has an Interactive section, which lets you set the keyboard and bars to react to audio being piped through the laptop. You can choose between four presets for this, and there’s also a Screen Sync feature that tends to mimic what’s on the lower portion of the screen.
The Screen Sync option is a bit crude. For instance, when I went to YouTube and an ad for YouTube Premium popped up in the bottom-left corner, the keys below it turned red and a purplish-white, mirroring the ad. But when watching trailers (and ads) for action movies, the flashing keys echoing gunshots did add something to the effect–I’m just not sure it’s something I like.
Having the lights and keyboard echoing what’s on the screen is distracting when you’re trying to watch a TV show or movie, but it could add to the level of immersion when gaming in the dark. And for music, the light show that the Helios 500 kicks out is almost like a party in and of itself. That said, I was not impressed with the sound output of the laptop, at least for music.
Acer bills the Helios 500 as having “True 5.1 channel surround sound” with a subwoofer, and licensed DTS X software is included for audio tweaking. But out of the box, highs and mids sounded harsh, while lows were minimized and there was little in the way of bass, especially for a laptop that literally says Subwoofer on the bottom. To be fair, this audio may be pre-tweaked for gaming, bringing key strategic elements up in the mix, but it certainly doesn’t make music sound good.
It’s possible that audio will improve with final units, but a few minutes playing with the various DTS presets, first choosing the music preset and then fiddling with the manual EQ, didn’t yield the kind of pleasing sound output I’d like from a laptop this big and expensive.
Unlike the version of the Helios 500 we looked at back in 2018, there are no ports on the back, save for the power connections. The left edge has a pair of tightly packed Thunderbolt/USB-C ports, as well as a USB 3 Type-A, a full-size HDMI port, and an anachronistic Mini Displayport.
The right edge houses separate headphone and mic jacks, two more USB 3 Type-A ports, and a Killer Ethernet E3100G jack. Wi-Fi 6 is also included. Note that Acer has yet to provide us with full specs, so we are unsure of the exact port speeds/specs.
Display on the Acer Predator Helios 500 (2021)
We weren’t able to test the 1080p 360 Hz display (or any other aspect) of our pre-production sample. But elite gamers will no-doubt appreciate the speed. And in casual use and gaming, we didn’t find it to be particularly dull or dim. Acer says it will also offer a 2560 x 1440 display option with a 165 Hz refresh rate.
But the most interesting screen option will be the 3840 x 2160 (4K) IPS screen that supports 120 Hz refresh and has a Mini LED matrix, giving it 512 backlight zones, plus what should be high-brightness HDR support. In short, that should be one dynamic display, with bright lights and dark blacks, which should be great for both AAA gaming and movies. But we’ll have to await a final review unit to say for sure. And we’re very curious to hear how much that panel adds to the laptop’s cost.
Keyboard and Touchpad of the Acer Predator Helios 500 (2021)
The input devices on the Acer Predator Helios 500 are nothing if not colorful and roomy. With nearly 16 inches of width to play with, the keys are reasonably large, with lots of separation between them. And aside from the per-key backlighting, there’s also an RGB ring that runs around the touchpad, which is about 4.8 inches on the diagonal.
There’s also plenty of travel, in both the keys and touchpad buttons, likely thanks in part to the fact that Acer clearly wasn’t aiming for slimness here. And the WASD keys, apart from being visually offset in translucent blue, also have a stiffer feel than the surrounding keys, helping your fingers find them in the event that you were grabbing a snack and suddenly find yourself in the heat of battle.
All that said, personally I’ve never been a huge fan of these kind of flat keycaps on a gaming keyboard, and the white edges of the keys, while they help the RGB lighting shine through, look a bit garish in the daylight, against the darkness of the rest of the laptop’s design. I really hope more laptop makers shift back to offering mechanical switch options,
like Alienware has recently
.
Gaming on the Acer Predator Helios 500 (2021)
With the limited time I had with the new Acer Predator Helios 500, I didn’t have the luxury of long nights lost in raids and quests. But I did take the laptop through a few rounds of Doom Eternal and the decidedly less-demanding strategy title, Becastled. At the high frame rates that come with an RTX 3080 and an Intel Core i9 on a 1080p display, the 360 Hz screen was buttery smooth, and I found the roomy keyboard a bit more enjoyable for executing the games sometimes complicated jumps that I did for typing.
And while the cartoonish graphics of Becastled weren’t quite as colorful on the Helios’ display as they are on the Sony HDR TV that I use as my primary monitor, things didn’t exactly look dull eiter. Again, I’d love to see what these games (and frankly everything else) would look like on the 4K, 120 Hz HDR display option with localized dimming. And I’m curious to hear how much that screen will add to the price of the laptop.
As far as cooling goes, the Acer Predator Helios 500 was far from silent under load, but we wouldn’t expect that given its pairing of top-end components. But it also didn’t get overtly loud either. And the fans didn’t often fluctuate up and down, which can be more noticeable than a higher-decibel, but more constant whir. We’d need more time with the laptop (and the ability to test temps) to pass final judgement on the cooling system here, but nothing stood out as problematic or bad on that front, not that the laptop was quieter than we’d expect, either.
Final Thoughts
Given that every gaming laptop maker is using the same current core high-end components (though there is more more variety these days, with AMD’s CPUs more in the mix), and those parts have the same thermal requirements, there are only so many ways for a high-end gaming laptop to stand out.
Acer makes an attempt here by tossing in extra RGBs so that the Predator Helios 500 is as much a light show as it is a gaming powerhouse. It also doesn’t look or feel cheap or flimsy, as we’ve seem from some gaming portables in the past, when the aim was to keep costs as low as possible, or keep the chassis as slim as can be.
But really this Predator’s most striking trick might just be its Mini LED 4K display with full-array local dimming and 120 Hz refresh. Sadly, we’ll have to wait for a final version to get our eyes on that impressive-sounding display. The 360 Hz 1080p panel in our sample is certainly impressive in its own right, but its wow factor isn’t apparent unless you spend your time with highly competitive esports titles.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.