The Nvidia TX 3080 Ti is here, at least for those lucky enough to find one. Announced earlier this week alongside the forthcoming RTX 3070 Ti, the 3080 Ti serves as the pricier successor to the GeForce RTX 3080, an excellent graphics card that made 4K gaming that much more affordable and exemplified just how transformative DLSS technology could be.
Like the RTX 3090, Nvidia’s newest flagship touts impressive 1440p and 4K performance, albeit in a smaller, quieter package with half the VRAM. It certainly isn’t cheap at $1,200, but given the ongoing semiconductor shortage and the outlandish street prices of most GPUs right now, the RTX 3080 Ti might be more affordable at MSRP than if you were to succumb to the resale market.
While we expect availability to be limited at launch and throughout the remainder of the year, several retailers currently have the RTX 3080 Ti in stock. Best Buy announced yesterday that it would give customers a chance to purchase a Founder’s Edition of the card at select stores today, however, unlike previous years, Best Buy will only offer the GPU in-store. Customers who are hoping to pick it up at launch will need to line up early and secure a ticket at 7:30AM local time before it officially goes on sale at 9:00AM local time. If you’re reading this now, chances are good that you’re already too late.
Heading to your local Micro Center might be your only other viable option, if you’re hoping to get one in-store. That retailer lists five RTX 3080 Ti models on its site from Gigabyte, Asus, EVGA, and MSI. If you’re looking to purchase the RTX 3080 Ti online, your options are limited. The cards will go on sale at B&H Photo, Newegg, Micro Center, and Amazon, however, we expect stock to sell out fast. You might have luck getting one online through MSI, PNY, Zotac, or EVGA’s site.
Here are some of the specific models available at the online retailers above (we’ll add more individual listings as they go live at other sites):
If these options fail you, try your luck in a Newegg Shuffle lottery on June 4th at 9AM ET:
Looks like Google TV could soon support different user profiles on the home screen. 9to5Google has dug into the source code of the latest version of the operating system and found mention of personalised home screens, which would offer a much more tailored experience for anyone watching.
Google TV already lets you sign in with multiple Google Accounts, and offers Kids Profiles, which only offer age-appropriate content. But with adult profiles, no matter who’s watching, the homepage will only be personalised to the main account. That means anyone watching will only see recommendations for the main account holder.
But it looks like that could soon change. Source code for the latest update to the Google TV Home app – version 1.0.370 – contains mentions for individual profiles on the home screen.
The mentions include: “Add another account to this device to have their own personalized Google TV experience”. Which seems pretty clear cut.
However, just because this text appears in the code that doesn’t mean the feature will definitely make an appearance. Google might just be considering adding it for now, though given how it would enhance the user experience – and bring it in line with lots of other streaming services – we reckon it’s close to a dead cert.
The code also reveals a new tutorial video that would show parents how to hide certain content from kids’ profiles.
Google TV features on the stellar Google Chromecast with Google TV – a dongle that earned five stars in our review. The operating system recently added support for Amazon Music, bolstering its offering even further.
MORE:
Read all about Google TV: apps, features, compatible TVs and more
Check out our guide to the best video streamers
Or go in-depth with one of the best, with our Amazon Fire TV 4K review
HBO Max’s ad-supported streaming option has officially launched on the platform. The new tier will be priced at $10 per month, which shaves $5 off HBO Max’s regular cost of $15 per month. How exactly HBO Max planned to successfully introduce ads on its premium content has been a bit of a head-scratcher, as premium and ad-free content was sort of the entire point of HBO to begin with. But to hear the company justify it, ads on the platform’s content will be both “elegant” and “respectful” of the subject matter.
“Advertising is a time-tested way to reduce the cost of great entertainment and reach a wider audience,” Andy Forssell, EVP and general manager at HBO Max, said in a statement. “We’ve worked hard to create an elegant, tasteful ad experience that is respectful of great storytelling for those users who choose it, and which we’re confident will deliver for our advertising partners as well.”
While significantly cheaper, users who subscribe to this budget tier will be missing out on some of HBO Max’s more premium features. The company introduced 4K streaming to its service last year with the premiere of Wonder Woman 1984, but video quality on the budget tier will be capped at 1080p — much like Netflix does with its cheaper tiers.
Additionally, titles will not be available for offline viewing on the more affordable plan, and users will lose the ability to access same-day movie premieres on the service. Subscribers will eventually be able to view these titles, just in the months after their theatrical release windows end. Everyone on the cheaper with-ads plan will have full access to the service’s originals, however.
WarnerMedia said in a press release that ad time will be capped at a max of four minutes per hour. In addition, ads will not play on HBO originals — a smart move on WarnerMedia’s part if it hopes to maintain its relationships with high-profile talent and creators. The company said that as users continue using the service, “subscribers can expect to see greater personalization in the ads they do see with more innovation in formats to come.” The ad experience kicks off this month with more than 35 brands at launch.
Given that its originals are sort of the entire point of subscribing to the service, and taking into account the fact that these titles will remain free of ads, it’ll be interesting to see how HBO Max’s subscriber numbers change in the months ahead. I, for one, am looking forward to cutting $5 per month off my subscriptions tab now that I know I’ll be able to stream Mare of Easttown in peace.
(Image credit: Future / The Boy From Medellin, Amazon Prime)
Sony is giving away a £50/€50 PlayStation Store voucher with select Bravia XR TVs.
The promotion is already up and running in seven European countries: UK, France, Germany, Denmark, Norway, Sweden and Finland. All you need to do is pick out a Bravia XR TV (LED or OLED) at a participating retailer between June 1st and July 31st 2021.
According to the official announcement on Sony’s website, you can redeem the £50/€50 gift card for, “anything on PlayStation Store: games, add-ons, subscriptions and more”.
Not familiar with Bravia XR? The range boasts some of the best TVs in the Sony 2021 TV line-up and featues the Japanese giant’s “cognitive intelligence” tech, which aims to optimise every pixel, frame and scene to produce the most lifelike picture possible.
As you’d expect the Bravia XR range is a decent match for a next-gen console such as the PS5. The presence of HDMI 2.1 with support for 4K@120Hz and Variable Refresh Rate (VRR) should help you max out the PS5’s capabilities.
The XR line-up covers Sony’s top-tier models. The 55-inch A80J starts at £1999/€2299 (around $2800, AU$3600) while the A90J Master Series, the firm’s top 4K OLED for 2021 costs from £2699 ($2800, around AU$3700). Not cheap, but recently we called the 55-inch XR-55A90J “simply one of the best TVs we’ve tested”.
The Bravia XR models also come with free access to Bravia CORE, Sony’s high-bitrate video streaming service, which promises lossless Blu-ray-quality “streaming up to 80Mbps.”
MORE:
Your guide to the Sony 2021 TV line-up
Samsung 2021 TV lineup: everything you need to know
Nintendo is holding an E3 event on June 15th, and the company promises it will be “focused exclusively on Nintendo Switch games mainly releasing in 2021.” The Nintendo Direct presentation will begin at 9AM PT / 12PM ET on June 15th, and Nintendo will hold three hours of gameplay deep dives once the event has concluded.
Nintendo’s wording suggests we won’t be seeing any hardware announcements at the company’s E3 show. A “Switch Pro” has been rumored for months, with recent reports suggesting it may be announced ahead of E3.
Bloomberg has previously reported that this new Switch model will use more powerful silicon from Nvidia that supports DLSS (Deep Learning Super Sampling). The updated Switch is also said to support 4K output when connected to a TV and reportedly includes a seven-inch OLED display.
If a new Switch is imminent, Nintendo’s E3 show would be the perfect opportunity to show how games run at 4K and to demonstrate the power of Nvidia’s latest chip and DLSS support. As E3 kicks off in virtual form on June 12th, there are only 10 days left to see if Nintendo is ready to announce an updated Switch ahead of E3.
Last year’s Nvidia RTX 3080 was the first GPU to make 4K gaming finally feasible. It was a card that delivered impressive performance at 4K, especially for its retail price of $699 — far less than the 2080 Ti cost a generation earlier. That was before the reality of a global chip shortage drove the prices of modern GPUs well above $1,000. Now that the street prices of RTX 3080s have stayed above $2,000 for months, Nvidia is launching its RTX 3080 Ti flagship priced at $1,199.
It’s a card that aims to deliver near identical levels of performance to the $1,499 RTX 3090, but in a smaller package and with just 12GB of VRAM — half what’s found on the RTX 3090. Nvidia is effectively competing with itself here, and now offering three cards at the top end. That’s if you can even manage to buy any of them in the first place.
I’ve spent the past week testing the RTX 3080 Ti at both 4K and 1440p resolutions. 4K gaming might have arrived originally with the RTX 2080 Ti, but the RTX 3080 Ti refines it and offers more headroom in the latest games. Unfortunately, it does so with a $1,199 price tag that I think will be beyond most people’s budgets even before you factor in the inevitable street price markup it will see during the current GPU shortage.
Hardware
If you put the RTX 3080 Ti and the RTX 3080 side by side, it would be difficult to tell the difference between them. They look identical, with the same ports and fan setup. I’m actually surprised this card isn’t a three-slot like the RTX 3090, or just bigger generally. The RTX 3080 Ti has one fan on either side of the card, with a push-pull system in place. The bottom fan pulls cool air into the card, which then exhausts on the opposite side that’s closest to your CPU cooler and rear case fan. A traditional blower cooler also exhausts the hot air out of the PCIe slot at the back.
This helped create a quieter card on the original RTX 3080, and I’m happy to report it’s the same with the RTX 3080 Ti. The RTX 3080 Ti runs at or close to its max fan RPM under heavy loads, but the hum of the fans isn’t too distracting. I personally own an RTX 3090, and while the fans rarely kick in at full speed, they’re certainly a lot more noticeable than the RTX 3080 Ti’s.
Nvidia has used the same RTX 3080 design for the 3080 Ti Model.
That quiet performance might have a downside, though. During my week of testing with the RTX 3080 Ti, I noticed that the card seems to run rather hot. I recorded temperatures regularly around 80 degrees Celsius, compared to the 70 degrees Celsius temperatures on the larger RTX 3090. The fans also maxed out a lot during demanding 4K games on the RTX 3080 Ti in order to keep the card cool. I don’t have the necessary equipment to fully measure the heat output here, but when I went to swap the RTX 3080 Ti for another card after hours of testing, it was too hot to touch, and stayed hotter for longer than I’d noticed with either the RTX 3080 or RTX 3090. I’m not sure if this will result in problems in the long term, as we saw with the initial batch of 2080 Ti units having memory overheating issues, but most people will put this in a case and never touch it again. Still, I’m surprised at how long it stayed hot enough for me to not want to touch it.
As this is a Founders Edition card, Nvidia is using its latest 12-pin single power connector. There’s an ugly and awkward adapter in the box that lets you connect two eight-pin PCIe power connectors to it, but I’d highly recommend getting a single new cable from your PSU supplier to connect directly to this card. It’s less cabling, and a more elegant solution if you have a case window or you’re addicted to tidy cable management (hello, that’s me).
I love the look of the RTX 3080 Ti and the pennant-shaped board that Nvidia uses here. Just like the RTX 3080, there are no visible screws, and the regulatory notices are all on the output part of the card so there are no ugly stickers or FCC logos. It’s a really clean card, and I’m sorry to bring this up, but Nvidia has even fixed the way the number 8 is displayed. It was a minor mistake on the RTX 3080, but I’m glad the 8 has the correct proportions on the RTX 3080 Ti.
At the back of the card there’s a single HDMI 2.1 port and three DisplayPort 1.4a ports. Just like the RTX 3080, there are also LEDs that glow around the top part of the fan, and the GeForce RTX branding lights up, too. You can even customize the colors of the glowing part around the fan if you’re really into RGB lighting.
Just like the RTX 3080, this new RTX 3080 Ti needs a 750W power supply. The RTX 3080 Ti even draws more power, too, at up to 350 watts under load compared to 320 watts on the RTX 3080. That’s the same amount of power draw as the larger RTX 3090, which is understandable given the performance improvements, but it’s worth being aware of how this might impact your energy bills (and the cost of your PC build to run it).
1440p testing
I’ve been testing the RTX 3080 Ti with Intel’s latest Core i9 processor. For 1440p tests, I’ve also paired the GPU with a 32-inch Samsung Odyssey G7 monitor. This monitor supports refresh rates up to 240Hz, as well as Nvidia’s G-Sync technology.
I compared the RTX 3080 Ti against both the RTX 3080 and RTX 3090 to really understand where it fits into Nvidia’s new lineup. I tested a variety of AAA titles, including Fortnite, Control, Death Stranding, Metro Exodus, Call of Duty: Warzone, Microsoft Flight Simulator, and many more. You can also find the same games tested at 4K resolution below.
All games were tested at max or ultra settings on the RTX 3080 Ti, and most exceeded an average of 100fps at 1440p. On paper, the RTX 3080 Ti is very close to an RTX 3090, and my testing showed that plays out in most games at 1440p. Games like Microsoft Flight Simulator, Assassin’s Creed: Valhalla, and Watch Dogs: Legion all have near-identical performance across the RTX 3080 Ti and RTX 3090 at 1440p.
Even Call of Duty: Warzone is the same without Nvidia’s Deep Learning Super Sampling (DLSS) technology enabled, and it’s only really games like Control and Death Stranding where there’s a noteworthy, but small, gap in performance.
However, the jump in performance from the RTX 3080 to the RTX 3080 Ti is noticeable across nearly every game, with the exception of Death Stranding and Fortnite, which both perform really well on the base RTX 3080.
RTX 3080 Ti (1440p)
Benchmark
RTX 3080 Founders Edition
RTX 3080 Ti Founders Edition
RTX 3090 Founders Edition
Benchmark
RTX 3080 Founders Edition
RTX 3080 Ti Founders Edition
RTX 3090 Founders Edition
Microsoft Flight Simulator
46fps
45fps
45fps
Shadow of the Tomb Raider
147fps
156fps
160fps
Shadow of the Tomb Raider (DLSS)
154fps
162fps
167fps
CoD: Warzone
124fps
140fps
140fps
CoD: Warzone (DLSS+RT)
133fps
144fps
155fps
Fortnite
160fps
167fps
188fps
Fortnite (DLSS)
181fps
173fps
205fps
Gears 5
87fps
98fps
103fps
Death Stranding
163fps
164fps
172fps
Death Stranding (DLSS quality)
197fps
165fps
179fps
Control
124fps
134fps
142fps
Control (DLSS quality + RT)
126fps
134fps
144fps
Metro Exodus
56fps
64fps
65fps
Metro Exodus (DLSS+RT)
67fps
75fps
77fps
Assassin’s Creed: Valhalla
73fps
84fps
85fps
Watch Dogs: Legion
79fps
86fps
89fps
Watch Dogs: Legion (DLSS+RT)
67fps
72fps
74fps
Watch Dogs: Legion (RT)
49fps
55fps
56fps
Assassin’s Creed: Valhalla performs 15 percent better on the RTX 3080 Ti over the regular RTX 3080, and Metro Exodus also shows a 14 percent improvement. The range of performance increases ranges from around 4 percent all the way up to 15 percent, so the performance gap is very game dependent.
Even when using games with ray tracing, the RTX 3080 Ti still managed high frame rates when paired with DLSS. DLSS uses neural networks and AI supercomputers to analyze games and sharpen or clean up images at lower resolutions. In simple terms, it allows a game to render at a lower resolution and use Nvidia’s image reconstruction technique to upscale the image and make it look as good as native 4K.
Whenever I see the DLSS option in games, I immediately turn it on now to get as much performance as possible. It’s still very much required for ray tracing games, particularly as titles like Watch Dogs: Legion only manage to hit 55fps with ultra ray tracing enabled. If you enable DLSS, this jumps to 72fps and it’s difficult to notice a hit in image quality.
4K testing
For my 4K testing, I paired the RTX 3080 Ti with Acer’s 27-inch Nitro XV273K, a 4K monitor that offers up to 144Hz refresh rates and supports G-Sync. I wasn’t able to get any of the games I tested on both the RTX 3080 Ti and RTX 3090 to hit the frame rates necessary to really take advantage of this 144Hz panel, but some came close thanks to DLSS.
Metro Exodus manages a 14 percent improvement over the RTX 3080, and Microsoft Flight Simulator also sees a 13 percent jump. Elsewhere, other games see between a 4 and 9 percent improvement. These are solid gains for the RTX 3080 Ti, providing more headroom for 4K gaming over the original RTX 3080.
The RTX 3080 Ti comes close to matching the RTX 3090 performance at 4K in games like Watch Dogs: Legion, Assassin’s Creed: Valhalla, Gears 5, and Death Stranding. Neither the RTX 3080 Ti nor RTX 3090 is strong enough to handle Watch Dogs: Legion with ray tracing, though. Both cards manage around 30fps on average, and even DLSS only bumps this up to below 50fps averages.
RTX 3080 Ti (4K)
Benchmark
RTX 3080 Founders Edition
RTX 3080 Ti Founders Edition
RTX 3090 Founders Edition
Benchmark
RTX 3080 Founders Edition
RTX 3080 Ti Founders Edition
RTX 3090 Founders Edition
Microsoft Flight Simulator
30fps
34fps
37fps
Shadow of the Tomb Raider
84fps
88fps
92fps
Shadow of the Tomb Raider (DLSS)
102fps
107fps
111fps
CoD: Warzone
89fps
95fps
102fps
CoD: Warzone (DLSS+RT)
119fps
119fps
129fps
Fortnite
84fps
92fps
94fps
Fortnite (DLSS)
124fps
134fps
141fps
Gears 5
64fps
72fps
73fps
Death Stranding
98fps
106fps
109fps
Death Stranding (DLSS quality)
131fps
132fps
138fps
Control
65fps
70fps
72fps
Control (DLSS quality + RT)
72fps
78fps
80fps
Metro Exodus
34fps
39fps
39fps
Metro Exodus (DLSS+RT)
50fps
53fps
55fps
Assassin’s Creed: Valhalla
64fps
70fps
70fps
Watch Dogs: Legion
52fps
55fps
57fps
Watch Dogs: Legion (DLSS+RT)
40fps
47fps
49fps
Watch Dogs: Legion (RT)
21fps
29fps
32fps
Most games manage to comfortably rise above 60fps in 4K at ultra settings, with Microsoft Flight Simulator and Metro Exodus as the only exceptions. Not even the RTX 3090 could reliably push beyond 144fps at 4K without assistance from DLSS or a drop in visual settings. I think we’re going to be waiting on whatever Nvidia does next to really push 4K at these types of frame rates.
When you start to add ray tracing and ultra 4K settings, it’s clear that both the RTX 3080 Ti and RTX 3090 need to have DLSS enabled to play at reasonable frame rates across the most demanding ray-traced titles. Without DLSS, Watch Dogs: Legion manages an average of 29fps (at max settings), with dips below that making the game unplayable.
DLSS really is the key here across both 1440p and 4K. It was merely a promise when the 2080 Ti debuted nearly three years ago, but Nvidia has now managed to get DLSS into more than 50 popular games. Red Dead Redemption 2 and Rainbow Six Siege are getting DLSS support soon, too.
DLSS also sets Nvidia apart from AMD’s cards. While AMD’s RX 6800 XT is fairly competitive at basic rasterization at 1440p, it falls behind the RTX 3080 in the most demanding games at 4K — particularly when ray tracing is enabled. Even the $1,000 Radeon RX 6900 XT doesn’t fare much better at 4K. AMD’s answer to DLSS is coming later this month, but until it arrives we still don’t know exactly how it will compensate for ray tracing performance on AMD’s GPUs. AMD has also struggled to supply retailers with stock of its cards.
That’s left Nvidia in a position to launch the RTX 3080 Ti at a price point that really means it’s competing with itself, positioned between the RTX 3080 and RTX 3090. If the RTX 3090 wasn’t a thing, the RTX 3080 Ti would make a lot more sense.
Nvidia is also competing with the reality of the market right now, as demand has been outpacing supply for more than six months. Nvidia has introduced a hash rate limiter for Ethereum cryptocurrency mining on new versions of the RTX 3080, RTX 3070, and now this RTX 3080 Ti. It could help deter some scalpers, but we’ll need months of data on street prices to really understand if it’s driven pricing down to normal levels.
Demand for 30-series cards has skyrocketed as many rush to replace their aging GTX 1080 and GTX 1080 Ti cards. Coupled with Nvidia’s NVENC and professional tooling support, it’s also made the RTX 30-series a great option for creators looking to stream games, edit videos, or build games.
In a normal market, I would only recommend the RTX 3080 Ti if you’re really willing to spend an extra $500 to get some extra gains in 1440p and 4K performance. But it’s a big price premium when the RTX 3090 exists at this niche end of the market and offers more performance and double the VRAM if you’re really willing to pay this much for a graphics card.
At $999 or even $1,099, the RTX 3080 Ti would tempt me a lot more, but $1,199 feels a little too pricey. For most people, an RTX 3080 makes a lot more sense if it were actually available at its standard retail price. Nvidia also has a $599 RTX 3070 Ti on the way next week, which could offer some performance gains to rival the RTX 3080.
Either way, the best GPU is the one you can buy right now, and let’s hope that Nvidia and AMD manage to make that a reality soon.
NVIDIA today refreshed the top-end of the GeForce RTX 30-series “Ampere” family of graphics cards with the new GeForce RTX 3080 Ti, which we’re testing for you today. The RTX 3080 Ti is being considered the next flagship gaming product, picking up the mantle from the RTX 3080. While the RTX 3090 is positioned higher in the stack, NVIDIA has been treating it as a TITAN-like halo product for not just gaming, but also quasi-professional use cases. The RTX 3080 Ti has the same mandate as the RTX 3080—to offer leadership gaming performance with real-time raytracing at 4K UHD resolution.
NVIDIA’s announcement of the GeForce RTX 3080 Ti and RTX 3070 Ti was likely triggered by AMD’s unexpected success in taking a stab at the high-end market after many years with its Radeon RX 6800 series and RX 6900 XT “Big Navi” GPUs, which are able to compete with the RTX 3080, RTX 3070, and even pose a good alternative to the RTX 3090. NVIDIA possibly found itself staring at a large gap between the RTX 3080 and RTX 3090 that needed to be filled. We hence have the RTX 3080 Ti.
The GeForce RTX 3080 Ti is based on the same 8 nm GA102 silicon as the RTX 3080, but with more CUDA cores, while maxing out the 384-bit wide GDDR6X memory bus. It only has slightly fewer CUDA cores than the RTX 3090, the memory size is 12 GB as opposed to 24 GB, and the memory clock is slightly lower. NVIDIA has given the RTX 3080 Ti a grand 10,240 CUDA cores spread across 80 streaming multiprocessors, 320 3rd Gen Tensor cores that accelerate AI and DLSS, and 80 2nd Gen RT cores. It also has all 112 ROPs enabled, besides 320 TMUs. The 12 GB of memory maxes out the 384-bit memory bus, but the memory clock runs at 19 Gbps (compared to 19.5 Gbps on the RTX 3090). Memory bandwidth hence is 912.4 GB/s.
The NVIDIA GeForce RTX 3080 Ti Founders Edition looks similar in design to the RTX 3080 Founders Edition. NVIDIA is pricing the card at $1,200, or about $200 higher than the Radeon RX 6900 XT. The AMD flagship is really the main target of this NVIDIA launch, as it has spelled trouble for the RTX 3080. As rumors of the RTX 3080 Ti picked up pace, AMD worked with its board partners to release an enthusiast-class RX 6900 XT refresh based on the new “XTXH” silicon that can sustains 10% higher clock-speeds. In this review, we compare the RTX 3080 Ti with all the SKUs in its vicinity to show you if it’s worth stretching your penny to $1,200, or whether you could save some money by choosing this card over the RTX 3090.
The EVGA GeForce RTX 3080 Ti FTW3 Ultra is the company’s premium offering based on NVIDIA’s swanky new RTX 3080 Ti graphics card, which the company hopes will restore its leadership in the high-end gaming graphics segment that felt disputed by the Radeon RX 6900 XT. Along with its sibling, the RTX 3070 Ti, the new graphics cards are a response to AMD’s return to competitiveness in the high-end graphics segment. It has the same mission as the RTX 3080—to offer maxed out gaming at 4K Ultra HD resolution, with raytracing, making it NVIDIA’s new flagship gaming product. The RTX 3090 is still positioned higher, but with its 24 GB memory, is branded as a TITAN-like halo product, capable of certain professional-visualization applications, when paired with NVIDIA’s Studio drivers.
The GeForce RTX 3080 Ti features a lot more CUDA cores than the RTX 3080—10,240 vs. 8,796, and maxes out the 384-bit wide memory interface of the GA102 silicon, much like the RTX 3090. The memory amount, however, is 12 GB, and runs at 19 Gbps data-rate. The RTX 3080 Ti is based on the Ampere graphics architecture, which debuts the 2nd generation of NVIDIA’s path-breaking RTX real-time raytracing technology. It combines new 3rd generation Tensor cores that leverage the sparsity phenomenon to accelerate AI inference performance by an order of magnitude over the previous gen; new 2nd generation RT cores which support even more hardware-accelerated raytracing effects; and the new faster Ampere CUDA core.
The EVGA RTX 3080 Ti FTW3 Ultra features the same top-tier iCX3 cooling solution as the top RTX 3090 FTW3, with a smart cooling that relies on several onboard thermal sensors besides what the GPU and memory come with; a meaty heatsink ventilated by a trio of fans, and plenty of RGB LED lighting to add life to your high-end gaming PC build. The PCB has several air guides that let airflow from the fans to pass through, improving ventilation. EVGA is pricing the RTX 3080 Ti FTW3 Ultra at $1340, a pretty premium over the $1,200 baseline price of the RTX 3080 Ti.
The Zotac GeForce RTX 3080 Ti AMP HoloBlack is the company’s top graphics card based on the swanky new RTX 3080 Ti “Ampere” GPU by NVIDIA. Hot on the heels of its Computex 2021 announcement, we have with us NVIDIA’s new flagship gaming graphics card, a distinction it takes from the RTX 3080. The RTX 3090 is still around in the NVIDIA’s product stack, but it’s positioned as a TITAN-like halo product, with its 24 GB video memory benefiting certain quasi-professional applications, when paired with NVIDIA’s GeForce Studio drivers. The RTX 3080 Ti has the same mandate from NVIDIA as the RTX 3080—to offer leadership 4K UHD gaming performance with maxed out settings and raytracing.
Based on the same 8 nm “GA102” silicon as the RTX 3080, the new RTX 3080 Ti has 12 GB of memory, maxing out the 384-bit GDDR6X memory interface of the chip; while also packing more CUDA cores and other components—10,240 vs. 8,796, 320 TMUs, those many Tensor cores, 80 RT cores, and 112 ROPs. The announcement of the RTX 3080 Ti and its sibling, the RTX 3070 Ti—which we’ll review soon—may have been triggered by AMD’s unexpected return to the high-end gaming graphics segment, with its “Big Navi” Radeon RX 6000 series graphics cards, particularly the RX 6900 XT, and the RX 6800.
The GeForce Ampere graphics architecture debuts the 2nd generation of NVIDIA RTX, bringing real-time raytracing to gamers. It combines 3rd generation Tensor cores that accelerate AI deep-learning neural nets that DLSS leverages; 2nd generation RT cores that introduce more hardware-accelerated raytracing effects, and the new Ampere CUDA core, that significantly increases performance over the previous generation “Turing.”
The Zotac RTX 3080 Ti AMP HoloBlack features the highest factory-overclocked speeds from the company for the RTX 3080 Ti, with up to 1710 MHz boost, compared to 1665 MHz reference, a bold new cooling solution design that relies on a large triple-fan heatsink that, and aesthetic ARGB lighting elements that bring your gaming rig to life. Zotac hasn’t provided us with any pricing info yet, we’re assuming the card will end up $100 pricier than the base cards, like Founders Edition.
Palit GeForce RTX 3080 Ti GamingPro is the company’s premium custom-design RTX 3080 Ti offering, letting gamers who know what to expect from this GPU to simply install and get gaming. Within Palit’s product stack, the GamingPro is positioned a notch below its coveted GameRock brand for enthusiasts. By itself, the RTX 3080 Ti is NVIDIA’s new flagship gaming graphics product, replacing the RTX 3080 from this distinction. The RTX 3090 is marketed as a halo product, with its large video memory even targeting certain professional use-cases. The RTX 3080 Ti has the same mandate as the RTX 3080—to offer leadership gaming performance at 4K UHD, with maxed out settings and raytracing.
The GeForce RTX 3080 Ti story likely begins with AMD’s unexpected return to the high-end graphics segment with its Radeon RX 6800 series and RX 6900 XT “Big Navi” graphics cards. The RX 6900 XT in particular, has managed to outclass the RTX 3080 in several scenarios, and with its “XTXH” bin, even trades blows with the RTX 3090. It is to fill exactly this performance gap between the two top Amperes—the RTX 3080 and RTX 3090, that NVIDIA developed the RTX 3080 Ti.
The RTX 3080 Ti is based on the same 8 nm GA102 GPU as the other two top cards from NVIDIA’s lineup, but features many more CUDA cores than the RTX 3080, at 10,240 vs. 8,704; and more importantly, maxes out the 384-bit wide memory bus of this silicon. NVIDIA endowed this card with 12 GB of memory. Other key specs include 320 Tensor cores, 80 RT cores, 320 TMUs, and 112 ROPs. The memory ticks at the same 19 Gbps data-rate as the RTX 3080, but the wider memory bus means that the bandwidth is now up to 912 GB/s.
Palit adds value to the RTX 3080 Ti, by pairing it with its TurboFan 3.0 triple-slot, triple-fan cooling solution that has plenty of RGB bling to satiate gamers. The cooler is longer than the PCB itself, so airflow from the third fan goes through the card, and out holes punched into the metal backplate. The card runs at reference clock speeds of 1665 MHz, and is officially priced at the NVIDIA $1200 baselines price for the RTX 3080 Ti, more affordable than the other custom designs we’re testing today. In this review, we tell you if this card is all you need if you have your eyes on an RTX 3080 Ti.
Samsung could enable HDR10+ for gaming, according to a German blog post spotted by HDTVtest. The article claims Samsung executives are working with ‘various unnamed studios’ to set up a steady supply of HDR10+ titles.
The HDR10+ format was created by Samsung and is a competitor to Dolby Vision. Like Dolby Vision, HDR10+ is all about adding dynamic metadata to the HDR signal to deliver more detail. Unlike Dolby Vision, companies don’t need to pay a fee to license HDR10+.
The report doesn’t reveal whether Samsung is planning to bring the technology to games consoles or reserve it for mobile devices such as the HDR10+- supporting Samsung Galaxy S21.
However, it’s interesting to note that Dolby Vision is supposed to be exclusive for the Xbox Series X and S for the next two years. Could Samsung be working with Sony to bring HDR10+ gaming to the PS5? It’s certainly a possibility.
The Xbox Series X and Xbox Series S systems have supported Dolby Atmos since launch, with Dolby Vision support expected later this year. Microsoft recently announced a Dolby Vision HDR test program for Alpha Ring members ahead of ‘general availability’.
Only a handful of titles make use of Dolby Vision HDR (Gears 5, Halo: The Master Chief Collection and Borderlands 3 are the biggies) but last month Microsoft revealed plans for a major push into Dolby Vision gaming.
If the rumours are true, HDR10+ for gaming could bring better contrast and more vibrant colours to your favourite titles, although you’ll still need a compatible 4K TV.
Apple is expected to adopt OLED displays in “some” iPads starting next year, according to Korea’s ETNews.
“Apple decided to apply OLED instead of Liquid Crystal Display (LCD) from some iPad models in 2022,” says the publication. “It is reported that Apple and display companies have agreed on production and delivery.”
Samsung and LG already supply the OLED displays used in the current generation of Apple iPhones. If the latest rumours are to be believed, the Korean tech titans are primed to manufacture the OLED displays for the next wave of iPads, too.
The report – spotted by 9to5Mac – ties in with previous rumours that have tipped Apple to transition to OLED displays in 2022. It doesn’t specify which models will make the leap, but in March, noted Apple analyst Ming-Chi Kuo tipped the mid-range iPad Air for an OLED display by 2022.
Last month, Apple launched the M1-powered 12.9-inch iPad Pro complete with cutting-edge Liquid Retina XDR (Mini LED) display. Mini LED technology delivers deeper blacks and richer colours, but it doesn’t have the pixel-level contrast control of OLED.
Many analysts believe Mini LED is a one-year ‘stop-gap’ solution due to its high price in comparison to OLED. According to ETNews, all iPads released in 2023 could have OLED screens.
The iPad is the world’s best-selling tablet with sales of around 50 million per year, so keeping up with demand could be quite the challenge. Especially with Samsung reported to be flat-out making 120Hz OLED displays for the upcoming iPhone 13 and iPhone 13 Pro (via PhoneArena).
MORE:
These are the best iPads currently available
And the best tablets
New Apple TV 4K uses iPhone sensors to boost picture quality
Dennis Villeneuve’s Dune isn’t the only film adaptation of Frank Herbert’s novel getting a 4K release this year. That’s because the original adaptation, made in 1984 by a young David Lynch, is getting a limited edition 4K Blu-ray release on August 30th. The film was both a critical and box office bomb that Lynch later disowned, but it’s also a fascinating historical artifact and sci-fi cult-classic.
Arrow Films, the distributor handling the release, says the 4K restoration is sourced from the film’s original camera negative, scanned at 4K 2160p and mastered in Dolby Vision HDR. It also includes uncompressed stereo audio and a DTS-HD Master Audio 5.1 surround sound mix.
There are two versions of the 4K release available to pre-order: a standard edition containing the film along with a bonus disc of extra content, and a steelbook edition which adds a third non-4K Blu-ray disc containing the HD version of the film. Extra features include brand new audio commentaries (from film historian Paul M. Sammon and podcaster Mike White), a new feature-length documentary, and mix of new and old featurettes. It will be available to buy in the UK, US, and Canada when it releases next month.
It’d probably be an understatement to say the original Dune got a mixed reception upon its release. Critic Robert Ebert called it “an incomprehensible, ugly, unstructured, pointless excursion into the murkier realms of one of the most confusing screenplays of all time.” But the story of how it came to be is fascinating, with Ridley Scott being attached to direct at one point before he dropped out and directed Blade Runner instead. Vulture has a good timeline of the struggles various filmmakers have been through over the years trying to adapt the novel.
Meanwhile, the 2021 Dune adaptation is currently due to release on October 1st, when it will be available simultaneously to watch in cinemas as well as HBO Max in 4K HDR.
Alienware is keen on giving Razer a run for its money when it comes to making a super-thin gaming laptop. Two of the configurations of Alienware’s new X15 flagship model are actually 15.9mm thick, almost the same as Razer’s just-refreshed 15.8mm-thick Blade 15 Advanced. That’s impressively thin, especially considering that Alienware doesn’t usually try to compete in this realm.
What’s also noteworthy is that, despite its thin build, the X15 looks like it will be a capable machine. Alienware is also announcing a bigger and thicker 17-inch X17 laptop that’s even more powerful. We’ll go into detail on both below.
Let’s start with the X15, which will cost $1,999 for the base model, available starting today. Packed into that entry model is Intel’s 11th Gen Core i7-11800H processor (eight cores and a boost clock speed of up to 4.6GHz), 16GB of RAM clocked at 3,200MHz (but not user-upgradeable due to size constraints), 256GB of fast NVMe storage (which is user-upgradeable, with two slots that support either M.2 2230 or 2280-sized SSDs), and Nvidia’s RTX 3060 graphics chip (90W maximum graphics power, and a base clock speed of 1,050MHz and boost clock of 1,402MHz). A 15.6-inch FHD display with a 165Hz refresh rate, 3ms response time, and up to 300 nits of brightness with 100-percent sRGB color gamut support comes standard.
Alienware hasn’t shared pricing for spec increases, but you can load the X15 with up to an Intel Core i9-11900H processor, a 2TB NVMe M.2 SSD (with a maximum 4TB of dual storage supported via RAID 0), and 32GB of RAM. To top it off, you can put in an RTX 3080 graphics card (the 8GB version, with 110W maximum graphics power, a base clock speed of 930MHz and a boost clock speed of 1,365MHz). The display can be upgraded to a 400-nit QHD G-Sync panel with a 240Hz refresh rate, 2ms response time, and 99-percent coverage of the DCI-P3 color gamut. The X15 has a 87Wh battery and includes a 240W “small form factor” adapter. At its lowest weight, the X15 comes in at five pounds, but it goes up to 5.2 pounds depending on the specs.
All of the X15’s ports, aside from a headphone jack and power input, are located on its back. There’s a USB-A 3.2 Gen 1 port, one USB-C 3.2 Gen 2 port, one Thunderbolt 4 port, a microSD card slot, and an HDMI 2.1 port that will allow the X15 to output a 4K signal at up to 120Hz.
If you’re all about getting a 17.3-inch screen, the X17 starts at $2,099 and has similar starting specs. It has a thicker chassis than the X15 at 20.9mm, and it’s heavier, starting at 6.65 pounds. But that extra heft apparently allows for more graphical and processing power, if you’re willing to pay for it. For example, its RTX 3060 card has a higher maximum graphics power of 130W. This pattern is seen for more pricey GPU upgrades, too, especially the RTX 3080 (16GB) that can sail with 165W of max graphics power at a boost clock speed of 1,710MHz. In the processor department, you can go up to an Intel Core i9-11900HK. Additionally, you can spec this one with up to 64GB of XMP RAM clocked at 3,466MHz.
As for the screen, there’s an upgrade option to get a 300-nit FHD G-Sync panel with a 360Hz refresh rate and 1ms response time, but you can go all the way up to a 500-nit 4K display with a 120Hz refresh rate and 4ms response time. Like the X15, the X17 has an 87Wh battery, but whether you get a 240W or 330W power supply will depend on the configuration that you buy.
The X17 has all of the same ports as the X15, along with one extra USB-A port, a Mini DisplayPort jack, and a 2.5G ethernet port (the X15 includes a USB-C to ethernet adapter).
Generally speaking, thinner laptops struggle with heat management. But Alienware’s Quad Fan claims to move a lot of air, and in X15 and X17 models that have the RTX 3070 or 3080 chips, it touts a new “Element 31 thermal interface material” that apparently provides a boost in the thermal resistance of its internals compared to previous Alienware laptops. We’ll have to see how this fares when we try out a review unit. I’m curious how loud they might get in order to stay cool.
If you’re an Alienware enthusiast, be aware that the company’s mainstay graphics amplifier port is missing. We asked Alienware about this, and it provided this statement to The Verge:
Today’s latest flagship desktop graphics cards achieve graphical power beyond what the Alienware Graphics Amplifiers (as well as other external graphics amplifiers) can successfully port back through PCI (and Thunderbolt) connections. For Alienware customers who are already purchasing high-end graphics configurations, the performance improvements from our Alienware Graphics Amplifier would be limited. While improvements would be noticeable, in many cases it wouldn’t be enough to justify purchasing an external amplifier and flagship graphics card. So instead, we are using that additional space to offer extra ports and thermal headroom which provides a better experience for all gamers purchasing this product.
Wrapping up this boatload of specs, the X15 and X17 each have a 720p Windows Hello webcam, and configurations with the RTX 3080 have an illuminated trackpad that can be customized within Alienware’s pre-installed software. These laptops come standard with Alienware’s X-Series keyboard that has per-key lighting, n-key rollover, anti-ghosting, and 1.5mm of key travel. In the X17, you have the option to upgrade to Alienware’s Cherry MX ultra low-profile mechanical switches, which have a longer 1.8mm key travel.
Lastly, both laptops are available in the “Lunar Light” colorway, which is white on the outside shell and black on the inside.
AMD introduced its new Radeon RX 6000M-series laptop graphics at Computex, during a keynote by AMD’s CEO, Dr. Lisa Su. The new mobile graphics lineup is made up of the top-end AMD Radeon RX 6800M, a mid-range RX 6700M and the entry level RX 6600M. For now at least, the GPUs are being paired in systems from laptop vendors with AMD’s Ryzen processors for what the company calls “AMD Advantage.”
These are the first laptop GPUs from AMD that use its RDNA 2 architecture, with Infinity Cache for higher memory bandwidth, low power consumption (AMD claims near 0 watts at idle) and high frequencies even when the system is running at low power. The company is claiming up to 1.5 times performance over last-gen RDNA graphics and up to 43% lower power consumption.
AMD Radeon RX 6800M
AMD Radeon RX 6700M
AMD Radeon RX 6600M
Compute Units
40
36
28
Game Clock
2,300 MHz
2,300 MHz
2,177
Memory
12GB GDDR6
10GB GDDR6
8GB GDDR6
Infinity Cache
96MB
80MB
32MB
AMD Smart Access Memory
Yes
Yes
Yes
AMD Smart Shift
Yes
Yes
Yes
Power Targets
145W and above
Up to 135W
Up to 100W
Resolution Targets
1440p
1440p/1080p
1080p
The most powerful of the new bunch is the AMD Radeon RX 6800M, which will be available starting June 1 in the Asus ROG Strix G15 Advantage Edition. It has 40 compute units and ray accelerators, along with a 2,300 MHz game clock, 12GB of GDDR6 memory and a 96MB cache. It will also be compatible with AMD SmartShift and Smart Access Memory.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
AMD compared the ROG Strix G15 with the RX 6800M and a Ryzen 9 5900HX to a 2019 MSI Raider GE63 with a 9th Gen Intel Core i7 processor and an RTX 2070, claiming up to 1.4 times more frames per second at 1440p max settings in Assassin’s Creed Valhalla and Cyberpunk 2077, 1.5 times the performance in Dirt 5 and 1.7x more frames while playing Resident Evil: Village.
In closer comparisons, to an RTX 3070 (8GB) and RTX 3080 (8GB), AMD claimed its flagship GPU was typically the top performer – within a frame or so – in several of those games, as well as Borderlands 3 and Call of Duty: Black Ops Cold War, though it’s unclear which settings and resolutions were used for these tests.
Unlike Nvidia, AMD isn’t aiming for 4K gaming. The most powerful of the cards, the RX 6800M, aims for a power target of 145W and above and is designed for 1440p.
The middle-tier AMD Radeon RX 6700M is designed for 1440p or 1080p gaming, depending on the title. It has 36 compute units with a 2,300 MHz game clock, 10GB of GDDR6 RAM and an 80MB infinity cache, as well as the same support for SmartShift and SAM. AMD says these will ship in laptops “soon.’ It also said that the GPU will allow for 100 fps gaming at 1440p and high settings in “popular games,” though didn’t specify which games it was referring to.
Image 1 of 3
Image 2 of 3
Image 3 of 3
The RX 6600M sits at the bottom of the stack for gaming at 1080p. AMD compared it to an RTX 3060 (6GB) on 1080p max settings, and found that it led in Assassin’s Creed Valhalla, Borderlands 3 and Dirt 5. It was five frames behind in Call of Duty: Black Ops Cold War in AMD’s tests, and there was a one-frame difference playing Cyberpunk 2077. Like the RX 6800M, the 6600M will start shipping on June 1.
AMD Advantage Laptops
AMD is now referring to laptops with both AMD processors and graphics as offering the “AMD Advantage.” The company says these designs should offer great performance because of power sharing between the CPU and GPU.
Image 1 of 2
Image 2 of 2
AMD says its technologies can achieve up to 11% better performance in Borderlands 3, 10% in Wolfenstein Young Blood, 7% in Cyberpunk 2077 and 6% in Godfall.
Additionally, the company says AMD Advantage laptops will only have “premium” displays — either IPS or OLED, but no VA or TN panels. They should hit or surpass 300 nits of brightness, hit 144 Hz or higher and use AMD FreeSync.
Image 1 of 3
Image 2 of 3
Image 3 of 3
Each laptop should come with a PCIe NVMe Gen 3 SSD, keep the WASD keys below 40 degrees Celsius while gaming and allow for ten hours of video on battery. (AMD tested this with local video, not streaming.)
The first of these laptops is the Asus ROG Strix G15, with up to a Ryzen 9 5900HX and Radeon RX 6800M, a 15-inch display (either FHD at 300 Hz or WQHD at 165 Hz) with FreeSync Premium, liquid metal for cooling both the CPU and GPU along with a vapor chamber. It will launch in mid-June.
The HP Omen 16 will also come with a 165 Hz display with up ao a Ryzen 9 5900Hx and AMD Radeon RX 6600M for 1080p gaming. It will launch sune on JD.com, then become available worldwide.
In June, we should see more releases from HP, Asus, MSI and Lenovo.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.