G.Skill is adding two new ultra-fast memory configurations to its current lineup of Royal Elite DDR4 memory kits. According to TechPowerUp, the new Royal Elite kits will come configured at 3600MHz or 4000Mhz at an ultra-low latency of CL14, thanks to high-quality Samsung B-Die ICs.
G.Skill’s current lineup of Trident Z Royal Elite memory already consists of very fast memory, including 4266MHz, 4800MHz, and even 5333MHz kits. But what makes these new lower frequency kits so special is their super tight timings at CL14, which has proved to be the most optimal latency for superior gaming performance, as well as the best configuration for day-to-day tasks where latency is more important than memory bandwidth.
As the name implies, these Trident Z Royal Elite kits are decked out with super-aggressive styling: angled edges everywhere and a diamond-like finish for the RGB lighting at the top. The Royal Elite kits come in silver or gold colors, naturally.
These kits will come within two configurations for memory frequency, 4000mhz and 3600mhz. The tightest timings will come with the 3600MHz kit at 14-14-14-34, and the 4000MHz at 14-15-15-35. However, in order to hit these crazy fast latencies, the memory kits will run at very high voltages, of 1.45v for the 3600Mhz kits and 1.55v for the 4000MHz kits.
While we don’t know prices yet, with these timings you can expect to pay top dollar, as should be obvious from the name alone. These memory kits will be available for purchase sometime in June of 2021.
AMD’s liquid-cooled Radeon RX 6900 XT appears to be making its way to the retail market. VideoCardz has spotted a Sapphire listing over at a Kabum, a Brazilian retailer.
Speculation around the graphics card world is that the Radeon RX 6900 XT LC could very well be the incarnation of the rumored Radeon RX 6900 XTX. The latter was expected utilize the Navi 21 XTXH silicon, which allegedly brings the highest clock rates out of AMD’s RDNA 2 army. Although we’ve already see the graphics card in the wild, AMD hasn’t formally confirmed the existence of the Radeon RX 6900 XT LC.
The Sapphire Radeon RX 6900 XT LC, which should employ the Navi 21 XTXH die, comes with 80 Compute Units (CUs) for a total of 5,120 Stream Processors (SPs). Along with those 5,120 SPs, you’ll also find 80 ray accelerators for ray tracing workloads. The Radeon RX 6900 XT LC arrives with a 2,250 MHz game clock and a boost clock up to 2,435 MHz. The vanilla Radeon RX 6900 XT has a 2,015 MHz game clock and 2,250 MHz boost clock. Therefore, Sapphire’s rendition is offering up to 11.7% and 8.2% higher game and boost clocks, respectively.
Besides the uplift in clock speeds, Kabum’s specification table also shows an increase in memory speed for the Radeon RX 6900 XT LC. Apparently, the liquid-cooled version sports 18 Gbps GDDR6 memory chips as opposed to the 16 Gbps ones on the Radeon RX 6900 XT. It may be a human error, but it’s certainly feasible, considering that Samsung produces 18 Gbps GDDR6 memory chips. If accurate, the extra frequency on the memory chips bumps the Radeon RX 6900 XT LC’s maximum theoretical memory bandwidth up to 576 GBps, a 12.5% improvement over the regular version.
AMD Radeon RX 6900 XT LC Specifications
Radeon RX 6900 XT LC*
Radeon RX 6900 XT
Architecture (GPU)
RDNA 2 (Navi 21)
RDNA 2 (Navi 21)
CUDA Cores / SP
5,120
5,120
RT Cores
80
80
Ray Accelerators
80
80
Texture Units
320
320
Base Clock Rate
?
1,825 MHz
Game Clock Rates
2,250 MHz
2,015 MHz
Boost Clock Rate
2,435 MHz
2,250 MHz
Memory Capacity
16GB GDDR6
16GB GDDR6
Memory Speed
18 Gbps
16 Gbps
Memory Bus
256-bit
256-bit
Memory Bandwidth
576 Gbps
512 GBps
ROPs
128
128
L2 Cache
4MB
4MB
L3 Cache
128MB
128MB
TDP
?
300W
Transistor Count
26.8 billion
26.8 billion
Die Size
536 mm²
536 mm²
MSRP
?
$999
*Specifications are unconfirmed.
The Sapphire-branded Radeon RX 6900 XT LC (21308-02-10G) shares the same design as the Radeon RX 6900 XT LC that recently popped up inside a gaming PC over in China. Although listed as a Sapphire SKU, there are no signs of the Sapphire logo or any type of third-party marketing on the graphics card itself.
It stands to reason that the Radeon RX 6900 XT LC likely conforms to an AMD reference design where the chipmaker’s partners are free to slap their name beside the Big Navi graphics card. It flaunts a dual-slot design with aluminium plates on both sides of the graphics card. For comparison, the Radeon RX 6900 XT reference edition conforms to a 2.5-slot design. Evidently, there are no cooling fans so the Radeon RX 6900 XT LC’s only method of staying cool is the included 120mm AIO liquid cooler.
Image 1 of 2
Image 2 of 2
Despite coming with a significant factory overclock, the Sapphire Radeon RX 6900 XT LC still uses a pair of 8-pin PCIe power connectors. It’s uncertain if the liquid-cooled variant still abides by the 300W TDP (thermal design power) limit as the normal Radeon RX 6900 XT, though. One would expect a more generous TDP, given the higher clock speeds on the Radeon RX 6900 XT LC.
The display outputs on the Sapphire Radeon RX 6900 XT LC, on the other hand, remains unchanged. Like AMD’s reference design, the liquid-cooled variant retains support for four monitors. It offers the standard HDMI 2.1 port, two DisplayPort 1.4 outputs and one USB Type-C port.
Kabum has the Sapphire Radeon RX 6900 XT LC up for preorder at $4,662.40 or $3,368.41 in a single cash payment. Don’t pay attention to the pricing since it probably includes VAT (value-added tax) and a huge retailer markup. Kabum claims that it’ll start shipping Sapphire Radeon RX 6900 XT LC orders on June 30 so we could see an proper announcement from AMD very soon.
Fossil executives Greg McKelvey and Steve Prokup said in an interview with CNET that the company’s existing Wear OS watches won’t get upgraded to the new combined Wear software platform from Samsung and Google. Instead, Fossil is planning a premium watch that Prokup said would have “pretty major hardware upgrades,” and existing watches will probably be discounted as budget options.
According to CNET, Fossil’s Gen 6 watch should have features similar to upcoming watches from Google and Samsung, with faster performance, longer battery life, new chips, and LTE cellular options. “All of the software benefits that Google’s talking about and launching with the unified platform is something we’ll be building into that as well,” McKelvey said.
Google and Samsung announced last month at Google I/O that they would combine Wear OS and Samsung’s Tizen into a new platform, creating one central smartwatch OS for Android.
The Fossil execs didn’t give much insight into what buttons or crowns the company may use in its next Wear OS watch. ”I think you’re still going to see a variety of offerings across even our products, as well as manufacturers,” Prokup said in the interview, “not so much that you’re going to have a watch that ends up having four, five, six dedicated buttons or no buttons.”
King Yuan Electronics, a top 10 outsourced semiconductor assembly and test (OSAT) contractor in the world, announced Friday that it is halting production for two days after several employees were tested positive for COVID-19. A two-day shutdown shouldn’t devastate the market, but amid an ongoing chip shortage — largely stemming from chip packaging — the news is still daunting.
King Yuan halted all production at its two factories in Taiwan starting as of 5:20 p.m. Taipei time for 48 hours for for a full cleaning and disinfection. The action will decrease the company’s output and revenue by 4-6% in June, the company said in a statement with Taiwan Stock Exchange.
This should not significantly impact the company’s output for the year. However, delays of at least two days may be impactful for some of King Yuan’s clients. And this isn’t encouraging news for the already supply-constrained microelectronics industry.
King Yuan is the world’s seventh largest OSAT contractor with $267 million in sales in Q1 2021. The company has multiple factories in Taiwan and its customer base includes Intel and Samsung.
According to Reuters, 67 employees working at a King Yuan factory in the northern city of Miaoli contracted COVID-19. Taiwan’s authorities plan to test 7,000 more people in the region.
Taiwan closed down in early 2020 to avoid a local outbreak. But soon after Taiwan opened up earlier this year, the number of COVID cases began to grow and reached its highest point of the pandemic this spring.
This outbreak has threatened operations of Taiwan’s high-tech chip industry, including TSMC, the world’s largest foundry, and companies like ASE Technology, Micron, UMC and Winbond.
Nvidia does not have plans to bring its ray tracing-enabled GPU architectures to smartphones or other ultra-mobile devices right now, CEO Jensen Huang told journalists at a Computex meeting this week. The statements come just days after AMD confirmed that upcoming Samsung smartphones using AMD RDNA2 GPU architecture will support ray tracing.
According to Huang, the time for ray tracing in mobile gadgets hasn’t arrived yet.
“Ray tracing games are quite large, to be honest,” Huang said, according to ZDNet. “The data set is quite large, and there will be a time for it. When the time is right we might consider it.”
AMD, meanwhile, has licensed its RDNA2 architecture, which supports ray tracing, to Samsung for use in the upcoming Exynos 2200 SoC expected to power its laptops and other flagship mobile devices. AMD CEO Dr. Lisa Su said this week that the SoC will indeed support ray tracing.
“The next place you’ll find RDNA2 will be the high-performance mobile phone market,” Su said, as reported by AnandTech. “AMD has partnered with industry leader Samsung to accelerate graphics innovation in the mobile market, and we’re happy to announce we will bring custom graphics IP to Samsung’s next flagship SoC, with ray tracing and variable rate shading capabilities. We’re really looking forward to Samsung providing more details later this year.”
Currently, Samsung’s Exynos-powered smartphones use Arm Mali-powered graphics; whereas, Qualcomm Snapdragon-based handsets use Adreno GPUs.
Nvidia is in process of taking over Arm, which develops general-purpose Cortex CPU cores as well as Mali graphics processing units for various system-on-chips (SoCs). Nvidia has long tried to license its GeForce technologies to designers of mobile SoCs and devices without any tangible success. If Nvidia’s acquisition of Arm is approved by various regulators, Nvidia will be able to offer its latest GeForce architectures to Arm licensees. Yet, it appears Nvidia has no immediate plans to bring GeForce RTX to smartphones.
Nvidia’s Ampere and Turing architectures seem to be too bulky for smartphone SoCs (and even for entry-level PC graphics) anyway. For now, the company will have to use its GeForce Now game streaming service to address demanding gamers on smartphones and tablets.
“That’s how we would like to reach Android devices, Chrome devices, iOS devices, MacOS devices, Linux devices — all kinds of devices, whether it’s on TV, or mobile device or PC,” said Huang. “I think that for us, right now, that is the best strategy.”
Yet, ray tracing is nothing new on mobiles. Imagination Technologies architectures since the PowerVR GR6500 introduced in 2014 have supported ray tracing, so it’s up to hardware designers to decide on implementing the capability and game designers to leverage it. Imagination’s PowerVR ray tracing implementation is currently supported by Unreal Engine 4 and Unity 5, but it’s unclear whether it’s primarily used for eye candy, performance increase and/or power reduction.
(Pocket-lint) – Google has announced a new version of the Pixel Buds, its true wireless headphones that originally launched in 2017 – the first-gen weren’t all that, though, while the second-gen Buds 2 stepped things up a little in 2019.
The third model belongs to the A-Series, picking up on the A series that we’ve seen in Google’s phones, presenting an affordable choice of true wireless headset.
What’s different to the previous Pixel Buds?
To look that, there isn’t a huge difference between the A-Series and Buds 2: both have the same overall styling and come in a case that’s smooth, much like a pebble.
Both have the same earbud design with a little promontory at the top to help keep them secure, and a round touch-control area on the outside.
Pocket-lint
The Pixel Buds 2 have wireless charging, however, and inside the case and on the inner part of the ‘buds have a matte finish to the plastics, while the A-Series is glossy. That means the older version looks slightly higher quality.
The A-Series also lacks the option to change the volume via gestures – instead you have to use voice for that – and there are a few minor feature differences. Otherwise, the experience is much the same – but the A-Series is much cheaper.
Design & Build
Earbud: 20.7 x 29.3 x 17.5mm; 5.06g
Colours: Dark Olive / Clearly White
Case: 63 x 47 x 25mm; 52.9g
IPX4 water-resistant
Three ear tip sizes
The Buds A-Series’ case, for all intents and purposes, is the same as that of the Buds 2: it’s the same size, has the same feel, and that same satisfying action when you open and close the lid. Both have a USB-C charging port, a manual connection button on the rear, but the A-Series is slightly lighter.
There’s a satisfying magnetic action when you drop the ‘buds into the case to charge and don’t worry about mixing these up if you happen to have the older version too – the A-Series has two charging contacts inside, the Pixel Buds 2 has three.
Pocket-lint
There are two colours to the A-Series – Clearly White or Dark Olive – and opening the lid reveals the colour you’re looking at, as it’s the touchy smooth round end of these Buds, carrying the ‘G’ logo, which makes them really distinctive.
The A-Series ‘buds have the same design as the previous model, with the body of the earbud designed to sit in the concha of the ear, while sealing into the canal with a choice of three different ear tips. These are round – Google seemingly hasn’t been tempted to move to oval as seen on some rivals.
There’s an additional rubber arm that sticks out the top of the buds that is designed to slot into one of the folds at the top of your ear to help keep things secure. We weren’t a fan of it on the previous version and we have the same reservations here: you can’t remove it from the ‘buds and we’re not convinced it’s necessary. As for us, the Buds A-Series sit securely in the ear anyway – even when exercising.
Indeed, if we rotate the earbuds to get that blobby rubber ant to engage with our ears, the sound from the headphones gets worse because they then don’t sit in the best position for our ears. That’s one thing to consider: all ears are different, so this might work for some people and not for others.
Pocket-lint
The great thing about these earbuds’ design is that they don’t hang out of your ear, so you don’t need to worry about pulling a hat over the top or anything else – we think they look a lot better than the ear-dribble style of Apple’s AirPods and all those who copy them. We find the Google design more comfortable for wearing over long periods, too.
Connection, setup and control
Native Pixel support
Pixel Buds app
Touch controls
Google Fast Pair means you just have to lift the lid of the case and your nearby Android phone will detect the Pixel Buds A-Series and allow you to connect with one tap. It’s essentially the same as Apple’s system with the AirPods and iPhone, linking the Buds to the Google account you register them with so they are then available on other devices too.
If you’re using a Pixel phone then you’ll have native support for the Buds; if using another brand Android device you’ll be prompted to download the Pixel Buds app, which will provide access to firmware updates and details on how to use all the features, as well as some options.
Pocket-lint
As far as setup is concerned, that’s all there is to it: you’ll be asked to walk through things like Google Assistant, and you’ll be prompted to allow notifications access, so you can unlock the potential of the Pixel Buds.
The touch-controls are fairly easy to master, too, with both left and right sides offering the same function: single-tap to play/pause; double-tap to skip forward; triple-tap to skip backwards; press-and-hold to get a notifications update.
The last of those is interesting, because you’ll get a report of the time and then you’ll be told about your notifications – with the option to reply, needing a press-and-hold to speak your reply, before it’s confirmed and then sent.
Pocket-lint
Missing from this selection of touch-controls is volume: unlike the Pixel Buds 2, you can’t swipe to change the volume, you have to ask Google Assistant to do it or you have to thumb the volume controller on your device instead.
This, we feel, is the biggest flaw of these headphones: volume control is pretty important when you’re listening to something, so having to ask Google using voice just isn’t appropriate in all situations.
Google Assistant and smart features
Google Assistant integration
Adaptive Sound
With a lack of volume control, Google pushes its Adaptive Sound option as a solution. This is designed to adapt the volume to the ambient sound levels. As the external noise goes up, so does the volume of the headphones. That’s fine in principal and works when you move from and area of consistent background noise to another – from a quiet library to a server room with whirring fans, for example – but it’s hopeless when you have varying noise levels.
Just walk along a busy street with Adaptive Sound on and you’ll find the volume of the headphones yo-yoing, because it’s not constant noise, it depends on what’s driving past at that moment. This could be corrected by a software update with Google reducing the frequency of volume changes. If you manually adjust the volume then it suspends the system for a bit and leaves the control to you, but in reality, it’s just too irritating to use in many situations and you might as well turn your phone volume up instead.
Pocket-lint
As we’ve said, Google Assistant is fully integrated into the headphones, so you can ask Google anything that you might on your phone or Nest Hub at home. For fans of the system, that’s a great addition, because you don’t need to fish your phone out of your pocket first. Sure, there are lots of headphones out there that offer Google Assistant, but naturally, Google puts Google first and the experience is nice and smooth.
It’s also a two-way experience, with Google Assistant notifying you of incoming messages and it’s able to read them out to you too – with the option to speak a reply. You can disable messages from any apps you don’t want in the Pixel Buds app, to maintain privacy (or, indeed, a barrage of non-stop voiced messaging). You can also trigger message sending through voice – and you’ll get to confirm the message that’s being sent.
Thanks to Voice Match, it will only respond to your voice – and that also means you can access things like your calendar and so on. It’s plain sailing all round.
Sound quality and performance
Buds: 5 hours battery life
Case: 19 hours extra
Spatial Vents
Bass Boost
When it comes to the performance, Google is taking a bit of a gamble. Rather then pursuing isolation from the outside world, it wants to provide an experience that lets some of the ambient sound in, so you don’t feel cut off.
Pocket-lint
Google uses what it calls Spatial Vents, while claiming that the headphones provide a gentle seal rather than trying to block everything out. We’re not huge fans of this approach and with the rise in headphones offering active noise cancellation (ANC), it suggests that generally speaking that’s what people are buying.
Needless to say, there’s no ANC here and you’ll be able to hear what’s happening around you a lot of the time. At home that’s perhaps useful – you can hear the doorbell or the dog bark – but out on public transport, you’ll hear every announcement, door crash, clatter of the wheels on the tracks, and that’s not something we want. This is exactly the same experience as the previous Pixel Buds and whether that suits you will depends very much on where you wear your headphones. If that’s a busy place, the A-Series might not be the best for you.
Aside from that, in quiet conditions, the sound quality is actually very good. The Pixel Buds A-Series benefits from the Bass Boost option that Google added as a software update to the previous Buds in late 2020, so they offer better performance for tracks which want a driving bassline. In quiet conditions at home we have no complaints: the Pixel Buds A-Series is a great pair of headphones, especially at the asking price and given the smart options they offer.
Pocket-lint
When it comes to calling there are two beam-forming mics on each ‘bud, but they still let noise through to the caller. This is reduced, but they’ll hear every car that drives past as a hiss. If you’re after a better calling experience, the Samsung Galaxy Buds Pro offers a far better experience, providing a better veil of silence when making calls.
The Pixel Buds A-Series provides battery life of 5 hours, which we’ve found to be accurate – although we found the left ‘bud to drain slightly faster than the right one. The case supplies 24 hours of life, recharging the buds when they are back in it, and charged itself via USB-C. This isn’t the longest battery life on the market, but it matches the Apple AirPods.
Best Bluetooth headphones 2021 rated: Top on-ear or over-ear wireless headphones
By Mike Lowe
·
Verdict
The Pixel Buds A-Series have a lot to offer considering the price: Google Assistant integration, comfortable design, a lovely case, plus great audio performance when in quieter conditions.
The biggest downsides are the lack of on-bud volume controls and the design decision to not strive for isolation from external noise. The Adaptive Sound – which auto-adjusts volume – is a good idea in principle to compensate for this, but it sees the headphones’ volume yo-yo unnaturally.
Compared to the older Pixel Buds 2, we’d pick the Pixel Buds A-Series every time: they do the important things just as well but the price is much more approachable, meaning you can forgive the omissions given the context of price.
Also consider
Pocket-lint
Samsung Galaxy Buds Pro
Samsung’s Galaxy Buds Pro offer great noise-cancelling – which is especially effective when making calls – while also offering a great set of features.
Read the full review
squirrel_widget_3816695
Jabra Elite Active 75t
These headphones are a little more bulky, but they offer noise-cancellation that will almost entirely eliminate external noise. If you want silence, Jabra delivers it.
The latest gadget from Satechi is both a stand and a USB-C hub made for the 11-inch and 12.9-inch iPad Pro models that have a USB-C charging port, as well as the 2020 iPad Air. It’s $100, and with your tablet wedged into the stand, it should look more like a traditional monitor, so you can keep on pretending it’s an actual desktop replacement.
Like any hub, it gives you a handful of extra ports for expanding the functionality of your iPad. This one includes an HDMI port capable of outputting 4K at up to 60Hz refresh rate, a USB-A data port, a USB-C PD port with up to 60W of charging power, a headphone jack, and separate slots for an SD and microSD card.
You technically don’t need an iPad to use this hub, as this foldable stand is compatible with other computers, tablets, and phones that have a USB-C port. Satechi does mention, however, that your device needs a USB-C PD port for “full compatibility” with the hub. It lists the last five years of MacBook Pros, the 2018 and 2020 MacBook Air, Microsoft’s Surface Pro 7 and Surface Go, as well as the Samsung Galaxy S20 and Google Pixelbook as viable companions to its hub. Extra compatibility is nice, but if you don’t have an iPad (and thus, no need for the stand component), you can probably find a different USB-C hub that’s no less capable for cheaper.
If this product seems appealing to you, Satechi is knocking $20 off the cost through June 6th at midnight PT when you enter the offer code IPADPRO at checkout.
Samsung has introduced its first Zones Namespaces (ZNS) solid-state drives that combine high performance, long endurance, relatively low price enabled by the company’s QLC V-NAND, and improved quality of service (QoS) for datacenters. To use Samsung’s new PM1731a ZNS SSDs, datacenters will have to deploy new storage systems and software.
ZNS SSDs work differently than conventional block-based drives and have a number of advantages. ZNS SSDs write data sequentially into large zones and have better control over write amplification, which reduces over-provisioning requirements by an order of magnitude. For example, some enterprise drives rated for 3 DWPD (drive writes per day) reserve about a third of their raw capacity for over-provisioning, but for ZNS SSDs about 10% of that is enough. In addition, since ZNS uses large zones instead of many 4KB blocks, garbage collection is not needed as often as traditional SSDs, it also improves real-world read and write performance.
Samsung’s 2.5-inch PM1731a ZNS SSDs are based on the company’s proprietary dual-port controller as well as 6th Generation QLC V-NAND memory. The drives will be available in 2TB and 4TB capacities.
Samsung says that its new PM1731a ZNS drives will last up to four times longer than conventional NVMe SSDs, which will reduce their total cost of ownership (TCO) and will simplify server infrastructure.
But enhanced endurance and performance come at a cost. ZNS ecosystem requires new software infrastructure that is not yet widely available. In a bid to make ZNS more widespread, Samsung is participating in a number of open-source ZNS projects. The manufacturer also plans to make its ZNS technology available to xNVMe and participate in the Storage Performance Development Kit (SPDK) community to enable NVMe and SPDK users to implement ZNS more easily.
Samsung will start mass production of its PM1731a ZNS SSDs in the second half of the year. The company is the second major maker of SSDs, the first being Western Digital, to unveil a ZNS SSD.
“Samsung’s ZNS SSD reflects our commitment to introducing differentiated storage solutions that can substantially enhance the reliability and lifetime of server SSDs,” said Sangyeun Cho, senior vice president of the Memory Software Development Team at Samsung Electronics. “We plan to leverage quad-level cell (QLC) NAND technology in our next-generation ZNS drives to enable higher thresholds for storage performance and capacity in the enterprise systems of tomorrow.”
Samsung has announced two new Windows laptops running Arm-based processors. The Galaxy Book Go and Galaxy Book Go 5G both use Snapdragon chips from Qualcomm rather than Samsung’s own Exynos designs.
The Galaxy Book Go is an entry-level model that starts at $349. It has the updated Snapdragon 7c Gen 2 processor that Qualcomm announced last month, as well as 4GB or 8GB of RAM and 64GB or 128GB of eUFS storage. The display is a 14-inch 1080p LCD and the laptop is 14.9mm thick, weighing in at 1.38kg.
The Galaxy Book Go 5G, meanwhile, uses Qualcomm’s more powerful Snapdragon 8cx Gen 2 processor — though other laptops with that chip aren’t exactly powerhouses — and, as the name suggests, it includes 5G connectivity. Despite running on a Snapdragon chip with an integrated LTE modem, the $349 Galaxy Book Go is actually Wi-Fi-only.
Specs otherwise appear to be shared between the two laptops. The Galaxy Book Go has two USB-C ports, one USB-A port, a headphone jack, a 720p webcam, and a microSD card slot. Samsung hasn’t given pricing or release information for the Galaxy Book Go 5G just yet, but the $349 Galaxy Book Go is going on sale on June 10th.
Last year’s Nvidia RTX 3080 was the first GPU to make 4K gaming finally feasible. It was a card that delivered impressive performance at 4K, especially for its retail price of $699 — far less than the 2080 Ti cost a generation earlier. That was before the reality of a global chip shortage drove the prices of modern GPUs well above $1,000. Now that the street prices of RTX 3080s have stayed above $2,000 for months, Nvidia is launching its RTX 3080 Ti flagship priced at $1,199.
It’s a card that aims to deliver near identical levels of performance to the $1,499 RTX 3090, but in a smaller package and with just 12GB of VRAM — half what’s found on the RTX 3090. Nvidia is effectively competing with itself here, and now offering three cards at the top end. That’s if you can even manage to buy any of them in the first place.
I’ve spent the past week testing the RTX 3080 Ti at both 4K and 1440p resolutions. 4K gaming might have arrived originally with the RTX 2080 Ti, but the RTX 3080 Ti refines it and offers more headroom in the latest games. Unfortunately, it does so with a $1,199 price tag that I think will be beyond most people’s budgets even before you factor in the inevitable street price markup it will see during the current GPU shortage.
Hardware
If you put the RTX 3080 Ti and the RTX 3080 side by side, it would be difficult to tell the difference between them. They look identical, with the same ports and fan setup. I’m actually surprised this card isn’t a three-slot like the RTX 3090, or just bigger generally. The RTX 3080 Ti has one fan on either side of the card, with a push-pull system in place. The bottom fan pulls cool air into the card, which then exhausts on the opposite side that’s closest to your CPU cooler and rear case fan. A traditional blower cooler also exhausts the hot air out of the PCIe slot at the back.
This helped create a quieter card on the original RTX 3080, and I’m happy to report it’s the same with the RTX 3080 Ti. The RTX 3080 Ti runs at or close to its max fan RPM under heavy loads, but the hum of the fans isn’t too distracting. I personally own an RTX 3090, and while the fans rarely kick in at full speed, they’re certainly a lot more noticeable than the RTX 3080 Ti’s.
Nvidia has used the same RTX 3080 design for the 3080 Ti Model.
That quiet performance might have a downside, though. During my week of testing with the RTX 3080 Ti, I noticed that the card seems to run rather hot. I recorded temperatures regularly around 80 degrees Celsius, compared to the 70 degrees Celsius temperatures on the larger RTX 3090. The fans also maxed out a lot during demanding 4K games on the RTX 3080 Ti in order to keep the card cool. I don’t have the necessary equipment to fully measure the heat output here, but when I went to swap the RTX 3080 Ti for another card after hours of testing, it was too hot to touch, and stayed hotter for longer than I’d noticed with either the RTX 3080 or RTX 3090. I’m not sure if this will result in problems in the long term, as we saw with the initial batch of 2080 Ti units having memory overheating issues, but most people will put this in a case and never touch it again. Still, I’m surprised at how long it stayed hot enough for me to not want to touch it.
As this is a Founders Edition card, Nvidia is using its latest 12-pin single power connector. There’s an ugly and awkward adapter in the box that lets you connect two eight-pin PCIe power connectors to it, but I’d highly recommend getting a single new cable from your PSU supplier to connect directly to this card. It’s less cabling, and a more elegant solution if you have a case window or you’re addicted to tidy cable management (hello, that’s me).
I love the look of the RTX 3080 Ti and the pennant-shaped board that Nvidia uses here. Just like the RTX 3080, there are no visible screws, and the regulatory notices are all on the output part of the card so there are no ugly stickers or FCC logos. It’s a really clean card, and I’m sorry to bring this up, but Nvidia has even fixed the way the number 8 is displayed. It was a minor mistake on the RTX 3080, but I’m glad the 8 has the correct proportions on the RTX 3080 Ti.
At the back of the card there’s a single HDMI 2.1 port and three DisplayPort 1.4a ports. Just like the RTX 3080, there are also LEDs that glow around the top part of the fan, and the GeForce RTX branding lights up, too. You can even customize the colors of the glowing part around the fan if you’re really into RGB lighting.
Just like the RTX 3080, this new RTX 3080 Ti needs a 750W power supply. The RTX 3080 Ti even draws more power, too, at up to 350 watts under load compared to 320 watts on the RTX 3080. That’s the same amount of power draw as the larger RTX 3090, which is understandable given the performance improvements, but it’s worth being aware of how this might impact your energy bills (and the cost of your PC build to run it).
1440p testing
I’ve been testing the RTX 3080 Ti with Intel’s latest Core i9 processor. For 1440p tests, I’ve also paired the GPU with a 32-inch Samsung Odyssey G7 monitor. This monitor supports refresh rates up to 240Hz, as well as Nvidia’s G-Sync technology.
I compared the RTX 3080 Ti against both the RTX 3080 and RTX 3090 to really understand where it fits into Nvidia’s new lineup. I tested a variety of AAA titles, including Fortnite, Control, Death Stranding, Metro Exodus, Call of Duty: Warzone, Microsoft Flight Simulator, and many more. You can also find the same games tested at 4K resolution below.
All games were tested at max or ultra settings on the RTX 3080 Ti, and most exceeded an average of 100fps at 1440p. On paper, the RTX 3080 Ti is very close to an RTX 3090, and my testing showed that plays out in most games at 1440p. Games like Microsoft Flight Simulator, Assassin’s Creed: Valhalla, and Watch Dogs: Legion all have near-identical performance across the RTX 3080 Ti and RTX 3090 at 1440p.
Even Call of Duty: Warzone is the same without Nvidia’s Deep Learning Super Sampling (DLSS) technology enabled, and it’s only really games like Control and Death Stranding where there’s a noteworthy, but small, gap in performance.
However, the jump in performance from the RTX 3080 to the RTX 3080 Ti is noticeable across nearly every game, with the exception of Death Stranding and Fortnite, which both perform really well on the base RTX 3080.
RTX 3080 Ti (1440p)
Benchmark
RTX 3080 Founders Edition
RTX 3080 Ti Founders Edition
RTX 3090 Founders Edition
Benchmark
RTX 3080 Founders Edition
RTX 3080 Ti Founders Edition
RTX 3090 Founders Edition
Microsoft Flight Simulator
46fps
45fps
45fps
Shadow of the Tomb Raider
147fps
156fps
160fps
Shadow of the Tomb Raider (DLSS)
154fps
162fps
167fps
CoD: Warzone
124fps
140fps
140fps
CoD: Warzone (DLSS+RT)
133fps
144fps
155fps
Fortnite
160fps
167fps
188fps
Fortnite (DLSS)
181fps
173fps
205fps
Gears 5
87fps
98fps
103fps
Death Stranding
163fps
164fps
172fps
Death Stranding (DLSS quality)
197fps
165fps
179fps
Control
124fps
134fps
142fps
Control (DLSS quality + RT)
126fps
134fps
144fps
Metro Exodus
56fps
64fps
65fps
Metro Exodus (DLSS+RT)
67fps
75fps
77fps
Assassin’s Creed: Valhalla
73fps
84fps
85fps
Watch Dogs: Legion
79fps
86fps
89fps
Watch Dogs: Legion (DLSS+RT)
67fps
72fps
74fps
Watch Dogs: Legion (RT)
49fps
55fps
56fps
Assassin’s Creed: Valhalla performs 15 percent better on the RTX 3080 Ti over the regular RTX 3080, and Metro Exodus also shows a 14 percent improvement. The range of performance increases ranges from around 4 percent all the way up to 15 percent, so the performance gap is very game dependent.
Even when using games with ray tracing, the RTX 3080 Ti still managed high frame rates when paired with DLSS. DLSS uses neural networks and AI supercomputers to analyze games and sharpen or clean up images at lower resolutions. In simple terms, it allows a game to render at a lower resolution and use Nvidia’s image reconstruction technique to upscale the image and make it look as good as native 4K.
Whenever I see the DLSS option in games, I immediately turn it on now to get as much performance as possible. It’s still very much required for ray tracing games, particularly as titles like Watch Dogs: Legion only manage to hit 55fps with ultra ray tracing enabled. If you enable DLSS, this jumps to 72fps and it’s difficult to notice a hit in image quality.
4K testing
For my 4K testing, I paired the RTX 3080 Ti with Acer’s 27-inch Nitro XV273K, a 4K monitor that offers up to 144Hz refresh rates and supports G-Sync. I wasn’t able to get any of the games I tested on both the RTX 3080 Ti and RTX 3090 to hit the frame rates necessary to really take advantage of this 144Hz panel, but some came close thanks to DLSS.
Metro Exodus manages a 14 percent improvement over the RTX 3080, and Microsoft Flight Simulator also sees a 13 percent jump. Elsewhere, other games see between a 4 and 9 percent improvement. These are solid gains for the RTX 3080 Ti, providing more headroom for 4K gaming over the original RTX 3080.
The RTX 3080 Ti comes close to matching the RTX 3090 performance at 4K in games like Watch Dogs: Legion, Assassin’s Creed: Valhalla, Gears 5, and Death Stranding. Neither the RTX 3080 Ti nor RTX 3090 is strong enough to handle Watch Dogs: Legion with ray tracing, though. Both cards manage around 30fps on average, and even DLSS only bumps this up to below 50fps averages.
RTX 3080 Ti (4K)
Benchmark
RTX 3080 Founders Edition
RTX 3080 Ti Founders Edition
RTX 3090 Founders Edition
Benchmark
RTX 3080 Founders Edition
RTX 3080 Ti Founders Edition
RTX 3090 Founders Edition
Microsoft Flight Simulator
30fps
34fps
37fps
Shadow of the Tomb Raider
84fps
88fps
92fps
Shadow of the Tomb Raider (DLSS)
102fps
107fps
111fps
CoD: Warzone
89fps
95fps
102fps
CoD: Warzone (DLSS+RT)
119fps
119fps
129fps
Fortnite
84fps
92fps
94fps
Fortnite (DLSS)
124fps
134fps
141fps
Gears 5
64fps
72fps
73fps
Death Stranding
98fps
106fps
109fps
Death Stranding (DLSS quality)
131fps
132fps
138fps
Control
65fps
70fps
72fps
Control (DLSS quality + RT)
72fps
78fps
80fps
Metro Exodus
34fps
39fps
39fps
Metro Exodus (DLSS+RT)
50fps
53fps
55fps
Assassin’s Creed: Valhalla
64fps
70fps
70fps
Watch Dogs: Legion
52fps
55fps
57fps
Watch Dogs: Legion (DLSS+RT)
40fps
47fps
49fps
Watch Dogs: Legion (RT)
21fps
29fps
32fps
Most games manage to comfortably rise above 60fps in 4K at ultra settings, with Microsoft Flight Simulator and Metro Exodus as the only exceptions. Not even the RTX 3090 could reliably push beyond 144fps at 4K without assistance from DLSS or a drop in visual settings. I think we’re going to be waiting on whatever Nvidia does next to really push 4K at these types of frame rates.
When you start to add ray tracing and ultra 4K settings, it’s clear that both the RTX 3080 Ti and RTX 3090 need to have DLSS enabled to play at reasonable frame rates across the most demanding ray-traced titles. Without DLSS, Watch Dogs: Legion manages an average of 29fps (at max settings), with dips below that making the game unplayable.
DLSS really is the key here across both 1440p and 4K. It was merely a promise when the 2080 Ti debuted nearly three years ago, but Nvidia has now managed to get DLSS into more than 50 popular games. Red Dead Redemption 2 and Rainbow Six Siege are getting DLSS support soon, too.
DLSS also sets Nvidia apart from AMD’s cards. While AMD’s RX 6800 XT is fairly competitive at basic rasterization at 1440p, it falls behind the RTX 3080 in the most demanding games at 4K — particularly when ray tracing is enabled. Even the $1,000 Radeon RX 6900 XT doesn’t fare much better at 4K. AMD’s answer to DLSS is coming later this month, but until it arrives we still don’t know exactly how it will compensate for ray tracing performance on AMD’s GPUs. AMD has also struggled to supply retailers with stock of its cards.
That’s left Nvidia in a position to launch the RTX 3080 Ti at a price point that really means it’s competing with itself, positioned between the RTX 3080 and RTX 3090. If the RTX 3090 wasn’t a thing, the RTX 3080 Ti would make a lot more sense.
Nvidia is also competing with the reality of the market right now, as demand has been outpacing supply for more than six months. Nvidia has introduced a hash rate limiter for Ethereum cryptocurrency mining on new versions of the RTX 3080, RTX 3070, and now this RTX 3080 Ti. It could help deter some scalpers, but we’ll need months of data on street prices to really understand if it’s driven pricing down to normal levels.
Demand for 30-series cards has skyrocketed as many rush to replace their aging GTX 1080 and GTX 1080 Ti cards. Coupled with Nvidia’s NVENC and professional tooling support, it’s also made the RTX 30-series a great option for creators looking to stream games, edit videos, or build games.
In a normal market, I would only recommend the RTX 3080 Ti if you’re really willing to spend an extra $500 to get some extra gains in 1440p and 4K performance. But it’s a big price premium when the RTX 3090 exists at this niche end of the market and offers more performance and double the VRAM if you’re really willing to pay this much for a graphics card.
At $999 or even $1,099, the RTX 3080 Ti would tempt me a lot more, but $1,199 feels a little too pricey. For most people, an RTX 3080 makes a lot more sense if it were actually available at its standard retail price. Nvidia also has a $599 RTX 3070 Ti on the way next week, which could offer some performance gains to rival the RTX 3080.
Either way, the best GPU is the one you can buy right now, and let’s hope that Nvidia and AMD manage to make that a reality soon.
The Samsung and LG-sourced screens for the iPhone 13 are already in production, and there will be support for 120Hz refresh rates. That’s according to Korean news site TheElec.
Production has started a month earlier than last year, suggesting that Apple’s iPhone 13 range will return to its normal launch schedule of September. The iPhone 12 launched a month later last year due to component sourcing issues caused by the Coronavirus pandemic.
The site says Samsung Display started production in the middle of May, with LG Display following “recently”. Samsung is planning on making 80 million OLED screens for the iPhone 13, while LG will make 30 million.
Samsung’s TFT OLEDs (which have a maximum refresh rate of 120Hz) are destined for the top two iPhone 13 models (likely the iPhone 13 Pro and Pro Max), according to the report. LG’s will be used in the lower-end models (iPhone 13 and 13 Mini).
The report doesn’t explicitly say that the iPhone 13 and 13 Mini won’t have 120Hz refresh rates, but it’s implied.
A 120Hz refresh rate would double that of the handsets in the iPhone 12 range. A higher refresh rate should mean less blur – especially noticeable in fast-moving content like sports and games.
So it appears to be full steam ahead for September, although we expect to see plenty more leaks before then – we’ll bring you the most credible as they arrive.
MORE:
These are the best iPhones you can currently buy
What to expect from the Apple AirPods 3 and AirPods Pro 2
Listening, upgraded: Apple Music lossless: which devices will (and won’t) play lossless and Spatial Audio
Apple’s streaming TV app is coming to another platform today: Nvidia’s Shield. Shield owners will now be able to access Apple TV Plus, rent movies through Apple’s store, and access subscriptions to premium channels like Showtime and Starz that were set up through Apple.
The biggest hook is finally getting access to Apple TV Plus. Apple needs the streaming service to be accessible in as many places as possible in order to expand viewership. And viewers need to be able to access the service on whatever device is hooked up to their TV, if Apple wants to make sure people use it and stay signed up.
Apple TV Plus is already available through many of the most popular streaming devices. It’s offered through Roku and Fire TV streaming devices, available on recent PlayStations and Xboxes, and supported on many Vizio, Sony, Samsung, and LG TVs. The app came to Google’s latest Chromecast in February, and it was supposed to expand to other Android TV devices — like the Shield — sometime after that. The service will support Dolby Vision and Dolby Atmos on Shield devices.
The timing is good for Apple. Free trials for Apple TV are about to lapse for the service’s earliest users. And the second season of the service’s biggest (and pretty much only) hit, Ted Lasso, debuts July 23rd. The more places Apple TV Plus can be accessed, the better odds Apple has of getting some actual paying subscribers.
Samsung could enable HDR10+ for gaming, according to a German blog post spotted by HDTVtest. The article claims Samsung executives are working with ‘various unnamed studios’ to set up a steady supply of HDR10+ titles.
The HDR10+ format was created by Samsung and is a competitor to Dolby Vision. Like Dolby Vision, HDR10+ is all about adding dynamic metadata to the HDR signal to deliver more detail. Unlike Dolby Vision, companies don’t need to pay a fee to license HDR10+.
The report doesn’t reveal whether Samsung is planning to bring the technology to games consoles or reserve it for mobile devices such as the HDR10+- supporting Samsung Galaxy S21.
However, it’s interesting to note that Dolby Vision is supposed to be exclusive for the Xbox Series X and S for the next two years. Could Samsung be working with Sony to bring HDR10+ gaming to the PS5? It’s certainly a possibility.
The Xbox Series X and Xbox Series S systems have supported Dolby Atmos since launch, with Dolby Vision support expected later this year. Microsoft recently announced a Dolby Vision HDR test program for Alpha Ring members ahead of ‘general availability’.
Only a handful of titles make use of Dolby Vision HDR (Gears 5, Halo: The Master Chief Collection and Borderlands 3 are the biggies) but last month Microsoft revealed plans for a major push into Dolby Vision gaming.
If the rumours are true, HDR10+ for gaming could bring better contrast and more vibrant colours to your favourite titles, although you’ll still need a compatible 4K TV.
Apple is expected to adopt OLED displays in “some” iPads starting next year, according to Korea’s ETNews.
“Apple decided to apply OLED instead of Liquid Crystal Display (LCD) from some iPad models in 2022,” says the publication. “It is reported that Apple and display companies have agreed on production and delivery.”
Samsung and LG already supply the OLED displays used in the current generation of Apple iPhones. If the latest rumours are to be believed, the Korean tech titans are primed to manufacture the OLED displays for the next wave of iPads, too.
The report – spotted by 9to5Mac – ties in with previous rumours that have tipped Apple to transition to OLED displays in 2022. It doesn’t specify which models will make the leap, but in March, noted Apple analyst Ming-Chi Kuo tipped the mid-range iPad Air for an OLED display by 2022.
Last month, Apple launched the M1-powered 12.9-inch iPad Pro complete with cutting-edge Liquid Retina XDR (Mini LED) display. Mini LED technology delivers deeper blacks and richer colours, but it doesn’t have the pixel-level contrast control of OLED.
Many analysts believe Mini LED is a one-year ‘stop-gap’ solution due to its high price in comparison to OLED. According to ETNews, all iPads released in 2023 could have OLED screens.
The iPad is the world’s best-selling tablet with sales of around 50 million per year, so keeping up with demand could be quite the challenge. Especially with Samsung reported to be flat-out making 120Hz OLED displays for the upcoming iPhone 13 and iPhone 13 Pro (via PhoneArena).
MORE:
These are the best iPads currently available
And the best tablets
New Apple TV 4K uses iPhone sensors to boost picture quality
AMD is partnering with Samsung to provide RDNA 2 graphics technology for an Exynos mobile system-on-chip, potentially giving a boost to GPU performance in flagship Samsung phones. The announcement was made today at Computex Taipei.
There aren’t many details on the chip or which products it’ll be used in, but AMD describes the chip as a “next-generation Exynos SoC,” and says Samsung will provide further information later in 2021. The GPU will use AMD’s RDNA 2 architecture, enabling features like ray tracing and variable rate shading. AMD says it’ll make its way to “flagship mobile devices.”
“The next place you’ll find RDNA 2 will be the high-performance mobile phone market,” AMD CEO Lisa Su said on stage. “AMD has partnered with industry leader Samsung for several years to accelerate graphics innovation in the mobile market, and we’re happy to announce that we’ll bring custom graphics IP to Samsung’s next flagship mobile SoC with ray tracing and variable rate shading capabilities. We’re really looking forward to Samsung providing more details later this year.”
Exynos is the brand name that Samsung uses for its own in-house processors. In the US and certain other markets, Samsung’s flagship Galaxy phones ship with Snapdragon SoCs from Qualcomm, while the rest of the world gets Exynos chips. The Exynos models are generally regarded as slightly less performant than their Qualcomm equivalents, but it was seen as a surprise when Samsung decided to switch to the Snapdragon variant of the Galaxy S20 in its home market of South Korea.
Whether AMD’s mobile solution will provide tangible benefits over Qualcomm’s Adreno GPUs is unknown. But by throwing out buzzwords like ray tracing and lending its latest RDNA 2 architecture, AMD is certainly setting expectations high for future Samsung devices.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.