With the launch of the Nvidia GeForce RTX 3070 Ti, we’re collecting information about all of the partner cards that have launched or will launch soon. We have listings from seven companies, ranging from top-end liquid cooling models to budget-friendly cards.
The RTX 3070 Ti, is Nvidia’s latest mid-range to high-end SKU for the RTX 3000 series lineup. The GPU is based on a fully enabled GA104 die consisting of 6144 CUDA cores, operating at up to a 1770MHz boost frequency for the reference spec. The GPU will come with 8GB of GDDR6X memory operating at 19Gbps and a TDP of 290W.
EVGA
Image 1 of 2
Image 2 of 2
To keep things simple during this GPU shortage crisis, EVGA has only released two SKUs for the RTX 3070 Ti, the XC3 Gaming and the FTW3 Ultra Gaming. You can grab both of these cards right now on EVGA’s store if you have the companies Elite membership. If not, you’ll need to wait until tomorrow to grab the cards if they happen to be availably.
Nothing has really changed with the RTX 3070 Ti’s FTW3 and XC3 designs, both cards feature a triple-fan cooler design, along with a fully blacked-out shroud. The XC3 is a much more stealthy dual-slot cooler, with barely any RGB insight.
The FTW3 model is much larger at 2.75 slots in thickness, and features much more RGB than its cheaper counterpart.
The FTW3 model runs at a boost frequency of 1860MHz while the XC3 runs at a lower 1770MHz.
Gigabyte
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Gigabyte has done the exact opposite of EVGA and released five different SKUs for the RTX 3070 Ti, ranging from the flagship Aorus Master model down to the RTX 3070 Ti Eagle, a more budget-friendly card.
Because Gigabyte does not have its own store, expect to buy (or wait to buy) these cards from popular retailers such as MicroCenter, Newegg, Amazon, Best Buy and others.
Aesthetically, each RTX 3070 Ti SKU has subtle differences between the RTX 3080 and RTX 3070 versions. For each SKU, Gigabyte has adjusted the design of the cards, giving them the same colors as the 3080 and 3070 cards, but offering slightly different design elements.
The only exception to this is the RTX 3070 Ti Vision, which shares the exact same design as the RTX 3080 and 3070 models.
All RTX 3070 Ti models are triple-fan cooler designs, presumably due to the 3070 Ti’s really high TDP of 290W. The Aorus Master is the top trim with a beefy triple slot heatsink, and lots of RGB. The Gaming variant is Gigabyte’s mid-range SKU, and the Eagle represents Gigabyte’s lowest-end offering. The Vision model is aimed more towards the prosumer market, with less “gamery” aesthetics.
MSI
Image 1 of 3
Image 2 of 3
Image 3 of 3
MSI will be offering three custom-designed versions of the RTX 3070 Ti, the Suprim, Gaming Trio and Ventus. Each model also comes with a OC model, doubling the amount of options to six.
The Suprim is the flagship card with a silver and grey finish, and a shroud that measures beyond two slots in thickness. RGB can be seen by the fans and on the side.
The Gaming Trio is the mid-range offering, featuring a blacked out shroud along with red and silver accents. The card is similar in height to the Suprim and is over two slots thick.
The Ventus is MSI’s budget entry level card featuring a fully blacked out shroud, with grey accents and again, is more than two slots thick. If you want a stealthy appearance this is the card to go for.
Compared to the RTX 3080 and RTX 3070 equivalent models, there’s very little difference between them and the RTX 3070 Ti SKUs. They all are incredibly similar in size, and aesthetically are largely identical besides a few backplate design changes and a couple of accent changes on the main shroud.
Zotac
Image 1 of 2
Image 2 of 2
Zotac is coming out with just two models for the RTX 3070 Ti, the Trinity and AMP Holo.
Both the Trinity and Holo feature triple-fan cooler designs, with largely identical design elements to them. Both feature grey and back color combinations, along with
The main difference between the cards is a slightly different boost speed of 1800MHz on the Trinity vs 1830MHz on the Holo, and the Holo features a much larger RGB light bar on the side, making the Trinity the more “stealthy” of the two.
Inno3D
Image 1 of 2
Image 2 of 2
Inno3D is releasing four different SKUs for the 3070 Ti, the iChillX4, iChill X3, X3 and X3 OC.
The Chill X4 and X3 are almost identical in everything; The only major add for the X4 is a quad fan setup, with an extra fan to give the card some active airflow from the side. We are not sure how much this will affect temps, but it’s a cool looking feature.
Both the Chill X3 and X4 feature very aggressive styling for a graphics card, with a black and metal finish, with several screws drilled into the metal, similar to race cars. To the side is a very bright and large strip of RGB that looks like something from Cyberpunk 2077. The RGB itself has a neon glow to it, with the ‘iChill’ logo installed in the middle.
The Chill X3 and X4 feature 1830Mhz boost frequencies and thicknesses beyond 2 slots.
INnno3D’s RTX 3070 Ti X3 and X3 OC on the other hand, are the complete opposite of the Chill cards. The shroud is a very basic black shroud with no RGB or lighting anywhere on the card. This is Inno’s budget-friendly option which explains the simplistic design.
The card comes with a 1770MHz boost clock, with the OC model featuring a 1785MHz boost frequency. The X3 comes with a flat 2 slot thickness, allowing the card to fit in slimmer chassis.
Galax
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Galax is coming in with four different versions of the RTX 3070 Ti, including dual fan options.
The flagship model for Galaxy is the 3070 Ti EXG, available in Black or White colors. These cards feature large triple-fan cooler designs and thicknesses beyond two slots. The shrouds are very basic, with just a pure black or pure white shroud, depending on the color you purchase. Making up the lighting are the fans with RGB illumination.
The RTX 3070 Ti SG is probably the most interesting of all of the 3070 Ti cards as a whole, with a unique add-on cooling solution. The card comes with the same shroud and fan design as the EXG, but features a significantly cut-down PCB, to make way for a large cut-out at the end to allow the installation of an additional fan to the rear of the card. If space allows, this additional fan gives the rear of the card a push-pull design, for maximum airflow.
Next, we have the 3070 Ti EX, a dual-fan option available in black or white flavors. This is the first SKU we’ve seen with a dual-fan solution for the 3070 Ti, so this will be a great option for users looking for a compact solution for smaller chassis. However, like the other Galax cards, the thickness is higher than two slots, so keep that in mind for smaller builds.
Besides the dual fan cooler, everything else is very similar to the EXG models with a pure black or white finish (depending on the flavor you choose) and RGB fans.
Lastly, there’s the Galax RTX 3070 Ti, a card with no fancy name, representing the budget endo Galax’s lineup.
The card is super basic with a carbon fiber-looking black shroud, and black fans. Unlike the EX model, this card is boxier with fewer angles to the design.
Palit
Image 1 of 2
Image 2 of 2
Palit is introducing three versions of the RTX 3070 Ti: the GameRock, GameRock OC, and Gaming Pro.
The GameRock appears to be the company’s flagship model for the 3070 Ti. The card comes in a wild-looking grey shroud paired with a layer of see-through diamond-like material all along the fan area. This part is all RGB illuminated.
The cards are triple fan cards with sizes larger than two slots in thickness.
The GamingPro, on the other hand, is a more normal card, with a black and grey shroud and some fancy silver accents which act as fan protectors on the middle and rear fans. This card is similar in size to the GameRock cards.
The GameRock OC comes with a 1845MHz boost clock, the vanilla model features a 1770MHz boost clock, and the same clock goes for the GamingPro.
Asus Z590 WiFi Gundam Edition (Image credit: Asus)
In a collaboration with Sunrise and Sostu, Asus announced last year a special lineup of PC components inspired by the Gundam anime series. While the products were originally specific to the Asian region, they have now made their way over to the U.S. market.
Asus introduced two opposing Gundam series. The Gundam series is based on the RX-78-2 Gundam, while the Zaku series borrows inspiration from the MS-06S Char’s Zaku II. The list of components include motherboards, graphics cards, power supplies, monitors and other peripherals. Specification-wise, the Gundam-and Zaku-based versions are identical to their vanilla counterparts.
For now, there’s not a lot to choose from. Newegg only currently sells four Gundam-themed products from Asus. On the motherboard end, we have the Z590 WiFi Gundam Edition and the TUF Gaming B550M-Zaku (Wi-Fi). The U.S. retailer also listed the RT-AX86U Zaku II Edition gaming router and TUF Gaming GT301 Zaku II Edition case.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
The Z590 WiFi Gundam Edition, which retails for $319.99, is a LGA1200 motherboard that supports Intel’s latest 11th Generation Rocket Lake-S processors. The motherboard supports up to 128GB of memory and memory frequencies up to DDR4-5133 without a sweat. The Z590 WiFi Gundam Edition also offers PCIe 4.0 support on both its M.2 ports and PCIe expansion slots as well as Wi-Fi 6 connectivity with added Bluetooth 5.0 functionality.
The TUF Gaming B550M-Zaku (Wi-Fi), on the other hand, leverages the B550 to accommodate multiple generations of Ryzen processors up to the latest Zen 3 chips. The microATX motherboard also supports the latest technologies, such as PCIe 4.0, Wi-Fi 6 and USB Type-C. Newegg has the TUF Gaming B550M-Zaku (Wi-Fi) up for purchase for $219.99.
The RT-AX860U Zaku II Edition is one of Asus’ most recent dual-band Wi-Fi 6 gaming routers. The router, which sells for $299.99, offers speeds up to 5,700 Mbps and 160 MHz channels. A quad-core 1.8 GHz processor and 1GB of DDR3 powers the RT-AX860U Zaku II Edition.
Lastly, the TUF Gaming GT301 Zaku II Edition is a $119.99 mid-tower case for ATX motherboards. It offers generous support for radiators up to 360mm and a tempered glass side panel to show off your hardware. There’s also a convenient headphone hangers to keep your heaphones safe and at hand.
Discrete GPU suppliers for desktop PCs nearly quadrupled the cash they raked in during Q1 2021 compared to this time last year. That’s due to the skyrocketing increases in average selling prices (ASPs), according to Jon Peddie Research.
Nvidia retained its leadership and outsold its rival AMD four-to-one in the discrete GPU market, but it’s clear that both vendors are selling every chip they can manufacture.
GPU Pricing Skyrockets, Revenue Jumps 370%
According to the report, GPU makers earned around $12.5 billion in Q1, a whopping 370% increase over last year. A big part of that gain is due to skyrocketing pricing.
As you can see in the image above, everything changed in the second half of 2020 when GPU pricing — from entry to midrange and from high-end to workstation — skyrocketed nearly overnight in the fourth quarter as a result of demand from gamers and miners.
Today the average price of an entry-level graphics card is $496, a mid-range board costs $809, and a high-end GPU carries an $1,358 price tag, according to JPR.
Analysts from Jon Peddie Research believe that price increases are driven by a number of factors, including component shortages, insufficient manufacturing capacity, strong demand from gamers, and solid (albeit limited) demand from Ethereum cryptocurrency mining.
Interestingly, the average price of a GPU has increased slowly over the years as lower-end systems migrated to integrated GPUs, whereas owners of higher-end machines demanded better standalone graphics boards. For instance, the average pricing for a high-end graphics card increased from around $500 in the first half of 2018 to around $780 in the first half of 2020. Meanwhile, average prices of entry-level and midrange third-party cards tend to fluctuate depending on actual offerings and refresh cycles. Those incremental price increases turned into an avalanche in recent times, though.
11.77 Million Graphics Cards for Desktops Sold in Q1
Unit shipments of discrete graphics cards for desktop PCs reached 11.77-11.8 million units in Q1 2021, a 7.77% quarter-over-quarter increase and a noticeable 24.4% year-over-year growth, JPR reports. Typically, sales of graphics boards drop in Q1 compared to the prior quarter, but that was not the case this year.
The increased sales can be explained by the growing demand for gaming graphics cards as well as add-in-boards used for cryptocurrency mining.
AMD managed to increase its market share by 3% in the first quarter and commanded 20% of discrete GPU unit shipments. In contrast, Nvidia lost 3% but had a market share of 80%, still a dominant position.
“We believe the stay-at-home orders created demand in 2020 and in the first quarter of 2021,” the analysts said. “Home PCs and workstations became the center of professional life and often the main source of entertainment during the lockdowns. Gaming added to the pressure on the supply chain as it continued to grow in popularity. But, as we said, this is also an anomalous period in graphics history. Prices are high as a result of shortages and so is demand in response to cryptocurrency prices.”
The RTX 3070 Ti is out, and at this point, it almost feels like official retailers (or maybe Nvidia’s AIB partners) are scalping us, too. It’s not uncommon to see new graphics cards going for over twice their MSRP on eBay and other aftermarket sites on launch day, but now we’re seeing the same phenomenon hit Best Buy and Newegg.
Best Buy’s currently got two RTX 3070 Ti cards listed on its site, the Founders Edition and the Asus TUF Gaming GeForce RTX 3070 Ti. Both say they’re “coming soon,” but while the former is selling for Nvidia’s suggested $599, Asus’s card is $949. Usually, we expect AIB cards to be more expensive than Nvidia’s, but that’s a pretty steep increase. Sure, the Asus card has three fans and one RGB strip, but it’s not like we’re talking liquid cooling here.
The same phenomenon is happening at Newegg, too, where the Asus ROG Strix version of the RTX 3070 Ti is $999. Hey, but there’s more RGB than on the Tuf Gaming model!
If you thought the price increases were limited to Asus, though, you’re out of luck. A chart published yesterday by Jon Peddie Research shows that AIB graphics card pricing has roughly quadrupled since the start of Q1 2020.
Even today’s Newegg Shuffle, which is sadly probably your best chance to get an RTX 3070 Ti right now, is dominated by high-priced cards, ranging from $799 to $999 and including makers like Gigabyte and Zotac. There are a few cards at the $599 MSRP as well, but Ampere stock is sure to be so low across the board, so it’s not guaranteed you’ll get those.
I just hope that if you’re one of the folks that have been waiting in around-the-block lines for this card, you don’t have to leave the store feeling like you could have just gone to eBay instead.
Intel introduced the Iris Xe discrete graphics processor months ago, but so far, only a handful of OEMs and a couple of graphics card makers have adopted it for their products. This week, VideoCardz discovered another vendor, Gunnir, that offers a desktop system and a standalone Intel DG1 graphics card with a rare D-Sub (VGA) output, making it an interesting board design.
It’s particularly noteworthy that the graphics card has an HDMI 2.0 and a D-Sub output that can be used to connect outdated LCD or even CRT monitors. In 2021, this output (sometimes called the VGA connector, yet a 15-pin D-Sub is not exclusive for monitors) is not particularly widespread as it does not properly support resolutions beyond 2048×1536. Image quality at resolutions higher than 1600×1200 heavily depends on the quality of the output and the cable (quality is typically low). Adding a D-Sub output to a low-end PC makes some sense because some old LCD screens are still in use, and retro gaming with CRT monitors has become a fad.
As far as formal specifications are concerned, the Gunnir Lanji DG1 card is powered by Intel’s cut-down Iris Xe Max GPU with 80 EUs clocked at 1.20 GHz ~ 1.50 GHz paired with 4GB of LPDDR4-4266 memory connected to the chip using a 128-bit interface. The card has a PCIe 3.0 x4 interface to connect to the host. The card can be used for casual games and for multimedia playback (a workload where Intel’s Xe beats the competition). Meanwhile, DG1 is only compatible with systems based on Intel’s 9th- and 10th-Gen Core processors and motherboards with the B460, H410, B365, and H310C chipsets.
It is unclear where these products are available (presumably from select Chinese retailers or to select Chinese PC makers), and at what price.
Intel lists Gunnir on its website, but the card it shows is not actually a custom Gunnir card but is a typical reference design of an entry-level add-in-board from Colorful, a company that officially denies it produces Intel DG1 products as it exclusively makes Nvidia-powered GPUs.
Ahead of E3 Microsoft and Xbox are putting a heavy emphasis on cloud gaming and its Game Pass subscription program alongside its existing console ecosystem. This includes new, dedicated streaming hardware for any TV or
monitor
. It is also updating its cloud datacenters to use the
Xbox Series X
, so that gamers who stream are getting the company’s most powerful hardware.
Xbox’s announcement
comes ahead of Xbox’s joint E3 games showcase this Sunday with its recent acquisition, Bethesda, and also comes with a slew of new attempts to push Xbox onto just about any device you might already have. The Xbox division is moving to get its software embedded into internet-connected TVs, which would require no additional hardware other than a controller to play cloud games.
Additionally, the company is looking into new subscription offerings for Game Pass. (though it didn’t get into specifics), and is looking into new purchase options for Xbox All Access, which lets people buy the console and Game Pass for a monthly fee, rather than paying up front. (This is similar to how many pay for smartphones in the U.S.).
Building its own streaming devices, however, is a bigger push to make Xbox an ecosystem outside of consoles and even moves Xbox into competition, to a degree, with Chromecast, Roku and Apple TV for the living room. (Chromecast is scheduled to get
Google Stadia
support later this month).
Still, the company sees its consoles, the Xbox Series X and Series S, as its top-notch offering, even while it expands in mobile, on PC and in streaming. In fact, that’s the other major piece of hardware Xbox is working on: the next console.
“Cloud is key to our hardware and Game Pass roadmaps, but no one should think we’re slowing down on our core console engineering. In fact, we’re accelerating it,” said Liz Hamren, corporate vice president of gaming experiences and platforms.
“We’re already hard at work on new hardware and platforms, some of which won’t come to light for years. But even as we build for the future, we’re focused on extending the Xbox experience to more devices today so we can reach more people.”
This isn’t exactly surprising. Consoles start getting designed years in advance, and these days, the mid-life cycle refresh cycle is common. Microsoft has also positioned the latest consoles as a “series” of devices, so it’s possible there will be new entries in the line that remain compatible with the current options.
Cloud gaming in Xbox Game Pass Ultimate is set to launch in Brazil, Japan and Australia later this year. Meanwhile, cloud gaming in a web browser, including support for Chrome, Edge and Safari, will go live to Game Pass Ultimate subscribers “in the next few weeks.” The Xbox app on PC will also get cloud gaming integrated this year.
Hamren said that Game Pass has more than 18 million subscribers, though that wasn’t broken down between the console, PC and ultimate plans, (which include game streaming).
The Series X and S haven’t seen a ton of new titles from Microsoft Studios yet, but it sounds like that will change.
“In terms of the overall lineup, we want to get to a point of releasing a new game every quarter…” said Matt Booty, the head of Xbox Game Studios. “We know that a thriving entertainment service needs a consistent and exciting flow of new content. So our portfolio will continue to grow as our service grows.”
Xbox has more than 23 studios and also recently acquired ZeniMax Media, the parent company of Bethesda Game Studios, as well as id Software, ZeniMax Online Studios, Arkane, MachineGames, Tango Gameworks, Alpha Dog and Roundhouse Studios.
Game Pass games are released simultaneously on PC and Xbox, which Xbox Head Phil Spencer used to poke at its competitors, namely Sony and its
PlayStation 5
.
“So right now, we’re the only platform shipping games on console, PC and cloud simultaneously,” Spencer said. “Others bring console games to PC years later, not only making people buy their hardware up front, but then charging them a second time to play on PC. And, of course, all of our games are in our subscription service day one, full cross-platform included.” (PlayStation brought Horizon Zero Dawn and Days Gone to PC but long after their PlayStation 4 releases.)
Tim Stuart, the chief financial officer for Xbox, said “we’ll do a lot more in PC for sure.” There have been rumors of big changes to the Microsoft Store on Windows, including making it easier for developers to sell games. That’s another avenue we may see explored soon, as Microsoft explores
what’s next for Windows
later this month, after E3.
The Xbox and Bethesda Games Showcase will take place on Sunday, June 13 at 10 a.m. PT / 1 p.m. ET and will stream on YouTube, Twitch, Facebook and Twitter.
Microsoft is making some significant upgrades to its Xbox Cloud Gaming (xCloud) service in the next few weeks. The Xbox cloud streaming service will be moving to Xbox Series X hardware on the server side, bringing dramatic improvements to load times and frame rates. Microsoft is also moving xCloud on the web out of beta, which is good news for owners of Apple devices.
“We’re now in the final stages of internal testing, and we’ll be upgrading the experience for Ultimate members in the next few weeks,” says Kareem Choudhry, head of cloud gaming at Microsoft. “The world’s most powerful console is coming to Azure.”
The upgrade will include major improvements to xCloud, with players able to benefit from the same faster load times and improved frame rates that are available on Xbox Series X consoles. Microsoft’s xCloud service launched in September, powered by Xbox One S server blades. The load times have been one of the troubling aspects of using Xbox game streaming, and this upgrade will dramatically reduce the wait time of launching games. Players will also be able to access Xbox Series X / S optimized games.
Alongside the server upgrades, Microsoft is launching Xbox Cloud Gaming through the browser for all Xbox Game Pass Ultimate members in the next few weeks. The service is currently in an invite-only beta mode, but the expansion will make it available for all Xbox Game Pass Ultimate members to access xCloud streaming on iPhones, iPads, and on any device with a compatible browser (Chrome, Edge, and Safari).
Microsoft is also expanding cloud gaming to Australia, Brazil, Mexico, and Japan later this year, and hinting at plans for new Xbox Game Pass subscriptions. “We need to innovate to bring our games and services to more people around the world, and we’re investigating how to introduce new subscription offerings for Xbox Game Pass,” says Liz Hamren, head of gaming experiences and platforms at Microsoft.
These new Xbox Game Pass subscriptions will likely include some form of access to xCloud game streaming. Microsoft currently only offers Xbox game streaming to those who subscribe to the Xbox Game Pass Ultimate tier, which is priced at $14.99 per month. It’s easy to imagine a future where Microsoft offers a separate Game Pass tier that only provides access to Xbox Cloud Gaming (xCloud).
Microsoft is also announcing plans for an Xbox TV app and its own streaming stick today, alongside the ability to access and use xCloud on Xbox consoles later this year.
Microsoft is planning to let Xbox console owners try games before they download them later this year. The new Xbox dashboard feature will allow console players to stream games through Microsoft’s Xbox Cloud Gaming (xCloud) service instantly. It’s part of a push to integrate xCloud more into Xbox consoles and into the Xbox app on Windows PCs.
“Later this year, we’ll add cloud gaming directly to the Xbox app on PCs, and integrated into our console experience, to light up all kinds of scenarios, like ‘try before you download,’” says Kareem Choudhry, head of cloud gaming at Microsoft.
Microsoft isn’t detailing all of the ways that xCloud will appear on Xbox consoles, but trying games before you download them certainly opens up possibilities for Xbox owners who want to know what a game is like before buying it.
Either way, utilizing xCloud to let Xbox players quickly jump into games before they’re downloaded will be particularly useful on day one game launches. With games regularly exceeding 100GB, it often takes hours to download titles if you didn’t plan ahead and preload a game before its launch.
In a briefing with members of the press ahead of Microsoft’s Xbox E3 event on Sunday, the company’s head of Xbox, Phil Spencer, was keen to stress Microsoft’s commitment to Xbox Game Pass and cloud gaming.
“So right now we’re the only platform shipping games on console, PC, and cloud simultaneously,” says Spencer. “Others bring console games to PC years later, not only making people buy their hardware up front, but then charging them a second time to play on PC.”
Spencer is of course referring to Sony and its ongoing efforts to bring more PlayStation games to PC years after their launch. Microsoft obviously prefers its own approach to launching simultaneously across multiple platforms and being available on Xbox Game Pass on day one.
Speaking of Xbox Game Pass, Microsoft is also committing to some form of a timeline for exclusive first-party content for the service. “In terms of the overall lineup, we want to get to a point of releasing a new game every quarter … we know that a thriving entertainment service needs a consistent and exciting flow of new content,” explains Matt Booty, head of Xbox Game Studios. “So our portfolio will continue to grow as our service grows.”
Microsoft isn’t providing an update on its Xbox Game Pass subscription growth yet. The service jumped to 18 million subscribers earlier this year, after growing steadily throughout 2020. Today’s announcements are part of some broader Xbox and xCloud news, including server upgrades to xCloud and Microsoft’s plans for an Xbox TV app and streaming sticks.
All in all, Asus’s Chromebook Detachable CM3 is a nice package. It’s a 10.5-inch tablet with magnetically-attached fabric cover and kickstand. It’s $389.99 as tested, which means it’s priced far below all kinds of convertible Chromebooks. I’m not the first to make this comparison, but it’s a slightly more expensive, and slightly fancier version of the $269 Lenovo Chromebook Duet (currently listed at $269) that impressed me so much last year.
I think the CM3 is a slightly worse purchase than the Duet for most people who are looking for a secondary device, or a small Chromebook for a student. The CM3 does offer a few noticeable benefits over the Duet, but I’m not sure they’re worth $100. While features like a dual-folding kickstand, a garaged stylus, and a headphone jack are nice to have, none of them are as central to a device’s user experience as its processor. And while $269 is an acceptable price to pay for a tablet with a MediaTek chip, $389.99 is pushing it.
With all that said, I don’t have many problems with this Chromebook. It’s just in a bit of an odd spot.
My test unit includes 128GB of storage, 4GB of RAM, a 10.5-inch 1920 x 1200 display, and a MediaTek 8183 processor. There’s a 64GB version listed at $369.99 as well. 64GB isn’t a lot of storage (and there’s no microSD card slot for expansion on the CM3), so my config is the one I’d recommend most people go for.
The most important thing to understand about the CM3 before you buy it is the size. It’s small, with just a 10.5-inch screen. This brings benefits and drawbacks. On the one hand, it’s quite slim and portable, at just 0.31 inches thick and 1.1 pounds (2.02 pounds with the keyboard and stand attached). It’s the kind of thing I could easily carry in my purse.
On the other hand, a 10.5-inch screen is cramped for a desktop OS like Chrome OS (though it is bright enough to use outdoors, and I appreciate that it has a 16:10 aspect ratio — 16:9 would be unbearable for me at this size). But it was too small for me to comfortably use as a work driver. I had to zoom out far to be able to see everything I needed to in my Chrome windows.
It also means there’s only so much space for the keyboard deck, which is also cramped. The touchpad, in particular, is small. The keyboard itself is roomier than the Duet’s, though — it has a surprising amount of travel and a satisfying click. While the small keys are a bit of an adjustment, none are small enough as to be unusable.
Small doesn’t mean cheap, and the CM3’s build is fairly sturdy overall. The palm rests and detachable keyboard deck feel quite plasticky, but the tablet itself is aluminum (with “diamond-cut edges”, per Asus). The magnetic cover is made of a woven fabric, and looks quite similar to the cover of the Chromebook Duet. The cover is included with the price of the CM3, which isn’t the case with some detachables (such as Microsoft’s Surface Go line).
A USI stylus lives in the top right corner of the chassis — it’s firmly in there, so you’ll need a nail to tug it out. It’s small, and not my favorite stylus I’ve ever used, but it is there and does work. The Duet supports USI styluses, but it doesn’t come with one, so that’s one advantage the CM3 brings.
The main way the CM3 is unique to other detachables is that its kickstand folds multiple ways. That is, you can fold it the long way when you’re using the tablet like a laptop, or you can flip the tablet vertically and fold the kickstand horizontally. This is a cool feature I haven’t seen before, and it does work — I was never worried about the CM3 falling over in either direction.
On the other hand, the only real use case I can think of for the horizontal position is video calls where you don’t need to have the keyboard attached and are okay with the camera being on the side of the screen. You can take your own view, but I’d rather use an iPad or dedicated tablet for these purposes and have the camera in the right place.
My unit did have a bit of fraying on the edges of the keyboard deck, which was disappointing to see on a brand-new device, even at this price. The kickstand cover also slipped off the tablet a few times while I was adjusting the height, which isn’t something that ever happened with the Duet.
Speaking of convertibility, the CM3 has a two-megapixel front-facing camera as well as an eight-megapixel rear-facing camera. Both cameras deliver a surprisingly reasonable picture. I wasn’t too washed out when I did a video call outside, nor was I too grainy in dim light. That said, the dual-camera setup is another cool-sounding feature that probably isn’t the most pragmatic: The rear camera isn’t good enough for actual photography of any kind, and the best use case is probably for snapping pictures of a whiteboard in class. It also takes a few seconds for the CM3 to swap between cameras (it’s not nearly as quick of a swap as it is on an iPhone, for example) so it wouldn’t have saved me a ton of time over just whipping out a phone.
The CM3’s MediaTek MTK 8183 is a hybrid chip that’s mainly used in Android tablets. (It’s a different MediaTek chip from the one that was in the Duet last year, but very similar to the one in uh, Amazon’s new Echo Show 8 smart display.) It’s far from the most powerful processor you can find in a Chromebook, but that’s by design — battery life is going to be a higher priority for many folks who are considering a device as portable as the CM3.
The battery life is, in fact, excellent. I averaged 12 hours and 49 minutes of continuous use running the CM3 through my regular workload of Chrome tabs and Android apps including Slack, Messenger, Twitter, Gmail, Spotify, and an occasional Zoom call with the screen at medium brightness — over an hour longer than I saw from the Duet with the same workload. This is already a heavier load than many people may want to put the CM3 through, so you may get even more time between charges. The 45W USB-C adapter juiced the CM3 up to 40 percent in an hour, making it much faster than the Duet’s wimpy 10W charger.
That battery life doesn’t come free, though, and the CM3’s performance was a mixed bag. It works fine in Chrome, for example, albeit with a teensy bit of sluggishness when swapping tabs and resizing windows, as well as other Google services like Gmail, Docs, Drive, Calendar, and Meet (and it comes with a free 12-month 100GB membership to Google One for the rest of this year). Gaming is also fine — Flipping Legends and Monsters were both smooth and stutter-free, regardless of whether the CM3 was plugged in or running on battery.
I also think Chrome OS’s tablet mode, which the CM3 supports, has gotten pretty good. It uses Android-esque gesture controls that can help flatten the learning curve for new Chromebook users. Swiping up brings you to the home screen, for example, and swiping right swaps between web pages. You can access a version of Chrome specifically for tablets, which allows you to easily open, close, and reorder tabs with drags, swipes, and large buttons. It’s not quite like using an iPad, but I do think it’s a smoother experience than Windows’s tablet mode (especially in Chrome).
All you have to do to switch in and out of tablet mode is snap the keyboard on and off — it takes a second, and my windows didn’t always quite go back to the way I’d arranged them when I put the keyboard back on, but it’s a reasonably smooth affair overall.
But the CM3 didn’t perform well on every task I needed. Sometimes when I was trying to use Slack or Messenger over a pile of Chrome tabs, something would freeze. Zoom calls were possible — which is more than can be said for some budget Chromebooks — but I did run into lag between audio and video inputs. Slack froze and crashed quite often, and Spotify crashed a few times as well.
Photo editing was where I really ran into trouble. Lightroom was basically unusable on the CM3 with just a few things running in the background — I tried to edit a batch of around 100 photos, and could consistently only get through a few before the program crashed. I tried to move over to Google Photos, which also eventually crashed, and ended up having to do everything in Gallery. Of course, not everyone will be editing photos on their Chromebook, or pushing it as hard as I was pushing this one, so it’s a matter of knowing your own needs.
Speaking of Zoom meetings, the dual speakers are okay for Zoom calls but not too much more. The songs I played had stronger percussion than I sometimes hear from laptop speakers, but it was thin and tinny overall. The microphone did seem to work well, and didn’t have trouble picking up my voice on calls.
This was a difficult product to score. I do think the CM3 is a great device. And it does offer a few benefits over the Chromebook Duet that justify it costing a bit more. I’d probably purchase it over the Duet myself for the keyboard alone if I were looking for this type of device — the versatile kickstand, built-in stylus, and decent build quality are nice perks as well.
But “if I were looking for this type of device” is doing some heavy lifting in that sentence. I’m not looking for a MediaTek device, and there’s a reason I’m not. The battery life is impressive, sure, but it’s just not enough horsepower for the workload I need. And if you are someone whose needs are suited to this low-powered processor (and there are plenty of these people in the world), I really think $389 is at the very high end of what you should be spending.
Sure, the CM3 has a (just okay) stylus, a kickstand with a funky fold, slightly better battery life, and one extra port. But it’s also on par with or slower than the Duet in most tasks I tried, the audio is worse, and it’s thicker and heavier. Given all that, I’m not convinced the CM3’s advantages are worth $100 to most people who are shopping in this category.
Microsoft is working with TV manufacturers to make an Xbox app available on devices soon. The software giant is planning to bring its Xbox Game Pass service to TVs through its xCloud streaming technology, opening up more ways to get access to Xbox games. This will be available as both an app on TVs, and with Microsoft’s own dedicated streaming stick.
“We’re working with global TV manufacturers to embed the Game Pass experience directly into internet-connected TVs so all you’ll need to play is a controller,” says Liz Hamren, head of gaming experiences and platforms at Microsoft.
Microsoft isn’t announcing exactly when this Xbox app will be available on TVs, nor which manufacturers will bundle it on their devices. Xbox chief Phil Spencer previously hinted at an Xbox app for TVs late last year, noting he expects to “see that in the next 12 months.”
Spencer also hinted at Microsoft’s own Xbox streaming stick last year, something Microsoft now says will appear soon. “We’re also developing standalone streaming devices that you can plug into a TV or monitor, so if you have a strong internet connection, you can stream your Xbox experience,” reveals Hamren.
Much like the TV app plans, Microsoft isn’t providing any details on release date or pricing for its own Xbox streaming devices. We don’t even know what they will look like. Microsoft revealed these details in a special press briefing ahead of its E3 event later this week. Microsoft will be focusing on games at its E3 showcase on Sunday June 13th, so it’s unlikely we’ll get any further details until the devices are ready to ship.
This Xbox Game Pass expansion to TVs is part of a broader effort by Microsoft to make its subscription service available beyond just phones and Xbox consoles. Microsoft is also announcing upgrades to its xCloud server blades today, and the ability to access and use xCloud on Xbox consoles later this year.
Just when you thought things were as bad in the GPU market as they could be, something new pops up. Demand for gaming graphics cards, game consoles, and GPUs used for mining cryptocurrencies have driven prices of graphics cards and graphics memory to new heights in recent months, but according to TrendForce, that’s about to get even worse. Contract prices of GDDR memory are expected to grow another 8% – 13% later this year due to numerous factors. The only question is how badly the price of graphics memory will affect the prices of actual graphics cards, too.
Graphics DRAM represents a relatively small fraction of the overall memory market, which is largely dominated by LPDDR memory for smartphones, mainstream DRAM for PCs, and server DRAM for datacenters. To that end, GDDR always has to fight for limited DRAM production capacities with other types of memory. Due to relatively low graphics memory volumes and superior performance characteristics, these chips are usually fairly expensive. But that’s a liability in a severe under-supply situation — the price of GDDR6/GDDR6X DRAM has increased significantly.
Memory makers traditionally serve large contract customers first. In the case of GDDR6 and GDDR6X DRAMs, the largest consumers are Nvidia (which bundles memory with some of its GPUs), contract manufacturers that produce game consoles for Microsoft and Sony (Flextronics, Foxconn), and several GPU makers (Asus, Colorful, Palit, etc.). As a result of this prioritization, smaller clients are struggling to get graphics memory.
TrendForce says that GDDR fulfillment rates for some medium- and small-size customers have been around 30% for some time, which is why spot prices of graphics memory sometimes exceeded contract prices by up to 200%. To some degree, GDDR6 spot prices were affected by increasing crypto pricing (particularly Ethereum), so a drop in the coin’s value also reduced GDDR6 spot pricing.
In contrast, GDDR5 hasn’t fluctuated significantly. That’s mostly because it’s really only used for GeForce GTX 1650/1660 and some OEM-oriented graphics boards.
There are several factors that will affect graphics memory pricing in the coming months:
Demand for gaming PCs remains at high levels, so GPU makers will require more GDDR6 and GDDR6X SGRAM chips.
The latest game consoles from Microsoft and Sony use 16Gb GDDR6 memory chips, whereas graphics cards use 8Gb GDDR6 devices, so makers of graphics memory are not exactly flexible.
Since Nvidia, contract manufacturers, and select makers of graphics cards will remain the top consumers of GDDR6 and GDDR6X memory, other players will still be severely supply-constrained.
As demand for servers and mainstream PCs is increasing, graphics memory has to fight for production capacity, affecting its pricing.
Since GDDR6 and GDDR6X pricing is set to increase, the bill-of-materials (BOM) costs for GPUs will also increase. Since there are supply constraints of other components and logistics problems in place, it is unlikely anything could offset the BOM increase in the third quarter. And with that, it will get more expensive for manufacturers to build GPUs.
Meanwhile, GPU pricing is inflated because of demand from gamers and miners. Therefore, if AMD and Nvidia increase their GPU supply in Q3 and demand from miners receeds because of lower Ethereum value, then GPU pricing could actually decrease. Unfortunately, we don’t know exactly what will happen, so the future is hard to predict.
The Nvidia GeForce RTX 3070 Ti continues the Ampere architecture rollout, which powers the GPUs behind many of the best graphics cards. Last week Nvidia launched the GeForce RTX 3080 Ti, a card that we felt increased the price too much relative to the next step down. RTX 3070 Ti should do better, both by virtue of only costing $599 (in theory), and also because there’s up to a 33% difference between the existing GeForce RTX 3070 and GeForce RTX 3080. That’s a $100 increase in price relative to the existing 3070, but both the 3070 and 3080 will continue to be sold, in “limited hash rate” versions, for the time being. We’ll be adding the RTX 3070 Ti to our GPU benchmarks hierarchy shortly, if you want to see how all the GPUs rank in terms of performance.
The basic idea behind the RTX 3070 Ti is simple enough. Nvidia takes the GA104 GPU that powers the RTX 3070 and RTX 3060 Ti, only this time it’s the full 48 SM variant of the chip, and pairs it with GDDR6X. While Nvidia could have tried doing this last year, both the RTX 3080 and RTX 3090 were already struggling to get enough GDDR6X memory, and delaying by nine months allowed Nvidia to build up enough inventory of both the GPU and memory for this launch. Nvidia has also implemented its Ethereum hashrate limiter, basically cutting mining performance in half on crypto coins that use the Ethash / Dagger-Hashimoto algorithm.
Will it be enough to avoid having the cards immediately sell out at launch? Let me think about that, no. Not a chance. In fact, miners are probably still trying to buy the limited RTX 3080 Ti, 3080, 3070, 3060 Ti, and 3060 cards. Maybe they hope the limiter will be cracked or accidentally unlocked again. Maybe they made too much money off of the jump in crypto prices during the past six months. Or maybe they’re just optimistic about where crypto is going in the future. The good news, depending on your perspective, is that mining profitability has dropped significantly during the past month, which means cards like the RTX 3090 are now making under $7 per day after power costs, and the RTX 3080 has dropped down to just over $5 per day.
GeForce RTX 3070 Ti: Not Great for Mining but Still Profitable
Image 1 of 3
Image 2 of 3
Image 3 of 3
Even if the RTX 3070 Ti didn’t have a limited hashrate, it would only net about $4.25 a day. With the limiter in place, Ravencoin (KAWPOW) and Conflux (Octopus) are the most profitable crypto coins right now, and both of those hashing algorithms still appear to run at full speed. Profitability should be a bit higher with tuning, but right now, we’d estimate making only $3.50 or so per day. That’s still enough for the cards to ‘break even’ in about six months, but again, profitability has dropped and may continue to drop.
The gamers among us will certainly hope so, but even without crypto coin mining, demand for GPUs continues to greatly exceed supply. By launching the RTX 3070 Ti, with its binned GA104 chips and GDDR6X memory, Nvidia continues to steadily increase the number of GPUs it’s selling. Nvidia is also producing more Turing GPUs right now, mostly for the CMP line of miner cards, and at some point, supply should catch up. Will that happen before the next-gen GPUs arrive? Probably, but only because the next-gen GPUs are likely to be pushed back thanks to the same shortages facing current-gen chips.
Okay, enough of the background information. Let’s take a look at the specifications for the RTX 3070 Ti, along with related Nvidia GPUs like the 3080, 3070, and the previous-gen RTX 2070 Super:
GPU Specifications
Graphics Card
RTX 3080
RTX 3070 Ti
RTX 3070
RTX 2070 Super
Architecture
GA102
GA104
GA104
TU104
Process Technology
Samsung 8N
Samsung 8N
Samsung 8N
TSMC 12FFN
Transistors (Billion)
28.3
17.4
17.4
13.6
Die size (mm^2)
628.4
392.5
392.5
545
SMs / CUs
68
48
46
40
GPU Cores
8704
6144
5888
2560
Tensor Cores
272
192
184
320
RT Cores
68
48
46
40
Base Clock (MHz)
1440
1575
1500
1605
Boost Clock (MHz)
1710
1765
1725
1770
VRAM Speed (Gbps)
19
19
14
14
VRAM (GB)
10
8
8
8
VRAM Bus Width
320
256
256
256
ROPs
96
96
96
64
TMUs
272
192
184
160
TFLOPS FP32 (Boost)
29.8
21.7
20.3
9.1
TFLOPS FP16 (Tensor)
119 (238)
87 (174)
81 (163)
72
RT TFLOPS
58.1
42.4
39.7
27.3
Bandwidth (GBps)
760
608
448
448
TDP (watts)
320
290
220
215
Launch Date
Sep 2020
Jun 2021
Oct 2020
Jul 2019
Launch Price
$699
$599
$499
$499
The GeForce RTX 3070 Ti provides just a bit more theoretical computational performance than the 3070, thanks to the addition of two more SMs. It also has slightly higher clocks, giving it 7% more TFLOPS — and it still has 27% fewer TFLOPS than the 3080. More important by far is that the 3070 Ti goes from 14Gbps of GDDR6 and 448 GB/s of bandwidth to 19Gbps GDDR6X and 608 GB/s of bandwidth, a 36% improvement. In general, we expect performance to land between the 3080 and 3070, but closer to the 3070.
Besides performance specs, it’s also important to look at power. It’s a bit shocking to see that the 3070 Ti has a 70W higher TDP than the 3070, and we’d assume nearly all of that goes into the GDDR6X memory. Some of it also allows for slightly higher clocks, but generally, that’s a significant increase in TDP just for a change in VRAM.
There’s still the question of whether 8GB of memory is enough. These days, we’d say it’s sufficient for any game you want to play, but there are definitely instances where you’ll run into memory capacity issues. Not surprisingly, many of those come in games promoted by AMD, it’s almost like AMD has convinced developers to target 12GB or 16GB of VRAM at maximum quality settings. But a few judicious tweaks to settings (like dropping texture quality a notch) will generally suffice.
The difficulty is that there’s no good way to get more memory other than simply doing it. The 256-bit interface means Nvidia can do 8GB or 16GB — nothing in between. And with the 3080 and 3080 Ti offering 10GB and 12GB, respectively, there was basically no chance Nvidia would equip a lesser GPU with more GDDR6X memory. (Yeah, I know, but the RTX 3060 12GB remains a bit of an anomaly in that department.)
GeForce RTX 3070 Ti Design: A Blend of the 3070 and 3080
Image 1 of 6
Image 2 of 6
Image 3 of 6
Image 4 of 6
Image 5 of 6
Image 6 of 6
Unlike the RTX 3080 Ti, Nvidia actually made some changes to the RTX 3070 Ti’s design. Basically, the 3070 Ti has a flow-through cooling fan at the ‘back’ of the card, similar to the 3080 and 3090 Founders Edition cards. In comparison, the 3070 just used two fans on the same side of the card. This also required some tweaks to the PCB layout, so the 3070 Ti doesn’t use the exact same boards as the 3070 and 3060 Ti. It’s not clear exactly how much the design tweak helps with cooling, but considering the 290W vs. 220W TDP, presumably Nvidia did plenty of testing before settling on the final product.
Overall, whether the change significantly improves the cooling or not, we think it does improve the look of the card. The RTX 3070 and 3060 Ti Founders Editions looked a bit bland, as they lacked even a large logo indicating the product name. The 3080 and above (FE models) include RGB lighting, though, which the 3070 Ti and below lack. Third party cards can, of course, do whatever they want with the GPU, and we assume many of them will provide beefier cooling and RGB lighting, along with factory overclocks.
One question we had going into this review was how well the card would cool the GDDR6X memory. The various Founders Edition cards with GDDR6X memory can all hit 110 degrees Celsius on the memory with various crypto mining algorithms, at which point the fans kick into high gear and the GPU throttles. Gaming tends to be less demanding, but we still saw 102C-104C on the 3080 Ti. The 3070 Ti doesn’t have that problem. Even with mining algorithms, the memory peaked at 100C, and temperatures in games were generally 8C–12C cooler. That’s the benefit of only having to cool 8GB of GDDR6X instead of 10GB, 12GB, or 24GB.
GeForce RTX 3070 Ti: Standard Gaming Performance
TOM’S HARDWARE GPU TEST PC
Our test setup remains unchanged from previous reviews, and like the 3080 Ti, we’ll be doing additional testing with ray tracing and DLSS — using the same tests as our AMD vs. Nvidia: Ray Tracing Showdown. We’re using the test equipment shown above, which consists of a Core i9-9900K, 32GB DDR4-3600 memory, 2TB M.2 SSD, and the various GPUs being tested — all of which are reference models here, except for the RTX 3060 (an EVGA model running reference clocks).
That gives us two sets of results. First is the traditional rendering performance, using thirteen games, at 1080p, 1440p, and 4K with ultra/maximum quality settings. Then we have ten more games with RT (and sometimes DLSS, where applicable). We’ll start with 4K, since this is a top-tier GPU more likely to be used at that resolution, plus it’s where the card does best relative to the other GPUs — CPU bottlenecks are almost completely eliminated at 4K, but more prevalent at 1080p. If you want to check 1080p/1440p/4K medium performance, we’ll have those results in our best graphics cards and GPU benchmarks articles — though only for nine of the games.
Image 1 of 14
Image 2 of 14
Image 3 of 14
Image 4 of 14
Image 5 of 14
Image 6 of 14
Image 7 of 14
Image 8 of 14
Image 9 of 14
Image 10 of 14
Image 11 of 14
Image 12 of 14
Image 13 of 14
Image 14 of 14
The RTX 3070 Ti does best as a 1440p gaming solution, which remains the sweet spot in terms of image quality and performance requirements. Overall performance ended up 9% faster than the RTX 3070 and 13% slower than the RTX 3080, so the added memory bandwidth only goes so far toward removing bottlenecks. However, a few games benefit more, like Assassin’s Creed Valhalla, Dirt 5, Horizon Zero Dawn, Shadow of the Tomb Raider, and Strange Brigade — all of which show double-digit percentage improvements relative to the 3070.
Some of the games are also clearly hitting other bottlenecks, like the GPU cores. Borderlands 3, The Division 2, Far Cry 5, FFXIV, Metro Exodus, and Red Dead Redemption 2 all show performance gains closer to the theoretical 7% difference in compute that we get from core counts and clock speeds. Meanwhile, Watch Dogs Legions ends up showing the smallest change in performance, improving just 3% compared to the RTX 3070.
The RTX 3070 Ti makes for a decent showing here, but we’re still looking at an MSRP increase of 20% for a slightly less than 10% increase in performance. Compared to AMD’s RX 6000 cards, the 3070 Ti easily beats the RX 6700 XT, but it comes in 6% behind the RX 6800 — which, of course, means it trails the RX 6800 XT as well.
On the one hand, AMD’s GPUs tend to sell at higher prices, even when you see them in places like the Newegg Shuffle. At the same time, RTX 30-series hardware on eBay remains extremely expensive, with the 3070 selling for around $1,300, compared to around $1,400 for the RX 6800. Considering the RTX 3070 Ti is faster than the RTX 3070, it remains to be seen where street pricing lands. Of course, the reduced hashrates for Ethereum mining on the 3070 Ti may also play a role.
Image 1 of 14
Image 2 of 14
Image 3 of 14
Image 4 of 14
Image 5 of 14
Image 6 of 14
Image 7 of 14
Image 8 of 14
Image 9 of 14
Image 10 of 14
Image 11 of 14
Image 12 of 14
Image 13 of 14
Image 14 of 14
Next up is 1080p testing. Lowering the resolution tends to make games more CPU limited, and that’s exactly what we see. The 3070 Ti was 7% faster than the 3070 this time and 11% slower than the 3080. It was also 7% faster than the 6700 XT and 6% slower than the 6800. While you can still easily play games at 1080p on the RTX 3070 Ti, the same is true of most of the other GPUs on our charts.
We won’t belabor the point, other than to note that our current test suite is slightly more tilted in favor of AMD GPUs (six AMD-promoted games compared to four Nvidia-promoted games, with three ‘agnostic’ games). We’ll make up for that when we hit the ray tracing benchmarks in a moment.
Image 1 of 14
Image 2 of 14
Image 3 of 14
Image 4 of 14
Image 5 of 14
Image 6 of 14
Image 7 of 14
Image 8 of 14
Image 9 of 14
Image 10 of 14
Image 11 of 14
Image 12 of 14
Image 13 of 14
Image 14 of 14
Not surprisingly, while 4K ultra gaming gave the RTX 3070 Ti its biggest lead over the RTX 3070 (11%), it also got its biggest loss (17%) against the 3080. 4K also narrowed the gap between the 3070 Ti and the RX 6800, as AMD’s Infinity Cache starts to hit its limits at 4K.
Technically, the RTX 3070 Ti can still play all of the test games at 4K, just not always at more than 60 fps. Nearly half of the games we tested came in below that mark, with Valhalla and Watch Dogs Legion being the two lowest scores — and they’re still in the mid-40s. The RTX 3070 was already basically tied with the previous generation RTX 2080 Ti, which means the RTX 3070 Ti is now clearly faster than the previous-gen halo card, at half the price.
GeForce RTX 3070 Ti: Ray Tracing and DLSS Gaming Performance
So far, we’ve focused on gaming performance using traditional rasterization graphics. We’ve also excluded using Nvidia’s DLSS technology in order to provide an apples-to-apples comparison. Now we’ll focus on ray tracing performance, with DLSS 2.0 enabled where applicable. We’re only using DLSS in Quality mode (2x upscaling) in the six games where it is supported. We’ll have to wait for AMD’s FSR to see if it can provide a reasonable alternative to DLSS 2.0 in the coming months, though Nvidia clearly has a lengthy head start. Note that these are the same tests we used in our recent AMD vs. Nvidia Ray Tracing Battle.
Image 1 of 11
Image 2 of 11
Image 3 of 11
Image 4 of 11
Image 5 of 11
Image 6 of 11
Image 7 of 11
Image 8 of 11
Image 9 of 11
Image 10 of 11
Image 11 of 11
Nvidia’s RTX 3070 Ti does far better — at least against the AMD competition — in ray tracing games. It’s not a complete sweep, as the RX 6800 still leads in Godfall, but the 3070 Ti ties or wins in every other game. In fact, the 3070 Ti basically ties the RX 6800 XT in our ray tracing test suite, and that’s before we enable DLSS 2.0.
Image 1 of 11
Image 2 of 11
Image 3 of 11
Image 4 of 11
Image 5 of 11
Image 6 of 11
Image 7 of 11
Image 8 of 11
Image 9 of 11
Image 10 of 11
Image 11 of 11
Even 1080p DXR generally ends up being GPU limited, so the rankings don’t change much from above. DLSS doesn’t help quite as much at 1080p, but otherwise, the 3070 Ti ends up right around 25% faster than the RX 6800 — the same as at 1440p. We’ve mentioned before that Fortnite is probably the best ‘neutral’ look at advanced ray tracing techniques, and the 3070 Ti is about 5–7% faster there. Turn on DLSS Quality and it’s basically double the framerate of the RX 6800.
GeForce RTX 3070 Ti: Power, Clocks, and Temperatures
We’ve got our Powenetics equipment working again, so we’ve added the 3080 Ti to these charts. Unfortunately, there was another slight snafu: We couldn’t get proper fan speeds this round. It’s always one thing or another, I guess. Anyway, we use Metro Exodus running at 1440p ultra (without RT or DLSS) and FurMark running at 1600×900 in stress test mode for our power testing. Each test runs for about 10 minutes, and we log the result to generate the charts. For the bar charts, we only average data where the GPU load is above 90% (to avoid skewing things in Metro when the benchmark restarts).
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Nvidia gives the RTX 3070 Ti a 290W TDP, and it mostly makes use of that power. It averaged about 282W for our Metro testing, but that’s partly due to the lull in GPU activity between benchmark iterations. FurMark showed 291W of power use, right in line with expectations.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Core clocks were interesting, as the GeForce RTX 3070 Ti actually ended up with slightly lower clocks than the RTX 3070 in FurMark and Metro. On the other hand, both cards easily exceeded the official boost clocks by about 100 MHz. Custom third-party cards will likely hit higher clocks and performance, though also higher power consumption.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
While we don’t have fan data (or noise data — sorry, I’m still trying to get unpacked from the move), the RTX 3070 Ti did end up hitting the highest temperatures of any of the GPUs in both Metro and FurMark. As we’ve noted before, however, none of the cards are running “too hot,” and we’re more concerned with memory temperatures. The 3070 Ti thankfully didn’t get above 100C on GDDR6X junction temperatures when testing, and even that value occured while testing crypto coin mining.
GeForce RTX 3070 Ti: Good but With Diminishing Returns
We have to wonder what things would have been like for the RTX 3070 Ti without the double whammy of the Covid pandemic and the cryptocurrency boom. If you look at the RTX 20-series, Nvidia started at higher prices ($599 for the RTX 2070 FE) and then dropped things $100 with the ‘Super’ updates a year later. Ampere has gone the opposite route: Initial prices were excellent, at least on paper, and every one of the cards sold out immediately. That’s still happening today, and the result is a price increase — along with improved performance — for the 3070 Ti and 3080 Ti.
Thankfully, the jump in pricing on the 3070 Ti relative to the 3070 isn’t too arduous. $100 more for the switch to GDDR6X is almost palatable. Except, while the 3070 offers about 90% of the 3070 Ti performance for 80% of the price and represents an arguably better buy, the real problem is the RTX 3080. It’s about 12–20% faster across our 13 game test suite and only costs $100 more (a 17% price increase).
Well, in theory anyway. Nobody is really selling RTX 3080 for $700, and they haven’t done so since it launched. The 3080 often costs over $1,000 even in the lottery-style Newegg Shuffle, and the typical price is still above $2,000 on eBay. It’s one of the worst cards to buy on eBay, based on how big the markup is. In comparison, the RTX 3070 Ti might only end up costing twice its MSRP on eBay, but that’s still $1,200. And it could very well end up costing more than that.
We’ll have to see what happens in the coming months. Hopefully, the arrival of two more desktop graphics cards in the form of the RTX 3080 Ti and RTX 3070 Ti will alleviate the shortages a bit. The hashrate limiter can’t hurt either, at least if you’re only interested in gaming performance, and the drop in mining profitability might help. But we’re far from being out of the shortage woods.
If you can actually find the RTX 3070 Ti for close to its $600 MSRP, and you’re in the market for a new graphics card, it’s a good option. Finding it will be the difficult part. This is bound to be a repeat of every AMD and Nvidia GPU launch of the past year. If you haven’t managed to procure a new card yet, you can try again (and again, and again…). But for those who already have a reasonable graphics card, there’s nothing really new to see here: slightly better performance and higher power consumption at a higher price. Let’s hope supply and prices improve by the time fall blows in.
The Cooler Master MM720 is a unique gaming mouse that improves on its predecessor, the Spawn, with a case, sensor and cable that compete with other high-end mice.
For
+ Unique design with ring finger support
+ Pure PTFE feet
+ Very lightweight, flexible cord
Against
– Side buttons can be hard to reach
– Cable already suffers from light kinking
It took nearly a decade, but Cooler Master finally announced a followup to its Spawn gaming mouse at CES 2020. The vendor has followed up its cult classic with the Cooler Master MM720. Available for $40–$50 as of writing, the MM720 is ready for the new millennium with a honeycomb-style chassis, upgraded sensor and a cable with both pros and cons. Ultimately, it’s a winning package that not only competes favorably against modern rivals but also its predecessor, which some consider the best gaming mouse of yesteryear.
Cooler Master MM720 Specs
Sensor Model
PixArt PMW-3389
Sensitivity
Up to 16,000 CPI native or 32,000 via software
Polling Rates
125, 250, 500, or 1,000 Hz
Programmable Buttons
6
LED Zones and Colors
2x RGB
Cable
6 foot (1.8m) USB Type-A
Connectivity
USB Type-A
Measurements (LxWxH)
4.15 x 3.01 x 1.47 inches (105.42 x 76.5 x 37.4mm)
Weight (without cable)
1.72 ounces (49g)
Extra
Replacement PTFE feet
Design and Comfort
Modern gaming mice often seem like they were made from the same mold. That isn’t necessarily a bad thing because manufacturers have mostly settled on shapes that can appeal to a broad audience, and breaking that mold can result in a truly awful mouse. But that didn’t stop Cooler Master from eschewing the staid designs of modern mice in favor of the unique, seemingly hand-molded case that inspired the original Spawn gaming mouse.
The Cooler Master MM720 is short, wide and defined by its curves. It almost seems like the company handed someone a ball of Silly Putty, told them to pretend it was a mouse and then used the resulting shape as inspiration. There is nary a flat surface on the mouse; every point of contact has been contoured in some way to better accommodate the natural shape of most people’s hands. This looks weird, yes, but it feels great during use.
Image 1 of 6
Image 2 of 6
Image 3 of 6
Image 4 of 6
Image 5 of 6
Image 6 of 6
But all of those things were true of the Spawn when it debuted a decade ago. The Cooler Master MM720 complements that ergonomic design with an ABS plastic honeycomb shell that weighs roughly half as much as its predecessor, a PixArt PMW-3389 optical sensor that’s been moved to a more sensible location under the mouse and a braided cable that should offer a better experience than the rubber cable Cooler Master had to use in the Spawn.
Cooler Master has also welcomed modern design trends with the MM720 in the form of two color options, white and black, with either a glossy or a matte finish. There’s a subdued Cooler Master logo on the palm rest that—along with the scroll wheel—provides the new mouse’s obligatory RGB lighting. And, of course, the honeycomb shell makes the MM720 look much different from the Spawn’s solid plastic construction.
The result is a mouse that is familiar in many ways, thanks to its similarity to mice like the similarly priced Cooler Master MM710 and Glorious Model D, yet still novel because of its shape. The Cooler Master MM720 measures 4.15 inches long, 3.01 inches wide and 1.5 inches tall and weighs 1.72 ounces. For comparison, the MM710 is 4.59 x 2.46 x 1.51 inches and about 1.87 ounces, and the Model D- is 4.72 x 2.40-2.64 x 1.30-1.57 inches and 2.4 ounces.
Unfortunately, the matte black option of the Cooler Master MM720 we tested is also a fingerprint magnet, which gives the already odd-looking mouse an even less appealing aesthetic. This problem might not be as noticeable on other versions of the mouse though, especially the white ones. And it’s merely a cosmetic issue. Cooler Master says the MM720’s case offers IP58 dust and water resistance, thanks to its special coating. The company also claims “you can dunk this bad boy in water to clean it off,” but I wasn’t brave enough to test that claim.
I also noticed some light kinking on the cable after just a little over a week of use. At this point it’s more of a visual distraction than anything else, but it does raise concerns about the cable’s long-term durability.
Gaming Performance
The Cooler Master MM720 is surprisingly comfortable to use for extended periods, and that’s mostly because it offers a place to rest your ring finger while you’re playing. Most gaming mice tend to ignore the existence of our ring fingers entirely—companies typically account for our thumbs, index fingers and middle fingers before calling it quits. But the Cooler Master MM720’s design accounts for one of those neglected appendages (sorry, pinky), and this seemingly inconsequential change makes a noticeable difference over the course of a long play session.
It’s also surprisingly easy to fling the Cooler Master MM720 around a mousepad. Many of the changes Cooler Master made to this mouse contribute to that ease of movement: the 100% pure PTFE feet are smoother than Rob Thomas, and the braided cable offers minimal drag, although it was still somewhat distracting coming off the wireless mice I’ve reviewed lately. I’m firmly in the wireless camp at this point, (see our Best Wireless Mouse page for recommendations), but if you insist on having a cable you could do worse than the Cooler Master MM720 when it comes to actual gameplay. Of course, your final views will depend on how founded or unfounded those concerns about durability prove to be.
The Cooler Master MM720’s light weight, smooth feet and braided cable are complemented by the PMW-3389 optical sensor, specced for up to 16,000 counts per inch (CPI) sensitivity, a max velocity of 400 inches per second (IPS) and max acceleration of 50g. Many other mice, including the excellent Razer Naga Pro, use the same sensor to great effect.
The sensor’s also in a sensible position on the MM720: smack-dab in the middle of the mouse, as opposed to the offset sensor found in the original Spawn. I didn’t have any trouble popping heads in Valorant with the Cooler Master MM720, and the PMW-3389’s reliability is a big contributor to that.
Another contributor: The LK optical micro switches used in the primary mouse buttons. They are certainly responsive, and I only found myself shouting “but I clicked!” because of network problems, not because of a missed input. Cooler Master markets the switches as offering “nearly instant actuation” and reducing debounce time to “practically zero.”
In fact, the only problems I had in-game with the Cooler Master MM720 involved the side buttons. They appear to be well-made, as I didn’t notice any pre or post-travel during everyday use, but their placement just doesn’t work for me. Practically every aspect of the mouse lends itself to a relaxed grip, so I want to rest my thumb in the dedicated groove along the side of the case, but the side buttons are located above that groove. This placement wouldn’t be a problem with my normal fingertip grip, but because of the Cooler Master MM720’s design, I would end up using something closer to a palm grip that forced me to stretch my thumb every time I wanted to press a side button. Cooler Master says the MM720 is fit for palm and claw grippers, but I can’t comfortably use a claw grip and take advantage of the ring finger rest, so it ended up being a matter of which trade-off I was most willing to live with.
Whether or not that’s a problem for you will depend on the grip you use, the size of your hand and how much importance you put on the side buttons. But it did seem a bit strange that this one aspect of the Cooler Master MM720’s design was at odds with the rest of the mouse. Maybe there’s a technical limitation preventing a lower placement for the side buttons or perhaps the grip I settled on wasn’t actually what Cooler Master had in mind. Hopefully others fare better in that regard.
Features and Software
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
The Cooler Master MM720 is configured using the comically named Cooler Master MasterPlus+ software. The utility offers information about your system, like the temperature, usage percentage, and voltage of your CPU and GPU by default. You can also use it to manage your other Cooler Master hardware. It checks for any new firmware on first launch, offers to install it and then gets out of the way so you can configure the Cooler Master MM720 using its many distinct settings.
The six programmable buttons can all be configured under the appropriately titled Buttons page. Because of the software’s Mouse Combo feature, there are actually just five programmable buttons—the side buttons, right mouse button and the scroll wheel directional inputs—by default. That setting allows each of the mouse’s buttons to perform a secondary function when the scroll wheel button is held down. Luckily that setting, which is enabled by default, can be disabled right on this page.
The MasterPlus+ software offers a variety of actions. Each button can be disabled, set to behave like another mouse button, keyboard key, or DPI switch, used to control multimedia playback or tasked with executing a macro, switching between profiles or performing a Rapid Fire action that repeats a given input up to 99 times as quickly as possible. There’s also an option to disable the sensor, which could prove useful if you want to stop someone from clicking around your system or if you want to watch a video without the controls popping up because you happened to jostle your desk, and the DPI switch on the bottom of the mouse can be assigned any of these functions as well.
The Cooler Master MM720 also offers a surprising amount of control over its performance. The usual settings are all here: You can enable angle snapping, toggle lift-off distance, or set the polling rate to 125, 250, 500, or 1,000 Hz. There are also sensitivity controls between 200 and 32,000 CPI; just be warned that setting the CPI any higher than 16,000 uses software and also causes problems because of the PMW-3389’s limitations. And by “causing problems” I mean the cursor is nigh impossible to control, skips around the screen, and is essentially unusable. Cooler Master provides seven CPI stages for toggling with the CPI switch that all offer separate values for their horizontal and vertical sensitivity; although, the two are linked by default.
MasterPlus+ also offers sliders for angle tunability, button response time and the operating system’s settings for double-click speed and pointer sensitivity. But the premier feature is Surface Tuning, which is supposed to optimize the sensor for your particular mousepad. I didn’t notice any improvement, but I’m also used to adapting to a variety of sensors in numerous mice, so maybe someone who spends months on end with the same mouse and/or sensor would better appreciate the setting.
The software’s RGB settings are similar to those found in most other utilities. Cooler Master offers seven preset colors, as well as slots for seven custom colors that you can set by using a color wheel or providing RGB values and adjusting the brightness slider. There are four built-in effects—Static, Breathing, Color Cycle and Indicator—that mostly perform as expected. I say mostly because Indicator is a bit of an odd duck. It’s not clear what exactly it’s indicating, and it’s the only built-in effect that uses different colors for the two RGB zones—blue in the palm rest and pink under the scroll wheel— but those colors don’t appear to be customizable and they remain static even if I move the mouse or click around the app.
There’s also the option to create a custom lighting effect, but this seems to be limited to solid colors because the LED speed and LED direction settings are grayed out. Aside from using the Indicator setting, this appears to be the only way to set different colors for the two RGB zones, but the process isn’t particularly intuitive. You have to select a color and then, entirely without prompting, click on the zone you want to assign that color in the preview window.
Macros, meanwhile, are surprisingly limited. All you’re able to do is tell MasterPlus+ to start recording your keyboard or mouse inputs, tell it when to stop recording and then set the input delay for the individual actions you performed. The only other option is to run a macro once, have it loop for as long as the designated execution key is held down or have it loop until that key is pressed again. That isn’t to say the macros can’t prove useful, but they are more limited than they are in other utilities.
Finally, there are profiles. Cooler Master offers five by default, and they can each be reset, renamed, overridden by an imported profile, exported, or viewed as a .exe file in your file system. Otherwise, they simply store the settings managed by the other sections in the app to the mouse’s 512KB of onboard storage. You can change the mouse’s current profile without having to open (or download) the app again by using the profile switch button.
Bottom Line
I said in my review of the MSI Clutch GM41 Lightweight Wireless that it featured the “prototypical gaming mouse look.” Nobody could say that about the Cooler Master MM720. It’s a unique mouse that breaks the mold with purpose—providing a more comfortable gaming experience—instead of a misguided attempt to simply look different from the other mice on the market. Sure, the groundwork for this design was laid over a decade ago, but it’ll still be novel to most of its potential customers.
The Cooler Master MM720 is also a surprisingly good value, with a honeycomb shell, modern-day sensor, braided cable, large 100% pure PTFE feet and two RGB lighting zones, starting at $40 as of writing. Many companies would either charge more for mice with those components or choose different parts. The HK-Gaming Mira-M (currently $40), for example, relies on a PMW-3360 sensor and smaller feet.
The primary drawbacks to the Cooler Master MM720 are the placement of its side buttons and the questionable durability of its cable. But of far greater concern is the mouse’s shape and if it fits your style. I preferred palm gripping with the MM720, and people who’ve been waiting for a followup to the Spawn or a more ergonomic gaming mouse should be excited by the MM720. If you prefer an ambidextrous mouse or a claw grip, the Glorious Model D- and Mira-M may be better options.
There isn’t necessarily a clear winner between the Mira-M, Model D- and MM720, which all earned our Editor’s Choice Award. But that might actually be a good thing: Having options with quite different shapes but similar pricing, specs and performance is a sign that this ultralight segment is maturing. Now you can opt for the mouse that best suits your hand size, grip and play style.
For gamers seeking a unique, ergonomic-minded option, the Cooler Master MM720 is a solid product. Let’s just hope it doesn’t take Cooler Master another decade to release a followup, eh?
The NVIDIA GeForce RTX 3070 Ti is the company’s attempt at bolstering its sub-$700 lineup targeting a segment of the gaming market that predominantly games at 1440p, but needs an upgrade path toward 4K UHD. Cards from this segment are very much capable of 4K gaming, but require a tiny bit of tweaking. There are also handy features like DLSS to fall back on. NVIDIA already has such a product in the RTX 3070, so why did it need the new RTX 3070 Ti? The answer lies in AMD’s unexpected return to the high-end graphics market with its Radeon RX 6800 series “Big Navi” graphics cards. The RX 6800 was found to outclass the RTX 3070 in most games that don’t use raytracing, and the more recently released RX 6700 XT only adds to the pressure as it trades blows with the RTX 3070 at a slightly lower price.
The GeForce RTX 3070 Ti is among a two-part refresh by NVIDIA for the higher-end of its GeForce RTX 30-series “Ampere” product stack, with the other being the RTX 3080 Ti we reviewed last week. NVIDIA attempted to set the RTX 3070 Ti apart from the RTX 3070 without significantly increasing manufacturing costs (i.e., without having to tap into the larger GA102 silicon). It did this with two changes. First, the RTX 3070 Ti maxes out the GA104 chip, enabling all 6,144 CUDA cores physically present as opposed to the 5,888 on the RTX 3070—a 4% increase. Next, NVIDIA gave the memory sub-system a major boost by giving this card 19 Gbps GDDR6X memory instead of the 14 Gbps GDDR6 on the RTX 3070. This in itself is a 35% increase in memory bandwidth even if the memory size remains the same at 8 GB. Slightly higher GPU clock speeds wrap things up. The idea is to outclass the RX 6700 XT and make up ground lost to the RX 6800.
The “Ampere” graphics architecture debuts the second generation of NVIDIA’s ambitious RTX real-time raytracing technology that combines raytraced elements with conventional raster 3D to significantly improve realism. It combines second-generation RT cores, fixed-function hardware that accelerate raytracing, now even even more raytraced effects, third-generation Tensor cores, which accelerate AI deep-learning and leverage the sparsity phenomenon to significantly increase AI inference performance, and the new Ampere CUDA core that doubles compute performance over the previous generation, leveraging concurrent INT32+FP32 math.
The new GeForce RTX 3070 Ti Founders Edition graphics card comes with an all-new design that looks like a cross between the RTX 3080 FE and RTX 3070 FE. It implements the same dual-axial flow-through concept as the RTX 3080 FE, but with styling elements that remind more of the RTX 3070 FE. The design involves two fans, one on either side of the card, and the PCB being shorter than the card itself, so fresh air drawn in by one fan is exhausted from the other side for better heat dissipation. NVIDIA is pricing the GeForce RTX 3070 Ti Founders Edition at $599, a $100 premium over the RTX 3070. We expect that current market conditions will have the card end up at around $1300, matching the RTX 3070 and slightly below the $1400 RX 6800 non-XT.
The MSI GeForce RTX 3070 Ti Suprim X is the company’s top custom-design graphics card based on the swanky new RTX 3070 Ti high-end graphics card by NVIDIA. The Suprim series represents MSI’s best efforts in the areas of product design, factory-overclocked speeds, cooling performance, and more. NVIDIA debuted the RTX 3070 Ti and RTX 3080 Ti to augment its RTX 30-series “Ampere” graphics card family, particularly as it faced unexpected competition from rival AMD in the high-end with the Radeon RX 6000 series “Big Navi” graphics cards. The RTX 3070 Ti is designed to fill a performance gap between the the RTX 3070 and RTX 3080, letting NVIDIA better compete with the RX 6700 XT and RX 6800, which posed stiff competition to the RTX 3070. Cards from this segment are expected to offer maxed-out gaming at 1440p with raytracing enabled, and also retain the ability to play at 4K UHD with reasonably good settings.
The GeForce RTX 3070 Ti is based on the same GA104 silicon as the RTX 3070, but NVIDIA made two major design changes—first, it has maxed out the GA104, enabling all 6,144 CUDA cores as opposed to 5,888 on the RTX 3070; and second, it is using faster 19 Gbps GDDR6X memory in place of 14 Gbps GDDR6 memory. The memory sub-system alone sees a significant 35% uplift in bandwidth. The memory size is still 8 GB.
The GeForce “Ampere” graphics architecture debuts the second-generation of NVIDIA’s path-breaking RTX real-time raytracing technology that combines raytraced effects, such as reflections, shadows, lighting, and global-illumination, with conventional raster 3D graphics to increase realism. “Ampere” combines second-generation RT cores with third-generation Tensor cores that accelerate AI, and faster “Ampere” CUDA cores.
The MSI RTX 3070 Ti Suprim X is an attempt by MSI to match NVIDIA’s Founders Edition cards in terms of aesthetics. A premium-looking, brushed metal cooler shroud greets you, with its trio of TorX 4.0 fans, and a dense aluminium fin-stack heatsink. MSI has given the RTX 3070 Ti its top factory-overclock at 1860 MHz compared to the 1770 MHz reference. In this review, we take the card out for a spin to show you whether MSI has aced a better-looking and better-performing card than the NVIDIA Founders Edition.
GeForce RTX 3070 Ti Market Segment Analysis
Price
Cores
ROPs
Core Clock
Boost Clock
Memory Clock
GPU
Transistors
Memory
RX 5700 XT
$370
2560
64
1605 MHz
1755 MHz
1750 MHz
Navi 10
10300M
8 GB, GDDR6, 256-bit
RTX 2070
$340
2304
64
1410 MHz
1620 MHz
1750 MHz
TU106
10800M
8 GB, GDDR6, 256-bit
RTX 3060
$900
3584
48
1320 MHz
1777 MHz
1875 MHz
GA106
13250M
12 GB, GDDR6, 192-bit
RTX 2070 Super
$450
2560
64
1605 MHz
1770 MHz
1750 MHz
TU104
13600M
8 GB, GDDR6, 256-bit
Radeon VII
$680
3840
64
1400 MHz
1800 MHz
1000 MHz
Vega 20
13230M
16 GB, HBM2, 4096-bit
RTX 2080
$600
2944
64
1515 MHz
1710 MHz
1750 MHz
TU104
13600M
8 GB, GDDR6, 256-bit
RTX 2080 Super
$690
3072
64
1650 MHz
1815 MHz
1940 MHz
TU104
13600M
8 GB, GDDR6, 256-bit
RTX 3060 Ti
$1300
4864
80
1410 MHz
1665 MHz
1750 MHz
GA104
17400M
8 GB, GDDR6, 256-bit
RX 6700 XT
$1000
2560
64
2424 MHz
2581 MHz
2000 MHz
Navi 22
17200M
12 GB, GDDR6, 192-bit
RTX 2080 Ti
$1400
4352
88
1350 MHz
1545 MHz
1750 MHz
TU102
18600M
11 GB, GDDR6, 352-bit
RTX 3070
$1300
5888
96
1500 MHz
1725 MHz
1750 MHz
GA104
17400M
8 GB, GDDR6, 256-bit
RTX 3070 Ti
$1300 MSRP: $600
6144
96
1575 MHz
1770 MHz
1188 MHz
GA104
17400M
8 GB, GDDR6X, 256-bit
MSI RTX 3070 Ti Suprim X
$1350
6144
96
1575 MHz
1860 MHz
1188 MHz
GA104
17400M
8 GB, GDDR6X, 256-bit
RX 6800
$1400
3840
96
1815 MHz
2105 MHz
2000 MHz
Navi 21
26800M
16 GB, GDDR6, 256-bit
RX 6800 XT
$1700
4608
128
2015 MHz
2250 MHz
2000 MHz
Navi 21
26800M
16 GB, GDDR6, 256-bit
RTX 3080
$1500
8704
96
1440 MHz
1710 MHz
1188 MHz
GA102
28000M
10 GB, GDDR6X, 320-bit
RTX 3080 Ti
$2200
10240
112
1365 MHz
1665 MHz
1188 MHz
GA102
28000M
12 GB, GDDR6X, 384-bit
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.