Strong comeback of the German economy in the summer The German economy is stronger after the crash in the Corona crisis got going than initially assumed. The gross domestic product rose in the period from July to September by 8.5 percent compared to the second quarter, announced the Federal Statistical Office. According to preliminary data, the authority had assumed an increase of 8.2 percent. At the end of the year, however, the recovery is likely to pause due to the partial lockdown in Germany and restrictions in many other European countries in view of the increasing number of infections.
Artists are facing storm free content snippets A group of 576 artists appeals to politics not to play off copyright against them. Above all, the proposed “minor barrier” in the draft of the Federal Ministry of Justice for copyright reform, with which snippets of content such as memes in social media should become license-free and free of charge, must be dropped, demand the authors. “The Corona crisis hit us hard,” write the artists. It is all the more painful that they still do not receive adequate remuneration for their works in the only unrestricted market of online platforms.
Our weekday news podcast delivers the most important news of the day compressed into 2 minutes. Anyone who uses voice assistants such as Amazon Alexa or Google Assistant can also hear or see the news there. Simply activate the skill on Alexa or say to the Google Assistant: “Play heise top”.
WireGuard for Windows continues to take shape Shortly after each other, the WireGuard project released Windows versions 0.2 and 0.3 of its VPN software. An important innovation is a restricted view for regular users: With the Network Configuration Operator function, the project is aimed particularly at corporate users whose system administrators use WireGuard but do not want to give their users administrator rights. In addition, WireGuard for Windows can now be used on ARM and ARM 64 systems. With further changes and bug fixes, WireGuard should also run faster and more stable.
Snapchat starts TikTok for Snaps Snapchat is rolling out its new Spotlight function for Android and iOS with immediate effect. For Snapchat users, a new menu item in the form of a play button appears at the bottom right with which the videos can be played. To create a spotlight snap, select “Spotlight” on the send screen and add a hashtag if you wish. Spotlight is very reminiscent of TikTok or Reels from Instagram. There will be no public comment function for spotlight snaps – but you can share and like them with friends.
Microsoft is in the early phases of rolling out its xCloud streaming service on mobile devices, but TVs are the next logical step. In an interview with The Verge, Xbox chief Phil Spencer has revealed we’ll likely see an Xbox app appear on smart TVs over the next year. “I think you’re going to see that in the next 12 months,” said Spencer, when asked about turning the Xbox into a TV app. “I don’t think anything is going to stop us from doing that.”
Spencer previously hinted at TV streaming sticks for Microsoft’s xCloud service last month, and this latest hint suggests we might see similar hardware or an Xbox app for TVs during 2021. Microsoft is currently working on bringing xCloud to the web to enable it on iOS devices, and this work would naturally allow xCloud to expand to TVs, browsers, and elsewhere.
Microsoft was previously working on a lightweight Xbox streaming device back in 2016, but it canceled the hardware. Microsoft has been testing the idea of streaming and TV sticks ever since the company originally demonstrated Halo 4 streaming from the cloud to Windows and Windows Phones all the way back in 2013.
Microsoft’s xCloud service.Photo by Nick Statt / The Verge
While Microsoft might be pushing ahead with xCloud, it certainly has no plans to abandon consoles or hardware. “I don’t think these will be the last big pieces of hardware that we ship,” says Spencer. Instead, Microsoft sees a future where there’s a hybrid of local hardware and cloud hardware. “When we think about xCloud, which is our version of Stadia or Luna, I think what it needs to evolve to are games that actually run between a hybrid environment of the cloud and the local compute capability,” explains Spencer. “It’s really a hybrid between both of those.”
How this hybrid plays out could mean we see Xbox Series S and X consoles getting access to xCloud soon. This could allow players to try a game quickly before they fully download it from Game Pass, or possibly even stream a demo of a game before purchasing it.
Microsoft also has plans to integrate xCloud into Facebook Gaming in the future, so we’re clearly going to see a lot of changes to xCloud soon. We still don’t have full details on Microsoft’s plans for Xbox TV apps, but the company did partner with Samsung earlier this year for xCloud. Microsoft is also planning to upgrade its server blades to the more capable Xbox Series X hardware at some point in 2021.
You can read or listen to Nilay Patel’s full interview with Phil Spencer right here.
Online trade in Germany is growing steadily, in recent years yellow, brown and other colored delivery vans have become more conspicuous in the street scene, also parked in the second row. However, it is not their suppliers who clogged the streets in the cities; the online trade is responsible for other culprits, for example car sharing and commuters Online trade with private end customers, combined with the significantly stronger growth in parcel deliveries to commercial customers, has not led to a serious increase in delivery trips by shipping service providers compared to all other traffic movements in cities, explains the Federal Association of E-Commerce and Mail Order Germany (BEVH). The percentage of the shipment volume increased significantly less than the order turnover. In addition, more and more orders are being delivered to customers in the surrounding area or in smaller places.
“No problem case e-commerce” “From a ‘problem case e-commerce’ for the traffic situation in large cities on the basis of increased parcel deliveries is out of the question “, says BEVH managing director Christoph Wenk-Fischer according to a statement. He refers to the findings of market researchers at MRU, who specialize in parcel and delivery services. In the exemplary city of Hamburg, the number of delivery traffic for all commercial and private deliveries increased from 2.1 per year 2017 to 2.2 departures per square kilometer last year . In contrast, two years ago 28 there were daily departures per km 2 the deliveries of the stationary retail trade or the catering trade.
In addition, it has 2017 around 800, 2019 then 1500 car sharing Vehicles given in the Hamburg city area. The number of cars registered in Hamburg is between 2017 and 2019 around 25. 000 to slightly more than 795. , the number of the surrounding area commuting to Hamburg – Resident around almost 30. 000 ,
In the ranking list of parcel delivery traffic per day and square kilometer, Munich came first with 4.9 last year. It is followed by Frankfurt and Düsseldorf with 3.8 and 3.3. Berlin is on par with Hamburg with 2.9, has calculated the MRU.
Better logistics The BEVH goes indirectly in his communication on the problem of delivery vans as stationary traffic obstacles and has MRU managing director Horst Manner-Romberg comment: “There is still a lack of comprehensive solutions and the consideration of logistical requirements in urban planning.” There is a lack of delivery and loading zones, areas for micro-hubs or delivery time windows.
The Lower Saxony Association of Cities and Municipalities has already suggested more finely tuned price concepts. Not every online customer wants to be supplied immediately and personally – after all, the providers are using more and more parcel or delivery boxes in parallel. There are already a few ideas for refined logistics in cities: Apcoa is serving its parking garages, parcel services are experimenting with cargo bikes and, to the displeasure of their competitors, the Post has suggested a “consolidated delivery” from delivery competitors.
As of today, Amazon customers can activate WebAuthn in AWS Single Sign-On as additional multi-factor authentication. So far, the service only offered one-time passwords (OTP) and radius for this.
WebAuthn has been developed by the FIDO Alliance and the W3C since 2019 Version 1.0 published web standard for passwordless authentication with public key procedures. It is also a core component of FIDO-2, which is used in many security keys and in current operating systems such as Android, Linux or Windows. The service on the server uses the Client-to-Authenticator Protocol (CTAP) to access a local authenticator, such as a fingerprint scanner, integrated facial recognition or a hardware token.
MFA can be set up with just a few settings via the WebAuthn authenticator.
(Image: Amazon AWS)
Users can register various authentication devices via the WebAuthn authenticator and in the future use one of them to log into the AWS Management Console or the AWS Command Line Interface (CLI). The two-factor authentication from WebAuthn not only works for all identities stored in the internal identity store of AWS Single Sign-On, but also for users from an Active Directory – regardless of whether they are managed by AWS or not. All information and instructions for configuration can be found in the associated blog entry.
AMD has the revised overclocking algorithm Precision Boost Overdrive 2 for the recently introduced Ryzen – 5000 – processors announced. The Curve Optimizer contained therein is exciting, which is supposed to “intelligently” reduce the core voltage of the CPUs by using the internal sensors for evaluation, among other things to measure the load, the temperatures and the currents.
Instead of a general offset that always subtracts a given value from the standard voltage, a curve appears. Users can use the Curve Optimizer in 15 Set levels with a saving of 3 to 5 millivolts each, i.e. a minus of a maximum of 0, 09 to 0, 15 volts. Ryzen – 3000 – Processors draw most of the voltage at lower and medium clock frequencies; less at the peak in order to keep the performance as possible.
So-called undervolting can even improve the performance, because no silicon chip is the same and chip manufacturers install voltage cushions. Many processors (and GPUs) also reach their target clock frequencies with reduced voltage. Undervolting reduces the power consumption, which in turn lowers the temperatures and can thus have a positive effect on the clock behavior.
AMD Precision Boost Overdrive 2 with Curve Optimizer (5 images) The Curve Optimizer draws more voltage at lower clock frequencies than at high ones.
(Image: AMD) Ryzen 9 benefits the most According to AMD, the two benefit in particular 16- and 12 – Kerner Ryzen 9 5950 X and Ryzen 9 5800 X from the Curve Optimizer, as the two models often use the thermal design Power (TDP) of 71 watts ( real up to 142 Watt Package Power Tracking, PPT ). A slight voltage reduction could be a Ryzen 9 5900 X up to 10 Get percent more multithreading performance, like Anandtech AMD quotes – on this a small plus for the single threading performance.
Models with fewer cores like the Ryzen 7 5800 X and Ryzen 5 5600 X are most likely to benefit from single-threading performance. In order to increase the effect, the Curve Optimizer can be set to “negative” for each CPU core. Means: Overvolting of individual cores, while all others run with reduced voltage. AMD recommends this for the two cores with the most clock speed, which are marked with a star and a white dot and create a potentially higher boost with more voltage.
As of AGESA 1.1.8.0 in the UEFI-BIOS AMD switches the Curve Optimizer with the upcoming AMD Generic Encapsulated Software Architecture (AGESA) 1.1.8.0 on AM4 mainboards of the 500 he and 400 series free – corresponding BIOS updates should appear from December 2020. An update for the Ryzen Master Tool is to appear next year, which enables the setting in Windows. As always: with adjustments to the voltages, CPUs run outside of their specifications, so that the guarantee is void.
With Ryzen – 3000 – Processors or older models meanwhile Curve Optimizer does not work – according to AMD the CPUs lack the hardware requirements.
Thanks to ARM: The fast M1 chip in the new Macs makes it very tempting for Windows users to venture into the Apple universe from the PC. Anyone who wanted to get on with their work immediately, however, had a problem so far. Because Apple’s migration assistant, intended to make it easier to move from computer to computer, was not yet compatible with macOS 11 alias Big Sur – and this operating system works on the new Apple silicon machines.
Small tool, many functions Since the end of last week the wait has come to an end. On Friday Apple made the new version 2.3.0.0 of the Windows Migration Assistant available for download. The 5.1 MB download “helps you migrate your data from a Windows PC running Windows,” says Apple. The app starts running as soon as it has been installed. The minimum requirement on the PC side is Windows 7 or later, on the Mac it must be macOS 11 aka Big Sur. For earlier macOS versions you need an older version of the migration assistant – or you can skip the update.
A complete clone of the current work environment is of course not available when switching from Windows to macOS. This only works when switching from Mac to Mac. “Contacts, calendar, e-mail accounts and much more” are transferred. This includes iTunes content, pictures, files from the root directory, system settings such as desktop pictures or the start page of the browser and its bookmarks.
64 – Bit versions of Outlook The migration assistant does not support any 64 – bit versions of Outlook. “You can send mail, contacts or calendar manually from Outlook 2013 or Outlook 2016 migrate. Sign up for this and enter the content manually on your Mac, “says Apple succinctly. The migration assistant also transfers “only the mail or contact data that belong to the logged-in Windows user”.
Apple recommends running the Check Disk utility (chkdsk) on the PC before the migration starts – to make sure there are no problems on the Windows hard drive: (bsc)
The WireGuard project released Windows versions 0.2 and 0.3 of its VPN software in quick succession. An important innovation is a restricted view for regular users: Anyone who belongs to the Network Configuration Operator group can start and stop tunnels in it and view their status without administrator rights. However, any access to the keys remains blocked for them. With this function, the project aims in particular at company users whose system administrators use WireGuard, but do not want to give their users administrator rights.
Furthermore, the configuration can now be found under% ProgramFiles% WireGuard Data Configurations. She previously saved the software in the LocalSystem user profile. Microsoft advises against the latter, however, and is no longer migrating these settings files between Windows 10 versions. When updating, WireGuard automatically moves the configuration to its new location, where it is encrypted without user intervention.
WireGuard for the Surface Pro X and the Pi Furthermore, WireGuard for Windows can now be used on ARM and ARM 64 systems. For the latter, the project targets Microsoft’s Surface series and the Raspberry Pi. Previously, the software could only be used with x 86 – and amd 64 – use computers. Instead of letting the user choose the correct MSI himself, WireGuard now offers an installer that automatically downloads the correct installation file in the latest version, validates its signature and then sets up the program. However, the individual MSIs are retained so that administrators can keep and upload them themselves.
With further changes and bug fixes, WireGuard should also run faster and more stable . The project has also added further translations. According to the announcement of the two new versions, split tunneling is not yet ready for productive use. WireGuard for Windows appears as open source software under the MIT license.
The storage location of the configuration profile is not just any user profile, but the LocalSystem user profile. The unclear wording in the text has been corrected.
Under certain conditions, attackers could attack Linux and Windows systems with VMware Cloud Foundation, IdentityManager, Identity Manager Connector, vRealize Suite Lifecycle Manager, Workspace One Access and Workspace One Access Connector and execute malicious code. Security updates are not yet available. Admins must secure their systems with workarounds.
In order to use the as ” critical “classified gap (CVE – 2020 – 4006) successfully, attackers need network access to the admin configuration panel (port 8443). A valid password is required for this. If access is given, you can execute your own commands with unrestricted rights.
Secure systems now! In a warning message, VMware lists further information about the security gap. As you can see from an article, the workarounds only work with Identity Manager, Identity Manger Connector and Workspace One Access.
If the workaround is active, admins cannot use the configuration panel. If this is absolutely necessary, you can temporarily deactivate the workaround and arm it again after the settings have been made.
It is not yet known when the security updates will appear. These versions are affected by the vulnerability:
Eight email programs for the desktop in a comparison test Never change a running client? The test candidates Exclude incompatibility One-to-one meetings Conclusion Test table Article in c’t 24 / 2020 read Desktop mail programs in the age of smartphones and web mailers? Why, surely! The well-known feature monsters are still very popular because they support the bread-and-butter business with precisely configurable filters, with text modules or simply with keyboard shortcuts.
The mobile and web clients enliven business as constant competition, so New concepts are also developing in the desktop market: the established programs not only keep pace with them, they also learn more features and even software that is now fully developed, such as Thunderbird, receives updates with new functions again.
The test field includes the email clients Airmail, Apple Mail, eM Client, Evolution, Geary, Outlook, Thunderbird and Windows Mail.
Access to all contents of heise + exclusive tests, advice & background: independent , critically founded c’t, iX, Technology Review, Mac & i, Make, c’t read photography directly in the browser register once – read on all devices – can be canceled monthly first month free, then monthly 9 , 95 € Weekly newsletter with personal reading recommendations from the editor-in-chief Start FREE month Start FREE month now heise + already subscribed?
Sign in and read Register now and read the article immediately More information about heise +
Intrepid developers have done the expected and necessary: Doom now runs on Nintendo’s recently released Game & Watch portable. The Game & Watch: Super Mario Bros. was designed to run three games, but successfully porting Doom shows it can handle others if you have a lot of patience (via Hackaday).
This version of Doom is not perfect, of course. In order to get the game to run even at the slower speed shown in the video below, sound had to be disabled and textures had to be simplified to fit the game on the small portable. And getting that far wasn’t easy: programmers stacksmashing and Konrad Beckmann had to trick the Game & Watch to offload its firmware by injecting code into the external storage accessed by the Game & Watch’s tiny microcontroller.
Finding a version of Doom small enough to fit on the Game & Watch’s meager amount of storage was a separate issue. The hackers settled on a package called “Minimal Doom IWAD” that replaces the game’s original textures with simplified versions, but they still had to make further adjustments like disabling sound to work within the Game & Watch’s 1.1 MB of usable storage.
The Game & Watch: Super Mario Bros. is designed to evoke Nintendo’s original Game & Watch handhelds, while running a version of Nintendo’s NES classic, Super Mario Bros., as well as Super Mario Bros. 2 (known in the West as The Lost Levels). (It also runs a Mario-themed version of the Game & Watch game Ball).Doom’s demon-slaying action wasn’t ported to a Nintendo console until the more powerful SNES was released, so showing up on a tiny device designed to run NES games is as unusual as it is impressive.
Doom is an old standby for programmers experimenting with homebrew games. The game has showed up in all sorts of places: from a Samsung refrigerator running xCloud, to a Windows PC built in Minecraft, and even the Touch Bar of a MacBook Pro. Thanks to this work, Doom seems right at home on a Game & Watch, and soon, other homebrew games might be, too.
A Weibo user has shared some interesting photographs of a domestic gaming PC that’s being cooked up in China. The grand novelty is that the machine isn’t using an Intel or AMD processor, but an Arm-powered chip, more specifically the Phytium FT-2000/4.
The China Electronics Corporation (CEC) is behind the labeled PKS gaming PC. You might not have heard of the company before, but the Chinese fabless chipmaker Phytium should ring a bell. That’s because the CEC is the parent company of Phytium. Now that the connection is made, it shouldn’t come as a surprise that the PKS would leverage one of Phytium’s FeiTeng 64-bit ARMv8-based processors.
The FT-2000/4 is an FCBGA processor with 1,144 pins that measure 35 x 35mm. TSMC produces the FT-2000/4 for Phytium on the foundry’s 16nm process node. The FT-2000/4 wields four FTC663 processing cores, which run at three different clock speeds: 2.2 GHz, 2.6 GHz, and 3 GHz. The quad-core part also comes equipped with 4MB of L2 cache (split into 2MB per two cores) and 4MB of L3 cache. According to Phytium, the FT-2000/4 has a typical power consumption of a mere 10W.
PKS Gaming PC (Image credit: 蚁工厂/Weibo)
The FT-2000/4 processor supports dual-channel DDR4-3200 memory and delivers up to 34 PCIe 3.0 lanes. The latter allows for a configuration of up to two PCIe 3.0 x16 lanes and two PCIe 3.0 x1 lanes. As a result, the Chinese firm outfits the PKS gaming PC with 32GB of DDR4 memory and a discrete graphics card with 8GB of memory. The Chinese firm didn’t specify the model of the graphics card, but it’ll be interesting to see whether the FT-2000/4 at 10W will be the bottleneck.
On the storage side, the FT-2000/4 can accommodate up to four SATA III ports, which is why the manufacturer advertises support for conventional SSDs and high-capacity hard drives. In terms of connectivity, the PKS gaming PC offers one Gigabit Ethernet port, six unspecified USB ports, one HDMI port, and one DisplayPort output. There’s also dual-display support.
The PKS gaming PC is a domestic product and will likely feature a homegrown operating system, such as UOS (Unity Operating System) or NeoKylin. The caveat with Chinese operating systems is that they are based on some kind of Linux distribution, and there aren’t many native Linux games out there. Software like Wine or CrossOver exists to allow Linux users to run Windows games, but it’s one of those hit-or-miss experiences. However, the PKS gaming PC could probably succeed as an entry-level gaming PC or find its way into gaming cyber cafes where MMOs and MMORPGs prevail over more graphics-intensive titles.
We’re pointing to ways to get Windows 10 for free — or least cheaper. (Image credit: Anton Watman/Shutterstock)
You can spend thousands of dollars on components when building a PC, but it won’t boot without an operating system (OS). Linux is a viable option, but most people prefer Windows because it runs all of their favorite software, including the latest games. And for those who were still holding on, Windows 7 has officially reached its end of life, meaning it won’t get any more support or security updates. Fortunately, you can get Windows 10 for free or cheap, if you know where to look.
Getting hold of the Windows installer is as easy as visiting support.microsoft.com. Whether you’ve paid for Windows 10 already or not, Microsoft lets anyone download a Windows 10 ISO file and burn it to a DVD, or create installation media on a USB drive for free. Once that’s done, you can boot from your installation media and load Windows 10 onto your PC. During the installation process, Microsoft asks for an activation key. You can skip it, but eventually, Windows will start alerting you that your install isn’t activated.
There are many ways to get a Windows 10 activation / product key, and they range in price from completely free to $309 (£339, $340 AU), depending on which flavor of Windows 10 you want. You can of course buy a key from Microsoft online, but there are other websites selling Windows 10 keys for less. There’s also the option of downloading Windows 10 without a key and never activating the OS. But what, if anything, are you missing out if you don’t activate Windows 10? And does your carefully crafted rig face any risks?
Below we outline the top ways you can get Windows 10 — from the cheapest to most expensive — and the downsides of each option.
Access to all personalization options; Microsoft support access; Free
Free
Access to all personalization options; Microsoft support access; Equivalent to Windows 10 Enterprise; Free
Access to all personalization options; Microsoft support access
Access to all personalization options; Microsoft support access; Refunds
Cons
There’s a small chance Microsoft will reject activation, and you’ll have to contact support
Desktop watermark; Personalization options restricted; Can’t use Microsoft support
You have to be enrolled in an eligible school
There’s a small chance your key won’t work, and you’ll have to contact support to get it fixed; Some third parties don’t offer refunds
Expensive
Upgrade From Windows 7 or 8 to Windows 10: Free
Nothing’s cheaper than free. If you’re looking for Windows 10 Home, or even Windows 10 Pro, it’s possible to get Windows 10 for free onto your PC. If you already have a Windows 7, 8 or 8.1 a software/product key, you can upgrade to Windows 10 for free. You activate it by using the key from one of those older OSes. But note that a key can only be used on one PC at a time, so if you use that key for a new PC build, any other PC running that key is out of luck.
To do this with a Windows 10 compatible PC (after backing up your important data, of course) download Windows 10. When asked, select “Upgrade this PC now.” Note that if you’ve recently changed your PC’s hardware, such as the motherboard, Windows may not find the license for you device. That means you’ll have to reactive the OS. Here are Microsoft’s instructions for reactivating Windows 10 after changing PC hardware.
Downsides of Upgrading From Windows 7 or 8
When using an older Windows key to activate Windows 10, you may run into complications if Microsoft isn’t sure whether you’re eligible to update or not. In that case, you’d have to call a number and go through a process of entering your key and getting a code. But that seems to be happening less in recent months and years.
Don’t Activate Windows: Free
If you don’t have a valid key, you can still use Windows 10 for free on your PC even if you don’t activate the OS. I have colleagues who have used non-activated versions of Windows for years without Microsoft ever shutting it down. In this way, you can have Windows 10 Home or Pro running on your PC nearly flawlessly. Nearly…
Downsides of Not Activating Windows
“If the user [installs Windows 10] before activating Windows, they will see an ‘Activate Windows’ watermark on their desktop, as well an experience a limit on Windows 10 personalization options,” Microsoft told Tom’s Hardware in a statement.
Microsoft brands PCs running an unactivated version of Windows 10 with a watermark in the bottom-right corner of the screen. A Microsoft spokesperson told me that activating Windows 10 ensures you have a legitimate copy of Windows 10, and the watermark is an attempt to alert consumers that their version could be false. However, if you downloaded your ISO directly from Microsoft, there’s no way your copy can be a fake.
If you get Windows 10 for free and don’t activate it, you’ll see this watermark on your desktop.
If you don’t activate Windows 10, you won’t be able to change Personalization options in the Settings menu. That means you can’t choose personal desktop wallpapers, slideshow backgrounds, Start, taskbar, Action Center or title bar colors, light or dark color schemes, font choices or lock screen options.
The lack of custom aesthetics can be a downer, especially if you like to liven things up by changing colors and images. However, we checked, and you can still change your wallpaper if you right-click an image from the web or a personal photo and set it as your wallpaper. And if you have a wallpaper tied to your Microsoft account, it will appear if you sign into Windows with that account.
Personalization options are blocked out if you get Windows 10 for free and don’t activate it.
Unsurprisingly, Microsoft won’t offer you any Windows 10 technical support if you don’t activate the OS. If you call or chat with their techs, they’ll start off by asking you for your key, and you’ll have no response.
Use the Microsoft Student Discount: Free
Microsoft offers students attending certain universities and high schools the ability to get Windows 10 for free by allowing them to activate Windows 10 Education for free. Meanwhile, teachers can get Windows 10 Education for $14.99. You can see if your school is eligible and download your free Windows 10 key here. The key is yours even after you graduate.
Certain schools offer students and teachers Windows 10 for free or $15, respectively.
But is Windows 10 Education any different from Windows 10 Home? It’s actually better. Windows 10 Education is the same as Windows 10 Enterprise, which Microsoft calls the most robust version of Windows 10. The OS has features targeting security, device control and management and deployment that Windows 10 Home lacks. Unlike Windows 10 Home, Windows 10 Education has client and host remote desktop and remote app i(nstead of client only), Hyper-V (Microsoft’s hypervisor) and extra apps, like AppLocker and BitLocker. Although, it’s likely you won’t ever use any of those bonus features.
If you’re not currently a student but happen to have a .edu email, we don’t recommend scamming the system. In addition to ethical concerns, if you get caught, Microsoft can make you pay up anyway. “False representations of eligibility voids this offer, and Microsoft reserves the right to collect the full price of product(s) ordered,” Microsoft’s policy states.
Downsides of Using the Microsoft Student Discount
If your school is eligible for the discount, there isn’t really a downside to this method of procuring Windows 10 free. Not all colleges / high schools have it, and you may need to make a special user account to download it. But if you can score Windows 10 Education for free, we don’t see any reason not to.
Buy a Cheap Windows 10 Key From a Third-Party Seller: Around $30
Retailers like Kinguin and Newegg don’t sell Windows 10 for free, but they often have it for cheap. (Image credit: Kinguin)
If you can’t stand living with the scarlet letter of an eternal watermark or want the comfort of knowing Microsoft won’t disown your PC’s OS should you call for help, you’ll have to buy a Windows 10 key. And while some turn to Microsoft for this purchase, there are third-party websites selling keys for much cheaper than Microsoft. For example, at the time of writing, Kinguin sells Windows 10 Home for about $40, Amazon charges $129.99, Newegg’s pushing it for $89.99 and even Walmart has it for $99.95, as well as a Pro OEM version .
Now, let’s address the elephant in the room. While we can’t vouch for all of them, websites selling lower-priced Windows keys are likely selling legitimate codes. One popular site, Kinguin, has 37 merchants worldwide selling Windows keys. Mark Jordan, Kinguin’s VP of communications, told me that their merchants acquire the codes from wholesalers who have surplus copies of Windows they don’t need.
“It’s not a gray market. It would be like buying Adidas or Puma or Nike from a discounter, from TJ Maxx,” Jordan said. “There are no legal issues with buying it from us. It’s just another marketplace.”
According to Jordan, Kinguin’s merchants have sold “several hundred thousand” keys and are not one-time sellers posting listings for codes they don’t want. As part of its fraud protection, a Kinguin employee randomly buys a key “every now and then” to make sure they’re legitimate, he said. Jordan added that it’s rare for a customer to get a key that’s been resold, but if they did, customer support would help them get a new one for free.
“If there’s ever a problem with a key being already activated or something like that, our customer support team helps you get a new key… And that merchant would be in deep trouble, so they are very careful with it,” Jordan said.
It’s worth noting that we’ve encountered reports of customer dissatisfaction, including from users who wanted a specific type of key (like non-OEM only), ended up with something different (like an OEM version) and could only get a refund, rather than the type of key they originally tried to buy.
You’ll have to enter a key to activate Windows, but you won’t have a problem doing that if you bought your key from a place like Kinguin (or Amazon, Newegg, etc.). In fact, Microsoft still offers 24/7 technical support online and via phone even if you got your Windows 10 key from somewhere other than Microsoft.
If you do opt to get your key for less, make sure it’s from a legitimate site. A hint will be if that key is too cheap — i.e. free or close to free. And, as with anything else, if you haven’t heard of a seller, check their ratings or go elsewhere.
No matter where you get your product key, you shouldn’t download Windows 10 from anyone besides Microsoft. As noted on Microsoft’s website: “When buying Microsoft software as a digital download, we recommend that you avoid auction sites and peer-to-peer (P2P) file sharing sites. At the moment there are a limited number of sites where you can legally purchase digital downloads of Microsoft software.”
“Genuine Windows is published by Microsoft, properly licensed and supported by Microsoft or a trusted partner. Non-genuine software results in a higher risk of malware, fraud, public exposure of your personal information and a higher risk for poor performance or feature malfunctions,” Microsoft added in a statement to Tom’s Hardware.
Downsides of Cheap Keys
These non-Microsoft websites have varying return policies for software key purchases. While Kinguin seems to have an open return policy,
Meanwhile, Amazon and Newegg both have no-refund policies for software keys. Amazon claims all keys sold on its site are genuine, and any gripes you have with your key must be handled by the individual vendors. If a key you bought from Newegg doesn’t work, you’ll have to contact Newegg’s product support team to get a new key.
Still, most, if not all, sites seem willing to accommodate you should you get a key that’s already been used or doesn’t work. Again, just make sure you’re buying your key from a legitimate source. For that reason we don’t recommend buying Windows 10 keys from individual sellers (or illegally).
This final downside is only applicable if you want to equip your PC with Windows 10 Pro for Workstations. While I was able to find Windows 10 Home on a number of genuine key-selling websites and Windows 10 Pro on some (although fewer) websites, I couldn’t find a place to download a key for Windows 10 Pro for Workstations anywhere besides Microsoft (Amazon sells it to ship for $293.83). The most advanced and pricey ($309) member of the Windows 10 clan, Windows 10 Pro for Workstations offers “support for the next generation of PC hardware, up to four CPUs and 6TB of memory,” according to Microsoft’s website. But it’s unlikely you’ll need the juggernaut of Windows 10 for your personal machine.
Buy a Windows Key From Microsoft: $139+
Want a version of Windows 10 where you can enjoy dynamic slideshows on your home screen and vibrant red, green, pink, or purple taskbars? Do you enjoy the thrills of a watermark-free screen and the comfort of knowing you can call Microsoft support if you have any problems? Then you need a key, which, as discussed, you can get from various retailers. But if you want to avoid any chance of getting an unusable key or want the guaranteed ability to get a full refund even if there’s no problem with the key, your best bet is buying from Microsoft.
In addition to selling keys for Windows 10 Home and Pro, Microsoft is the only place you can get a key for Windows 10 Pro for Workstations. Additionally, Microsoft offers the Assure Software Support Plan for an extra $99 (£95/ AU$120). This plan is valid for a year after activating Windows 10. It’s applicable for up to five devices and entitles you to online and phone support and one-on-one in-store training. One caveat: Microsoft says the plan is “for purchase and activation only in the region in which it was acquired.”
Downsides of Buying from Microsoft
Microsoft charges the most for Windows 10 keys. Windows 10 Home goes for $139 (£119.99 / AU$225), while Pro is $199.99 (£219.99 /AU$339). Despite these high prices, you’re still getting the same OS as if you bought it from somewhere cheaper, and it’s still only usable for one PC.
Plus, the premium price doesn’t entitle you to any support perks. Microsoft’s 24/7 basic phone and online support is available to anyone with a Windows 10 key, even those who didn’t get it from Microsoft. After already investing time and money building a PC , it can be difficult to convince yourself to spend over $100 for an OS that you can get with the same specs and support for cheaper.
What’s the Best Way to Get Windows 10?
If you have an old Windows key you can get Windows 10 free by carrying that key over from a previous build — that’s your best option.
If you don’t have a key on hand, you need to decide whether you’re comfortable using an unactivated version of Windows 10, which limits your customization options, has an ugly watermark and leaves you ineligible for Microsoft support. Many would argue that downloading Windows without paying for or already owning a product key is ethically wrong. That said, Microsoft has made this process easier over various Windows iterations and lessened the limitations and nagging that happens when you don’t activate. The company isn’t trying to close this loophole, probably because it’s more interested in driving user numbers. I’ve even seen well-known vendors and Microsoft partners do press presentations with watermarks on their desktop.
If you must buy a Windows 10 key, you can save a lot with a low-cost seller such as Kinguin, although customer service may be lacking. Microsoft’s price is astronomically high and doesn’t offer any significant benefits. You can save $100 or more by buying a key from one of these third-party sites, which is money you can spend on one of the best graphics cards, a roomier SSD, or a few AAA games for your new PC.
MORE: Running Windows 10 on Raspberry Pi
MORE: PC Building Tips for Beginners MORE: How to Build A PC
MORE: How to Factory Reset a Windows 10 PC MORE: How to Set Up RAID In Windows 10
The AMD Radeon RX 6800 XT and Radeon RX 6800 have arrived, joining the ranks of the best graphics cards and making some headway into the top positions in our GPU benchmarks hierarchy. Nvidia has had a virtual stranglehold on the GPU market for cards priced $500 or more, going back to at least the GTX 700-series in 2013. That’s left AMD to mostly compete in the high-end, mid-range, and budget GPU markets. “No longer!” says Team Red.
Big Navi, aka Navi 21, aka RDNA2, has arrived, bringing some impressive performance gains. AMD also finally joins the ray tracing fray, both with its PC desktop graphics cards and the next-gen PlayStation 5 and Xbox Series X consoles. How do AMD’s latest GPUs stack up to the competition, and could this be AMD’s GPU equivalent of the Ryzen debut of 2017? That’s what we’re here to find out.
We’ve previously discussed many aspects of today’s launch, including details of the RDNA2 architecture, the GPU specifications, features, and more. Now, it’s time to take all the theoretical aspects and lay some rubber on the track. If you want to know more about the finer details of RDNA2, we’ll cover that as well. If you’re just here for the benchmarks, skip down a few screens because, hell yeah, do we have some benchmarks. We’ve got our standard testbed using an ‘ancient’ Core i9-9900K CPU, but we wanted something a bit more for the fastest graphics cards on the planet. We’ve added more benchmarks on both Core i9-10900K and Ryzen 9 5900X. With the arrival of Zen 3, running AMD GPUs with AMD CPUs finally means no compromises.
Update: We’ve added additional results to the CPU scaling charts. This review was originally published on November 18, 2020, but we’ll continue to update related details as needed.
AMD Radeon RX 6800 Series: Specifications and Architecture
Let’s start with a quick look at the specifications, which have been mostly known for at least a month. We’ve also included the previous generation RX 5700 XT as a reference point.
Graphics Card
RX 6800 XT
RX 6800
RX 5700 XT
GPU
Navi 21 (XT)
Navi 21 (XL)
Navi 10 (XT)
Process (nm)
7
7
7
Transistors (billion)
26.8
26.8
10.3
Die size (mm^2)
519
519
251
CUs
72
60
40
GPU cores
4608
3840
2560
Ray Accelerators
72
60
N/A
Game Clock (MHz)
2015
1815
1755
Boost Clock (MHz)
2250
2105
1905
VRAM Speed (MT/s)
16000
16000
14000
VRAM (GB)
16
16
8
Bus width
256
256
256
Infinity Cache (MB)
128
128
N/A
ROPs
128
96
64
TMUs
288
240
160
TFLOPS (boost)
20.7
16.2
9.7
Bandwidth (GB/s)
512
512
448
TBP (watts)
300
250
225
Launch Date
Nov. 2020
Nov. 2020
Jul-19
Launch Price
$649
$579
$399
When AMD fans started talking about “Big Navi” as far back as last year, this is pretty much what they hoped to see. AMD has just about doubled down on every important aspect of its architecture, plus adding in a huge amount of L3 cache and Ray Accelerators to handle ray tracing ray/triangle intersection calculations. Clock speeds are also higher, and — spoiler alert! — the 6800 series cards actually exceed the Game Clock and can even go past the Boost Clock in some cases. Memory capacity has doubled, ROPs have doubled, TFLOPS has more than doubled, and the die size is also more than double.
Support for ray tracing is probably the most visible new feature, but RDNA2 also supports Variable Rate Shading (VRS), mesh shaders, and everything else that’s part of the DirectX 12 Ultimate spec. There are other tweaks to the architecture, like support for 8K AV1 decode and 8K HEVC encode. But a lot of the underlying changes don’t show up as an easily digestible number.
For example, AMD says it reworked much of the architecture to focus on a high speed design. That’s where the greater than 2GHz clocks come from, but those aren’t just fantasy numbers. Playing around with overclocking a bit — and the software to do this is still missing, so we had to stick with AMD’s built-in overclocking tools — we actually hit clocks of over 2.5GHz. Yeah. I saw the supposed leaks before the launch claiming 2.4GHz and 2.5GHz and thought, “There’s no way.” I was wrong.
AMD’s cache hierarchy is arguably one of the biggest changes. Besides a shared 1MB L1 cache for each cluster of 20 dual-CUs, there’s a 4MB L2 cache and a whopping 128MB L3 cache that AMD calls the Infinity Cache. It also ties into the Infinity Fabric, but fundamentally, it helps optimize memory access latency and improve the effective bandwidth. Thanks to the 128MB cache, the framebuffer mostly ends up being cached, which drastically cuts down memory access. AMD says the effective bandwidth of the GDDR6 memory ends up being 119 percent higher than what the raw bandwidth would suggest.
The large cache also helps to reduce power consumption, which all ties into AMD’s targeted 50 percent performance per Watt improvements. This doesn’t mean power requirements stayed the same — RX 6800 has a slightly higher TBP (Total Board Power) than the RX 5700 XT, and the 6800 XT and upcoming 6900 XT are back at 300W (like the Vega 64). However, AMD still comes in at a lower power level than Nvidia’s competing GPUs, which is a bit of a change of pace from previous generation architectures.
It’s not entirely clear how AMD’s Ray Accelerators stack up against Nvidia’s RT cores. Much like Nvidia, AMD is putting one Ray Accelerator into each CU. (It seems we’re missing an acronym. Should we call the ray accelerators RA? The sun god, casting down rays! Sorry, been up all night, getting a bit loopy here…) The thing is, Nvidia is on its second-gen RT cores that are supposed to be around 1.7X as fast as its first-gen RT cores. AMD’s Ray Accelerators are supposedly 10 times as fast as doing the RT calculations via shader hardware, which is similar to what Nvidia said with its Turing RT cores. In practice, it looks as though Nvidia will maintain a lead in ray tracing performance.
That doesn’t even get into the whole DLSS and Tensor core discussion. AMD’s RDNA2 chips can do FP16 via shaders, but they’re still a far cry from the computational throughput of Tensor cores. That may or may not matter, as perhaps the FP16 throughput is enough for real-time inference to do something akin to DLSS. AMD has talked about FidelityFX Super Resolution, which it’s working on with Microsoft, but it’s not available yet, and of course, no games are shipping with it yet either. Meanwhile, DLSS is in a couple of dozen games now, and it’s also in Unreal Engine, which means uptake of DLSS could explode over the coming year.
Anyway, that’s enough of the architectural talk for now. Let’s meet the actual cards.
Meet the Radeon RX 6800 XT and RX 6800 Reference Cards
Image 1 of 11
(Image credit: Tom’s Hardware)
Image 2 of 11
(Image credit: Tom’s Hardware)
Image 3 of 11
(Image credit: Tom’s Hardware)
Image 4 of 11
(Image credit: Tom’s Hardware)
Image 5 of 11
(Image credit: Tom’s Hardware)
Image 6 of 11
(Image credit: Tom’s Hardware)
Image 7 of 11
(Image credit: Tom’s Hardware)
Image 8 of 11
(Image credit: Tom’s Hardware)
Image 9 of 11
(Image credit: Tom’s Hardware)
Image 10 of 11
(Image credit: Tom’s Hardware)
Image 11 of 11
(Image credit: Tom’s Hardware)
We’ve already posted an unboxing of the RX 6800 cards, which you can see in the above video. The design is pretty traditional, building on previous cards like the Radeon VII. There’s no blower this round, which is probably for the best if you’re worried about noise levels. Otherwise, you get a similar industrial design and aesthetic with both the reference 6800 and 6800 XT. The only real change is that the 6800 XT has a fatter heatsink and weighs 115g more, which helps it cope with the higher TBP.
Both cards are triple fan designs, using custom 77mm fans that have an integrated rim. We saw the same style of fan on many of the RTX 30-series GPUs, and it looks like the engineers have discovered a better way to direct airflow. Both cards have a Radeon logo that lights up in red, but it looks like the 6800 XT might have an RGB logo — it’s not exposed in software yet, but maybe that will come.
Image 1 of 11
(Image credit: Tom’s Hardware)
Image 2 of 11
(Image credit: Tom’s Hardware)
Image 3 of 11
(Image credit: Tom’s Hardware)
Image 4 of 11
(Image credit: Tom’s Hardware)
Image 5 of 11
(Image credit: Tom’s Hardware)
Image 6 of 11
(Image credit: Tom’s Hardware)
Image 7 of 11
(Image credit: Tom’s Hardware)
Image 8 of 11
(Image credit: Tom’s Hardware)
Image 9 of 11
(Image credit: Tom’s Hardware)
Image 10 of 11
(Image credit: Tom’s Hardware)
Image 11 of 11
(Image credit: Tom’s Hardware)
Otherwise, you get dual 8-pin PEG power connections, which might seem a bit overkill on the 6800 — it’s a 250W card, after all, why should it need the potential for up to 375W of power? But we’ll get into the power stuff later. If you’re into collecting hardware boxes, the 6800 XT box is also larger and a bit nicer, but there’s no real benefit otherwise.
The one potential concern with AMD’s reference design is the video ports. There are two DisplayPort outputs, a single HDMI 2.1 connector, and a USB Type-C port. It’s possible to use four displays with the cards, but the most popular gaming displays still use DisplayPort, and very few options exist for the Type-C connector. There also aren’t any HDMI 2.1 monitors that I’m aware of, unless you want to use a TV for your monitor. But those will eventually come. Anyway, if you want a different port selection, keep an eye on the third party cards, as I’m sure they’ll cover other configurations.
And now, on to the benchmarks.
Radeon RX 6800 Series Test Systems
Image 1 of 10
(Image credit: Tom’s Hardware)
Image 2 of 10
(Image credit: Tom’s Hardware)
Image 3 of 10
(Image credit: Tom’s Hardware)
Image 4 of 10
(Image credit: Tom’s Hardware)
Image 5 of 10
(Image credit: Tom’s Hardware)
Image 6 of 10
(Image credit: Tom’s Hardware)
Image 7 of 10
(Image credit: Tom’s Hardware)
Image 8 of 10
(Image credit: Tom’s Hardware)
Image 9 of 10
(Image credit: Tom’s Hardware)
Image 10 of 10
(Image credit: Tom’s Hardware)
It seems AMD is having a microprocessor renaissance of sorts right now. First, it has Zen 3 coming out and basically demolishing Intel in every meaningful way in the CPU realm. Sure, Intel can compete on a per-core basis … but only up to 10-core chips without moving into HEDT territory. The new RX 6800 cards might just be the equivalent of AMD’s Ryzen CPU launch. This time, AMD isn’t making any apologies. It intends to go up against Nvidia’s best. And of course, if we’re going to test the best GPUs, maybe we ought to look at the best CPUs as well?
For this launch, we have three test systems. First is our old and reliable Core i9-9900K setup, which we still use as the baseline and for power testing. We’re adding both AMD Ryzen 9 5900X and Intel Core i9-10900K builds to flesh things out. In retrospect, trying to do two new testbeds may have been a bit too ambitious, as we have to test each GPU on each testbed. We had to cut a bunch of previous-gen cards from our testing, and the hardware varies a bit among the PCs.
For the AMD build, we’ve got an MSI X570 Godlike motherboard, which is one of only a handful that supports AMD’s new Smart Memory Access technology. Patriot supplied us with two kits of single bank DDR4-4000 memory, which means we have 4x8GB instead of our normal 2x16GB configuration. We also have the Patriot Viper VP4100 2TB SSD holding all of our games. Remember when 1TB used to feel like a huge amount of SSD storage? And then Call of Duty: Modern Warfare (2019) happened, sucking down over 200GB. Which is why we need 2TB drives.
Meanwhile, the Intel LGA1200 PC has an Asus Maximum XII Extreme motherboard, 2x16GB DDR4-3600 HyperX memory, and a 2TB XPG SX8200 Pro SSD. (I’m not sure if it’s the old ‘fast’ version or the revised ‘slow’ variant, but it shouldn’t matter for these GPU tests.) Full specs are in the table below.
Anyway, the slightly slower RAM might be a bit of a handicap on the Intel PCs, but this isn’t a CPU review — we just wanted to use the two fastest CPUs, and time constraints and lack of duplicate hardware prevented us from going full apples-to-apples. The internal comparisons among GPUs on each testbed will still be consistent. Frankly, there’s not a huge difference between the CPUs when it comes to gaming performance, especially at 1440p and 4K.
Besides the testbeds, I’ve also got a bunch of additional gaming tests. First is the suite of nine games we’ve used on recent GPU reviews like the RTX 30-series launch. We’ve done some ‘bonus’ tests on each of the Founders Edition reviews, but we’re shifting gears this round. We’re adding four new/recent games that will be tested on each of the CPU testbeds: Assassin’s Creed Valhalla, Dirt 5, Horizon Zero Dawn, and Watch Dogs Legion — and we’ve enabled DirectX Raytracing (DXR) on Dirt 5 and Watch Dogs Legion.
There are some definite caveats, however. First, the beta DXR support in Dirt 5 doesn’t look all that different from the regular mode, and it’s an AMD promoted game. Coincidence? Maybe, but it’s probably more likely that AMD is working with Codemasters to ensure it runs suitably on the RX 6800 cards. The other problem is probably just a bug, but AMD’s RX 6800 cards seem to render the reflections in Watch Dogs Legion with a bit less fidelity.
Besides the above, we have a third suite of ray tracing tests: nine games (or benchmarks of future games) and 3DMark Port Royal. Of note, Wolfenstein Youngblood with ray tracing (which uses Nvidia’s pre-VulkanRT extensions) wouldn’t work on the AMD cards, and neither would the Bright Memory Infinite benchmark. Also, Crysis Remastered had some rendering errors with ray tracing enabled (on the nanosuits). Again, that’s a known bug.
Radeon RX 6800 Gaming Performance
We’ve retested all of the RTX 30-series cards on our Core i9-9900K testbed … but we didn’t have time to retest the RTX 20-series or RX 5700 series GPUs. The system has been updated with the latest 457.30 Nvidia drivers and AMD’s pre-launch RX 6800 drivers, as well as Windows 10 20H2 (the October 2020 update to Windows). It looks like the combination of drivers and/or Windows updates may have dropped performance by about 1-2 percent overall, though there are other variables in play. Anyway, the older GPUs are included mostly as a point of reference.
We have 1080p, 1440p, and 4K ultra results for each of the games, as well as the combined average of the nine titles. We’re going to dispense with the commentary for individual games right now (because of a time crunch), but we’ll discuss the overall trends below.
9 Game Average
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Borderlands 3
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
The Division 2
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Far Cry 5
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Final Fantasy XIV
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Forza Horizon 4
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Metro Exodus
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Red Dead Redemption 2
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Shadow Of The TombRaider
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Strange Brigade
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
AMD’s new GPUs definitely make a good showing in traditional rasterization games. At 4K, Nvidia’s 3080 leads the 6800 XT by three percent, but it’s not a clean sweep — AMD comes out on top in Borderlands 3, Far Cry 5, and Forza Horizon 4. Meanwhile, Nvidia gets modest wins in The Division 2, Final Fantasy XIV, Metro Exodus, Red Dead Redemption 2, Shadow of the Tomb Raider, and the largest lead is in Strange Brigade. But that’s only at the highest resolution, where AMD’s Infinity Cache may not be quite as effective.
Dropping to 1440p, the RTX 3080 and 6800 XT are effectively tied — again, AMD wins several games, Nvidia wins others, but the average performance is the same. At 1080p, AMD even pulls ahead by two percent overall. Not that we really expect most gamers forking over $650 or $700 or more on a graphics card to stick with a 1080p display, unless it’s a 240Hz or 360Hz model.
Flipping over to the vanilla RX 6800 and the RTX 3070, AMD does even better. On average, the RX 6800 leads by 11 percent at 4K ultra, nine percent at 1440p ultra, and seven percent at 1080p ultra. Here the 8GB of GDDR6 memory on the RTX 3070 simply can’t keep pace with the 16GB of higher clocked memory — and the Infinity Cache — that AMD brings to the party. The best Nvidia can do is one or two minor wins (e.g., Far Cry 5 at 1080p, where the GPUs are more CPU limited) and slightly higher minimum fps in FFXIV and Strange Brigade.
But as good as the RX 6800 looks against the RTX 3070, we prefer the RX 6800 XT from AMD. It only costs $70 more, which is basically the cost of one game and a fast food lunch. Or put another way, it’s 12 percent more money, for 12 percent more performance at 1080p, 14 percent more performance at 1440p, and 16 percent better 4K performance. You also get AMD’s Rage Mode pseudo-overclocking (really just increased power limits).
Radeon RX 6800 CPU Scaling and Overclocking
Our traditional gaming suite is due for retirement, but we didn’t want to toss it out at the same time as a major GPU launch — it might look suspicious. We didn’t have time to do a full suite of CPU scaling tests, but we did run 13 games on the five most recent high-end/extreme GPUs on our three test PCs. Here’s the next series of charts, again with commentary below.
13-Game Average
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Assassin’s Creed Valhalla
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Borderlands 3
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
The Division 2
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Dirt 5
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Far Cry 5
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Final Fantasy XIV
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Forza Horizon 4
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Horizon Zero Dawn
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Metro Exodus
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Red Dead Redemption 2
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Shadow of the Tomb Raider
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Strange Brigade
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Watch Dogs Legion
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
These charts are a bit busy, perhaps, with five GPUs and three CPUs each, plus overclocking. Take your time. We won’t judge. Nine of the games are from the existing suite, and the trends noted earlier basically continue.
Looking just at the four new games, AMD gets a big win in Assassin’s Creed Valhalla (it’s an AMD promotional title, so future updates may change the standings). Dirt 5 is also a bit of an odd duck for Nvidia, with the RTX 3090 actually doing quite badly on the Ryzen 9 5900X and Core i9-10900K for some reason. Horizon Zero Dawn ends up favoring Nvidia quite a bit (but not the 3070), and lastly, we have Watch Dogs Legion, which favors Nvidia a bit (more at 4K), but it might have some bugs that are currently helping AMD’s performance.
Overall, the 3090 still maintains its (gold-plated) crown, which you’d sort of expect from a $1,500 graphics card that you can’t even buy right now. Meanwhile, the RX 6800 XT mixes it up with the RTX 3080, coming out slightly ahead overall at 1080p and 1440p but barely trailing at 4K. Meanwhile, the RX 6800 easily outperforms the RTX 3070 across the suite, though a few games and/or lower resolutions do go the other way.
Oddly, my test systems ended up with the Core i9-10900K and even the Core i9-9900K often leading the Ryzen 9 5900X. The 3090 did best with the 5900X at 1080p, but then went to the 10900K at 1440p and both the 9900K and 10900K at 4K. The other GPUs also swap places, though usually the difference between CPU is pretty negligible (and a few results just look a bit buggy).
It may be that the beta BIOS for the MSI X570 board (which enables Smart Memory Access) still needs more tuning, or that the differences in memory came into play. I didn’t have time to check performance without enabling the large PCIe BAR feature either. But these are mostly very small differences, and any of the three CPUs tested here are sufficient for gaming.
As for overclocking, it’s pretty much what you’d expect. Increase the power limit, GPU core clocks, and GDDR6 clocks and you get more performance. It’s not a huge improvement, though. Overall, the RX 6800 XT was 4-6 percent faster when overclocked (the higher results were at 4K). The RX 6800 did slightly better, improving by 6 percent at 1080p and 1440p, and 8 percent at 4K. GPU clocks were also above 2.5GHz for most of the testing of the RX 6800, and it’s default lower boost clock gave it a bit more room for improvement.
Radeon RX 6800 Series Ray Tracing Performance
So far, most of the games haven’t had ray tracing enabled. But that’s the big new feature for RDNA2 and the Radeon RX 6000 series, so we definitely wanted to look into ray tracing performance more. Here’s where things take a turn for the worse because ray tracing is very demanding, and Nvidia has DLSS to help overcome some of the difficulty by doing AI-enhanced upscaling. AMD can’t do DLSS since it’s Nvidia proprietary tech, which means to do apples-to-apples comparisons, we have to turn off DLSS on the Nvidia cards.
That’s not really fair because DLSS 2.0 and later actually look quite nice, particularly when using the Balanced or Quality modes. What’s more, native 4K gaming with ray tracing enabled is going to be a stretch for just about any current GPU, including the RTX 3090 — unless you’re playing a lighter game like Pumpkin Jack. Anyway, we’ve looked at ray tracing performance with DLSS in a bunch of these games, and performance improves by anywhere from 20 percent to as much as 80 percent (or more) in some cases. DLSS may not always look better, but a slight drop in visual fidelity for a big boost in framerates is usually hard to pass up.
We’ll have to see if AMD’s FidelityFX Super Resolution can match DLSS in the future, and how many developers make use of it. Considering AMD’s RDNA2 GPUs are also in the PlayStation 5 and Xbox Series S/X, we wouldn’t count AMD out, but for now, Nvidia has the technology lead. Which brings us to native ray tracing performance.
10-game DXR Average
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
3DMark Port Royal
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Boundary Benchmark
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Call of Duty Black Ops Cold War
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Control
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Crysis Remastered
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Dirt 5
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Fortnite
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Metro Exodus
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Shadow of the Tomb Raider
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
WatchDogs
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
Well. So much for AMD’s comparable performance. AMD’s RX 6800 series can definitely hold its own against Nvidia’s RTX 30-series GPUs in traditional rasterization modes. Turn on ray tracing, even without DLSS, and things can get ugly. AMD’s RX 6800 XT does tend to come out ahead of the RTX 3070, but then it should — it costs more, and it has twice the VRAM. But again, DLSS (which is supported in seven of the ten games/tests we used) would turn the tables, and even the DLSS quality mode usually improves performance by 20-40 percent (provided the game isn’t bottlenecked elsewhere).
Ignoring the often-too-low framerates, overall, the RTX 3080 is nearly 25 percent faster than the RX 6800 XT at 1080p, and that lead only grows at 1440p (26 percent) and 4K (30 percent). The RTX 3090 is another 10-15 percent ahead of the 3080, which is very much out of AMD’s reach if you care at all about ray tracing performance — ignoring price, of course.
The RTX 3070 comes out with a 10-15 percent lead over the RX 6800, but individual games can behave quite differently. Take the new Call of Duty: Black Ops Cold War. It supports multiple ray tracing effects, and even the RTX 3070 holds a significant 30 percent lead over the 6800 XT at 1080p and 1440p. Boundary, Control, Crysis Remastered, and (to a lesser extent) Fortnite also have the 3070 leading the AMD cards. But Dirt 5, Metro Exodus, Shadow of the Tomb Raider, and Watch Dogs Legion have the 3070 falling behind the 6800 XT at least, and sometimes the RX 6800 as well.
There is a real question about whether the GPUs are doing the same work, though. We haven’t had time to really dig into the image quality, but Watch Dogs Legion for sure doesn’t look the same on AMD compared to Nvidia with ray tracing enabled. Check out these comparisons:
Apparently Ubisoft knows about the problem. In a statement to us, it said, “We are aware of the issue and are working to address it in a patch in December.” But right now, there’s a good chance that AMD’s performance in Watch Dogs Legion at least is higher than it should be with ray tracing enabled.
Overall, AMD’s ray tracing performance looks more like Nvidia’s RTX 20-series GPUs than the new Ampere GPUs, which was sort of what we expected. This is first gen ray tracing for AMD, after all, while Nvidia is on round two. Frankly, looking at games like Fortnite, where ray traced shadows, reflections, global illumination, and ambient occlusion are available, we probably need fourth gen ray tracing hardware before we’ll be hitting playable framerates with all the bells and whistles. And we’ll likely still need DLSS, or AMD’s Super Resolution, to hit acceptable frame rates at 4K.
Radeon RX 6800 Series: Power, Temps, Clocks, and Fans
We’ve got our usual collection of power, temperature, clock speed, and fan speed testing using Metro Exodus running at 1440p, and FurMark running at 1600×900 in stress test mode. While Metro is generally indicative of how other games behave, we loop the benchmark five times, and there are dips where the test restarts and the GPU gets to rest for a few seconds. FurMark, on the other hand, is basically a worst-case scenario for power and thermals. We collect the power data using Powenetics software and hardware, which uses GPU-Z to monitor GPU temperatures, clocks, and fan speeds.
GPU Total Power
Image 1 of 4
(Image credit: Tom’s Hardware)
Image 2 of 4
(Image credit: Tom’s Hardware)
Image 3 of 4
(Image credit: Tom’s Hardware)
Image 4 of 4
(Image credit: Tom’s Hardware)
AMD basically sticks to the advertised 300W TBP on the 6800 XT with Metro Exodus, and even comes in slightly below the 250W TBP on the RX 6800. Enabling Rage Mode on the 6800 XT obviously changes things, and you can also see our power figures for the manual overclocks. Basically, Big Navi can match RTX 3080 when it comes to power if you relax increase the power limits.
FurMark pushes power on both cards a bit higher, which is pretty typical. If you check the line graphs, you can see our 6800 XT OC starts off at nearly 360W in FurMark before it throttles down a bit and ends up at closer to 350W. There are some transient power spikes that can go a bit higher as well, which we’ll discuss more later.
GPU Core Clocks
Image 1 of 4
(Image credit: Tom’s Hardware)
Image 2 of 4
(Image credit: Tom’s Hardware)
Image 3 of 4
(Image credit: Tom’s Hardware)
Image 4 of 4
(Image credit: Tom’s Hardware)
Looking at the GPU clocks, AMD is pushing some serious MHz for a change. This is now easily the highest clocked GPU we’ve ever seen, and when we manually overclocked the RX 6800, we were able to hit a relatively stable 2550 MHz. That’s pretty damn impressive, especially considering power use isn’t higher than Nvidia’s GPUs. Both cards also clear their respective Game Clocks and Boost Clocks, which is a nice change of pace.
GPU Core Temp
Image 1 of 4
(Image credit: Tom’s Hardware)
Image 2 of 4
(Image credit: Tom’s Hardware)
Image 3 of 4
(Image credit: Tom’s Hardware)
Image 4 of 4
(Image credit: Tom’s Hardware)
GPU Fan Speed
Image 1 of 4
(Image credit: Tom’s Hardware)
Image 2 of 4
(Image credit: Tom’s Hardware)
Image 3 of 4
(Image credit: Tom’s Hardware)
Image 4 of 4
(Image credit: Tom’s Hardware)
Temperatures and fan speeds are directly related to each other. Ramp of fan speed — which we did for the overclocked 6800 cards — and you can get lower temperatures, at the cost of noise levels. We’re still investigating overclocking as well, as there’s a bit of odd behavior so far. The cards will run fine for a while, and then suddenly drop into a weak performance mode where performance might be half the normal level, or even worse. That’s probably due to the lack of overclocking support in MSI Afterburner for the time being. By default, though, the cards have a good balance of cooling with noise. We’ll get exact SPL readings later (still benchmarking a few other bits), but it’s interesting that all of the new GPUs (RTX 30-series and RX 6000) have lower fan speeds than the previous generation.
Image 1 of 2
(Image credit: Future)
Image 2 of 2
(Image credit: Future)
We observed some larger-than-expected transient power spikes with the RX 6800 XT, but to be absolutely clear, these transient power spikes shouldn’t be an issue — particularly if you don’t plan on overclocking. However, it is important to keep these peak power measurements in mind when you spec out your power supply.
Transient power spikes are common but are usually of such short duration (in the millisecond range) that our power measurement gear, which records measurements at roughly a 100ms granularity, can’t catch them. Typically you’d need a quality oscilloscope to measure transient power spikes accurately, but we did record several spikes even with our comparatively relaxed polling.
The charts above show total power consumption of the RX 6800XT at stock settings, overclocked, and with Rage Mode enabled. In terms of transient power spikes, we don’t see any issues at all with Metro Exodus, but we see brief peaks during Furmark of 425W with the manually overclocked config, 373W with Rage Mode, and 366W with the stock setup. Again, these peaks were measured within one 100ms polling cycle, which means they could certainly trip a PSU’s over power protection if you’re running at close to max power delivery.
Image 1 of 4
(Image credit: Future)
Image 2 of 4
(Image credit: Future)
Image 3 of 4
(Image credit: Future)
Image 4 of 4
(Image credit: Future)
To drill down on the topic, we split out our power measurements from each power source, which you’ll see above. The RX 6800 XT draws power from the PCIe slot and two eight-pin PCIe connectors (PEG1/PEG2).
Power consumption over the PCIe slot is well managed during all the tests (as a general rule of thumb, this value shouldn’t exceed 71W, and the 6800 XT is well below that mark). We also didn’t catch any notable transient spikes during our real-world Metro Exodus gaming test at either stock or overclocked settings.
However, during our FurMark test at stock settings, we see a power consumption spike to 206W on one of the PCIe cables for a very brief period (we picked up a single measurement of the spike during each run). After overclocking, we measured a simultaneous spike of 231W on one cable and 206W on the other for a period of one measurement taken at a 100ms polling rate. Naturally, those same spikes are much less pronounced with Rage Mode overclocking, measuring only 210W and 173W. A PCIe cable can easily deliver ~225W safely (even with 18AWG), so these transient power spikes aren’t going to melt connectors, wires, or harm the GPU in any way — they would need to be of much longer duration to have that type of impact.
But the transient spikes are noteworthy because some CPUs, like the Intel Core i9-9900K and i9-10900K, can consume more than 300W, adding to the total system power draw. If you plan on overclocking, it would be best to factor the RX6800 XT’s transient power consumption into the total system power.
Power spikes of 5-10ms can trip the overcurrent protection (OCP) on some multi-rail power supplies because they tend to have relatively low OCP thresholds. As usual, a PSU with a single 12V rail tends to be the best solution because they have much better OCP mechanisms, and you’re also better off using dedicated PCIe cables for each 8-pin connector.
(Image credit: Tom’s Hardware)
Radeon RX 6800 Series: Prioritizing Rasterization Over Ray Tracing
It’s been a long time since AMD had a legitimate contender for the GPU throne. The last time AMD was this close … well, maybe Hawaii (Radeon R9 290X) was competitive in performance at least, while using quite a bit more power. That’s sort of been the standard disclaimer for AMD GPUs for quite a few years. Yes, AMD has some fast GPUs, but they tend to use a lot of power. The other alternative was best illustrated by one of the best budget GPUs of the past couple of years: AMD isn’t the fastest, but dang, look how cheap the RX 570 is! With the Radeon RX 6800 series, AMD is mostly able to put questions of power and performance behind it. Mostly.
The RX 6800 XT ends up just a bit slower than the RTX 3080 overall in traditional rendering, but it costs less, and it uses a bit less power (unless you kick on Rage Mode, in which case it’s a tie). There are enough games where AMD comes out ahead that you can make a legitimate case for AMD having the better card. Plus, 16GB of VRAM is definitely helpful in a few of the games we tested — or at least, 8GB isn’t enough in some cases. The RX 6800 does even better against the RTX 3070, generally winning most benchmarks by a decent margin. Of course, it costs more, but if you have to pick between the 6800 and 3070, we’d spend the extra $80.
The problem is, that’s a slippery slope. At that point, we’d also spend an extra $70 to go to the RX 6800 XT … and $50 more for the RTX 3080, with its superior ray tracing and support for DLSS, is easy enough to justify. Now we’re looking at a $700 graphics card instead of a $500 graphics card, but at least it’s a decent jump in performance.
Of course, you can’t buy any of the Nvidia RTX 30-series GPUs right now. Well, you can, if you get lucky. It’s not that Nvidia isn’t producing cards; it’s just not producing enough cards to satisfy the demand. And, let’s be real for a moment: There’s not a chance in hell AMD’s RX 6800 series are going to do any better. Sorry to be the bearer of bad news, but these cards are going to sell out. You know, just like every other high-end GPU and CPU launched in the past couple of months. (Update: Yup, every RX 6800 series GPU sold out within minutes.)
What’s more, AMD is better off producing more Ryzen 5000 series CPUs than Radeon RX 6000 GPUs. Just look at the chip sizes and other components. A Ryzen 9 5900X has two 80mm square compute die with a 12nm IO die in a relatively compact package, and AMD is currently selling every single one of those CPUs for $550 — or $800 for the 5950X. The Navi 21 GPU, by comparison, is made on the same TSMC N7 wafers, and it uses 519mm square, plus it needs GDDR6 memory, a beefy cooler and fan, and all sorts of other components. And it still only sells for roughly the same price as the 5900X.
(Image credit: Tom’s Hardware)
Which isn’t to say you shouldn’t want to buy an RX 6800 card. It’s really going to come down to personal opinions on how important ray tracing will become in the coming years. The consoles now support the technology, but even the Xbox Series X can’t keep up with an RX 6800, never mind an RTX 3080. Plus, while some games like Control make great use of ray tracing effects, in many other games, the ray tracing could be disabled, and most people wouldn’t really miss it. We’re still quite a ways off from anything approaching Hollywood levels of fidelity rendered in real time.
In terms of features, Nvidia still comes out ahead. Faster ray tracing, plus DLSS — and whatever else those Tensor cores might be used for in the future — seems like the safer bet long term. But there are still a lot of games forgoing ray tracing effects, or games where ray tracing doesn’t make a lot of sense considering how it causes frame rates to plummet. Fortnite in creative mode might be fine for ray tracing, but I can’t imagine many competitive players being willing to tank performance just for some eye candy. The same goes for Call of Duty. But then there’s Cyberpunk 2077 looming, which could be the killer game that ray tracing hardware needs.
We asked earlier if Big Navi, aka RDNA2, was AMD’s Ryzen moment for its GPUs. In a lot of ways, it’s exactly that. The first generation Ryzen CPUs brought 8-core CPUs to mainstream platforms, with aggressive prices that Intel had avoided. But the first generation Zen CPUs and motherboards were raw and had some issues, and it wasn’t until Zen 2 that AMD really started winning key matchups, and Zen 3 finally has AMD in the lead. Perhaps it’s better to say that Navi, in general, is AMD trying to repeat what it did on the CPU side of things.
RX 6800 (Navi 21) is literally a bigger, enhanced version of last year’s Navi 10 GPUs. It’s up to twice the CUs, twice the memory, and is at least a big step closer to feature parity with Nvidia now. If you can find a Radeon RX 6800 or RX 6800 XT in stock any time before 2021, it’s definitely worth considering. RX 6800 and Big Navi aren’t priced particularly aggressively, but they do slot in nicely just above and below Nvidia’s competing RTX 3070 and 3080.
by Mattia Speroni , published on 23 November 2020, at 18: 41
?? Canon EOS R5 officially receives the 1.2.0 firmware which does not bring sensational news but some small / large improvements to allow users to make the most of the latest full-frame mirrorless camera from the Japanese manufacturer. ??
Continue the series of firmware updates for Canon EOS R5 which, after version 1.1.1, now officially receives that 1.2.0 . There are no striking new features but small improvements and corrections to allow easier use by users (especially those who use external flashes).
Canon EOS R5 firmware news
The firmware for the camera Canon EOS R5 is available for download for Windows and Mac users on the official website of Canon Italy . Below is the complete list of news.
When using high-speed or low-speed continuous shooting modes, in Drive mode with , the visibility of the subject within the frame has been improved when shooting moving objects. During continuous shooting, black frames are inserted between the frames in the viewfinder and the live view. This improves the visibility of moving subjects in the live view and viewfinder.
Adds the [Autom.] setting to the [Luminosità mirino] menu that allows you to brighten and dim according to the ambient light conditions.
Activate 2nd synchronization with curtain shutter during radio transmission wireless flash shooting when the Speedlight EL-1 flash is attached to the camera.
Select manual flash output (excluding high sync and optical transmission speed wireless flash shooting) and configure up to 1 / 8192 from the camera menu screen when the Speedlight El-1 flash is attached to the camera.
Improve the compatibility of HEIF images recorded in the camera with MIAF (Multi-Image Application Format).
Adds AF support and release during zoom operations for some RF and EF lenses.
At the 21. November 1995 Spencer Kimball and Peter Mattis release the first public beta version of Gimp for Linux, Solaris and Unix. The program was developed as part of a semester project at Berkeley University in California. Gimp stands for “General Image Manipulation Program” – in the BDSM scene but also for a submissive person. The developers were inspired by the 54 Tarantino film Pulp Fiction for the naming.
The first official version published in January 1996 bears the number 0. 54. From now on, Gimp is unstoppable and is embarking on an unprecedented triumph. After a meeting with GNU founder Richard Stallman the following year, Kimball and Mattis changed the name of their program to “GNU Image Manipulation Program” without any acronym. And that name it still bears today.
Own format for Gimp 1.0 First in June 1998 Gimp gets the status 1.0 and a memory management system that allows large image files to be opened. In addition, Gimp 1.0 can store files in its own XCF format including layers and execute scripts in the Script-Fu language. The program is now also available for Windows and macOS. However, it is a time that demands a lot from users, for example compiling the installer for their platform themselves and installing the GUI toolkit GTK + manually.
Gimp 2.0 will be released in December 2004, but does not bring any earth-shattering innovations. It can import and output SVG files and brings simple functions for the CMYK color model and prepress.
Gimp 2.4 brings ICC color management system and Pressure simulation. When opening files, the program asks whether it should interpret or reject embedded ICC profiles.
What takes a long time will finally be GEGL In October 2008 the developers lay the foundation for the switch to the new graphics library with Gimp 2.6 GEGL. It promises complete ICC color management and image processing in 32 bit color depth per channel – optionally in floating point operations. For the time being, Gimp 2.8 continues to work with only 8 bit color depth per channel. Gimp 2.8 also comes with an optional one-window mode that combines the three floating pallets that were usual up to that point in a dock.
Gimp 2.8 runs for the first time on request in One-window mode.
It should take another ten years before 2018 the switch to GEGL with Gimp 2. 10, the direct successor of Gimp 2.8 by the way. The uneven version numbers are reserved for the developer versions. The long-awaited high color depth is finally a reality. In addition, GEGL brings immediate previews in the document window of filters such as Gaussian soft focus or unsharp masking. The current developer version 2. 54 promises support for HiDPI monitors, improved support for graphics tablets and a new plugin -API.
Gimp 2. 10 fully implements GEGL and shows, among other things, the effect of effects live in the document window.
Compiled installers are now naturally available for download at Gimp.org. Countless books, articles and videos explain the operation. Gimp becomes more user-friendly in adulthood. The free image processing has a permanent place on many PCs and has become an integral part of the open source world. We congratulate and look forward to further 25 years.
(akr)
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.