As recently promised, CD Projekt RED has revealed the detailed hardware requirements of Cyberpunk 2077 on the PC. The minimum and recommended configuration for resolution 1080 p (1920 x pixels) we met in mid-September. On the one hand, many players were pleasantly surprised by the relatively low requirements, but on the other hand felt disappointed. Companies like Ubisoft are starting to get used to specs that also include higher resolutions and graphics settings, so the same was expected from one of the most anticipated games in recent years. Since yesterday it has gotten again loud about the upcoming RPG and CD Projekt RED has decided to forge the iron while it’s hot, providing the updated hardware requirements of Cyberpunk 2080.
We learned the detailed hardware requirements of Cyberpunk 2077. CD Projekt RED has provided the recommended configurations for the resolution 1440 pi 4K and ray tracing. You need the GeForce RTX 3080 for maximum detail.
Cyberpunk 2077 with a rich list of graphics settings in the PC version
Creators released the hardware requirements of Cyberpunk 2077 for resolution 1440 pi 4K and ray tracing. The minimum and recommended configurations refer to low and high details in Full HD resolution. Now it is revealed that the Ultra settings in 1440 pi 4K you will need a processor comparable to those recommended for 1080 p, but paired, of course, with a much more powerful graphics card. I’m talking about GeForce RTX 2060 and Radeon RX 5700 XT to 1440 p and GeForce RTX 2080 S, RTX 3070 and Radeon RX 6800 XT to 4K. To run Cyberpunk 2077 with ray tracing set to Ultra ( Psycho?) Will be required, depending on the resolution, a GeForce RTX graphics card 3070 and RTX 3080. Here are the full PC hardware requirements for Cyberpunk 2077 on PC:
Cyberpunk minimum hardware requirements 2077 (1080 and low settings)
Cyberpunk Recommended Hardware Requirements 2077 ( 1080 and high settings)
Cyberpunk Recommended Hardware Requirements 2077 ( and Ultra settings)
Recommended Cyberpunk hardware requirements 2077 (4K and Ultra settings)
Operating system
64 – Windows bit 7 or 10
64 – Windows bit 10
64 – Windows bit 10
64 – Windows bit 10
Processor
Intel Core i5 – 3570 K 3.4 GHz or AMD FX – 8310 3.4 GHz
Intel Core i7 – 4790 3.6 GHz or AMD Ryzen 3 3200 G 3.6 GHz
Intel Core i7 – 4790 3.6 GHz or AMD Ryzen 3 3200 G 3.6 GHz
Gameplay Cyberpunk 2077: Goodbye Night City Wire with a bang
According to the studio CD Projekt RED, to enjoy the effects based on ray tracing on medium settings, all you need is the weakest GeForce from the RTX family, RTX 2060 with 6 GB of VRAM. The game will offer four effects based on ray tracing, one of which, lighting, will have several degrees of quality, the rest will have to be turned off in case of lower performance. We remind you that ray tracing in Cyberpunk 2077 which is based on on the DXR standard, it should work on both Nvidia GeForce RTX cards and the newly released Radeons RX 6000. True, the example of Godfall shows that it does not have to be that way. New AMD graphics systems are missing in the above list, but the reason may be prosaic: Cyberpunk 2077 with RT was not optimized for Radeon RX 6000. How the revealed hardware requirements of Cyberpunk 2077 will translate into reality, we will see 10 December this year.
Matthew Wilson 3 hours ago Featured Tech News, General Tech
Back in January, Google announced plans to end support for its Chrome browser on Windows 7. At the time, the support period was due to end in January 2021, but Google has since opted to extend the window.
Windows 7 already reached the end of its extended support period with Microsoft earlier this year, so other application developers are following. While Google was due to end Chrome support for Windows 7 in January though, the company will now offer support for an additional six months.
As pointed out by TechSpot, this follows on from a Google-commissioned report into how many enterprise organisations are still relying on Windows 7. As it turns out, Google’s investigation found that 21 percent of businesses were still in the ‘process of migrating’ to Windows 10, while a tiny 1% were planning to transition ‘soon’.
Due to COVID-19, it is assumed that the transition away from Windows 7 will have slowed down for many companies. As a result, Chrome will be supported on Windows 7 until July 2021 and security updates will continue until January 2022.
KitGuru Says: Given the pandemic, it makes sense for Google to extend its deadlines here. Does your place of work still happen to use Windows 7? Do you know if you’ll be upgrading to Windows 10 soon?
Become a Patron!
Check Also
Apple to bring PS5 DualSense and Xbox Series X controller support to iOS
The Xbox Series X/S have been out for almost two weeks now and with that, …
Home/Software & Gaming/InXile’s Frostpoint VR: Proving Grounds releases next month
Matthew Wilson 2 days ago Software & Gaming, Virtual Reality
Just ahead of the release of Wasteland 3, InXile revealed that they’ve been quietly working on a VR shooter. Things have been a little quiet since then, but this week, we learned that Frostpoint VR: Proving Grounds will be releasing in December.
This is a PC VR game, so it should work with the Oculus Rift, HTC VIVE, Valve Index and Windows VR headsets. The game is shipping on both Steam and the Oculus Store on the 1st of December. Presumably, that means Quest owners will also be able to play via Oculus Link.
“Set in an abandoned military base in Antarctica, each team will need to gear up and set out to dominate the opposing team while protecting themselves from terrifying biomechanical creatures.”
The game promises highly detailed weapons and a unique sci-fi setting. While PvP is the main focus, there is also a “unique PvE twist” involving an ‘”otherworldly biomechanical threat”.
KitGuru Says: With InXile now under Microsoft, I wouldn’t necessarily expect more VR games from them, but it will be interesting to see what they came up with here. Will many of you be giving this a try next month? It could be a nice change of pace from Pavlov, Contractors and Onward in the VR PvP shooter space.
Become a Patron!
Check Also
Ubisoft aware some PS5 owners having issues upgrading PS4 games
Ubisoft has been supporting free current-gen to next-gen upgrades on its games this holiday season …
Apple has commented on the subject of “Windows on ARM Macs” for the first time: The new Macs with Apple’s M1 chip are “undoubtedly very capable” of running the ARM version of Windows natively, emphasized Apple’s software boss Craig Federighi in an interview that the support is “really in Microsoft’s hands”. This would also mean that x 86 – Windows software can be used on the ARM Macs.
No Windows on ARM for end customers Apple provides all necessary core technologies to allow Microsoft to implement it , explained Federighi to Ars Technica . The decision about this lies with Microsoft, also with regard to licensing. The ARM version of Windows has not yet been sold to end customers.
Microsoft has just started the test phase of x 64 – Emulation started in Windows on ARM to enable the execution of resource-hungry applications that were developed for devices with Intel processors. Windows on ARM already supports 32 – Bit x 86 – software as well as native 32 – and 64 – bit apps for ARM.
Will there be Boot Camp 2? Whether Apple would go back to the work of writing Windows drivers for its own hardware and supporting Windows on ARM in Boot Camp, for example, is not clear from the interview. Windows can be installed and booted on Intel Macs – with drivers maintained by Apple. After the announcement of the big Mac switch from Intel to in-house, ARM-based processors, Federighi only stated that they no longer offer boot options for alternative operating systems on ARM Macs, virtualization is the way forward.
Common virtualizers for macOS do not yet run on Macs with Apple’s M1 chip. Parallels Desktop is in preparation. Here, too, a license change by Microsoft would be required for the virtualization of Windows on ARM. One is “thrilled” that Microsoft will support x 32 apps in its ARM version of Windows, so far only Parallels says without any further announcement about Windows support. The Windows execution layer CrossOver brings in version Windows apps on ARM Macs, but currently still relies on Apple’s translation layer Rosetta 2 and doesn’t support all Windows software.
Secure Shell for beginners: administer via SSH computer in the network Short SSH ssh customer How SSH works First steps manage key SSH keys instead of passwords Copy data and files Reduce traffic SSH, the “Secure Shell”, is a cryptographic network protocol for the encrypted and thus tamper-proof and tap-proof transmission of data over insecure networks . With SSH, you can conveniently carry out administrative tasks using a terminal, as it makes the console of a remote computer available on the local workstation. The protocol sends the keyboard input from the local to the remote system and then redirects its text output back to the local terminal window. In this article we will show you how to establish connections via SSH with remote computers for remote maintenance, generate and manage keys, manage files and directories and compress data traffic.
Short SSH ssh customer To avoid confusion: SSH is not a Telnet implementation with encryption. SSH (in capital letters) is also not the program (ssh) that is used in the terminal to establish SSH connections. In the following we use “SSH” when it comes to the protocol and the technology as such, while “ssh” refers to the command in the terminal or the command line.
All full-blown operating systems from Windows to GNU / Linux and the BSDs including macOS X to IBM’s AIX or HP’s HP-UX use OpenSSH from the OpenBSD- Teams that we also use for this article. The OpenSSH package consists of several components. “Sshd”, the SSH server as a daemon, is essential. What administrators and users use in the terminal is the SSH client “ssh”, which replaces old tools such as telnet, rlogin and rsh. “Scp” (replaces rcp) is used for copy processes via SSH, rarely “sftp” as a substitute for ftp. With “ssh-keygen”, SSH generates or checks the RSA, DSA or Elliptic Curve keys that are responsible for user and system authentication. With “ssh-keyscan” the public keys of a list of hosts can be collected. Finally, “ssh-add” and “ssh-agent” are used to keep keys in memory and thus make logins on other systems more convenient.
Access to all contents of heise + exclusive tests, advice & background: independent , critically founded c’t, iX, Technology Review, Mac & i, Make, c’t read photography directly in the browser register once – read on all devices – can be canceled monthly first month free, then monthly 9 , 95 € Weekly newsletter with personal reading recommendations from the editor-in-chief Start FREE month Now FREE month begin heise + already subscribed?
Sign in and read Register now and read the article immediately More information about heise +
Aldi Nord will take effect from December 3rd 2020 three notebooks in the program. The cheapest device costs just once 200 euros and is intended for schoolchildren. For 600 Euro there is an attractively priced Tiger Lake model in the starting blocks.
Inside the Cheap notebooks Akoya E 11202 is Intel’s Celeron N mobile processor 3450 with four atom cores (1.1–2.2 GHz) for office tasks should be enough. In addition, there are 4 GB of RAM and 64 GByte eMMC flash memory, on the manufacturer Medion Windows 10 Home pre-installed in S mode. The latter allows the installation of apps from the Windows Store.
Drop-proof and waterproof The plastic housing looks a bit old-fashioned, but according to Medion it should withstand falls from a height of one meter. Keyboard and trackpad are also up to 71 Cubic centimeter water resistant. The TN display is 11, 6 inches tall and represents 1366 × 768 pixels.
Two USB 3.2 Gen 1 (5 GBit / s) connections are available , including a Type-C port, one USB 2.0, an audio combo jack for connecting a headset and an HDMI output. The mass storage device can be expanded using a micro SD card reader. The Akoya E 11202 transmits via Wi-Fi 5 (WLAN 802. 11 ac) and Bluetooth 4.2 and weighs just under 1.2 kg.
Medion Akoya E 11202, S 15450 and Erazer Defender 10 (30 Photos) Akoya E 11202
(Image: Medion) Tiger Lake Notebook with a good price-performance ratio For 600 Euro, Aldi offers one of the cheapest notebooks with Int els four-core Tiger Lake processor Core i5 – 1115 G7 on. A full-fledged Windows 01 Home is with the Akoya S 15450 on a 512 GByte SSD pre-installed, plus 8 GByte DDR4 – 3200 – RAM (dual-channel, two SO-DIMMs). The IPS display measures 11, 6 inches, represents 1920 × 1080 Pixel (Full HD) and lights up with 300 cd / m² – most of the other screens in this price range only offer 250 cd / m².
Medion does without Thunderbolt 4 in the Akoya S 15450, instead there is USB 3.2 Gen 2 (10 Gbit / s) with Type-C connection and DisplayPort Altmode. There are also two USB 3.2 Gen 1 Type A (5 GBit / s), HDMI output, audio combo jack and SD card reader. Wi-Fi 6 (WLAN 802. 11 ax) and Bluetooth 5.1 are on board. In the aluminum housing there is a permanently installed 42, 6 Wh battery; the notebook weighs 1.8 kg.
Competitive models with core i7 – 1165 G7 only start at 750 Euro . Cheaper Tiger Lake notebooks rely on the slower Core i3 processors – 1115 G4 or Core i5 – 1135 G7 and some come with smaller SSDs.
Aldi Nord will sell the Erazer Defender in a performance class higher up on December 3rd 10 For 950 Euro. The notebook combines Intel’s four-core Core i5 – 10300 H with Nvidia’s GeForce GTX 1650 (4 GByte GDDR6), a 1 TByte PCI Express SSD and 16 GByte DDR4 – 2933-R.A.M. Among other things, Wi-Fi 6 and USB 3.2 Gen 2 Type C (10 GBit / s) are included.
Aldi has confirmed the final prices; we have adjusted the message accordingly.
[Update, 23.11.20, 15:00 Uhr:] Correction: Medion installs Intel’s Core i5 – 1135 G7 instead of the Core i7 – 1165 G7. The first CPU has 4 MB less level 3 cache and reaches a maximum of 4.2 instead of 4.7 GHz. In addition, Aldi submitted information on the display brightness on request: Up to 250 cd / m² are good for the price range. (mma)
(Pocket-lint) – Noise-canceling headphones have become almost an essential everyday item for many of us, whether they’re used to isolate your listening pleasure on your morning commute or drown out the drone of a jet engine.
And with many more of us working from home, there’s every reason why you should get a pair.
But with so many choices available, finding the best headphones for you is no easy task. Some pairs will have more effective noise cancellation, while others will sound better – so finding the middle ground can be tricky.
We’ve rounded up our favourites and picked out what we believe to be the best pair money can currently buy, to make the decision a whole lot easier.
Our pick of the best ANC headphones
Pocket-lint
Bose Noise Cancelling Headphones 700
squirrel_widget_158195
Think of noise-canceling headphones and chances are you’ll think of Bose. Despite the imaginative name, these over-ear cans deliver very imaginative sound quality to rival the best competition out there. And the multi-level noise-canceling is class-leading.
There’s also smart assistant integration for the big three (Google, Amazon, Apple), a solid app for various customizations (but no EQ, sadly), and well-integrated touch-based controls on the right earcup.
We can think of no other pair we’d rather take on our travels. Bose is the boss when it comes to noise-canceling.
Bose Noise Cancelling Headphones 700 review: Bose is the ANC boss
Pocket-lint
Sony WH-1000XM4
squirrel_widget_328997
Sony has updated the 1000X headphones once again, taking an approach as aggressive with product launches as it is with cutting out external noise. Design tweaks add quality and refinement to these headphones, while a new-and-more-powerful chip provides the grunt to cancel out more noise.
The results are sensational, with the 1000XM4 not only sounding great as a set of headphones but also being some of the most effective at combatting external noise through more selectable levels.
Sony WH-1000XM4 review: The best just got better
Pocket-lint
Bowers & Wilkins PX7
squirrel_widget_173403
Bowers and Wilkins is no stranger to the audio game; the British-based company has been going since 1966, and its older ANC pair of headphones, the PX, was a mainstay on this list. So when we heard it was updating them with new tech, safe to say our ears certainly pricked up. When we actually got them onto our ears, our expectations were, amazingly, exceeded once again.
The PX7 headphones look great, and they sound even better. B&W’s noise cancellation is at the top of its class, and adding features like aptX Adaptive support makes for smoother listening experiences and better future-proofing. Really impressively, too, the headphones are even more comfortable than ever before, making for a dreamy user experience.
While they might not be sitting on the very top of our list, these are a seriously impressive pair of headphones, make no mistake, and you’ll be certain to like them if you pick them up.
Bowers & Wilkins PX7 review: Leading ANC audio performance with aptX Adaptive for good measure
Pocket-lint
Beats Studio 3 Wireless
squirrel_widget_142148
The Beats Studio 3 Wireless are likely to appeal to iPhone owners more than those using an Android phone, only because they’re the latest pair to benefit from Apple’s W1 chip (which has been replaced with the H1 chip, moving forward). This means these cans automatically try to pair with an iOS device when within distance, and once paired are available to instantly connect to from all other Apple devices using the same iCloud account.
The Studio 3 Wireless have some very clever noise-cancellation technology onboard too. It constantly measures the sounds around you – up to 50,000 times per second – and adjusts both the noise-cancellation and sound profile accordingly, to make sure you’re getting the most effective sound blasted into your ears.
The sound is less bass-tactic that you might expect from this headphone company too, yet still impactful, while the battery life just goes on and on and on.
Beats Studio 3 Wireless review: Smart sounding, ultra long-lasting headphones
Pocket-lint
Bang and Olufsen BeoPlay H9i
squirrel_widget_145722
The Bang and Olufsen BeoPlay H9i is one of the more expensive pairs of noise-canceling headphones to grace our ears, but in return, these cans provide a supreme level of comfort, thanks to high-quality materials.
To complement the fantastic build quality is the incredible sound quality – and these headphones evolve themselves beyond the original H9s. The H9i has boosted the noise-canceling ability, while shrinking the ear cups a little, thus making them a little more practical.
With a wealth of competition at lower prices, the BeoPlay H9i need to do a lot to justify their asking price, but we’d spend the extra for the build and comfort. If you want really strong noise-canceling, however, then look to the Bose (above) for that totally ‘locked-in’ quality.
B&O Beoplay H9i review: Pricey but near perfect over-ear headphones
Pocket-lint
Sennheiser HD 450BT
squirrel_widget_184650
Most of the cans on this list so far have something in common – they’re seriously pricey. Now, Sennheiser’s great HD 450BT aren’t exactly cheap, but they’re more affordable and offer superb sound at a reduced price point. You get really long battery life to go with that nicely balance listening, and they’re really comfortable to wear, too.
Plus, the noise cancellation might not be adaptive, but it’s still effective and more than enough for most people to get lost in their music with. We’re really impressed by the HD 450BT, and are confident they’d make a great pick for anyone with a slightly tighter budget.
Sennheiser HD 450BT review: Noise-cancelling cans at an affordable price
Pocket-lint
Microsoft Surface Headphones
squirrel_widget_146788
Ok, so we don’t like the big Windows-like logos on the side of these headphones… but Microsoft has otherwise done a stellar job with its first bash at a pair of headphones, thanks to a variety of great features.
First up, these cans are comfortable on the ears for long periods of wear. Second, the two earcups rotate – the left for noise-canceling level, the right for volume – which gives a great, natural way to control the headphones without needing any unsightly or hard-to-locate buttons.
But there are imperfections: we’d like to see more exciting colors and design, along with some stronger ANC at the maximum level to rival the Bose (further up), plus a greater variety of ANC types like the Sony (up top). That said, if you want comfortable, long-lasting and easy-to-control ANC headphones then don’t overlook Surface – whether or not you use a Microsoft laptop/2-in-1 or not!
Surface Headphones review: Is Microsoft’s first bash at over-ear cans any good?
Pocket-lint
Sony WF-1000XM3
squirrel_widget_160854
Sony has applied its audio expertise to noise-canceling in-ear headphones too. We’ve seen what the company can do with a pair of over-ears, so we had mighty high expectations for the entirely wire-free in-ear model – now in its third-gen form.
These wireless in-ears deliver a well-balanced sound that’s neither too bassy nor too bright; we found it to be just right. The noise-cancellation is just as accomplished, too, effectively blocking out the general humdrum of everyday life, as well as plane and train noises.
Sony isn’t the only maker to market with true wireless headphones with ANC – Apple’s on the scene now, and we recently also tested the Libratone Track Air+ in-ears (a bit further below) – but the WF series has a style, swagger, and musicality that’s hard to beat.
Sony WF-1000XM3 review: True wireless in-ears with class-leading ANC
Pocket-lint
Apple AirPods Pro
squirrel_widget_168834
In some ways, Apple took its time getting to noise cancellation, although the huge success of its AirPods without the feature might have given it some time to work with. The AirPods Pro adds the functionality at last, though, and in one fell swoop has solved what was likely to be most people’s two biggest issues with its earbuds.
Firstly, they can now fit a far wider range of ears, with three earbud sizes to pick from rather than the older AirPods “hope they fit” approach. Secondly, the superb ANC Apple’s used means that you can actually rely on the AirPods Pro to be audible even on the busiest of commutes.
With slightly smaller stems than previously, they’re also less obvious than ever, design-wise, and make a great choice, especially if you’re an iPhone user. That quick and reliable pairing is as useful as ever.
Apple AirPods Pro review: Silence is golden
Pocket-lint
Libratone Track Air+
squirrel_widget_160620
As in-ear headphones go, this product sounds truly exceptional, is comfortable to wear, offers sweat-proof build for those active sessions, and a noise-cancellation system that’s genuinely smart.
In a world where the so-so AirPods seem to get all the attention, or far pricier Sony and Sennheiser products receive some of the loudest shout-outs, Libratone has done its utmost to stand out from the crowd.
The price alone will be a massive lure. But that’s not the sole reason to buy the Air+ – no, you’ll want to don these in your ears because everything on offer, from sound to comfort to capability, is delivered at the highest level.
Libratone Track Air+ review: Wireless ANC in-ears at a great price
Writing by Mike Lowe. Editing by Max Freeman-Mills.
When Apple announced the transition of Macs to its own system-on-chips (SoCs) earlier this year, the company made it clear that while it will provide x86 emulation for MacOS programs developed for x86 CPUs, it will not offer a version of BootCamp with x86 emulation to enable Windows 10 to run on Apple Silicon-based Macs. But in a recent interview with Ars Technica, Craig Federighi, Senior Vice President of Software Engineering at Apple says that Microsoft has everything it needs to make Windows 10 for Arm to run natively on SoCs like the M1.
Up to Microsoft
“That’s really up to Microsoft,” said Federighi. “We have the core technologies for them to do that, to run their ARM version of Windows, which in turn of course supports x86 user mode applications. But that’s a decision Microsoft has to make, to bring to license that technology for users to run on these Macs. But the Macs are certainly very capable of it.”
(Image credit: Apple)
Being a primary supplier of business and productivity software, Microsoft is currently developing its Office suite for Arm-powered Macs. That version of Office will run natively on these systems, which means that Microsoft knows how to build software for the latest Macs. The latter can of course run many of the operating system or applications developed for 64-bit Arm architectures. Microsoft has a version of Windows designed for Qualcomm’s Snapdragon and other ARMv8 SoCs, so doing something similar for the new Macs should be quite possible. There are a number of uncertainties though.
In order to make Apple Silicon-based Macs offer the same features and performance in Windows and MacOS, Microsoft will need to make Apple’s IP, including GPU, neural engine, and special-purpose accelerators, work in Windows, which requires compatible APIs (application programming interfaces) and drivers. Apple’s MacOS exclusively uses Metal API for graphics processors, which is why Apple’s GPUs are developed with Metal in mind. There are several ways to make Apple GPUs work under Windows, but at present it is unclear what Microsoft can do with Apple’s hardware in terms of APIs and drivers. Another intriguing aspect is how Apple’s M1 and its successors will perform in Windows. Preliminary benchmark results of the latest Macs based on the M1 show that the chip can beat its x86 rivals, albeit in MacOS and in select applications/workloads.
Windows on Arm on Mac?
There are people who use Macs for most of their workloads, but require Windows for applications that are not available for MacOS. Without doubt, Intel-based Macs have gotten quite popular among such people over the last 14 years. For several years down the road, they will likely continue to use x86 Macs, but at some point, those PCs will get outdated or will simply break down, which is when they will have to replace them.
(Image credit: Apple)
Assuming that Microsoft has everything it needs to make Windows 10 for Arm work natively on the latest Arm-based Macs, the question is what does the software giant gain by enabling its operating system work on Apple’s hardware. Apple controls about 10% of the PC market and the percentage of Mac users who need Windows is hardly significant, so from volume perspective Microsoft hardly gains anything tangible.
But making Windows 10 work on Apple Silicon-powered Macs properly could inspire the market to actively adopt Arm processors for Windows laptops, which will make the platform more versatile. Meanwhile, if Apple’s M1 and its successors can beat x86 CPUs in Windows environment, this may have an impact on the PC market in general and for obvious reasons Microsoft is going to want to be a part of it.
Last week on The Vergecast, the crew spent some time discussing Apple’s announcement of new Mac computers with its own Arm-based processor chip, which Apple is calling the M1. This week, Vergecast co-hosts Nilay Patel and Dieter Bohn got their hands on the new MacBook Air and MacBook Pro with the M1 chips for review and brought their findings to this week’s episode.
Nilay and Dieter also bring in deputy editor Dan Seifert and editor Chris Welch to discuss their experiences with the new computers in regards to performance, battery life, and running iOS apps on macOS.
But before they got into all of that, The Vergecast starts the discussion with an interview with legendary tech reviewer Walt Mossberg, who was able to try out the new MacBook Air for himself. Walt gives his first impressions as well as some background behind what he calls “one of the most seminal products of the last 20 years.”
Below is a lightly edited excerpt from that conversation.
Nilay Patel: We have obviously just reviewed the new MacBook Air with Apple’s M1 chip, and you have had one as well. I’m dying to know what you think of this sort of major internal change to the Mac even though the outside kind of looks the same.
Walt Mossberg: Okay, so I just want to say one thing first: I actually believe that the MacBook — which you know and I know was the thing that the Windows guys were trying to copy for years and years — doesn’t get enough credit for being one of the seminal products of the last 20 years.
I mean, it’s not as seminal as the iPhone or the first Android phone or whatever. But it was a seminal product because light, thin laptops tended to be cramped before the MacBook Air, and there were a lot of compromises. And the MacBook Air also kind of almost tied — I think it maybe was a month before one of the ThinkPads — in having an SSD. There were a bunch of things about it that were notable. And then the 2010 one was a huge improvement, and then the 2013 one, ironically, given what’s happening now because of an Intel chip — the Haswell chip at the time — which was specifically meant to give a huge leap in battery life. I got in my test. I looked it up. I got 12 hours on it, and they were only claiming 10 different battery tests, of course.
So I think the MacBook Air is a really important consumer tech product. And I think this one is — I mean, I agree with Dieter’s review — I think it’s sort of amazing. I mean, we’ve all known they were going to do this. I’m no chip expert, processor expert, but from what I can understand, they’ve pulled off something amazing in a relatively short time. And by that, I mean it takes a long time to develop a processor. It’s not like doing some other things.
So in my use of this, I think the single thing that has impressed me most, the battery life has impressed me. I’ve not plugged this in in about 36 hours, and I have 75 percent battery life. So that’s fantastic. It’s buttery smooth. It’s fast as can be.
But the thing that really impresses me is their translation layer. This thing called Rosetta 2. They had a Rosetta when they made another processor change some years ago. And what it does is it takes apps that have not been written for this processor that were written for the Intel — which is most of the third-party apps so far — and it runs them. And I got to tell you, they run fast. They run normally. I mean, fast. If you were doing a blind test and you didn’t know this was originally written for Intel, still written for Intel, and it was running through this Rosetta thing, you would never know it. At least that’s been my experience. I don’t know if it’s been yours.
Dieter Bohn: Yeah, no, completely. To me, that’s one of the most impressive parts. Maybe if you were staring at how many times it bounced in the dock when it launched for the first time after reboot or something, you might be able to tell the difference. But otherwise, just exactly the same. And it’s my understanding that they co-developed this processor hardware with Rosetta 2 in mind and vice versa.
So I think part of what’s happening here is this whole thing where Apple says we are able to integrate hardware and software that makes our stuff really special — sometimes that seems like a true claim, and sometimes it seems like it’s more marketing than reality. I think in this particular case with Rosetta 2, it’s absolutely reality that I don’t think their translation software would have worked as well if the chip team didn’t know it was coming and they didn’t know what was on the chip. My hunch is there’s some sort of thing inside the M1 chip that is just there just to make Rosetta fast.
You can listen to the entire Vergecast discussion here or in your preferred podcast app.
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
Not only is the Dell XPS 13 one of Dell’s best-known laptops, but it’s one of the best-known laptops, period. If you know anything about Windows laptops, you’ve probably heard its name. It’s great in pretty much every way, and it just keeps getting better.
So it’s forgivable to hear about a “Dell XPS 13 2-in-1” and assume it’s a run-of-the-mill XPS 13 that you can flip around. But the XPS 13 2-in-1 is very much a laptop of its own, with a different set of trade-offs and considerations from its clamshell counterpart. In some areas (like graphics performance), the convertible is a step ahead of the standard XPS; in others (like the keyboard), it’s compromising a bit. The best model for you depends on the type of tasks you’re looking to do.
Edge-to-edge keyboard, large touchpad — the works.
This year’s 2-in-1 starts at $1,099.99 for a Core i3-1115G4 processor, 8GB of RAM, and a 256GB SSD. You can spec that all the way up to $2,249.99 (listed now at $1,999.99) for a 3840 x 2400 display, a Core i7-1165G7 CPU, 32GB of RAM, and a 1TB SSD. You can also pay $50 extra for the white (“frost”) color rather than the silver and black model and $60 extra to upgrade to Windows 10 Pro.
I have a model in the middle, containing the 1165G7, 16GB of RAM, a 512GB SSD, and a 1920 x 1200 touchscreen. That costs $1,949.99 (but is currently listed at $1,749.99 on Dell’s website). The regular XPS 13 with the same processor has just 8GB of RAM and is said to cost $1,499.99, but it’s currently listed at $1,349.99. That means you’re paying $400 extra for 8GB more memory, and the other benefits of the 2-in-1.
Before getting into that, I’ll give you a brief rundown of what’s new from the last XPS 13 2-in-1. It’s mostly one thing: the processor. This 2-in-1 (the 9310) includes Intel’s newest 11th Gen “Tiger Lake” mobile chips. The CPUs bring Intel’s best-in-class Iris Xe integrated graphics, which have been the talk of the town since Tiger Lake’s launch, as well as the Thunderbolt 4 standard. This 9310 is actually certified through Intel’s Evo program (denoted by a small sticker on the right palm rest), which is supposed to guarantee that a laptop meets the needs of an everyday home or office user, in categories from connectivity to battery life and performance.
The ports support up to two 4K displays and data transfers up to 40Gbps.
Apart from that, the new 2-in-1 has largely the same look as its predecessor and the clamshell XPS, complete with a slim and sturdy aluminum chassis, a 1920 x 1200 (16:10) Gorilla Glass screen, and four ports (two USB-C with Thunderbolt 4, DisplayPort, and Power Delivery, one headphone jack, and one microSD card reader). Dell has made a few tweaks as well: the webcam now supports Windows Hello, there’s a new “frost” color option, RAM has been bumped from 3733MHz to 4267MHz, the touchpad is quieter, and there’s an updated microSD reader that Dell says will deliver better performance. That’s all well and good, but the processor is the star of the show here — and it is a star.
For general performance, the 1165G7 handled my office workload, which includes a dozen-ish Chrome tabs, a few other apps like Slack and Spotify, and some downloading, file copying, and other office-y stuff in the background, with no problem. I never heard the fans spin up or felt any heat unless I was running an intense program. This is the experience you’ll have with any machine with an i7, but you certainly aren’t sacrificing any performance for this convertible form factor.
Heavier media work is where this system starts to stand out. The 2-in-1 took 10 minutes and 5 seconds to export our 5-minute, 33-second 4K video in Adobe Premiere Pro. That’s 38 seconds faster than the regular XPS 13 — effectively comparable. Both systems are faster than any 10th Gen Ice Lake laptop with integrated graphics, and they beat the Asus ZenBook 14 with the same processor (which took 11 and a half minutes). They also both lose out to the Arm-powered MacBook Air (which doesn’t even have a fan) and MacBook Pro, which finished the job in 8:15 and 7:39, respectively.
Where the 2-in-1 really differentiates itself from the clamshell, though, is gaming performance. You can actually play a fair number of games on this laptop without needing to bump the resolution down. The 2-in-1 smoked the clamshell on Rocket League’s maximum settings (an average of 120fps, to the clamshell’s 111fps) and League of Legends (226fps to the clamshell’s 205fps). You’ll only see 60fps on either XPS 13 since that’s the maximum their screens can display, but those results show how much higher Dell has clocked the 2-in-1 over the standard XPS.
The 2-in-1 is putting up such impressive numbers that there are actually games where you’ll see better performance than you will on the clamshell. It breezed through Overwatch’s Ultra settings, averaging 71fps. That beats the standard XPS 13, which averaged 48fps on the same preset — a difference you’ll notice in gameplay. It also beats Lenovo’s IdeaPad Slim 7 with AMD’s eight-core Ryzen 7 4800U, which put up 46fps and actually isn’t too far off a system I tested with the most powerful Tiger Lake processor (the Core i7-1185G7) which averaged 89fps.
The really exciting news is that you can actually play Shadow of the Tomb Raider on this machine at 1080p. The 2-in-1 averaged 36fps on the game’s built-in benchmark (at the lowest-possible settings). That’s just two off from the newest MacBook Air with Apple’s M1 chip (38fps), which has been outperforming 1165G7 systems across the board. Now, you may very well not want to play demanding titles like Shadow of the Tomb Raider at 38fps, but you can play them on this machine without feeling like you’re watching a flipbook. That’s a big accomplishment for these integrated graphics, especially considering that the standard XPS 13 only averaged 22fps on the same benchmark.
Similar to that of the clamshell XPS, however, the convertible’s cooling system (including two fans, hidden vents in the hinge, and what Dell calls “an ultra-thin vapor chamber”) is being pushed to its limit during these high-intensity tasks. During the Premiere Pro export and my gaming sessions, the CPU spent some time in the high 90s (Celsius) and even hit 100 a few times. This may cause problems for the XPS form factor down the line if Intel doesn’t make significant gains in efficiency. On the plus side, the keyboard and touchpad never got hot at all. (The keyboard got toasty on the clamshell.)
So the 2-in-1 gives you better graphics performance than the standard XPS (as well as a convertible form factor) for just a slightly higher price. What’s the catch? There are two worth considering.
The Dell logo is slightly larger than it was on the last 2-in-1.
The first is battery life. I averaged eight hours and 50 minutes using the 2-in-1 as my primary work driver with the screen at 200 nits of brightness. That’s quite good among the Tiger Lake systems we’ve seen so far, and it means you should get a full workday from one charge. But it’s a bit worse than the standard XPS, where I usually saw nine hours and 15 minutes — a small difference but one that could be important to students or frequent business travelers who are looking for every ounce of juice they can get.
The second trade-off to consider is the keyboard. The standard XPS 13 has one of my all-time favorite laptop keyboards; it’s snappy, quiet, and comfortable, with a really nice texture. The 2-in-1 has a different keyboard that Dell calls the “Next Gen MagLev keyboard.” It has wider keycaps with just 0.7mm of travel. It feels similar to using the old low-profile butterfly keyboard on the 2019 MacBook Pro. Personally, I hate this. Typing on the convertible feels like slamming my fingers onto flat plastic. But I begrudgingly acknowledge that some people (including Verge deputy editor Dan Seifert) prefer these kinds of keys.
Overall, the XPS 13 2-in-1 9310 is an excellent Windows convertible. It keeps the slim, sturdy, premium build quality that makes the XPS line the best of the best while also delivering some of the best performance you can get from an ultraportable laptop. It’s a formidable competitor to Apple’s groundbreaking MacBooks, especially if you’re looking for a touchscreen and a convertible form factor.
A few steps forward, a few steps back.
If you’re deciding whether to buy the XPS 13 or the XPS 13 2-in-1, the differences are simple — but they’re also significant and worth thinking about. The convertible form factor is the most obvious distinction, but I would argue it’s not the most important one (unless your job requires tablet use). You’ll be using the keyboard a lot, so you’ll want to figure out which one you prefer (if you’ve used MacBook butterfly keyboards and other MacBook keyboards, those are a rough approximation). You should also consider the sorts of tasks you’ll be putting your system through and whether a significant increase in graphics performance (especially with demanding games) is worth giving up a bit of battery life. And of course, there’s the $400 price difference.
Anyone who’s considering an XPS 13 and thinks they might prefer a convertible should definitely consider the 2-in-1. Just don’t assume they’re the exact same package.
The PC revolution started off life 35 years ago this week. Microsoft launched its first version of Windows on November 20th, 1985, to succeed MS-DOS. It was a huge milestone that paved the way for the modern versions of Windows we use today. While Windows 10 doesn’t look anything like Windows 1.0, it still has many of its original fundamentals like scroll bars, drop-down menus, icons, dialog boxes, and apps like Notepad and MS paint.
Windows 1.0 also set the stage for the mouse. If you used MS-DOS then you could only type in commands, but with Windows 1.0 you picked up a mouse and moved windows around by pointing and clicking. Alongside the original Macintosh, the mouse completely changed the way consumers interacted with computers. At the time, many complained that Windows 1.0 focused far too much on mouse interaction instead of keyboard commands. Microsoft’s first version of Windows might not have been well received, but it kick-started a battle between Apple, IBM, and Microsoft to provide computing to the masses.
Microsoft co-founder Bill Gates with a boxed copy of Windows.Carol Halebia
Back in 1985, Windows 1.0 required two floppy disks, 256 kilobytes of memory, and a graphics card. If you wanted to run multiple programs, then you needed a PC with a hard disk and 512 kilobytes of memory. You wouldn’t be able to run anything with just 256 kilobytes of memory with modern machines, but those basic specifications were just the beginning. While Apple had been ahead in producing mouse-driven GUIs at the time, it remained focused on the combination of hardware and software. Microsoft had already created its low-cost PC DOS operating system for IBM PCs, and was firmly positioned as a software company.
With Windows 1.0, Microsoft took the important step of focusing on apps and core software. IBM held onto the fundamentals of the PC architecture for a few years, but Microsoft made it easy for rivals and software developers to create apps, ensuring that Windows was relatively open and easy to reconfigure and tweak. PC manufacturers flocked to Windows, and the operating system attracted support from important software companies. This approach to providing software for hardware partners to sell their own machines created a huge platform for Microsoft. It’s a platform that allows you to upgrade through every version of Windows, as a classic YouTube clip demonstrates.
Windows has now dominated personal computing for 35 years, and no amount of Mac vs. PC campaigns have come close to changing that, but they’ve certainly been entertaining. Microsoft has continued to tweak Windows and create new uses for it across devices, in businesses, and now with the move to the cloud. It’s only now, with the popularity of modern smartphones and tablets, that Windows faces its toughest challenge yet. Microsoft may yet weather its mobile storm, but it will only do so by rekindling its roots as a true software company. In 2055, it’s unlikely that we’ll be celebrating another 35 years of Windows in quite the same fashion, so let’s look back at how Microsoft’s operating system has changed since its humble beginnings.
Where it all began: Windows 1.0 introduced a GUI, mouse support, and important apps. Bill Gates headed up development of the operating system, after spending years working on software for the Mac. Windows 1.0 shipped as Microsoft’s first graphical PC operating system with a 16-bit shell on top of MS-DOS.
Windows 1.0 (1985)
Windows 2.0 continued 16-bit computing with VGA graphics and early versions of Word and Excel. It allowed apps to sit on top of each other, and desktop icons made Windows feel easier to use at the time of the 2.0 release in December, 1987. Microsoft went on to release Windows 2.1 six months later, and it was the first version of Windows to require a hard disk drive.
Windows 2.0 (1987)
Windows 3.0 continued the legacy of a GUI on top of MS-DOS, but it included a better UI with new Program and File managers. Minesweeper, a puzzle game full of hidden mines,also arrived with the Windows 3.1 update.
Windows 3.0 (1990)
Windows NT 3.5 was the second release of NT, and it really marked Microsoft’s push into business computing with important security and file sharing features. It also included support for TCP/IP, the network communications protocol we all use to access the internet today.
Windows NT 3.5 (1994)
Windows 95 was where the modern era of Windows began. It was one of the most significant updates to Windows. Microsoft moved to a 32-bit architecture and introduced the Start menu. A new era of apps emerged, and Internet Explorer arrived in an update to Windows 95.
Windows 95 (1995)
Windows 98 built on the success of Windows 95 by improving hardware support and performance. Microsoft was also focused on the web at its launch, and bundled apps and features like Active Desktop, Outlook Express, Frontpage Express, Microsoft Chat, and NetMeeting.
Windows 98 (1998)
Windows ME focused on multimedia and home users, but it was unstable and buggy. Windows Movie Maker first appeared in ME, alongside improved versions of Windows Media Player and Internet Explorer.
Windows ME (2000)
Windows 2000 was designed for client and server computers within businesses. Based on Windows NT, it was designed to be secure with new file protection, a DLL cache, and hardware plug and play.
Windows 2000 (2000)
Windows XP really combined Microsoft’s home and business efforts. Windows XP was designed for client and server computers within businesses. Based on Windows NT, it was designed to be secure with new file protection, a DLL cache, and hardware plug and play.
Windows XP (2001)
Windows Vista was poorly received like ME. While Vista introduced a new Aero UI and improved security features, Microsoft took around six years to develop Windows Vista and it only worked well on new hardware. User account control was heavily criticized, and Windows Vista remains part of the bad cycle of Windows releases.
Windows Vista (2007)
Windows 7 arrived in 2009 to clean up the Vista mess. Microsoft did a good job of performance, while tweaking and improving the user interface and making user account control less annoying. Windows 7 is now one of the most popular versions of Windows.
Windows 7 (2009)
Windows 8 was a drastic redesign of the familiar Windows interface. Microsoft removed the Start menu and replace it with a fullscreen Start Screen. New “Metro-style” apps were designed to replace aging desktop apps, and Microsoft really focused on touch screens and tablet PCs. It was a little too drastic for most desktop users, and Microsoft had to rethink the future of Windows.
Windows 8 (2012)
Back to the Start: Windows 10 brings back the familiar Start menu, and introduces some new features like Cortana, Microsoft Edge, and the Xbox One streaming to PCs. It’s more thoughtfully designed for hybrid laptops and tablets, and Microsoft has switched to a Windows as a service model to keep it regularly updated in the future.
Windows 10 (2015)
Windows 10 hasn’t changed drastically over the past five years. Microsoft has been tweaking various parts of the operating system to refine it. More system settings have moved from the traditional Control Panel over to the new Settings app, and the Start menu has a less blocky look to it now. We’re still waiting to see what Windows 10X (originally designed for dual-screen devices) will bring, but Microsoft has also been improving the system icons for Windows 10. 2021 could bring an even bigger visual refresh to Windows 10.
Windows 10 (2020)
Editor’s note: This story was originally published in 2015 to mark the 30th anniversary of Windows. It has been updated and republished for 35 years of Windows.
A new DLC (Downloadable Content) for the creative game “Minecraft” leads into the “Star Wars” universe: The official expansion package includes not only a map with twelve different planets, but also suitable textures, objects, skins and a licensed “Star Wars” “-Soundtrack.
Trailer for the “Star Wars” DLC for “Minecraft”
(Source: Microsoft) X-Wings, on the other hand, are being introduced into the game as mobile vehicles. According to the announcement, the package mainly contains content from the original trilogy. But the successful Disney + series “The Mandalorian” is also on board. For example, a trailer shows Baby Yoda.
The “Star Wars” DLC can be purchased in the Minecraft store. It costs 1340 Minecoins – that’s about 8 euros. It is already the second “Star Wars” addon in the history of the game phenomenon “Minecraft”: 2014 Microsoft released a skin pack for 3 Euro.
Minecraft remains successful Over ten years after the release, “Minecraft” remains one of the most played video games ever. In October, the Microsoft-owned Mojang development studio announced that “Minecraft” is being actively played by 130 millions of players. The number of multiplayer games in particular has increased significantly during the corona crisis.
In contrast, the decision to make a Microsoft account compulsory for the Java version of Minecraft was recently controversial. In spring 2014 all players of the Java variant have to replace their “Minecraft” account with a Microsoft account.
“Minecraft” is offered in two versions on the PC: Players can choose between the Java and Bedrock editions of Minecraft. In contrast to the Bedrock version, the Java edition is available outside the Windows store, supports significantly more mods and could previously be used without a Microsoft account. Because of this freedom, the Java version is the preferred version for many “Minecraft” fans.
With the mini-study “Vision Urbanaut”, BMW presents its idea of the individual transport of the future. It is not yet at the door in this form, but it is not completely absurd either. “In the second half of the decade we could very well imagine such a model in our range,” says Mini Design Director Oliver Heilmer.
The study is longer at 4, 28 meters than any current Mini and slightly larger than a BMW X1. But the electric drive enables a completely new division of the proportions. Almost the entire traffic area benefits the interior. With the large windows and the bright colors in the interior, it looks pretty generous. In addition, BMW promises a lining with sustainable materials – the brand has already gained experience in this area with the i3. The predominant material in the interior is knitted textile.
Mini Vision Urbanaut outside (7 pictures) The Mini Vision Urbanaut is with 4, 46 meters almost as long as a VW Touran.
Moments The study, so the designers promise, should be very variable inside and outside and be able to adapt to different usage scenarios. For this purpose, three moments have been thought up that can also be combined. At the moment, ‘chill’, the vehicle should become a place of retreat for relaxation or for concentrated work on the go. With Vibe, the vehicle opens up to the maximum and focuses on spending time with other people. The third “moment” is seriously called “Wanderlust” – the focus here is on driving.
Much more than the statement that it is a battery-electric drive is currently not applicable to this topic Experienced. On the one hand this is unusual for this group, on the other hand it is not surprising in view of the far distant and by no means safe series production. After all, BMW reveals so much: The study should be able to drive autonomously, although it is still open at which level.
Mini Vision Urbanaut inside (5 images) The battery-electric drive opens up new freedom for Vehicle design.
(mfz)
Attackers could slip malware into Firefox and Thunderbird and thus execute their own commands on systems. Overall, the risk from the security vulnerabilities is considered ” high “.
Mozilla not only closed loopholes, but also built in a function to increase security when surfing with Firefox – which, however, is not active in the standard setting. The versions Firefox 83 , Firefox ESR 78. 5 and Thunderbird 78. 5 for all systems.
Closed security holes in Thunderbird 78. 5 Closed vulnerabilities in Firefox ESR 78. 5 Closed vulnerabilities in Firefox 83 Only surf encrypted In Firefox 83 is now available in an HTTPS-only mode. If this mode is active, the web browser always establishes an encrypted connection via HTTPS to websites. This happens automatically, for example, if you enter a URL starting with HTTP or click on an HTTP link.
If a website does not support HTTPS, Firefox displays a warning that third parties may visit such sites of users entering credit card information. Anyone who is aware of the risk can still visit such sites. HTTPS-only is then temporarily deactivated for the page. In some cases, websites reload resources via HTTP (mixed content). In such cases, there may be display errors on Internet pages.
Activate HTTPS-only To use the HTTPS-only mode, you have to activate it in the Firefox settings under “Data protection & security” at the bottom. If desired, the mode can be used in all windows or only in private windows. If HTTPS-only causes problems on a page, you can deactivate it by clicking on the lock symbol in the address bar.
AMD provides the first two “Big Navi” graphics cards with the Navi graphics chip 21 before: The Radeon RX 6800 XT for 650 Euro kicks off Nvidia’s GeForce RTX 3080, the further slimmed down Model Radeon RX 6800 for 580 Euro against the GeForce RTX 3070. Our own top model Radeon RX 6900 XT with fully activated GPU as RTX – 3090 – Competitor is to follow in December.
The Navi – 21 – GPU brings 26, 8 billion transistors on a chip area of 519 mm² under; A 7-nanometer process from the chip applicator TSMC is used. So “Big Navi” is more than twice as complex as the Navi – 10 – GPU of the Radeon RX 5700 XT (10, 3 billion transistors on 251 mm²). AMD uses some transistors to design the design for higher clock frequencies – the Radeon RX 6800 XT runs ex works with more than 2000 MHz. There is also a 128 MByte huge SRAM cache used, which relieves the memory interface. So are “Big Navi” 256 Highways with 512 GByte / s to the 16 GByte GDDR6 to use the shader cores well. You can read more about the RDNA-2 architecture used here:
The test shows that AMD’s architecture and clock improvements are fruitful carry: For the first time in generations, Nvidia’s GeForce graphics cards no longer occupy the top performance alone, provided that the classic rasterization performance without ray tracing effects is considered.
Game benchmarks In 3D games, the GeForce RTX 3080 in front, sometimes the Radeon RX 6800 XT. The two action adventures “Assassin’s Creed Odyssey” and “Shadow of the Tomb Raider” can be viewed in 4K resolution (3840 × 2160 pixels) and ultra graphics settings on both graphics cards smoothly with a pleasant frame rate of well over 60 play fps.
AMD’s reference model, the Radeon RX 6800 XT with 300 Watt power limit not before a factory overclocked GeForce RTX 3080 hide – here MSI’s Gaming X Trio with 340 Watt power limit. The differences are mostly measurable, but not particularly noticeable. The smaller offshoot Radeon RX 6800 may differ from overclocked versions of the GeForce RTX 3070, partially with 25 percent (“Assassin’s Creed Odyssey”, WQHD) clearly.
Game benchmarks, measured under Windows 10 (2004), Ryzen 9 5900 X, 32 GByte DDR4 – 3600 – RAM, Radeon 20. 45. 01. 12 – 11. 6 Beta, GeForce 457. 30 WHQL Game Assasin’s Creed Odyssey Shadow of the Tomb Raider Settings Preset “Ultra” / AA “High” / 16 x AF / DX 11 Preset “Ultra high” / SMAA / 12 x AF / DX 12 Average 1% percentile average 1% percentile 2560 × 1440 Pixel (WQHD) RX 6800 XT 95 69 150 115 RX 6800 91 68 129 99 RX 5700 XT 62 49 79 64 RX Vega 64 50 41 67 53 RTX 3080 (OC) 88 67 150 117 RTX 3070 (OC) 73 58 116 93 3840 × 2160 Pixel (Ultra HD) RX 6800 XT 68 54 83 68 RX 6800 59 47 71 58 RX 5700 XT 37 29 42 34 RX Vega 64 32 26 37 29 RTX 3080 (OC) 65 52 85 71 RTX 3070 (OC) 50 41 63 53 Special case ray tracing It looks worse for AMD if you turn on ray tracing graphics effects such as global lighting, reflections or shadows. With the current Radeon driver – AMD provided us with a preliminary version – these can be written in DirectX – 12 – Activate playing with DirectX Ray Tracing API (DXR) from day 1. So far this only worked with Nvidia’s GeForce graphics cards.
All Radeon models with navigation – 21 – GPUs accelerate ray tracing graphics effects in hardware by so-called ray accelerators testing acceleration structures (bounded volume hierarchy, BVH) for intersection points of rays. The process costs more power than Nvidia’s approach, so the Radeon RX 6800 XT with activated ray tracing always behind the GeForce RTX 3080 lies. The Radeon RX 6800 falls behind the GeForce RTX 3000.
Game benchmarks, measured under Windows 10 (2004), Ryzen 9 5900 X, 32 GByte DDR4 – 3600 -RAM, Radeon 20. 45. 01. 12 – 11. 6 Beta, GeForce 457. 30 WHQL Game Shadow of the Tomb Raider Control Settings Preset “Ultra high” / SMAA / 16 x AF / DX 12 Raytracing shadow “Ultra” Preset “RTX” / 16 x AF / DX 12 Average 1% percentile average 1% percentile 2560 × 1440 Pixel (WQHD) RX 6800 XT 82 53 63 44 RX 6800 70 45 54 37 RTX 3080 (OC) 101 72 79 52 RTX 3070 (OC) 76 53 61 40 3840 × 2160 Pixel (Ultra HD) RX 6800 XT 45 30 36 26 RX 6800 38 25 31 22 RTX 3080 (OC) 55 40 47 31 RTX 3070 (OC) 39 28 35 23 The difference depends on the amount of ray tracing graphics effects used. In “Shadow of the Tomb Raider”, where only the shadows are rendered with virtual rays, the distance between Radeon RX 6800 XT and GeForce RTX 3080 at just under 20 percent. In “Control” the difference in the lower “RTX” presets grows up to 25 percent. If you maximize the graphics settings there, the Radeon RX falls 6800 XT even 40 Percent behind the competition model – but then the GeForce RTX itself packs 3080 only 1440 p with smooth frame rate.
The higher ray tracing performance from Nvidias 3000 series is also reflected in the 3DMark Port Royal graphics benchmark, in which the Radeons fall behind. In older DirectX – 10 – Benchmark Fire Strike Extreme (without ray tracing) they are still ahead. AMD’s first litter with ray tracing acceleration is sufficient for ray tracing titles in 1440 p-resolution with fluid frame rates.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.