If you’re finding that background noise is disrupting voice or video calls made from your computer, then a new piece of software from Nvidia might help (provided you have the necessary hardware to run it). Released in April 2020, RTX Voice uses the hardware found in Nvidia’s RTX (and more recently, GTX) GPUs to process your incoming and outgoing audio and eliminate almost all background noise.
Below, you’ll find a quick demonstration I recorded to show how it works. This was recorded from a Blue Snowball microphone using the built-in call recording functionality in Zoom. When I don’t have the software enabled, you can hear the loud clacking of my mechanical keyboard in the background of the call. But when I turn on RTX Voice, the sound completely disappears.
As well as processing your microphone’s input so that the people you’re speaking to can’t hear any background noise around you, you can also set the software to eliminate background noise coming in from other people. So you can save yourself from your colleagues’ loud keyboard as well as protecting them from your own. It’s a win-win.
How to use RTX Voice to reduce background noise
RTX Voice is pretty simple to use, but the big caveat is that you need the right hardware. In order to run it, you’ll need an Nvidia GeForce or Quadro RTX or GTX graphics card since the software uses this hardware to process your audio. That means you’re out of luck if you’ve got a Mac, or a Windows machine without a dedicated GPU.
As well as hardware requirements, the other thing to note about RTX Voice is that since the processing is being done by your graphics card, it might take system resources away from any games or other graphically intensive applications you’re running. I ran some quick and dirty benchmarks to try to gauge the performance impact and found that running RTX Voice on my Discord microphone input reduced UniEngine’s Heaven Benchmark by just over 3fps or around 6 percent, rising to over 8fps or 14 percent if I used the software to process incoming audio as well. That more or less tracks with YouTuber EposVox’s report of a 4 to 10 percent reduction when using it on his microphone, rising to 20 percent with both mic and speakers.
I think that makes RTX Voice a much better option for calls where you’re unlikely to be running something graphically intensive at the same time, like a work conference call, rather than while you’re running a game simultaneously. If you’re looking for something more gaming-specific, Discord recently launched its own noise suppression feature, which might be a better alternative.
RTX Voice can be set it up in just a couple of minutes.
First, update the driver software of your graphics card if it’s not already running on version 410.18 or above
Download RTX Voice from Nvidia’s website and install it
Once the software is installed, you can configure it to improve your incoming audio, outgoing audio, or both. Nvidia recommends only turning it on for your input device (read: microphone) to minimize the impact the audio processing will have on the performance of your system. You can also select how much noise suppression you want. I left it at 100 percent, but you might want to play around to find what works best for you.
Once installed, “Nvidia RTX Voice” will appear as an audio input and / or output device for your PC. That means you can go into your voice chat app of choice and select it as though you’d plugged an extra microphone or set of speakers into your PC. Check out Nvidia’s site for specific instructions on how to configure the software for individual applications; here’s what the setting looks like in Zoom.
Nvidia’s software isn’t unique. In addition to Discord’s feature, Microsoft also plans to add a similar piece of functionality to Teams later this year. The advantage of RTX Voice, however, is that it works across a much broader range of apps. Nvidia’s site lists 12 apps that it’s validated. However, I tested out audio recording app Audacity, which Nvidia doesn’t list as being supported, and found that RTX Voice worked just fine, so there are likely to be other unlisted apps that also work.
Not everyone will have the hardware to take advantage of this latest feature, and for others, the performance hit won’t be worth it. However if, like me, your gaming PC is mainly being used as a work computer these days, then using RTX Voice is a no-brainer.
Correction: This article originally stated that RTX Voice won’t work on a Windows machine with a dedicated GPU when it should have read that it won’t work on a Windows machine without a dedicated GPU. We regret the error.
Update 10:31AM, April 6th: Nvidia has extended RTX Voice support for earlier GTX, Quadro, and Titan-branded graphics card, so we’ve updated this post with relevant info.
Vox Media has affiliate partnerships. These do not influence editorial content, though Vox Media may earn commissions for products purchased via affiliate links. For more information, see our ethics policy.
for the PC version on Steam. The sequel follows the saga of Ethan Winters, this time with some apparently very large vampire ladies. Based on what we’ve seen, you’ll benefit from having one of the
best graphics cards
along with something from our list of the
best CPUs for gaming
when the game arrives on May 7.
The eighth entry in the series (VIII from Village), this will be the first Resident Evil to feature ray tracing technology. The developers have tapped AMD to help with the ray tracing implementation, however, so it’s not clear whether it will run on Nvidia’s RTX cards at launch, or if it will require a patch — and it’s unlikely to get DLSS support, though it could make for a stunning showcase for AMD’s FidelityFX Super Resolution if AMD can pull some strings.
We’ve got about a month to wait before the official launch. In the meantime, here are the official system requirements.
Minimum System Requirements for Resident Evil Village
Capcom notes that in either case, the game targets 1080p at 60 fps, though the framerate “might drop in graphics-intensive scenes.” While the minimum requirements specify using the “Prioritize Performance” setting, it’s not clear what settings are used for the recommended system.
The Resident Evil Village minimum system requirements are also for running the game without ray tracing, with a minimum requirement of an RTX 2060 (and likely future AMD GPUs like Navi 23), and a recommendation of at least an RTX 2070 or RX 6700 XT if you want to enable ray tracing. There’s no mention of installation size yet, so we’ll have to wait and see just how much of our SSD the game wants to soak up.
The CPU specs are pretty tame, and it’s very likely you can use lower spec processors. For example, the Ryzen 3 1200 is the absolute bottom of the entire Ryzen family stack, with a 4-core/4-thread configuration running at up to 3.4GHz. The Core i5-7500 also has a 4-core/4-thread configuration, but runs at up to 3.8GHz, and it’s generally higher in IPC than first generation Ryzen.
You should be able to run the game on even older/slower CPUs, though perhaps not at 60 fps. The recommended settings are a decent step up in performance potential, moving to 6-core/12-thread CPUs for both AMD and Intel, which are fairly comparable processors.
The graphics card will almost certainly play a bigger role in performance than the CPU, and while the baseline GTX 1050 Ti and RX 560 4GB are relatively attainable (the game apparently requires, maybe, 4GB or more VRAM), we wouldn’t be surprised if that’s with some form of dynamic resolution scaling enabled. Crank up the settings and the GTX 1070 and RX 5700 are still pretty modest cards, though the AMD card is significantly faster — not that you can find either in stock at acceptable prices these days, as we show in our
GPU pricing index
. But if you want to run the full-fat version of Resident Evil Village, with all the DXR bells and whistles at 1440p or 4K, you’re almost certainly going to need something far more potent.
Full size images: RE Village RT On / RE Village RT OffAMD showed a preview of the game running with and without ray tracing during its
Where Gaming Begins, Episode 3
presentation in early March. The pertinent section of the video starts at the 9:43 mark, though we’ve snipped the comparison images above for reference. The improved lighting and reflections are clearly visible in the RT enabled version, but critically we don’t know how well the game runs with RT enabled.
We’re looking forward to testing Resident Evil Village on a variety of GPUs and CPUs next month when it launches on PC, Xbox, and PlayStation. Based on what we’ve seen from other RT-enabled games promoted by AMD (e.g. Dirt 5), we expect frame rates will take a significant hit.
But like we said, this may also be the debut title for FidelityFX Super Resolution, and if so, that’s certainly something we’re eager to test. What we’d really like to see is a game that supports both FidelityFX Super Resolution and DLSS, just so we could do some apples-to-apples comparisons, but it may be a while before such a game appears.
In a strange twist of fate, Nvidia quietly patched its RTX Voice app at an unknown time to support all GeForce GTX graphics cards supported under Nvidia’s 410.18 driver or newer. This means RTX Voice works with all products from the best graphics cards in the RTX 30-series down to the GTX 600-series.
Nvidia released RTX Voice a year ago as a new spin-off feature for RTX 20-series GPUs to improve audio communication by reducing unwanted background noise intelligently using AI. Nvidia claimed the app uses the Tensor cores built into its latest products to accomplish this feature.
But ironically, right after the app was released, a super-simple hack leaked allowing you to run RTX Voice on Windows 7 and, best of all, non-RTX GPUs.
So it’s not too surprising that Nvidia eventually patched RTX Voice itself to support GTX graphics cards. However, the RTX nomenclature becomes very misleading with the new change.
If you want to use RTX Voice on your GeForce GTX GPU, you can head here to download the app.
But, if you own an RTX 20-series or RTX-30 series graphics card, you’re better going off going with its successor, RTX Broadcast, which includes RTX Voice and a webcam feature that allows you to set up virtual backgrounds when streaming or video chatting. Plus, it’s doubtful that RTX Voice will receive ongoing updates, unlike RTX Broadcast.
But our casual in-house testing has found bot the older RTX Voice app and newer RTX Broadcast to work surprisingly well. I’ve used both on my RTX 2060 Super, and it’s one of the best programs I’ve come across that accurately deletes background noise without killing or muting your voice. For more, see our Nvidia Broadcast noise removal demos on our YouTube channel.
No matter how many keys your keyboard has, you can always use a dedicated keypad with buttons for executing macros, launching your favorite apps or, if you’re a streamer, initiating functions in OBS. Many users swear by the Elgato Stream Deck lineup of macro keypads, but these devices are expensive.
With Raspberry Pi Pico, some inexpensive hardware and the right script, you can create your own Stream Deck-like macro keypad, plug it in via USB and use it to make your life easier in OBS or for any tasks. Once completed, the macro keypad will be seen as a USB keyboard by your operating system, allowing it to work with any computer, no drivers or special software required.
What you need to build a Raspberry Pi Pico-Powered Stream Deck
Raspberry Pi Pico
Mechanical Key switches (i.e. Cherry MX brown)
Key Caps (Compatible for Cherry MX)
30 Gauge Wires
3D printed Case (using this design)
Setting Up Raspberry Pi Pico’s Firmware
To get our Raspberry Pi Pico-powered stream deck working, we will be using Circuit Python as the programming language, because it has a built-in USB HID library. To use Circuit Python on a Pico, you must first flash the appropriate firmware.
1. Download the Circuit Python UF2 file.
2. Push and hold the BOOTSEL button and plug your Pico into the USB port of your Raspberry Pi or other computer. Release the BOOTSEL button after your Pico is connected.
This will mount the Pico as a Mass Storage Device called “RPI-RP2”.
3. Copy the UF2 file to the RPI-RP2 volume.
Your Pico should automatically reboot and will be running Circuit Python.
Adding Code for Pico-Powered Stream Deck
I have written custom code to make the Pico act as a stream deck / macro keypad. Here’s how to install it.
1. Download the project zip file from Novaspirit Github.
2. Transfer the contents of the zip file to “CIRCUITPY” volume and overwrite the existing files.
3. Reboot the Pico and it should load the macro keys code.
3D Printing Pico-Powered Stream Deck Case
If you want to use our case, you need to 3D print it or have it printed by a service such as All3DP. Download our design files and use these CURA settings.
PLA
15% infill
3 line wall thickness
No Support needed
0.2 Layer height (use 0.1 layer height for higher quality)
Print separately with two different colors
Image 1 of 2
Image 2 of 2
Assembling Your Pico-Powered Stream Deck
Now it’s time to assemble the stream deck / maco keypad and solder everything into place.
1. Start by placing the Cherry MX-compatible key switches on the top plate of the 3D-printed case.
2. You will connect wires as follows. More details below.
Image 1 of 2
Image 2 of 2
3. Connect all the top left pins on the switches together with a single wire and connecting it to Pin 36, the 3V3 pin on the Pico.
4. Solder a short wire to each one of the right pins to prep the connections we are going to make to individual GPIO pins.
5. Solder the required wires to the appropriate pins on the raspberry pi pico
5. Snap the case together.
Image 1 of 2
Image 2 of 2
Setting up the macro keys
The keys are set up in a way to utilize Ctrl+function keys starting from Button 1 (Top left) Control + F7 to Button 6 ( bottom right) Control + F12. These keys can be altered from the code.py as needed. But i’m going to show you a few ways to utilize the default mapping with the examples below for both program shortcuts and OBS.
Setting Up Macro for Program Shortcuts
If you want to use a key on Raspberry Pi Pico-powered stream deck to launch an app in Windows , here’s how.
1. Right click a shortcut and select “properties.”
2. Select the “Shortcut key” field in the Shortcut tab.
3. Press any of the macro keys and you’ll see its keyboard combo (ex: CTRL + F7 for key 1) appear in the box.
4. Press “OK” and your new macro has been assigned to the key pressed.
Setting Up Macros for OBS
1. Open OBS and navigate to “Settings.”
2. Select the “Hotkeys” setting and scroll down to the scene you want to assign a macro for.
3. Select “Switch to scene” on the scene you want to macro and press the appropriate key on your stream deck to assign it.
4. Press “OK” and the macro keys will be assigned to those scenes.
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
I’ve read plenty about what it’s like to use, hold, and type on an LG Gram before, but that didn’t take away from the impressive first impression it made when I used the new Gram 17 for the first time — especially this larger model. The Gram 17 has a grandiose 17-inch display, yet it’s only three pounds, which is light enough for me to carry around one-handed. Its keyboard is a joy to type on with a surprising amount of tactility and travel in the keys, and the battery life outlasts a whole day of work, even much of a second one, too. It’s a quiet laptop, and even under pressure, its fans weren’t loud enough for me to hear once.
The Gram communicates its biggest selling points — lightness and longevity — so effectively that it outshines some persisting minor problems. Those include a keyboard layout that can be difficult to adjust to. For instance, the num lock is too easy to press accidentally, being right next to backspace, and the only function key is located too far away from the most essential function row buttons, making it a stretch to adjust the volume one-handed. Lastly, the large trackpad isn’t always good at palm rejection. These are important things for any laptop to get right, let alone one that has a bunch of extra real estate that should be used to avoid flaws like these.
This new model for 2021 is mostly a spec update, not a design overhaul compared to the 2020 version. But it’s a good update, at that. Inside of LG’s sole $1,799 Gram 17 configuration (it’s been available for $1,699 since late March), there’s now an 11th Gen Intel Core i7 quad-core processor that promises — and actually delivers — better performance and longer battery life than the 2020 model my colleague Monica Chin reviewed. Additionally, this model’s faster 4,266MHz LPDDR4X RAM, of which it has 16GB, likely plays a role in that speed boost. It’s not a drastically different computer to use than before, but it can hold its own more reliably this time around.
While running my usual collection of around 10 tabs in Microsoft Edge for work, with Slack and Spotify running in tandem, performance didn’t stutter at all. This is the bare minimum of competency tests for laptops, so for something more demanding, I exported a five-minute, 33-second test file from our Verge video team through Adobe Premiere Pro. Last year’s model took 30 minutes to do this, but this one gets it done in around 11 minutes. That doesn’t hold a candle to laptops that put more of an emphasis on power usually at the expense of heft, but it’s enough of an improvement to make the Gram 17’s price a little easier to justify. Razer’s Book 13 with the same processor fared just about a minute faster with this test, but the Gram is on par with the latest Dell XPS 13 and Asus ZenBook 14.
The battery is also mystifyingly good — and better than before. With that same batch of apps I mentioned earlier, the Gram 17 lasted an entire workday and well into the next, around 12 or so hours later. If you’re looking for a laptop that can go a full day of work without its charger, whether you have video calls or not, this is one for your shortlist. It features the same 80Wh battery as last year’s model, which is still impressive considering the Gram 17’s lightweight profile.
Also similar to the 2020 version is its USB-C charging. LG now includes a 65W USB-C power adapter instead of the 48W charger that shipped with the previous model. It can more quickly recharge with the included brick (which is no bigger than a compact power bank), but it still takes a few hours to refill it completely.
That sums up the biggest changes to this year’s Gram 17. There are a few smaller tweaks I liked, too. The arrangement of ports has been shifted around in a more logical layout. On the left side, there’s a Thunderbolt 4 port (it can be used for charging, data, or connecting to a display), one USB-C 4.0 Gen 3 port, a headphone combo jack, and an HDMI port. Over on the right is where you’ll find two USB Type-A 3.2 Gen 2 ports next to a microSD card slot and a Kensington lock.
If you’re shopping around for 17-inch laptops, you’ll be hard-pressed to find anything lighter than the Gram. It’s 2.98 pounds, which is just a little heavier than the 13-inch MacBook Air. The Dell XPS 17 is one of LG’s main competitors in this space, yet its baseline model weighs over a pound and a half more. If you get an XPS 17 model like the one we reviewed in July 2020, it’ll weigh almost as much as two Gram 17s at 5.53 pounds. That added weight does bring more power and a dedicated GPU in the Dell, but if you just want a big, portable screen for productivity, the Gram is more than capable.
As my colleague Monica Chin mentioned in her review of the 2020 LG Gram 17, this laptop isn’t a looker. It still doesn’t stack up next to the high-end design of the XPS 17, which features an aluminum chassis. The Gram has a tough magnesium alloy-clad body, but it looks and feels plasticky. That said, there’s technically nothing flawed about its design, and it seems better than most black aluminum laptops I’ve tried at resisting fingerprints. Some people might actually prefer that its design doesn’t stick out much, even when its backlit keyboard is on.
Something minor that I wish LG offered with this model is the option for a matte display. It’s rare for ultrabooks to have them, but I find it hard to stay focused on what’s happening on the screen when I can see a reflection of all my apartment’s happenings staring back at me. Wherever you use this laptop, glare could be a big problem, like it can be with a TV. This doesn’t take away from the Gram 17’s display being sharp and vivid. It’s a WQXGA (2560 x 1600) IPS non-touch panel from the company’s own display division, and it makes everything look excellent with 99 percent DCI-P3 color gamut coverage. If a touchscreen is important to you, LG’s Gram 2-in-1 laptops feature them. LG was one of the first Windows laptop makers to move to a 16:10 aspect ratio, and the Gram 17 has one, too. It gives you a little more vertical real estate to work with on the screen compared to 16:9 displays. It’s most beneficial for productivity (you see more info at once, so less scrolling is necessary), but you’ll have black letterboxing for most full-screen videos you watch.
The Gram is short on bloatware, which I love to see. It ships with Amazon’s Alexa built-in, though it requires activation before you can use the service. A few other preinstalled apps include McAfee LiveScan and a suite of creator tools from CyberLink. Compared to some other laptops I’ve used recently, like Acer’s Predator Triton 300 SE, the Gram doesn’t shove pop-up notifications in your face seemingly every time you use it.
There are few 17-inch laptops to choose from and even fewer models that are as lightweight as this one. This year’s LG Gram 17 is unique in the sense that it’s more powerful than ever, but it doesn’t give up its portability. Oddly enough, the only competition it faces at the moment comes from within LG. The 16-inch Gram is lighter and less expensive, yet it features the same design, screen size, port selection, battery capacity, and specs (aside from having significantly less storage) for $1,399. You can find one that has the same 1TB storage as the Gram 17 for $1,599. If the Gram 17’s $1,799 price is too expensive, at least you have an alternative that’ll likely deliver the same great results.
But if the price isn’t an issue and you want a surprisingly portable and powerful laptop with an oversized screen, the Gram 17 is in a class of its own.
Since 1996, spacejam.com has been an internet time capsule like few others still in existence — a largely pristine sample of the early World Wide Web and all the most advanced multimedia offerings available at the time, such as animated GIFs and Windows 95 screensavers. But 25 years later, it’s finally been supplanted; the new sequel Space Jam: A New Legacy, starring LeBron James, has taken over the URL to showcase the new movie’s very first trailer.
Here’s what spacejam.com looks like today:
But before you go boycotting the sequel, you should probably know that the original Space Jam website isn’t dead yet. In fact, it’s just one click away at spacejam.com/1996, and the new website lets you that original Space Jam logo (in the upper-right-hand corner) to go back in time again.
Here’s that new trailer for the LeBron James version — featuring both a cartoon LeBron James, and upscaled CGI versions of the Looney Toons crew as well. Looks like they’ll be breaking into the real world once again!
Will LeBron go Googling for the original Space Jam website in the film (or, perhaps, will Daffy or Bugs remark that it’s still there?) I think it’s a safe bet. It’ll arrive on July 16th in theaters and HBO Max simultaneously.
For some very awkward reason, Intel has not posted a version of its Xe-LP graphics driver for its Rocket Lake processors on its website. The drivers are also not available from Intel’s motherboard partners, which causes quite some confusion as this essentially renders the new Rocket Lake’s Intel UHD Graphics featuring the Xe-LP architecture useless. However, there is a workaround for those who need it.
Intel’s main angle with its Rocket Lake processors for desktops is gaming, which is why it praises its Core i9-11900K and Core i7-11700K CPUs with a 125W TDP. Such systems rarely use integrated graphics, so enthusiasts do not really care about the availability of Xe-LP drivers for their chips. But the Rocket Lake family also includes multiple 65W and 35W processors that are designed for mainstream or low-power machines that use integrated GPUs.
For some reason, there are no drivers for Rocket Lake’s Intel UHD Graphics integrated GPU based on the Xe-LP architecture on Intel’s website, as noticed by AdoredTV. Intel’s motherboard partners do offer different versions of Intel’s Graphics Driver (which adds to the confusion) released in 2021, but none of them officially supports Rocket Lake’s integrated graphics, according to VideoCardz.
The absence of the Xe-LP drivers for Rocket Lake processors from official websites is hardly a big problem as there is an easy workaround for the problem. Instead of using an automatic driver installation wizard that comes in an .exe format, you can download a .zip file with the same driver (version 27.20.100.9316 at press time), then install it using Windows 10’s Update Driver feature with a Have Disk option, then hand pick the Intel Iris Xe Graphics.
Since Rocket Lake’s integrated GPU is based on the same architecture as Tiger Lake’s GPU, the graphics core will work just as it should. This option will work with experienced DIYers, but it might be tricky for an average user.
Unlike do-it-yourselfers, OEMs and PC makers will not use a workaround as the latest official driver has never been validated for the Rocket Lake family. Fortunately, at least some OEMs have access to ‘official’ Rocket Lake graphics drivers.
“We have drivers flowing to OEMs for a while now,” said Lisa Pierce, Intel vice president and director of graphics software engineering, in a Twitter post on April 2. “The delay was in a public posting with our unified graphics driver flow and we will work it post ASAP.”
She did not elaborate on when exactly the driver will be posted to Intel.com and whether it needs to pass WHQL validation before that. Meanwhile, on April 1 Pierce said that Rocket Lake drivers were several weeks away.
Microsoft has updated its events website to reveal the dates of Microsoft Build 2021. Per the new listing, the annual conference will take place from May 25th–27th this year. It will also be held virtually, along with many of Microsoft’s 2021 events.
A Microsoft spokesperson confirmed the dates to The Verge. “Please stay tuned for more information to come,” the spokesperson said.
“Microsoft Build is where developers, architects, start-ups, and students learn, connect, and code together, sharing knowledge and expanding their skillset, while exploring new ways of innovating for tomorrow,” the website reads.
Microsoft’s Build-specific website still includes information about Build 2020 and does not appear to have been updated with the new dates yet.
Microsoft Build 2020 was also held remotely. The 48-hour event was free of charge; workshops and keynotes were live-streamed. Build is primarily targeted at developers and is often Microsoft’s opportunity to showcase upcoming changes to Windows, Office, Edge, and other services. At last year’s conference, the company exhibited its Fluid Framework, PowerToys Run launcher, and Project Reunion, among other new products.
The announcement comes as daily reported COVID-19 cases are still trending upward in the US. About 30 percent of the US population has received at least one dose of a COVID-19 vaccine so far, but the Centers for Disease Control and Prevention is still recommending that all people avoid medium and large gatherings, regardless of vaccine status.
“The safety of our community is a top priority,” a Microsoft spokesperson told The Verge in a statement regarding last year’s announcement. “We look forward to bringing together our ecosystem of developers in this new virtual format to learn, connect and code together.”
Through its GeForce 465 driver update, NVIDIA formally introduced the PCI-Express Resizable BAR feature to its GeForce RTX 30-series “Ampere” graphics cards. This feature was invented by PCI-SIG, custodians of the PCI-Express bus, but only became relevant for the PC when AMD decided to re-brand it as “AMD Smart Access Memory” (which we separately reviewed here) and introduce it with the Radeon RX 6000 series RDNA2 graphics cards. That’s probably when NVIDIA realized they too could implement the feature to gain additional performance for GeForce.
How Resizable BAR Works
Until now, your CPU could only see your graphics card’s memory through 256 MB apertures (that’s 256 MB at a time). Imagine you’re in a dark room and have a tiny pocket flashlight that can only illuminate a small part of a page from a book to you. You can still read the whole page, but you’ll have to move the flashlight to where you’re looking. Resizable BAR is the equivalent of illuminating the whole room with a lamp.
This becomes even more important if you consider that with modern APIs, multiple CPU-to-GPU memory transfers can be active at the same time. With only a single, small aperture, these transfers have to be executed in sequence—if the whole VRAM is mapped, they can operate in parallel. Going back to our reading in the dark example, we now assume that there are multiple people trying to read a book, but they only have one flashlight. Everyone has to wait their turn, illuminate the book, read a bit of text and then pass the flashlight on to the next person. With Resizable BAR enabled, everybody can read the book at the same time.
The 256 MB size of the aperture is arbitrary and dates back to the 32-bit era when address space was at a premium. Even with the transition to x86-64, the limit stayed as newer 3D graphics APIs such as DirectX 11 relied less on mirroring data between the system memory and the video memory. Perhaps the main reason nobody bothered to implement Resizable BAR until now was that modern GPUs come with such enormous video memory bandwidths that the act of reading memory through apertures had minimal performance impact, and it’s only now that both NVIDIA and AMD feel the number-crunching power of their GPUs has far outpaced their memory bandwidth requirements.
To use Resizable BAR, a handful of conditions should be fulfilled. For starters, you need a modern processor that supports it. From the AMD camp, Ryzen 3000 “Zen 2” and Ryzen 5000 “Zen 3” processors support it. On the Intel camp, hardware support technically dates back to the 4th Gen “Haswell,” but most motherboard vendors for whatever reason restricted their Resizable BAR enabling BIOS updates to the 300-series chipset, or 8th Gen “Coffee Lake” (and later) architectures, along with X299, or 7th Gen “Skylake-X” HEDT (and later). You’ll also need a compatible graphics card—NVIDIA RTX 30-series or AMD RX 6000 series. Lastly, your PC must boot in UEFI mode with CSM disabled for UEFI GOP support. With these conditions met, you’ll need to enable Resizable BAR in your motherboard’s UEFI setup program.
There are multiple methods to check if Resizable BAR is enabled. The easiest is to use GPU-Z, which now shows the Resizable BAR status on its main screen. The other options are using NVIDIA’s Control Panel and Windows Device Manager.
In this review, we will be testing four NVIDIA GeForce RTX 30-series Ampere models—RTX 3090, RTX 3080, RTX 3070, and RTX 3060 Ti, all Founders Edition cards. Each of these will have Resizable BAR enabled and disabled, across our entire test-suite of 22 games with a rich diversity of game engines and APIs.
Microsoft has shut down its Cortana app for iOS and Android. It’s the latest in a series of moves to end support for Cortana across multiple devices, including Microsoft’s own Surface Headphones. The Cortana app for iOS and Android is no longer supported, and Microsoft has removed it from both the App Store and Google’s Play Store.
“As we announced in July, we will soon be ending support for the Cortana app on Android and iOS, as Cortana continues its evolution as a productivity assistant,” reads a Microsoft support note spotted by MacRumors. “As of March 31, 2021, the Cortana content you created–such as reminders and lists–will no longer function in the Cortana mobile app, but can still be accessed through Cortana in Windows.” Cortana reminders, lists, and tasks are now available in the Microsoft To Do app instead.
Microsoft first launched Cortana for iOS and Android in December 2015. The app was originally designed to connect Windows 10 PCs and mobile phones, but failed to gain traction despite a big redesign. Microsoft CEO Satya Nadella recognized the company’s difficulties in competing with other digital assistants a couple of years ago, revealing that Microsoft no longer saw Cortana as a competitor to Alexa and Google Assistant.
Microsoft had once envisioned a future full of Cortana-powered fridges, toasters, and thermostats. That dream came to an end earlier this month when the first and only Cortana speaker removed Microsoft’s digital assistant. Other devices like the Cortana-powered GLAS thermostat are also no longer powered by Microsoft’s digital assistant.
Cortana isn’t completely finished, though. Microsoft still sees value in conversational AI and the company is trying to reposition Cortana as an assistant that can improve Microsoft’s enterprise-focused offerings.
Discord is the latest company to introduce a Clubhouse-like feature that lets people easily broadcast live audio conversations to a room of virtual listeners. Discord says its take, called Stage Channels, is available now on all platforms where Discord is available, including Windows, macOS, Linux, iOS, Android, and the web.
If you’ve used Discord before, you might know that the app already offers voice channels, which typically allow everyone in them to talk freely. A Stage Channel, on the other hand, is designed to only let certain people talk at once to a group of listeners, which could make them useful for more structured events like community town halls or AMAs. However, only Community servers, which have some more powerful community management tools than a server you might share with a few of your buddies, can make the new Stage Channels.
The feature’s broad availability makes Discord the first app to offer an easy way to host or listen in on social audio rooms on most platforms. Clubhouse is still only available on iOS, though an Android version is in development. Twitter’s Spaces feature works on iOS and Android, but only some users have the ability to make audio rooms right now. (The company plans to let anyone host a Space starting in April.) LinkedIn, Mark Cuban, Slack, and Spotify are also working on live audio features, and Facebook reportedly has one in the works, too.
At the top of this post, you can see what a Discord Stage Channel looks like on desktop, and here’s what one looks like on mobile:
I got to participate in a Stage Channel to be briefed on the feature, and it was quite similar to using Clubhouse or Twitter Spaces. When I joined the Stage Channel, I was automatically put on mute and listed as an audience member. I could see who was speaking and who else was with me in the virtual crowd.
When I wanted to ask questions, I pressed a button to request to speak, and a Stage moderator brought me “on stage” so I could talk. Stage moderators can also mute speakers or even remove them from the room if they are being disruptive.
Microsoft is bringing 16 original Xbox and Xbox 360 games to the company’s Xbox Cloud Gaming (xCloud) service today. Titles like Elder Scrolls III: Morrowind, the original Banjo-Kazooie, and Fallout: New Vegas are all now available to stream to Android devices. Microsoft is also enabling touch controls for Jetpac Refuelled, Viva Pinata, and Viva Pinata TIP.
The 16 games will be available to Xbox Game Pass Ultimate members and streamable to Android phones and tablets. “We’ve listened to the feedback, going all the way back to our earliest cloud gaming preview, and making games from previous generations available on mobile devices has been one of the most requested features by the community,” says Microsoft.
Microsoft is also still planning to extend xCloud game streaming to iOS this spring. The software maker hasn’t revealed exactly when its iOS preview will go live, but Microsoft has previously promised both iOS and web browser streaming for xCloud in spring 2021. Game streaming will be available in the Xbox app for Windows 10 PCs, too.
We got an early look at how xCloud will work on Windows 10 last month, alongside an exclusive first look at the web version of the service. Microsoft is also testing 1080p streams for xCloud, and this will likely debut with the Windows version of the app.
For now, here are the original Xbox and Xbox 360 games making their way to Xbox Cloud Gaming today:
(Pocket-lint) – The LG Gram 16 is never going to make sense to some people. For many, a large-screen laptop has to be a super-powered desktop-replacer. And if it’s not, why does it exist?
LG’s Gram series has quietly challenged that view for the last few years. And the LG Gram 16 should make this concept less of a leap for those still struggling.
The pitch: the LG Gram 16 costs around a grand less than the MacBook Pro 16, but still has a big screen, a colour-rich display and long battery life. Oh, and it weighs 800g less and has a better keyboard, for some tastes at least.
Suddenly LG’s weirdo huge-but-light Gram laptops don’t sound so strange. Indeed, this 16-inch version is quite the stunning proposition.
Design
Dimensions: 313.4 x 215.2 x 16.8mm / Weight: 1.19kg
Magnesium alloy casing
Interested now? Let’s start by slapping the LG Gram 16 down to earth with one of the big issues you need to accept.
While the LG Gram 16 is a nicely made laptop, it doesn’t feel like a four-figures slab of the future when you pick it up. Carry it around like a notepad, give it a light squeeze between thumb and finger, and the base and lid panels will flex a bit.
LG has not made the Gram 16 on a shoestring budget. But large, low-weight body panels come with compromises. And you feel them each time you pick the laptop up like this.
The LG Gram 16’s casing is magnesium alloy, which is the best material for the job. It’s lighter than aluminium for the same level of strength, and a lot nicer than plastic. Just don’t expect the dense unibody feel plenty of 13-inch laptops at this price level.
The issue is all about feel, not utility. The LG Gram 16’s touchpad doesn’t stop clicking because you lift it by one corner of the base. You can’t stop the internal fans spinning by pressing down on part of the keyboard surround. And, yes, we’ve seen these issues in laptops smaller and heavier than the LG Gram 16.
Its keyboard panel, the most important of the lot, is pretty rigid – if not immaculately so. A little outer panel flex is only a big issue if you think it is.
Despite being a new entry in this series, the LG Gram 16 nicks its style from its siblings. This is a very plain, serious-looking laptop that isn’t out to dazzle eyes or fingers with flashy finishes. All panels are matte black with a very light texture similar to a soft-touch finish.
There’s a kind of confidence to a lifestyle laptop this plain, one that weaves a style out of sharp-cornered keyboard keys and a semi-distinctive font. If anything, LG could actually go plainer on this key typeface, which looks a little close to that of a gaming laptop.
But the aim is pretty clear: the LG Gram 16 is a laptop that can fit in just about anywhere. You can take is anywhere too, as the 1.19kg weight is lower than that of the average 13-inch portable.
The footprint isn’t tiny, of course, but it couldn’t get all that much smaller considering the 16-inch display has fairly small borders on all four sides.
Display
16.0-inch LCD panel, 2560 x 1600 resolution
99.1% DCI P3 colour coverage (as tested)
Glossy plastic finish
The LG Gram 16’s screen also helps keep the shape sensible, as this is a 16:10 aspect display, one taller than the standard widescreen style. This maxes out the perception of space when you use apps, rather than video. There’s no issue with the quality of the panel either.
Colour depth is truly excellent, matching what you get in a top MacBook Pro. Brightness is strong enough for outdoors use, which is pretty impressive considering the sheer square inch count the LG Gram 16 has to light up.
Contrast is not the best around, but is still good for an LCD-based screen. And resolution is, well, the one LG should have chosen. It’s at 2560 x 1600 pixels, sitting above Full HD, but a way below 4K.
The MacBook Pro 16 has a sharper screen still, at 3072 x 1920 pixels. But the LG Gram 16 still adds the crucial pixel density it needs to avoid the obvious pixellation that can happen in a larger display like this.
If you use a 13-inch laptop at the moment that supersize boost is the first thing you’ll notice. The LG Gram 16 makes it seem so much more like you’re using a monitor that happens to be hooked onto a laptop, rather than a laptop screen. That’s great for dull work apps, even for games.
However, the actual character of the screen doesn’t quite make the most of the top-quality panel underneath, because of another concession made for size: its plastic screen coating. Plastic is often used in matte finish laptops, to scatter reflections. But this is a glossy screen, telling us weight is the issue here. Glass is the usual choice, but glass isn’t that light.
The plastic film is also far less rigid than glass, causing reflections to distort at the corners a little. And if there’s meant to be a reflection-busting coating here, it’s not a very good one. There’s also no touchscreen, and the hinge only folds back to around 130 degrees, to stop the thing tumbling off your knees through weight imbalance.
Like the flexy lid and bottom panels, the plastic surface is one you’ll have to suck up for the sake of low weight. But does the LG Gram 16 have a high quality screen with plenty of space that you can use outdoors? Absolutely.
Keyboard & Touchpad
Textured glass touchpad
Two-level backlight
1.65mm key travel
The LG Gram 16’s keyboard fills out the appeal of this laptop for us. We type all day, every day, more or less. Keyboard quality matters, and this is a keyboard made for that sort of work.
Key travel is excellent, and not just if you limit your comparisons to ultra-light laptops. The keyboard plate feels rigid, even if – sure – you can get it to flex slightly under significant finger pressure. And springy resistance offers good feedback with each depress.
We also like that LG has thinned-down the NUM pad, which lets the main set of keys sit more towards the centre of the laptop. Being shunted too far to the left rarely feels good. Here there’s just a mild lurch leftwards. Think universal healthcare, not a state-led redistribution of all wealth.
The LG Gram 16 also has a two-level backlight and a fingerprint scanner hidden in the power button, just above the NUM pad.
Plenty of space in the keyboard plate leaves plenty of room for a giant touchpad. This thing is huge – and you probably can’t appreciate it from photos alone, where it seems in proportion with the rest.
The LG Gram 16’s touchpad has a smooth glass surface, zero floaty wobble, and an easy-to-depress yet well-defined clicker. It’s on the loud side, but that’s it for negative points to note.
A larger laptop opens the doors to a different approach to the keyboard. But apparently it doesn’t allow for a better webcam. The LG Gram 16 has the same sort of stogy 720p video call camera we see in most other high-end laptops.
Its speakers aren’t even close to those of the MacBook Pro 16 either. LG uses familiar-sounding drivers with just the tiniest hint of low-frequency output and only moderate max volume. Their tone is pleasant, we could watch a movie using them happily enough, but it would be good to see LG improve this area in future generations.
The main grilles for the treble drivers also sit on the underside, giving them just a couple of millimetres of clearance provided by the tiny rubber feet. Put the LG Gram 16 on a thick carpet or your bed and the treble is attenuated, although it does seem impossible to block the sound fully, which is good.
Performance
Intel i7-1165G7 processor
16GB LPDDR4X RAM
1TB NVMe SSD
The LG Gram 16 is an Intel Evo laptop. This is a new standard introduced by Intel to make laptops with its processors seem more attractive than those with AMD or Apple CPUs. It’s marketing, but not useless marketing, as it means you know you get standards like Thunderbolt 4, an 11th Gen processor, and at least nine hours of battery life (if the screen is a 1080p one).
Our Gram 16 has Intel’s Core i7-1165G7 CPU, 16GB RAM, and 1TB of very fast SK Hynix SSD storage.
Some CPU overclockers who design their own water cooling systems will disagree, but we think this is enough to make the LG Gram 16 a viable desktop replacement for the vast majority of people.
Windows 10 feels great, there’s more than enough power to run apps like Photoshop well. So why would you buy a MacBook Pro 16 with a more power consuming 9th Gen CPU? Or a much heavier Windows laptop with an Intel Core i7-10750H?
The top-end Mac has around 60 per cent additional CPU power, in part because it has an “i9” equivalent processor. A Core i7 alternative made for the more traditional desktop-replacing laptop offers around 20 per cent more power, and these processors are designed to hold max power for longer. Because chunky laptops tend to have fans that can shift more air.
But if you’re not sure if the LG Gram 16 has enough power or not, and you don’t use apps that make your current laptop slow down during exports, imports – whatever procedures they do – then it probably does have enough to satisfy.
The LG Gram 16 also gets Intel’s Xe graphics, which is a fantastic addition for a laptop like this. It turns slim laptops from poor gaming machines to at least acceptable ones. GTA V? No problem. The Witcher 3? Sure, even at 1080p if you play with the settings a bit. Alien Isolation runs well at just below Full HD resolution with a mix of Medium and High settings.
Absolutely loads of stuff is playable with Intel Xe graphics, because it gets to the level of separate entry-level gaming hardware from the last generation. And that’s not too shabby: it’s gaming skills you seem to get ‘for free’. If you buy an LG Gram 16 and find games don’t run as well as you hoped, make sure to try them at different resolutions. Intel Xe graphics chips may have a bit of punch to them, but 2560 x 2600 pixels is a bit much to ask in most console-grade titles.
There’s more good news. The LG Gram 16 is almost silent under all workloads, even if you max out the CPU for half an hour. There is a fan, but it’s barely audible if you play something through the speakers even at 30 per cent volume. This is probably the quietest laptop we’ve reviewed with one of these 11th Gen Core i7 processors. It’s another benefit of all that extra room inside: better airflow.
Battery Life
80Wh battery – up to 22 hour battery life (claimed)
65W charger
LG doesn’t sacrifice battery life for low weight either. More brownie points for LG’s engineers. The Gram 16 has an 80Wh battery, far larger than the 56Wh standard battery of the Dell XPS 15, if smaller than the more power-hungry (and powerful) 100Wh MacBook Pro 16.
Match that sort of capacity with a processor already fairly light on the battery drain and you are guaranteed good results. The LG Gram 16 lasts roughly 14 hours 30 minutes when streaming video at moderate screen brightness.
LG claims 22 hours, but this is one of those cheeky claims that involves using a benchmark from 2014 – and letting it sit in standby mode half the time.
Still, it’s excellent real-world stamina for light work, and way above the nine hours the Intel Evo”sticker guarantees. That guarantee only applies to a lower screen resolution than you get here too.
Use it with the display maxed and the CPU pushed to its limits the whole time and the LG Gram 16 will last around three hours and 25 minutes. Which still isn’t bad – a gaming laptop wouldn’t give you a third of that.
Want to know about the LG Gram 16’s connections? There are two Thunderbolt USB-C ports, and one is taken up by the charger while plugged-in. You get two classic USB ports, a microSD slot, a full-size HDMI, and a headphone jack too. So it doesn’t demand you keep a USB adapter handy, and you can plug it right into your TV or a monitor. Bliss.
Best laptop 2021: Top general and premium notebooks for working from home and more
By Dan Grabham
·
Verdict
It’s a wonder the LG Gram concept hasn’t been nicked more times already. The LG Gram 16 is a large-screen laptop that’s genuinely light enough to carry with you everywhere, every day.
There are barely any substantive compromises involved. The LG Gram 16 is as powerful as smaller laptops that weigh more, it lasts as long off a charge as some of the best Intel-powered laptops, and the keyboard is no lightweight either.
You don’t get the ultra-dense metallic feel of some of the smaller-screened alternatives at a similar price. And, sure, the Gram 16 uses a low voltage processor designed to minimise heat and save battery life, not for blistering power. However, it has enough of it to work perfectly as a desktop-replacer for most people.
Sure, a 13-inch laptop is better for some. A more powerful, thicker one will be better for others. But the LG Gram 16 takes some elements from both and, through clever design, makes it work far better than you’d imagine. For the right user it’s a stunning proposition.
Also consider
LG Gram 17
squirrel_widget_176886
Want something even bigger? LG has made a 17-inch Gram for a few years now. All the appeals are the same: low weight, good screen, good keyboard. Battery life is slightly shorter as it has a bigger screen and the same battery capacity. But the choice is all about the screen size you’d prefer. We think 16-inch is a more accommodating size for the masses.
Read our review
MacBook Pro 16
squirrel_widget_4315074
The 16-inch MacBook Pro isn’t really in the same category if you look right up close. It has a more powerful processor and weighs about 800g extra. Oh, and it costs a grand more. Ouch. However, the MacBook seems a more expensive laptop as it has that amazing Apple build, which feels like perfection. The glossy glass screen finish looks better too, making the most of its similarly brilliant colour depth.
Yesterday, AMD release a new Adrenalin driver to the public, version 21.3.2 with support for several new titles including Dirt 5, along with several bug fixes. Specifically, driver 21.3.2 adds support for Dirt 5‘s new DirectX Raytracing (DXR) update.
Dirt 5 originally launched late last year, and CodeMasters worked with AMD on the title. Not long after launch, AMD provided the press with early access to a beta DXR branch of the game, with the promise that DXR support would eventually get rolled into the public build. It took longer than expected, but with the latest update you can now try Dirt 5‘s ray tracing feature on AMD’s current RX 6000 series GPUs. (It also works with Nvidia RTX GPUs.) We’re planning a more extensive look at the state of ray tracing in games in the coming weeks, both to see how much DXR and ray tracing impact performance, as well as how much ray tracing improves the look of various games.
AMD added support for the new Outriders RPG and Evil Genius 2: World Domination as well. There’s no indication of major performance improvements or bug fixes for those games, but the latest drivers are game ready.
Bug Fixes
Besides the above, here are the five bugs squashed in this update:
The Radeon RX 6700 will no longer report incorrect clock values in AMD’s software.
Shadows corruption is fixed in Insurgency: Sandstorm when running on RX 6000 series hardware.
There is no longer an issue where the desktop resolution in Windows may change when turning a monitor off then back on again.
The start and cancel buttons should no longer disappear when resizing the Radeon Software.
You should no longer get a black screen when enabling Radeon FreeSync and setting a game to borderless fullscreen/windowed mode on RX 6000 series GPUs.
Nvidia just released a new Game Ready Driver, version 465.89, earlier this morning. This driver update comes with several new features, plus an assortment of bug fixes and more game support.
The highlight of the new driver is the addition of Resizable BAR support on all desktop RTX 30 series products. Before, this technology was limited to just the RTX 3060 12GB, but now it has filtered down to all Ampere products. (Fingers crossed that it will also land on RTX 20 series products in the future, but that might be hoping for too much.)
Resizable BAR is a feature built into the PCIe protocol that allows the CPU to access all of a discrete GPU’s frame buffer or VRAM, whereas before CPUs were limited to accessing small chunks (256MB) of VRAM at a time. This optimization can improve frame rates in some games by up to 10%, but currently game support is limited as the tech has only recently been adobted by Nvidia, AMD, and Intel.
Another feature added to this driver is ‘Windows Virtual Machine Beta’ support on all
of Nvidia’s currently supported GPUs. Simply put, this feature allows your physical Nvidia GPU to interface directly with a virtual machine as if the card was directly connected to the VM. This will allow for full GPU acceleration in a VM environment, allowing you to do things like run older video games in say a Windows 7 virtual machine, if the game can’t run on a native Windows 10 device.
Game Ready
Also coming in this new Game Ready driver are a bunch of updates to game support. Nvidia notes DLSS support for the upcoming Outriders RPG, and an update to Rainbow Six Seige that adds Nvidia’s Reflex technology to the game.
There’s also support for Dirt 5‘s new ray-tracing update, support for Evil Genius 2: World Domination and Kingdom Hearts.
Bug Fixes
Here are all the additional bug fixes addressed in this driver update. For the rest of the updates pertaining to this driver, check out Nvidia’s blog post here.
Pixelated Smoke in Rainbow Six Siege running the Vulkan API.
A game crash on RTX 30 series when playing X4 Foundations with the Vulkan API.
Blue screen crashes when pairing Samsung’s Odyssey G9 with a HDMI TV.
Another blue screen crash on the RTX 2060 specifically, when gaming and watching YouTube simultaneously.
In Sunset Overdrive, display random green corruption might happen when enabling depth of field.
Realtek’s DisplayPort to HDMI 2.1 protocol converted clock limited to 600Mhz.
GPU power consumption may be high in idle conditions when using some high refresh rate G-SYNC monitors.
Screenshots taken with GeForce Experience are washed out when HDR is enabled.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.