The Raspberry Pi Pico’s RP2040 SoC is turning up in the most unlikely of places, one of which is a redesigned mechanical keyboard from Pimoroni. Keybow 2040 is the latest version of Pimoroni’s Keybow keyboard add ons, and this version is the first to feature the RP2040 SoC as the brains of the project. We reached out to Pimoroni and we have all the confirmed details on this most unusual RP2040 board.
It’s so beautiful, I think I might cry, @pimoroni 🥲 #Keybow2040 #RP2040 pic.twitter.com/V3igoXviwrFebruary 8, 2021
Pimoroni’s Keybow has been around for some time. It was developed to use a Raspberry Pi Zero as a simple USB gadget that would emulate keyboard keys or be used to run macros. Keybow RP2040 has the new “Pi Silicon” SoC at the heart of the board, not a Raspberry Pi Pico but the RP2040 SoC embedded to the custom-designed board, which is used to emulate a USB HID device.
It sports a 4 x 4 matrix of Cherry MX-compatible mechanical keys, all of which have an RGB LED underneath controlled by an IS31FL3731 PWM LED matrix driver. Of course, each of these LEDs can be independently controlled to produce colorful animations or to indicate function / status of a key.
Image 1 of 3
Image 2 of 3
Image 3 of 3
Power and data connectivity is handled via a USB-C connector. On launch there will be MicroPython and C/C++ libraries with examples which can be tweaked to serve your needs. Pimoroni aim to have keyboard emulation ready on launch, which would introduce the feature to Raspberry Pi Pico’s own version of MicroPython. At present, keyboard emulation on the Pico is only possible with C and CircuitPython from Adafruit.
The anticipated release date is from the week commencing February 22, but this depends on Pimoroni receiving their allocation of RP2040 SoC. Price looks to be around £50, or about $68. Tom’s Hardware will shortly be receiving their review unit and it will be added to our everything we know page for the Raspberry Pi Pico.
(Pocket-lint) – In the recent past there was a moment when Huawei – the then champion Chinese export – looked poised to strike as the next brand (sure, it’s still huge, just less global right now). But the tables turned fast, locking out Google Services – largely down to tumultuous political wars – which left the door open somewhat in Europe.
Pushing its foot through that gap with keen assertion is Xiaomi. No, this other Chinese tech mega-company is no stranger to phone-making – having revealed some of the earliest near bezel-free devices to the market – but it’s now at a position, in design terms at least, where it’s at the very cutting edge.
For the Mi 11, complete with its curved screen design and fresh take on cameras, is a visual delight (to us it somewhat echoes Huawei’s P40 Pro, hence the comparison).
But the Mi 11 is also the first phone to every deploy Qualcomm’s Snapdragon 888 top-tier processor, showing Xiaomi is a step ahead in the hardware stakes too. So are we looking at the next big brand that’s about to blow up?
Design & Display
Dimensions: 164.3 x 74.6 x 8.06mm / Weight: 196g
Frosted glass finishes: Midnight Gray, Horizon Blue
Display: 6.81-inch AMOLED quad-curved panel
Resolution: 3200 x 1440 (WQHD+)
Refresh rate: adaptive 120Hz
In-screen fingerprint scanner
Sound by Harman Kardon
“It’s just a phone”, right? Sure, there’s only so far you can push the mold when it comes to creating a rectangular interactive screen, but the Mi 11 is refined at every turn.
Whether it’s the subtle curved glass edges, the subtle gradient and light-catching properties of the frosted glass rear, or the deftly cut punch-hole camera to the front (it’s way neater than most others), the Mi 11 is poised to perfection; a chiselled model on a flagship phone catwalk.
Wrapped into that design is a lot of top quality specification too. The screen, a 6.81-inch AMOLED panel, is large but proportioned so it’s not ridiculous for thumb-reach across it (the aspect ratio is 20:9). The always-on panel can glow to those subtle curved edges as a not-too-intrusive alert mechanism, too, which looks wonderful.
Stadia’s troubles, Garmin Instinct Solar review and more – Pocket-lint Podcast 89
By Rik Henderson
·
As is often the case with OLED panel balancing, however, when the screen is dimmed it crushes the blacks somewhat. Not nearly as badly as, say, the Oppo Find X2 Pro, but it’s definitely there. And, to some degree, the software seems a bit too keen to push the brightness down a notch – one of the number of quirks to Xiaomi’s MIUI software (here reviewed as 12.0.1, but 12.5 is expected very soon – and that could largely change things up).
The screen’s spec doesn’t stop reaching for the stars there either. It’s got a 2K resolution, with an adaptive 120Hz refresh rate to help smooth out animations and gameplay (oddly the refresh rate page in the settings calls 120Hz ‘Medium’ compared to 60Hz’s ‘Standard’, with no ‘High’ option – it’s not very well termed).
As ever with refresh rate, it impacts battery life, as does the resolution chosen. But the Mi 11 can run WQHD+ (that’s 3200 x 1440 pixels) at the 120Hz rate – which is as good as things get at this moment in time. There’s also FHD+ (2400 x 1080) and automatic switching options to help save that battery life all the more.
The high-spec screen is matched with high-spec innards too. As the first device to sport the Qualcomm Snapdragon 888 platform – here with 8GB RAM, there’s supposedly a 12GB option that we don’t anticipate for global markets – it’s put heaps of core power at your fingertips.
And a fair bit of heat to match – a powerful processor, even a 5nm platform such as the SD888, can’t exactly run cool, so expect some hand-warming (which, given it’s snowing at the time of writing here in the UK, hasn’t been something of complaint).
So while the setup will run your favourite games and apps at their best – enhanced by the available refresh rate and resolution of that screen – it will of course impact battery life. Inside the Mi 11 there’s a 4,600mAh cell which is reasonable enough – and also sports 55W fast-charging and 50W wireless charging – but will drain faster than your average on account of the high-end specification.
But we’re not talking to a problematic level: we ran from 9am to 1am, including four hours of gaming off and on, and those 16 hours of use took the battery into its final 15 per cent. So even with pretty solid usage – the above meant nearly seven hours of screen time – the Mi 11 will get you through the day.
Part of the reason for this is the fairly high impact of Xiaomi’s MIUI software. There are lots of options, a number of alerts to suggest limiting certain functions to retain battery, and a lot of per-app permissions that you’ll need to tinker with to ensure everything runs as you please.
When we reviewed the Redmi Note 9T, which was initially running the same MIUI 12.0.1 software as this review Mi 11, we hit walls and walls of problems. The Mi 11, however, hasn’t suffered the same – being far more stable. That said, we’ve found some off-and-on issues with notifications not being immediate at all times – likely a tucked-away battery-saving technique? – and other little quirks.
Now, the Mi 11 is expected to launch globally with MIUI 12.5, which should bring an updated and fresher approach. How much difference this will make and what tweaks it will bring is yet to be seen. But as we find the tinkering nature of MIUI to be the biggest hurdle of this device, seeing the software advance to a more natural, usable state would be great too see.
On the cameras front the Mi 11 takes a different, rather refreshing approach: yes there’s quite a number of lenses here – three, count ’em – but none are there for the heck of it.
Each lens has its own task: the main camera is super high-resolution; there’s an ultra-wide to cram more into a scene; while the 5-megapixel telemacro is the best we’ve seen yet for close-up shooting (although it’s still not quite perfect).
That’s refreshing compared to the various phone camera setups that appear with four or five lenses, many of which do little or nothing at all. The Mi 11’s only real absence is there’s no optical zoom of any kind – which seems like an oddity at this level, but then the expected €749 starting price more than goes to explain that.
Anyway, back to the cameras themselves. The main lens is 108-megapixels, but it uses four-in-one pixel processing to produce 27-megapixel images. Those are, inevitably, still massive – 6016 x 4512 pixels in 4:3 aspect ratio – but there’s heaps of detail crammed in. It’s a decent optic with good results, including from low-light conditions.
Indeed, the night mode does a grand job of long exposure handheld shots too, aided by the optical image stabilisation system to keep things steady.
The telemacro, however, doesn’t feature any stabilisation – which can make it a bit tricky to use. You’ll get some great close-ups, but there’s not the same degree of accomplishment with sharpness or detail – partly because it’s 5-megapixels only, partly because the autofocus feedback here is limited and not entirely helpful (but, hey, at least it’s a macro lens with autofocus – something you’ll barely see elsewhere).
All in all, despite the absence of proper optical zoom lenses, the Mi 11’s take on cameras is pretty strong. The main lens is great, the wide-angle accomplished, and the telemacro actually useful for creating unusual close-up shots. Here’s hoping the alleged Pro and Ultra models – if they come to fruition – expand on this already great camera setup and make it better yet.
First Impressions
So is Xiaomi about to blow up as the next big thing? Well, it could suffer a similar fate as Huawei – what with the US Administration adding it to its blacklist – which would be a shame as the Mi 11 is a potentially storming flagship.
No only does the Mi 11 hit a number of firsts – such as introducing the Qualcomm Snapdragon 888 to the world – it’s also attractively priced, with €749 set as the opening sum. For all that’s on offer here that’s super value.
The Mi 11’s fresh take on cameras – there aren’t lenses here for the heck of it – and accomplished design are among its highlights. Sure, the MIUI software has its quirks, which we think is the most questionable part of the package, but it’s stable and tinkerable enough to shape into a strong overall experience.
The Xiaomi Mi 11 is packed full of potential. At this price, it’s certainly worth you taking a moment of your time to mull it over as your next Android flagship.
Xiaomi is announcing the international version of its Mi 11 flagship phone today after an earlier release in China. No wild waterfall displays or sci-fi wireless charging here — this device is very much in keeping with the Mi series’ MO of offering high specs at a competitive price.
The Mi 11 has a Snapdragon 888 processor, making it the first phone to launch with Qualcomm’s latest high-end chip (though Samsung’s Galaxy S21 series beat it to market outside China). The 888 has 5G connectivity built in, of course, and the phone has 8GB of RAM and 128GB or 256GB of storage.
The screen appears to be the same panel — or very similar — to what’s found in the Galaxy S21 Ultra. It’s a slightly curved 6.8-inch 1440p OLED with a refresh rate of 120Hz and peak brightness of 1,500 nits, matching Samsung’s phone spec-for-spec. I don’t have the S21 Ultra to compare side-by-side, but I can tell you that the Mi 11’s screen is extremely good.
The Mi 11 has a 108-megapixel primary camera backed by a 13-megapixel ultrawide and a 5-megapixel “telemacro” camera. The selfie camera is 20 megapixels and tucked inside a small holepunch cutout at the top left of the screen.
The battery is 4,600mAh and can be charged at up to 55W with a cable and up to 50W wirelessly. The Mi 11 also features reverse wireless charging at up to 10W. It runs MIUI 12, based on Android 11.
Xiaomi hasn’t provided a full list of regions or launch dates just yet, but says the Mi 11 will be priced at €749 (~$900) for the 8GB/128GB model. XDA Developers notes the 8GB/256GB model will retail for €799 (~$960). Stay tuned for a full review.
A billion years ago last summer, people were getting antsy about vaccines. They wanted things to move faster, maybe skip a few steps in clinical trials to speed things up. This was, at the time, generally considered a bad idea that would result in less accurate data and cause people to lose trust in the vaccines. Now, exactly 1.589 trillion years later, recklessly speeding things up for speed’s sake remains a poor choice, but some people on Twitter like journalist Nate Silver, seem to think we should do it anyways.
They take particular issue with the fact that while Johnson and Johnson submitted their data from their giant clinical trial this week, the Food and Drug Administration will take until February 26th to review the data.
Three weeks can feel like an eternity during the pandemic, with hospitals crowded and deaths still climbing. It’s easy to be flip about the process and want things to Just. Go. Faster. But the 22 days is not that much longer than the 20 days the agency took to review data for Pfizer/BioNTech’s vaccine or the 17 days for Moderna’s candidate.
Here’s what’s going to happen during those days. Researchers at the FDA will have to review the data from the 43,783 people who participated in the trial. This will entail looking at the cases across all study sites — here in the US, in Latin America and in South Africa, where a new coronavirus variant is dominant. The typical review process for a vaccine can take months. Instead, it will happen in a few weeks.
If the process is anything like what the FDA planned for the earlier vaccine candidates, those weeks will be filled with a lot of late nights and workers doing everything that they can to reasonably speed things up. “Groups have been working in shifts, nights and weekends, looking in parallel at issues of clinical effectiveness and safety, and of levels of antibodies to confirm the way the vaccine is working.” the Wall Street Journal reported in December.
Why do all that work? Right now, the information that we have about the vaccine comes from the company. That information is promising, and shows that it will probably be a good vaccine. But there are reasons that the FDA doesn’t just take a company at its word.
Let’s turn to noted scientific historian Billy Joel, and the “children of thalidomide.”
Thalidomide was a sedative that was given to pregnant people in the 1960’s as a cure for morning sickness. It caused birth defects in thousands of children across the world. In the US, pregnant women were given the drug in clinical trials, but unlike other countries, it wasn’t approved for sale at the time, thanks to Frances Kelsey. Kelsey was a drug reviewer at the FDA who looked over data from the company trying to sell the drug and found it unconvincing. The incident led to new laws that let the FDA determine a drug’s safety and effectiveness.
Taking the time to review a vaccine during a pandemic might seem like rearranging deck chairs on the Titanic — but it’s actually inspecting the lifeboats before you leave port. We have procedures and protocols for a reason. When we launch something big, like a rocket, engineers don’t just push a button and send it soaring. They go through detailed pre-flight checklists, making sure that every bit of a spaceship is sound. We’ve learned the hard way that disregarding safety procedures can cost lives.
Pushing out a third vaccine quickly might help save lives, yes. But only if people are willing to take it. Some healthcare workers are already hesitant to take the vaccine. They worry that the process is rushed. Rushing the process more isn’t likely to convince them — people who generally are not opposed to vaccines — that the process is safe and secure.
I get it. Waiting sucks. But when you’re injecting people with a new treatment and the trust of billions of people is on the line, sometimes it’s worth taking the time to double-check your work.
Here’s what else is happening this week.
Research
Scientists want to know if vaccinated people can still become COVID-19 long-haulers Data from trials has shown that COVID-19 vaccines have done a stellar job at preventing severe cases of disease. But it’s still unclear whether they can prevent chronic COVID-19 symptoms. (Nicole Wetsman/The Verge)
The Pandemic Broke the Flu The flu appears to have taken the year off. While COVID-19 has dominated the planet, our regular seasonal virus appears to have mostly stayed home. It’s not clear what will happen (Katherine J Wu/The Atlantic)
Indigenous Americans dying from Covid at twice the rate of white Americans One in every 475 Native Americans has died of COVID-19, a rate that is higher than any other community in the US. The toll is especially brutal for smaller communities, who face disproportionate losses. (Nina Lakhani/The Guardian)
Development
New Vaccine Puzzle: Who Should Get Which Shot? Some places in the world are already juggling three different vaccines — and distributing them is getting very complicated. (Benjamin Mueller and Rebecca Robbins/The New York Times)
With a seductive number, AstraZeneca study fueled hopes that eclipsed its data A study of Astra Zeneca’s vaccine found that people who were vaccinated were less likely to carry the virus. That statistic got misinterpreted as proof that the vaccine decreased transmission of the virus. It might, but that hasn’t been proven yet. (Matthew Herper and Helen Branswell/STAT)
So you got the vaccine. Can you still infect people? Pfizer is trying to find out. People are still trying to understand how vaccines affect transmission, but it’s incredibly complicated. Here’s why. (Antonio Regalado/MIT Tech Review)
After a Rocky Start, Novavax Vaccine Could Be Here by Summer Novavax’s vaccine candidate has gotten off to a slower start than many of its competitors, but it is now well on its way. (Katie Thomas/The New York Times)
Perspectives
I had my espresso with foamed milk, even though coffee is disgusting to me now. I do not enjoy the taste. But I have a cup of coffee almost every morning. It’s like ritual, right? I enjoy the process of making it, and the warmth, and the caffeine. So I keep doing it, and I keep hoping that it will taste good to me at some point. I feel like I’m using my imagination when I eat, trying to use my memory of how things smell and taste to recreate the experience, because otherwise I would not want to eat.
—Meema Spadola tells Eater’s Jenny Zhang. Zhang interviewed people whose sense of taste and smell had not recovered after contracting COVID-19
More than numbers
To the more than 105,485,261 people worldwide who have tested positive, may your road to recovery be smooth.
To the families and friends of the 2,301,169 people who have died worldwide — 459,571 of those in the US — your loved ones are not forgotten.
ASRock has used a gear pattern as part of the design on its flagship Taichi motherboards for a few generations, but as you can see in the video below, now the gear actually spins on the Z590 iteration of the motherboard.
As Chinese publication XFastest demonstrated in its Z590 Taichi review, the gear on the I/O cover rotates in a clockwise fashion. ASRock even added a special option inside the motherboard’s firmware so you can control the spinning interval. Surprisingly, ASRock doesn’t brag about this little design detail on the Z590 Taichi’s product page, so it could just be a gimmick for the review unit. As far as we can tell, the gear serves no practical purpose, and it certainly isn’t going to help you hit higher overclocks.
Either way, at least ASRock is thinking outside of the box and doing something truly different other than simply adding more Christmas lights to the motherboard. The Z590 Taichi also has a set of gears on the passive heatsink for the Z590 PCH – maybe those will be next in line for some spinning action.
The new Z590 Taichi brings a couple of improvements over the Z490 model. Although the Z590 Taichi has lost a power phase in its power delivery subsystem (14 phases vs 15 phases), the new power chokes are rated for 90A instead of the 60A ones on the Z490 Taichi.
Of course, there’s also the PCIe 4.0 M.2 ports and PCIe x16 expansion slots on the Z590 Taichi and the upgraded Wi-Fi 6E and Bluetooth 5.2 connectivity that aren’t present on the Z490 Taichi.
The Z590 Taichi hasn’t landed at retailers yet, since Intel’s 11th Generation Rocket Lake-S processors aren’t out either. Nevertheless, the Z590 Taichi is expected to debut with a $429.99 price tag. For comparison, the Z490 Taichi normally sells for $369.99. Therefore, ASRock slaps on a $60 premium for the Z590 Taichi compared to the previous motherboard. In reality, considering the feature set, ASRock’s pricing for the Z590 model isn’t asking too much.
And, of course, there’s the spinning gear. As you would imagine, it probably doesn’t serve a practical purpose, but it might be appealing to enthusiasts that like to show off their rigs.
Thumpy, warm sound combined with very good microphone performance and lusciously soft ear cups make the MSI Immerse GH61 a winning choice.
For
Effective DAC and AMP
Immersive virtual surround sound
Quality, microphone
Super soft ear cups
Against
Ear cups get warm after a while
Bass lacking at max volume
We all like our audio a little different. Thankfully, gaming headset vendors love playing with EQ curves to create different sound profiles that can do things like boost your in-game awareness or make your music thump a little bit louder. If you like cans that are heavy on the bass, MSI has a headset for you.
The MSI Immerse GH61 may be one of the best gaming headsets for combining comfort and shameless bass. The ear cups boast baby-soft protein leather, while the drivers deliver distortion-free audio with thunderous bass worthy of my old dance club days. At $109.99 it’s a win for value seekers. The GH61is cross-platform compatible with PC, Mac, PS5, PS4, XBOX, Nintendo Switch, and you can connect either via USB or 3.5mm, coming with a DAC which boosts the audio, allows you to turn on or off the 7.1 surround sound, mute the mic and raise or lower the volume.
MSI Immerse GH61 Specs
Driver Type
40mm neodymium magnet
Impedance
32 Ohms
Frequency Response
20 Hz – 40 kHz
Microphone Type
Unidirectional, retractable
Connectivity
Dual 3.5mm (Consoles)
USB Type-A (PC)
Weight
0.6 pounds (300g)
Cord Length
USB Type-A cable: 3.9 feet (1.2m)
3.5mm cable: 3.2 feet (1m)
Lighting
None
Software
Nahimic for Headset
Design and Comfort of MSI Immerse GH61
With the dragon logo on the ear cups and angular plastic accents throughout, you can tell MSI designed the Immerse GH61 with gamers in mind. The sleek, angular styling of the swivel mounted ear cups brings style without going overboard. On the right side above where the ear cup and headband meet but stealthily located where no one one else will see it is the Onkyo logo, representing the company behind the Immerse GH61’s drivers. The headset’s left ear cup holds the retractable microphone that smoothly slides in and out of the unit.
Although the Immerse GH61 is mostly lightweight plastic, it doesn’t feel cheap or easily breakable. The plastic also helps keep the headset trim at 0.6 pounds. For comparison among other USB/3.5mm headsets, the Corsair HS70 Bluetooth is 0.7 pounds, and the XPG Precog is 0.8 pounds.
You get a very warm and super soft faux leather covering the memory foam ear cups. The adjustable metal headband is covered in the same memory foam and protein leather for an overall feel that’s oh so soft and lucious. The ear cups snuggle cozily against the ears and feel much better than the fabric you’ll find on some other gaming headsets. However, because the headset uses leatherette, you’ll start to feel warm after wearing them for a while. Thankfully, MSI includes cloth ear cup covers to swap in if that’s your preference, a thoughtful touch.
The Immerse GH61 can lay flat, thanks to its swivel mounted cups, but you’ll enjoy retiring them to the included tailor-made carrying pouch — another value point for MSI.
You can connect the Immerse GH61 via its integrated 3.5mm cable, which is handy for consoles but also works with PC or via USB. For a PC connection you can use either the 3.5mm cable or the USB cable, which includes an ESS Sabre-branded (digital-to-analog converter (DAC) and AMP. You’ll need the DAC (and, therefore, a USB connection) to use the headset’s virtual 7.1 surround sound feature. The DAC also provides handy controls over volume and the mic that relieve you from having to fuss around with your ear cups and, potentially, introduce noise into outgoing audio.
Audio Performance of MSI Immerse GH61
MSI used 40mm neodymium magnet drivers made and tuned by Japanese audio vendor Onkyo. In general, they produce silky smooth, warm audio with fine clarity, depth and ample volume. No matter how high I cranked the volume, there was zero distortion. Bass, on the other hand, seemed to have a healthy limit to prevent distortion when you have the volume maxed out, but the highs never get tinny.
But if you want the Immerse GH61 at its finest, you’ll want to use its DAC. The ESS Sabre DAC and AMP is said to increase the cans’ dynamic range from 90dB to 121dB using ESS’s HyperStream technology. They also boost the signal to noise ratio (SNR) from 100dB to 121dB, while total harmonic distortion and noise (THD+N) decreases, meaning less distortion, from 0.001% to 0.00017%
To test the cans’ gaming prowess, I set it to gaming mode via software. There was subtle difference compared to the out-of-box settings, and the virtual 7.1 surround sound seemed to work well in creating an immersive atmosphere with this mode.
With the DAC, I felt a heightened sense of the sounds around me in Batman: Arkham Knight. In a fight I could hear a goon’s feet shuffle to the left of me as I spun around to punch him and enjoy the audio reproduction of Batman’s jaw breaking punches.I could even tell which direction combatants were coming from, thanks to the virtual surround sound, which allowed me to turn quickly in response. Even Catwoman’s voice came through very clearly and distinctly from a distance, and I was able to tell how far she was based on the sound.
The Sabre DAC and AMP really helped make the audio experience lovely. Listening to the heavenly violin mastery of Julia Fischer playing Tchaikovsky’s Violin Concerto in D major was beyond exquisite. When the orchestra’s bass strings came into play, the creaminess of the bass was simply delightful. Julia’s violin strings were hauntingly melodic as the Immerse GH61 picked up every note change. The occasional wind instrument floated in to deliver gravitas and tonal changes that the DAC’s virtual 7.1 surround sound enhanced. Music often doesn’t take well to virtual surround sound, but on the Immerse GH61 pressing the 7.1 button took me into a concert hall. With the DAC, the experience was still high quality but lacked the fine-tuned concert hall feel.
The inviting, slow, melodious guitar solo that kicked off System of a Down’s “Toxicity” was reproduced harmoniously with 7.1 surround on. When the driving guitar kicked in, the Immerse GH61 handled the quick switch from sweet harmony to driving heavy metal angst with aplomb and joy. Serj Tankian’s rangy powerful voice belting out poetic political truths came through the swivel mounted, plastic ear cups. The 40mm drivers made me feel like I was in a live concert.
Without virtual surround sound, “Toxicity” still sounded fierce and powerful but, again. without that amazing inside a concert venue feel, where it felt like music was bouncing off the walls, massaging the sound and energizing the crowd.
The Immerse GH61’s drivers also support Hi-Res audio, which is audio that has a higher sampling frequency and/or bit depth than a CD, which is at 16-bit / 44.1 kHz. There isn’t much in the way of gaming that supports this audio format, but audiophiles will appreciate the inclusion, especially at this price.
I needed no more convincing that the MSI GH61’s were worth every penny of their $109.99. I will close by saying I listened to Prince’s “Purple Rain” and ended up standing and dancing fully enjoying the entire experience. MSI really made a smart move by combining the Onkyo speakers with the DAC and Nahimic software.
Microphone on MSI Immerse GH61
One of the Immerse GH61’s more unique features is its retractable microphones housed in its ear cup. At first I was worried that this would be a failure, either due to durability issues or by introducing noise into my audio. But I used the mic all day during many CES Zoom and Google Meets and to record a podcast and was pleasantly surprised.
I learned the hard way that retracting the microphone does not mute it, as my daughter heard me spew a choice word at an incoming news story on my screen. Then. I had to explain to my daughter that it’s not funny and not language to be repeated, to which she replied, “I’m almost 12 years old and I hear worse at the supermarket.” So please remember to mute, unless you want an unruly child picking up new, colorful imprecations.
Many attached mics do not pick up subtle tone changes very well, but the Immerse GH61’s did a nice job of doing just that when I recorded a podcast appearance. The mic caught all the bass and nuances in my voice as I bounced around from topic to topic, changing my voice levels to suit the mood. There was no distortion to report and, once again, the Nahimic software also was helpful which I will get into next.
The mic is specced for a frequency response of 100-10,000 Hz with a sensitivity of -38 dB. It’s well engineered and operated very smoothly with no hiccups. I must’ve slid it in and out 100 times in a row to see if there would be any catches, but that would never happen. Again, nice work team, Maybe next time make it able to retract and extend automatically? Ok, maybe I’m being a little lazy.
Features and Software of MSI Immerse GH61
Image 1 of 3
Image 2 of 3
Image 3 of 3
The Immerse GH61 works with Nahimic for Headset, which is a very user-friendly and simple application. The user interface is nicely designed with warm neon-like aqua colored tones. You can use the app to tweak the bass and treble levels, as well as select from presets for music, movie, communication and gaming modes. You can also choose to turn the effects off altogether and adjust the microphone settings.
Nahimic for Headset offers adjustments for the mic gain, and you can also minimize variations in volume by using the Voice Stabilizer section. There is also an excellent static noise suppression section, which improves communication clarity when you’re shouting commands in Call of Duty or on Zoom calls. It also removes a fair amount of background noise, including computer fans.
Bottom Line
The MSI Immerse GH61 arrives with baby bottom soft ear cups, a simple, yet stylish, design and excellent audio and microphone capabilities. It adds value with an excellent pouch for storing the lightweight thumpers. They cans are also somewhat versatile, offering both 3.5mm and USB connection and virtual surround sound, making them great for console and PC gamers alike.
At max volume, bass takes a hit, and it’d be great if the mic would mute when retracted. But overall, there’s not a lot missing here. (Most of us can still live without RGB on our headsets, right?)
If you want a gaming headset at a good price that’s thumpy and warm and offers wonderful spatial quality, the MSI Immerse GH61 should be high on your list.
In November 2020, Apple announced M1. By the end of the year, it announced three devices — the MacBook Air, 13-inch MacBook Pro, and the Mac Mini — that ditched Intel’s processors.
Those devices received largely positive reviews based on benchmark performance and battery life. But Intel has also released its 11th Gen “Tiger Lake” processors, and after several months of silence, now it’s firing back at Apple. Slides from the Santa Clara, Calif.-based chipmaker shows how it tested, and why it thinks Windows 10 laptops can beat back Apple’s ARM-based solution.
Below, we are publishing the slides in full (minus a title slide, be sure to look through the galleries), as well as our analysis. Intel shared benchmarks for the chips, but as with all vendor-provided benchmarks, take them with a grain of salt.
Intel’s Performance Claims
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
For pure productivity performance, Intel’s testing eschews typical benchmarks. Sure, it used Principled Technologies’ WebXPRT 3, but the Microsoft Office 365 tests appear to be based on Intel’s internal RUG (real-world usage guideline) tests. Intel claims the 11th-Gen system, an internal whitebox with an Intel Core i7-1185G7 and 16GB of RAM, is 30% faster overall in Chrome and faster in every Office task. This largely goes against what we saw in our 13-inch
MacBook Pro with M1 review
, where benchmarks showed M1 to be largely on the same level, if not better.
For what it’s worth, in most laptops, we’ve seen the companies that make them opt for the Core i7-1165G7. We’ve only seen the 1185G7 in one production laptop, the
MSI Prestige 14 Evo
.
Intel also claims that the i7-1185G7 is six times faster than M1 on AI-tools from Topaz Labs and Adobe Premiere, Photoshop and Lightroom functions. (Again, using the company’s internal RUG tests).
Gaming was a mix, with Intel and Apple trading blows with integrated graphics. But Intel also got a little snarky, placing Apple at 0 frames per second for a number of games that don’t currently work on macOS and the M1 CPU. Apple’s ecosystem hasn’t been a hardcore gaming platform for years now, especially after 32-bit app support was cut in macOS 10.15 Catalina.
It’s unclear how many people are playing some of the listed games, like Microsoft Flight Simulator 2020, Halo: The Master Chief Collection, Crysis Remastered or Red Dead Redemption 2 on Intel’s integrated Xe graphics, but yes, the point is made – Windows PCs have far larger collections of triple-A games.
Intel Evo vs. Apple M1
Image 1 of 3
Image 2 of 3
Image 3 of 3
When Intel revealed
Evo
, its second-generation upgrade to Project Athena to make the best portable devices, it included a number of experiences from studies that it believed would create the best notebooks. So when evaluating M1, it used those tests.
Intel claims that the M1 in the MacBook Pro it tested failed eight out of 25 tests it uses, including “Switch to Calendar” in Outlook, “start video conference” in Zoom, and “Select picture Menu” in PowerPoint. Intel’s workloads don’t explain how these are run, but they’re also simple tasks that work quite well on just about any modern processor, so they’re odd choices. (I had plenty of Zoom conferences while testing the MacBook Pro with no issue.)
Interestingly, in the configurations document at the end of the slides, Intel shows that it switched to a MacBook Pro with 8GB of RAM, rather than the 16GB model it tested for performance.
In battery life, Intel switched to an Intel Core i7-1165G7 notebook, the Acer Swift 5, rather than sticking with the Core i7-1185G7 in the whitebook it used for performance testing. It also tested a MacBook Air. They ran Netflix streams and tabs and found the MacBook Air came ahead with a six-minute difference.
Intel didn’t list battery life for the MacBook Pro.
In our tests,
that beat Intel PCs by hours.
The Form Factor Argument
Image 1 of 2
Image 2 of 2
There has been an interesting debate among Mac users for a long time about whether or not Apple should add a touchscreen to MacBooks. It hasn’t, and left that on the iPad.
Per Intel’s slides, a Windows machine offers more choice, including 2-in-1s, desktops, small form-factor desktops, desktops with touchscreens, and even easels. This is somewhat odd, considering Apple does offer a small desktop (the Mac Mini), as well as various desktops in the iMac and the Mac Pro, and Apple has promised that its own chips will land there, too. Touchscreens and convertible 2-in-1s are the big areas where Apple lacks.
The second slide about choice shows the various form factors and configurations. And yes, Apple’s laptops are limited to clamshells. Interestingly, Intel only includes the MacBook Pro on this list, and not the MacBook Air, which starts at $999 with an M1, 8GB and 256GB of memory. That’s less than the Dell XPS 13 listed at $1,499 and has a higher display resolution. However, it is right that the MacBook Pro can get expensive at higher configurations, and certainly about the fact that Apple’s port selection on the 13-inch MacBook Pro and the MacBook Air is lacking.
Intel also took a dig at the M1’s display capabilities. The slide is right – both the M1 MacBook Pro and MacBook Air only support one external display, up to 6K at 60 Hz. (This isn’t the case for the Mac Mini desktop, which also has an HDMI 2.0 port.)
Some users have found a workaround by using DisplayLink drivers and docks, but it is a weak point, especially for the Pro-branded notebook.
Compatibility
Apple includes Rosetta 2 to emulate x86 software on the Mac, but some software just doesn’t support M1. Intel includes games, again, as a weak point, as well as a lack of support for Boot Camp.
It also suggests many accessories won’t work. This is somewhat true. The M1 laptops don’t support external graphics docks, and some software won’t work on the Mac. (For instance, Razer recently announced a docking station that doesn’t have RGB lighting control because Synapse doesn’t currently work on the Mac).
Perhaps the Xbox controller wasn’t fully supported when Intel tested, but PS5 and Xbox Series X/S controller support showed up in the beta for macOS 11.3, so it’s on the way.
It’s definitely showing a disadvantage to early adoption, though many people use headphones, hard drives and other accessories that don’t require software to use.
Intel has made a similar argument about software. To a degree, again, this is true; not all software works. In my experience, I found anything that ran through Rosetta 2 seemed fairly seamless. Since then, more native software has become available or announced. For instance, Box, which is listed as incompatible, has called the issue a “High priority investigation.”
The other angle here is that the Mac has a devoted league of developers that make software only for Apple’s platform. So, in that case, people using M1 are likely to use some of that software, or Apple’s alternatives. Others, like Google Drive, are also available on the web.
On the Adobe front, Lightroom currently runs natively on M1, while the company has promised native versions of its other software.
So Intel does make some points here, but it seems far less about the M1’s capability and more about being an early adopter.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Notes and Disclaimers
Intel included these, so we’re including them here for the sake of transparency.
The company makes some good points about the current state of Apple’s chip initiative, especially if you demand a specialized form factor or play games casually.
Intel’s performance claims need to be taken with a certain grain of salt, as they’re in Intel-created tests and not industry-standard benchmarks. The fact that it switched out between the Pro and the Air for battery life (as well as the Core i7-1185G7 and Core i7-1165G7) also shows an incomplete picture.
Intel’s thoughts on software and compatibility get a bit tricky. Early adopters may feel a bit of a sting, but it’s been rapidly improving, and much of the software that doesn’t work at all may be counteracted with Apple software.
The slides paint two pictures: Yes, Apple has work to do in this transition, and the touchscreen, multi-display support, and limited port selection need to be fixed. But the fact that Intel went through putting these slides together also shows that it sees a formidable opponent worth comparing its chips against, suggesting a competitive future for notebooks.
Apple’s iOS 14.5 beta is out, and with it comes the ability to have your Apple Watch unlock your Face ID-protected phone if you happen to be wearing a mask. If you really want this feature right now, you’ll have to download the latest iOS and watchOS betas — something that does come with at least a little risk, as it’s unfinished software. There are bugs, and features might change between updates. There have been reports in the past of watches being bricked by betas (though I didn’t find any for this release).
With all that said, I’ve been using the betas since they came out, and I haven’t noticed anything acting particularly buggy. So if you’re willing to throw caution to the wind to get access to the new Apple Watch unlock feature, here’s how you can get the betas.
First, start by going to beta.apple.com on your iPhone. If you’ve never participated in a public beta before, you’ll have to tap on the sign up button, and if you have, you can tap on the sign in button.
Once you’re logged in, tap the down arrow on the header, and go to Enroll Your Devices. I highly recommend following Apple’s advice to make an archived backup of your device, as iCloud backups won’t necessarily be accessible if you have to switch back from a beta. Apple explains how to make one on the Enroll page. Go ahead, I’ll be here when you get back.
After you’ve backed up your device, you can scroll down and tap the Download profile button, and your phone will let you know that you have to review the profile before it installs.
Before we do that, though, let’s grab the profile for the watch by scrolling up to the top and choosing WatchOS from the list. There, you can tap the Download profile button, then press Allow.
Tap install to add the beta profile to your Watch.
You’ll then have to reboot your watch.
This will automatically open the Watch app, where you can press Install. This will reboot your watch, so while that’s happening, let’s install the phone’s profile by going to Settings, then Profile Downloaded.
There, you can tap the Install button, and your phone will restart.
Tapping install will add the beta profile to your phone.
After you’ve got the profiles downloaded and your devices rebooted, you’ll have to update your phone, then your watch. Unfortunately, you have to wait until the phone is updated before you can even start on the watch, so you can’t multitask.
Once you’re officially running iOS 14.5 and watchOS 7.4, we can finally enable the unlock with watch feature.
Go to Settings > Face ID & Passcode, and scroll down to the new Unlock With Apple Watch option. Toggling it will turn on the feature, there’s nothing you have to do on the watch.
Now that you have it installed, here’s what you can expect. First, and most important to note, is that your phone isn’t looking for your face with a mask, it’s looking for a face with a mask. With this feature on, if my wife was wearing a mask, she was able to unlock my phone with no problem if I was within three or so feet.
Apple’s mitigation to this is whenever your Apple Watch is used to unlock your phone, it buzzes you with a notification saying your phone has been unlocked, along with a button to lock it. Pressing that lock button immediately locks your phone and requires a passcode on next unlock.
With that caveat out of the way, so far I’ve had great success with the feature. I tried it with a variety of masks, and it worked with all of them for me. It is worth noting that, again, it is still looking for a face with a mask. This feature won’t help you in the situation where your phone is laying on the table and you want to unlock it without the Face ID camera being able to see you.
Still, for me, not having to put in my passcode every time I want to check my grocery list while shopping is a huge benefit and worth what I consider to be relatively minor security trade-offs. The feature may not be for those with super-secret info on their phones, but for everyone else it’ll be a nice quality-of-life improvement, whether you decide to go for it right now or wait for the official release.
Today we’re looking at our first custom 3070 card, the Asus GeForce RTX 3070 TUF Gaming OC. Like all the other recent GPUs, Nvidia’s GeForce RTX 3070 continues to be highly sought after — by gamers and miners alike. Originally revealed with a $500 base price, with performance relatively close to the previous generation RTX 2080 Ti, the GPU looked to land right in the sweet spot. The theoretical price easily earns the card a place on our best graphics cards list, and it sits in seventh place in our GPU benchmarks hierarchy (not including the Titan RTX). What does the Asus card bring to the table? Less and more, depending on your perspective.
Here’s a quick comparison of the reference 3070 Founders Edition with the Asus 3070 TUF Gaming. All of the core features and specs are the same, so the only real change is in clock speeds and the card’s design.
Nvidia GeForce RTX 3070 Specifications Comparison
Asus RTX 3070 TUF Gaming
RTX 3070 Founders Edition
Architecture
GA104
GA104
Process Technology
Samsung 8N
Samsung 8N
Transistors (Billion)
17.4
17.4
Die size (mm^2)
392.5
392.5
SMs / CUs
46
46
GPU Cores
5888
5888
Tensor Cores
184
184
RT Cores
46
46
Boost Clock (MHz)
1845
1725
VRAM Speed (Gbps)
14
14
VRAM (GB)
8
8
VRAM Bus Width
256
256
ROPs
96
96
TMUs
184
184
GFLOPS FP32 (CUDA)
21.7
20.3
TFLOPS FP16 (Tensor)
87 (174)
81 (163)
RT TFLOPS
42.4
39.7
Bandwidth (GBps)
448
448
TDP (watts)
275
220
Dimensions (LxHxW mm)
300x127x51.7
242x112x38
Weight (g)
1096
1034
Launch Price
$549 ($649)
$499
As with most Asus graphics cards, the RTX 3070 TUF Gaming OC has multiple clock speed options. A switch on the top of the card can toggle between ‘quiet’ and ‘performance’ modes (reboot required), but that’s not the full story. The OC Mode has a boost clock of 1845 MHz, compared to 1815 MHz in the default Gaming mode, and 1785 MHz in Quiet Mode. However, you can only use the OC Mode if you install the Asus GPU Tweak II software (see below) — otherwise, you’ll get the slightly lower Gaming Mode clocks.
Asus is basically straddling the fence with this approach. It gets to claim higher boost clocks, but we suspect a lot of users won’t bother installing GPU Tweak and will end up with (slightly) lower performance — and lower power draw as well. Realistically, most people won’t notice the difference either way, but cutting power use by 25W and dropping temperatures a bit are both desirable things with PC hardware. We’ve opted to run the performance tests with OC Mode engaged, but we also collected power and temperature data running in Gaming Mode.
Image 1 of 9
Image 2 of 9
Image 3 of 9
Image 4 of 9
Image 5 of 9
Image 6 of 9
Image 7 of 9
Image 8 of 9
Image 9 of 9
The RTX 3070 TUF’s design is nearly identical to that of the Asus RTX 3080 TUF, with a few minor adjustments. The 3070 has the same dimensions as the more potent 3080 and 3090 cards, but it weighs around 300g less. That’s because the GPU and GDDR6 memory won’t run as hot, so the heatsink isn’t quite as bulky. The overall appearance is nearly the same as the higher-end Asus TUF models as well, though there are a few small differences in the backplate (there are a few extra cutouts on the 3070).
While the reference 3070 has an official TGP (Total Graphics Power) of 220W, Asus doesn’t explicitly list a TGP and instead recommends at least a 750W power supply. The 3070 TUF still requires dual 8-pin power connectors, just like the 3080 and 3090 variants, which is a bit interesting to see. Based on our power testing, it will be challenging to push the card beyond 300W, and an 8-pin plus 6-pin setup would have been sufficient, but it was probably easier to just keep the dual 8-pin connections used on other models.
RGB lighting is present, but it’s very tame compared to other GPUs. The TUF logo on the top of the card lights up, and there’s a small RGB strip on the front edge of the card (linked to the same lights as the logo), and that’s it. If you’re after more bling, Asus has the Strix line for that. TUF is the more mainstream approach to design and aesthetics. Naturally, the Strix models cost more than the TUF models, with slightly higher factory overclocks and better cooling in addition to the extra RGB lighting.
The 3070 TUF has three 90mm fans, and they’re the new style with an integrated rim that increases static pressure and helps improve airflow at the same RPM. Considering we saw very good results from the cooling on the higher power RTX 3080 TUF, the fans should be more than sufficient for the 3070 card. Asus also rotates the center fan clockwise, with the side fans spinning counterclockwise, which it says reduces turbulence and noise. Our testing (see below) generally confirms these claims.
Image 1 of 3
Image 2 of 3
Image 3 of 3
We used GPU Tweak II during testing, setting it to OC Mode. We also did some manual overclocking, which showed similar results to what we’ve seen with other Ampere GPUs. We maxed out the power limit and managed to add 750 MHz to the GDDR6 base clock (15.5Gbps effective speed), but we could only add 75 MHz to the GPU core clocks before we encountered instability. We also ramped up fan speeds quite a bit — using the stock fan profile, we could only get around 50 MHz extra on the GPU core and a 600 MHz memory overclock.
In other words, we consider our OC’d results to be closer to the maximum you should expect to achieve, and we’re being aggressive on fan speeds to get there. If you run one of these cards with the fans usually spinning at 50-75%, the bearings are likely to wear out quicker, and we feel you’re better off just sticking with the default OC Mode for long-term use. Redlining a card for an extra 5% performance isn’t really a great idea, but YMMV.
Asus RTX 3070 TUF Gaming: 1080p Gaming Performance
TOM’S HARDWARE GPU TEST PC
We’ve only tested one custom 3070 so far, which we’ll highlight in bright red, with the reference 3070 Founders Edition in a darker shade. We’ve included both ‘stock’ (using the OC Mode) results and performance running our maximum manual overclock in the charts (we didn’t run benchmarks using the Gaming Mode or Silent Mode). We didn’t run the same overclocking tests on the Founders Edition back when we first tested it, but in the tests that we did run, we saw performance slightly higher than the Asus card gets using the OC Mode.
Our test PC is the same Intel Core i9-9900K we’ve been using for over a year now, with full details to the right. The Core i9-10900K and Ryzen 9 5900X may be slightly faster, depending on the game used and other settings. However, we’ve enabled XMP memory profiles for our GPU testbed, which seems to narrow the gap quite a bit, particularly with the RTX 3070. We’re running the RAM at DDR4-3600 with 16-18-18 timings, compared to the officially supported DDR4-2666 memory speed.
Image 1 of 28
Image 2 of 28
Image 3 of 28
Image 4 of 28
Image 5 of 28
Image 6 of 28
Image 7 of 28
Image 8 of 28
Image 9 of 28
Image 10 of 28
Image 11 of 28
Image 12 of 28
Image 13 of 28
Image 14 of 28
Image 15 of 28
Image 16 of 28
Image 17 of 28
Image 18 of 28
Image 19 of 28
Image 20 of 28
Image 21 of 28
Image 22 of 28
Image 23 of 28
Image 24 of 28
Image 25 of 28
Image 26 of 28
Image 27 of 28
Image 28 of 28
1080p continues to be the most popular resolution, according to the Steam Hardware Survey, though we figure anyone buying an RTX 3070 likely has their sights set a bit higher. However, some people prefer running a higher refresh rate display over resolution, in which case 1080p results are still important.
Despite the low resolution, there’s still a fairly large gap between the RTX 3070 and RTX 3080, thanks to the game selection and ultra quality settings. Overall, the Asus 3070 TUF ends up beating the reference 3070 FE by just four percent, while the 3080 leads the Asus card by 15 percent. If you have a choice between a heavily factory-overclocked 3070 and a reference-clocked 3080 for roughly the same price, you’ll be better off with the 3080 in every case. Not that you can find either one in stock right now.
Our gaming selection also illustrates one of the pain points with chasing higher frame rates: At maximum quality, even top tier GPUs can struggle to break 144 fps in many games, and 240 fps is basically out of the question. Unless you play Strange Brigade or other lighter fare like CS:GO, Overwatch, and League of Legends, in which case a 240Hz or even 360Hz monitor might be useful. Alternatively, you can drop the quality settings to boost performance, though some games (e.g., Assassin’s Creed Valhalla) will never get much above 120 fps.
Interestingly, the manual overclock is just enough to put the Asus card on equal footing with AMD’s reference RX 6800 (which can, of course, be overclocked for an additional 5-10% boost in performance). Some games strongly favor AMD’s RX 6800 (Valhalla, Borderlands 3, Dirt 5, The Division 2, and Forza Horizon 4). In contrast, other games favor the RTX 3070 (Far Cry 5, FFXIV sort of, Metro Exodus, Strange Brigade, and Watch Dogs Legion — along with every game that supports DXR, aka DirectX Raytracing and DLSS). Still, overall it’s a relatively close match.
Asus RTX 3070 TUF Gaming: 1440p Gaming Performance
Image 1 of 28
Image 2 of 28
Image 3 of 28
Image 4 of 28
Image 5 of 28
Image 6 of 28
Image 7 of 28
Image 8 of 28
Image 9 of 28
Image 10 of 28
Image 11 of 28
Image 12 of 28
Image 13 of 28
Image 14 of 28
Image 15 of 28
Image 16 of 28
Image 17 of 28
Image 18 of 28
Image 19 of 28
Image 20 of 28
Image 21 of 28
Image 22 of 28
Image 23 of 28
Image 24 of 28
Image 25 of 28
Image 26 of 28
Image 27 of 28
Image 28 of 28
Running at 2560×1440 is generally the best balance between resolution and frame rate, especially since 144Hz 1440p displays are relatively affordable — you can even get FreeSync and G-Sync Compatible IPS displays for around $300-$400, which is what we recommend for most people. Performance drops on average by approximately 20 percent compared to 1080p, but all of the games continue to run at more than 60 fps, outside of the two games where we’ve enabled DXR (Dirt 5 and Watch Dogs Legion — though WDL does have the option to use DLSS, which we haven’t done here.)
The factory overclock on the Asus 3070 TUF Gaming gives it a 5 percent lead over the 3070 FE, which isn’t particularly significant. Manually overclocking the Asus card also puts it (barely) ahead of the stock RX 6800 again, with a similar set of wins in losses in the individual games. This is about as far as we’d recommend pushing the RTX 3070 for most gamers.
Technically (see below), you can run at 4K as well, and with the right combination of game and settings, you might even break 60 fps still. However, 1440p 144Hz gaming simply feels much smoother than 4K gaming, even if you have a high-end 4K monitor. But let’s see the actual numbers.
Asus RTX 3070 TUF Gaming: 4K Gaming Performance
Image 1 of 28
Image 2 of 28
Image 3 of 28
Image 4 of 28
Image 5 of 28
Image 6 of 28
Image 7 of 28
Image 8 of 28
Image 9 of 28
Image 10 of 28
Image 11 of 28
Image 12 of 28
Image 13 of 28
Image 14 of 28
Image 15 of 28
Image 16 of 28
Image 17 of 28
Image 18 of 28
Image 19 of 28
Image 20 of 28
Image 21 of 28
Image 22 of 28
Image 23 of 28
Image 24 of 28
Image 25 of 28
Image 26 of 28
Image 27 of 28
Image 28 of 28
As we noted in our RTX 3070 Founders Edition review, it’s basically as fast as the previous generation RTX 2080 Ti. That’s despite the 8GB VRAM limitation — and it’s definitely a limitation. For example, Watch Dogs Legion really doesn’t seem to care for 8GB cards when all the settings are maxed out. DLSS helps, but we had to run the benchmark numerous times for the results shown in the gallery, as often we’d get stuck with extremely low performance. Overall, 4K remains viable, but it’s just not the same experience as 1440p 144Hz.
The overall rankings don’t change compared to the lower resolutions, though individual games may show a few position swaps. The gap between the 3070 and 3080 meanwhile continues to grow. It was only 15 percent at 1080p, then 20 percent at 1440p, and now it’s 30 percent at 4K. Part of that is due to CPU bottlenecks at lower resolutions, but the extra 2GB definitely helps the 3080 in some games at 4K. We’re also curious to see whether Nvidia will actually do a 3070 Ti with 16GB (or 3070 Super or whatever it decides to call it). Still, considering the ongoing GDDR6 shortages, that may not happen for a while.
Anyway, about half of the 13 games we’ve tested (six) average 60 fps or more at 4K ultra. The other half ranges from just slightly below 60 fps, where G-Sync would still make them feel smooth (Borderlands 3, Division 2, Metro Exodus, and Red Dead Redemption 2) to games that are more like the 30-45 fps console experience (Dirt 5 and Assassin’s Creed Valhalla). And then there’s Watch Dogs Legion, which sits at sub-20 fps rates and only barely reaches 30 fps with DLSS in performance mode — at least with DXR enabled. So you can’t plan on running every game maxed out at 4K ultra on the 3070, but most games are easily playable at 4K with a judicious mix of settings.
Asus RTX 3070 TUF Gaming: Power, Clocks, Thermals, Fan Speeds and Noise
For our power, thermal, etc., testing, we’ve tested the Asus card in Gaming Mode, OC Mode, and with our manual overclock. We run Metro Exodus at 1440p ultra (no DLSS or DXR) and FurMark running at 1600×900 in stress test mode. Each one loops a test sequence for about 10 minutes, and we use Powenetics software for in-line power measurement, with GPU-Z tracking clocks, temps, and fan speeds. Unfortunately, while HWInfo64 now reports GDDR6X memory temperatures, that doesn’t apply to vanilla GDDR6 memory. Presumably, it runs cooler since it’s only clocked at 14Gbps, but we weren’t able to check.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Compared to the reference model, the Asus 3080 TUF consumes about 30-35W more power using its default Gaming Mode settings. Engage the OC Mode and power use jumps another 20W, while our manual overclocking only managed an additional 8-9W. That’s not too surprising, considering the maximum power limit in GPU Tweak is 108%. Using a baseline of 250W, that would give a maximum of 270W, and then adding a bit of extra leeway accounts for the last bit of power.
As far as where the power comes from, the card’s peak power was only a few watts higher than what you see in the charts, and all three power sources are easily within spec. Even at our maximum manual overclock, the PCIe slot only provided 62W, the first PEG connector provided 127W, and the second PEG connector provided 98W.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
The Asus RTX 3070 TUF comes with a boost clock of 1845 MHz in OC mode, 120 MHz higher than the Founders Edition. As usual, we saw substantially higher clocks in games, with the card averaging 1.96GHz during our Metro Exodus test — note that we’re only averaging clock speeds when the GPU load is above 95%, so the dips you see in the line charts aren’t included. Our manual overclock pushed clock speeds even higher, to an average of 2.08GHz. On the other hand, Furmark hits much higher power use per MHz, so clocks drop about 250MHz (give or take).
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Temperatures are directly linked to fan speeds, so higher RPMs means lower temps. Even in OC Mode, the cooling on the 3070 TUF proves more than adequate, staying below 70C. More importantly, those temperatures come with the fans coasting along at their minimum speed of around 1000 RPM. (The average is a bit lower as the fans don’t even turn on until the GPU hits 50C.) Our maximum OC used a far more aggressive fan curve, which might be a bit overkill, but it allowed the testing to complete without issues. Thanks to the high fan speeds, temperatures while overclocked were actually lower than at stock, but some tuning to find a happy medium is possible.
Besides affecting thermals, fans also make noise — the higher the RPMs, the more noise. Like most modern GPUs, the Asus 3070 keeps the fans off until the GPU hits 50C, which means for office use and lighter tasks, the card doesn’t make any noise at all. Without manually overclocking, fan noise is only slightly above the noise floor of our test setup. At idle, with the SPL meter 15cm away from the GPU, the noise floor is 34 dB — 4 dB above the limit of the SPL meter, thanks to the CPU cooler fans and pump. During our stress testing, noise levels increased just a hair to 35.4 dB. That’s with an open testbed, so GPU noise will be even less noticeable if you have a decent case.
Asus RTX 3070 TUF Gaming Vulcan: A Solid Offering
Some people love flashy cards with tons of RGB lighting and other extras. The Asus RTX 3070 TUF Gaming skips most of that (other than a relatively subdued RGB logo) and instead focuses on delivering great performance and cooling. We haven’t tested Asus’s higher-end 3070 ROG Strix OC card, which boasts a boost clock of 1935MHz on the top model, but we have difficulty believing it will be much faster than the TUF Gaming OC. If you want to eke out the last few MHz from a GPU, that’s fine, but for most people, it’s better to get a more reasonably priced card rather than one packing all the bells and whistles.
With the Asus 3070 TUF, you get excellent cooling and better than reference performance, nothing more, nothing less. The original launch price was $550, just $50 more than the 3070 Founders Edition, and we can easily get behind that sort of offering. Unfortunately, in the current market, it’s more difficult to say how much you should be willing to spend. The Asus Store lists the 3070 TUF OC at $650 now, supposedly due to the increased tariffs on graphics card imports from China. Obviously, it’s also due to the extreme demand for any reasonably potent graphics card, and you won’t find the card readily available at anywhere close to $650 right now.
We feel like a broken record, but until things settle down, it’s not a great time for PC gaming enthusiasts hoping to pick up a new graphics card. And with Ethereum hitting all-time highs, coin miners are only exacerbating the situation. Nvidia, AMD, and all of their AIB partners are trying to get cards out to the market as quickly as possible, but we’ve gone from thinking it would only take a few months for things to get better to wondering if we’ll see cards at MSRP at all during 2021. Hopefully, but with shortages on many of the other components that go into a graphics card (GDDR6 memory and other materials), we don’t expect major improvements until June or July at best.
This isn’t to say you shouldn’t try to buy the Asus RTX 3070 TUF Gaming OC. If you can find one in stock — either via a waiting list or a Newegg lottery or just getting lucky at a brick and mortar store — and it’s priced reasonably (under $650), this is a great card. There are other GPUs slated to launch in the coming months as well, including the RTX 3060 12GB and the RX 6700 XT. The more options we get, the more likely we will start seeing less of a crush of people trying to buy the higher-end components. But until supply improves and coin mining profitability drops, it’s going to be a tough slog for anyone trying to buy a new high-end graphics card.
Mad Catz is a gaming peripherals company located in Hong Kong and has been around since 1989. In one way or another, the R.A.T. 8 has been around for many years now. The R.A.T. 8+ ADV comes with several improvements under the hood. Much like previous iterations, the R.A.T. 8+ ADV allows for extensive customization, including adjustments to the weight, back, and side panels. PixArt’s PMW3389 capable of up to 20,000 CPI is the sensor, and the main buttons have been outfitted with switches rated for 60 M clicks. The R.A.T. 8+ ADV comes with eleven buttons in total, all of which can be programmed to one’s liking in the software, along with the RGB lighting options, among other items.
Johnson & Johnson announced yesterday that it had officially asked the Food and Drug Administration to authorize its COVID-19 vaccine, kicking off a process that will send its one-shot immunization through a gauntlet of analysts and experts.
If all goes as expected and the agency agrees that the benefits of the vaccine outweigh the risks, the United States could have a third shot available by the end of February. Experts hope that this vaccine — which only takes one dose and can be stored in the refrigerator — could accelerate the United States’ vaccination campaign.
Over the next few weeks, FDA officials will work at breakneck speeds to review the data submitted by Johnson & Johnson. Then, they’ll write up a report and submit it to the FDA’s independent vaccine advisory committee. That group will meet on February 26th, evaluate the data, and decide if they think the FDA should authorize the vaccine. (Their recommendation isn’t binding, but the agency usually agrees with it.) A day or so later, the FDA could release its decision.
Three weeks may seem like a long time to wait before the committee meets and the FDA makes its decision. And in the context of the pandemic, three weeks is a long time — thousands of people could die in that gap — but it’s actually remarkably fast given the enormity of the task at hand. The agency will use every second of that time to make sure there aren’t any safety concerns with the vaccine and that it can actually do what it claims to do. Skipping steps could erode already fragile trust in vaccination.
“FDA staff feel the responsibility to move as quickly as possible through the review process,” former FDA Commissioner Stephen Hahn said in December, just after Pfizer and BioNTech filed for authorization of their vaccine. “However, they know that they must carry out their mandate to protect the public health and to ensure that any authorized vaccine meets our rigorous standards for safety and effectiveness.”
Rather than just look at the reports done by the pharmaceutical company, the agency takes the raw data and runs its own analysis. Normally, that process takes around a year. For the COVID-19 vaccines, the agency dramatically shortened that timeline. It took 21 days from the time Pfizer and BioNTech filed for authorization for the vaccine to get the green light from the FDA. Agency experts worked in shifts, on nights and weekends, to crank through the data as thoroughly and quickly as possible, Hahn told The Wall Street Journal in December.
The data for this vaccine could be trickier to evaluate than the data for the Moderna and Pfizer / BioNTech vaccines, which were overwhelmingly effective against COVID-19. Johnson & Johnson’s vaccine was less effective overall but still kept anyone who took it alive and out of the hospital, even if they caught COVID-19. It was tested in the United States, Latin America, and South Africa. In South Africa, it encountered the coronavirus variant that appears to reduce vaccine efficacy. Understanding how well the vaccine works will take teasing apart the geographic differences in the trial.
The US has agreements for 100 million doses of the Johnson & Johnson vaccine, enough to protect 100 million people against the deadly effects of COVID-19. By the middle of the month, half a million people in the US will have died from the disease. The one-shot, logistically easy vaccine can’t come fast enough — but for now, it’s coming as fast as it can.
(Pocket-lint) – Cambridge Audio has been around for donkey’s years and has built something of a tradition in the world of home audio equipment. But in the world of headphones, it doesn’t quite draw the attention as much as the likes of Sony, Bose or Sennheiser. So when it launched a pair of true wireless earbuds, the Melomania 1, that was something of a surprise.
Even more of a surprise, perhaps, was that the Melomania 1 was both great-sounding and very affordable. So the follow-up pair had a high bar to meet. Can Cambridge Audio blow us away with great, affordable buds for a second time? Here’s how the Melomania Touch fares…
Design
Touch controls
IPX4 water resistance
Faux leather coated case
Available in black or white
3x ear fin and 3x ear tip sizes for fit
Cambridge Audio launched its first pair of true wireless earbuds in 2018, and stood out from the market for a couple reasons. One of those was the design, which reminded us of little foam-tipped bullets. Of course, this shape wasn’t necessarily the most ergonomic or practical for in-ear fit, but the lightweight and easy-to-wear finish helped counter that.
The Melomania Touch looks nothing like the first-gen model and doesn’t feel the same in the ears either. It’s a huge change in direction, and one that has its benefits and its drawbacks.
The new warped teardrop-shape design of the Melomania Touch means these ‘buds are designed to fit and almost fill the inner part of your year, holding onto the middle ridge with an in-ear fin. That means, by their very nature, the individual ‘buds are a much more secure fit than the looser-fitting predecessors.
Because of the various sizes of fin and ear tips, we did find it took a couple of tries to get the right fit for us. Trying out a couple of different combinations we eventually settled on one that was comfortable but with a decent seal and with minimal pressure. Essentially just stepping down a size from the default fit. As we talk about in the sound section later on, getting this right fit is essential for good audio.
Even with a better fit for this generation you can still tell those tips are in your ear. As the tips are the standard shape and size for earbuds, you can always feel them in there and don’t quite get to that almost undetectable level you’ll find with wider cone-shaped tips. The Touch’s feel isn’t uncomfortable though, so you’ll be fine for a couple of hours at a time – too much longer and you will start to feel some sensitivity.
As we’re sure you gathered from the name ‘Touch’, these in-ears outer surface is touch-sensitive but of course, so you can use it to control various features, such as playing and pausing music or skipping tracks. Like most earbuds that feature this, it’s useful when you’re needing to use it on purpose. Most of the time we interacted with the Touch, however, it was accidentally.
The problem with such a large area being touch-sensitive is that if you try to adjust the fit, reach to remove the buds, or frankly do anything that involves touching them, it’s quite finicky trying to avoid that touch-sensitive area and inadvertently playing or pausing music.
The charging case is a nice upgrade for this second-gen model. Rather than looking like a small pack of dental floss made from the beige plastic cast-off from a 1990s desktop PC, this cases is pill-shaped and coated in a soft faux leather. It’s a lovely looking and lovely feeling case, although we found the docks for each earbud could do with having stronger feeling magnet, to ensure that each ‘bud was absolutely in the right position to charge.
Performance, features and voice calls
Bluetooth 5.0 and AAC
Qualcomm aptX, TrueWireless Stereo Plus
7 hours music playback (33 hrs in case)
50 hours total maximum battery life (in low power mode)
Cambridge Audio has equipped the Melomania Touch with a lot of the modern tech you’d hope to find in true wireless earbuds. There’s aptX for lag-free connection with most Android phones. In addition there’s Qualcomm’s other tech: TrueWireless Stereo Plus. This connects each earbud to the Android phone independently and to each other, rather than use one ‘bud as the primary and have the other feed from it.
There’s also Bluetooth 5.0 and AAC support, so Apple iPhone users are catered for. The Touch even uses Qualcomm’s tech for enhancing the clarity of voice calls, so while Cambridge Audio is something of a traditional British audiophile company, it’s had the sense to try and utilise expertise from available tech to make these ‘buds convenient for the day-to-day user who will want to use them for calls.
However, we did struggle at times with the wireless connectivity. It started with the initial setup, where we struggled with pairing and for the ‘buds to be discoverable – to the point where we went through the factory reset guide. Even then we could only get one ‘buds to pair – something that a firmware fix attempt couldn’t sort.
Given that all controls, including the reset process, are activated using that shiny touch-sensitive surface no physical feedback, it’s not exactly easy to perform such tasks. We’d much rather the Touch adopted a similar approach to the companies that have a single physical pairing button on the case itself, rather than trying to press-and-hold a touch-sensitive area on two earbuds simultaneously that’s curved and naturally slippery and may or may not respond as it’s supposed to. It’s finicky to say the least.
So we ended up seeking a replacement pair of the Melomania Touch just to make sure everything checked out. Which, generally speaking, has been the case. Once paired with replacement ‘buds, our connection has been reliable during our testing. With music playing, we’ve had no issues with the audio cutting out once it’s got going.
But connectivity didn’t seem to be quite as on the ball in all areas: upon initial connection, having removed the ‘buds from the case, the music would start in one ear before the other by a second or so. That wait isn’t the norm these days.
Battery life is strong, even in its normal usage mode. Up to seven hours out of the case at a time is more than enough for anyone, even if you’re taking a long journey. Cambridge Audio says you can get up to 50 hours total battery time if you’re happy switching to low power mode, but the process on how to do that isn’t exactly obvious or easy within the app, and really we’re just not sure it’s worth the hassle.
Getting up to 33 hours of total battery – including the charges in the case – is more than the average from most true wireless buds, so that’ll do just fine in our. Plus, you don’t have to put up with the lower quality sound you get from low power mode, which is actually this pair of ‘buds biggest plus point.
Sound quality
7mm drivers
Melomania app for EQ
As we’ve mentioned, the sound is highly dependent on the fit. For instance, if you have too snug a fit the bass will get a little too ‘boomy’ – particularly in songs where there’s some significant bass or bass drum powering the rhythm. As examples, the bass in Hoping by X Ambassadors or the kick drum in Dopeness by Black Eyed Peas. Some people might really like that high impact bass though. In a lot of songs it is highly enjoyable, but in others it just gets a tiny bit too much for our taste.
Best USB-C headphones for Android phones 2021
By Dan Grabham
·
Thankfully, there’s a manual equaliser (EQ) to adjust the sound to your own preference. This is found within the app, where you can also enable and adjust the transparency mode to let in external audio, so that you’re not completely blocked off from the world.
Use a fit that’s less pressured and the sound changes a little to become a lot more natural and less bass heavy. Bass is still quite prominent, but it doesn’t detract from the rest of the frequencies. In fact, the 7mm drivers in these ‘buds are some of the most detailed you’ll hear at this price point.
So with the right fit you get loud and prominent bass, but also all of the subtleties elsewhere in the mix Jangly piano is still bright and clear, as is subtle guitar string plucking, while vocals are delivered with clarity. Nothing is ever drowned out by those punchy bass notes. So all in all, it’s a dynamic sound that’s impressive at this end of the earbud market. And that’s what really matters.
Verdict
After loving the sound that came from the original Cambridge Audio Melomania 1, we had high hopes for the follow-up pair. And there’s no denying, the audio from the Melomania Touch is super – vibrant, punchy and hugely enjoyable.
But the earbuds suffer from connection issues and a design that’s just not hugely practical. Whether it’s the frustrating pairing process, or the fact that – at times – the connection to one of the ‘buds failed or was delayed, the experience lacked the polish we’d come to expect given the success of the first outing.
Still, once you have the Melomania Touch in our ears and are listening to music – and not touching them, because that touch-sensitive panel is easy to hit by accident – the music is so good.
In this price range you’re unlikely to find anything that sounds as dynamic and clear as these. We’re just wary given the connectivity ups and downs.
Also consider
Jabra Elite 75t
squirrel_widget_172296
As reliable a pair of true wireless ‘buds as you’ll find. These in-ears are small, comfortable to wear and deliver a sound solid.
Read our review
Sony WF-SP800N
squirrel_widget_2669856
These sporty in-ears offer a lot of Sony’s smart ambient sound control and the noise-cancellation tech is the real star. Battery life is only average though, which is surprising given the (massive) size of the case. Overall these ‘buds sound great and offer plenty of customisation.
If you’re shopping for 144 Hz and 25 inches, the BenQ EX2510 is one of the best IPS panels we’ve seen for under $300. There’s no extended color, but it delivers top-notch gaming and surprisingly good HDR. Users seeking a high performance-to-price ratio should definitely check it out.
For
Good contrast and color accuracy
Decent HDR
Good gaming performance
Strong build quality
Against
No extended color
No dynamic contrast in HDR
Features and Specifications
The price of a good gaming monitor is generally dictated by screen size, resolution and refresh rate. Other gaming features, like Adaptive-Sync, are pretty much a given for any display marketed to enthusiasts. And color accuracy and build quality do not necessarily go hand-in-hand with cost.
BenQ may not be as well known for its gaming screens as brands like Asus or Acer, but it offers products that deliver performance, quality and value. Lately, it has brought out new models with interesting names like Zowie and Mobiuz. But these creative monikers don’t attempt to make up for any shortfall. The new Mobiuz EX2510 is a great example. It’s a 25-inch, 1080p resolution IPS panel with a 144 Hz refresh rate, FreeSync and G-Sync compatibility and HDR with BenQ’s HDRi emulation mode. At publication time, it’s selling for around $250, making it much more affordable than many of the market’s best gaming monitors.
BenQ Mobiuz EX2510 Specs
Panel Type / Backlight
IPS / W-LED, edge array
Screen Size / Aspect Ratio
24.5 inches / 16:9
Max Resolution & Refresh Rate
1920 x 1080 @ 144 Hz
FreeSync: 48-144 Hz
Native Color Depth & Gamut
8-bit / sRGB; HDR10
Response Time (GTG)
2ms
Brightness
400 nits
Contrast
1,000:1
Speakers
2x 2.5w treVolo audio w/DSP
Video Inputs
1x DisplayPort 1.2
2x HDMI 2.0
Audio
3.5mm headphone output
USB 3.0
None
Power Consumption
15.8w, brightness @ 200 nits
Panel Dimensions WxHxD w/base
22 x 15.5-20.5 x 8.5 inches (559 x 394-521 x 216mm)
Panel Thickness
2 inches (51mm)
Bezel Width
Top/sides: 0.3 inch (7mm)
Bottom: 0.8 inch (21mm)
Weight
12.4 pounds (5.6kg)
Warranty
3 years
The 25-inch gaming monitor category is filled with super-fast 1080p resolution models running above 240 Hz and priced at the premium level. The 360 Hz Asus ROG Swift PG259QN is a perfect example. It’s a 25-inch, 1080p IPS monitor that costs an eye-watering $700. But if you’re OK with 144 Hz, you can save quite a bit of money. In fact, the BenQ EX2510 is a great alternative to 27-inch 1080p and 1440p monitors that typically cost about $75-100 more.
The EX2510’s 144 Hz refresh rate is achieved without overclock. The monitor’s AMD FreeSync-certified, and we were also able to run Nvidia G-Sync on it, even though it’s not certified (to do this yourself, check out our How to Run G-Sync on a FreeSync Monitor tutorial).
Though it’s compatible with HDR10 signals, the EX2510 does not include an extended color gamut. Color depth is a true 8 bits achieved without Frame Rate Compensation, and the backlight is flicker-free.
Assembly and Accessories of BenQ Mobiuz EX2510
The EX2510’s build quality is apparent when you unpack its three parts. The base is nicely finished in silver with an orange rubber accent across the front. The upright is quite heavy and solid. Just attach it to the base with a captive bolt. The panel then snaps in place. A 100mm VESA mount is included for aftermarket hardware.
In the box, you’ll find an HDMI cable and IEC power cord for the internal power supply. There’s also a snap-on cover for the input panel. You can pass the cables through a hole in the upright for a tidier look.
BenQ Mobiuz EX2510: Product 360
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
The BenQ EX2510’s styling is somewhat blocky and understated for a gaming monitor., but it’s all about function. Straight lines dominate its shape with the only curve being a smooth taper across the back. The bezel is flush and free of physical framing, but you can see a thin 7mm border when the power’s on. The bottom trim is 21mm wide and features the BenQ logo and an HDRi button (more on that later). The front anti-glare layer is free of grain and presents a sharp image without reflecting any room light.
The stand features 5 inches of height adjustment plus 20-degree swivel each way and -5/20 degrees of tilt. There is no portrait mode. Movements are very firm and solid with no play at all. Even if you shake up your desk during intense frag sessions the EX2510 will stay put.
In addition to the HDRi button on the front, there are two more keys in the back-right corner, plus a joystick for menu navigation. One key toggles power, its status indicated by a white LED, and the other changes the signal source. The controls click firmly and respond quickly to user input.
In the bezel’s center, you can see a small protrusion that houses the sensor for BenQ’s Brightness Intelligence Plus (BI+) feature. It works with two of the HDRi modes to adjust brightness and color temperature to the environment. It responds quickly to changes and, in most cases, you won’t see it working.
The input panel includes two HDMI 2.0 ports and a DisplayPort version 1.2 (for help picking one, check out our HDMI vs DisplayPort article). You also get a 3.5mm audio jack for headphones or powered speakers.
OSD Features of BenQ Mobiuz EX2510
The on-screen display (OSD) appears when you press the joystick button and scroll down to the menu option. You can configure the quick menu to the user to allow easy access to commonly used options like brightness or picture mode.
The EX2510 includes three HDR and seven SDR presets. Standard is the default and most accurate choice with good out-of-box color and access to all picture options, like gamma and color temp. Other features include Light Tuner, which changes highlight and shadow detail levels. You can also access a low blue light mode from the Eye Care menu, along with the aforementioned BI+ feature, which engages the light sensor to change brightness and color temp automatically for different ambient lighting conditions.
There are five gamma options and three preset color temps, plus a very precise user mode, which we used to calibrate the EX2510 to a high standard. Also here is AMA, BenQ’s term for overdrive. It worked well on the highest of three settings to curb motion blur without ghosting artifacts. If you want to try the backlight strobe for blur reduction, you’ll have to turn off FreeSync.
For HDR content, the EX2510 will automatically switch to its default HDR mode, which is the best of the three. Cinema and Game turn the color temp quite blue, though the effect varies if you use the BI+ sensor. For testing purposes, we left these automatic enhancements off.
To engage the bezel’s light sensor, turn it on in the Eye Care menu. It can vary brightness and color temp in the HDRi modes, as well as adjust itself over time to prevent eye fatigue. BenQ also includes red and green filters with 20 steps each to compensate for varying levels of user color blindness.
BenQ Mobiuz EX2510 Calibration Settings
In its Standard picture mode, the EX2510 is very accurate in the sRGB color space with no need for calibration. But a few gains are possible with adjustment of the RGB sliders in the user color temp mode. Gamma is spot-on with no adjustment necessary. Other picture modes are less accurate but may appeal to users playing specific game types. Using the HDRi emulations makes SDR content punchier but at the expense of some clarity in shadow and highlight areas. On page four, we’ll show you its effect with a few measurement charts.
Below are our recommended calibration settings for the BenQ Mobiuz EX2510 and SDR content. They produce perfect gamma with very accurate grayscale and color tracking.
Picture Mode
Standard
Brightness 200 nits
56
Brightness 120 nits
25
Brightness 100 nits
18
Brightness 80 nits
10 (min. 53 nits)
Contrast
50
Gamma
3
Color Temp User
Red 97, Green 99, Blue 100
For HDR, the best picture comes in the default mode.
Gaming and Hands-on with BenQ Mobiuz EX2510
One of the EX2510’s most unique features is its HDRi HDR emulation mode. It’s accessed by a button on the front panel and can give an HDR look to SDR content. HDRi works by manipulating gamma to increase perceived contrast. While not strictly accurate, it may appeal to some.
BenQ simplified the EX2510’s image options by eliminating any sort of dynamic contrast feature and leaving the HDRi modes the task of altering contrast for SDR content. We tested the three HDRi modes — HDR, Game HDRi and Cinema HDRi — with Windows apps and various games. It’s also possible to use the Cinema and Game modes with HDR-encoded content.
In SDR mode, we booted up Tomb Raider, and all three HDRi modes degraded the image to varying degrees. HDR was the least offensive but darkened the picture too much overall. Brighter scenes looked about the same, but dimly lit indoor areas were too hard to make out. Game and Cinema HDRi made the effect worse and created a blue tint over everything. With these observations in mind, we recommend avoiding the HDRi HDR emulations unless the content is predominantly bright, like a sports game or animated movie.
Turning on HDR in the Windows Control Panel had a positive effect. Very few HDR monitors actually look good running things like word processors and spreadsheets, but the EX2510 is an exception. By default, the monitor is set to 100% brightness with HDR content. That isn’t as harsh as you might think, given that it’s peaking at around 450 nits. Small highlights in photos and YouTube videos popped nicely, but the overall picture was very pleasing to look at.
Our only complaint is elevated black levels. Though perceived contrast is very good, dark material looked a bit too gray and washed out. A Harry Potter film, for example, looked murky. You’re better off watching your HDR movies in SDR mode. Since there’s no extended color gamut, you won’t see any difference in saturation between SDR and HDR. But in terms of color overall, the monitor has very accurate color tracking, so we didn’t miss the DCI-P3 color space too much.
With Windows HDR on, we played a few rounds of Call of Duty: WWII. This title makes great use of HDR, which is why we use it for testing. Bright cutscenes looked incredibly lifelike with sharp highlights and loads of detail on the EX2510. Darker areas were a bit gray, but detail was still easy to see.
In all cases, we had no trouble with video processing. You can set overdrive to its highest value without ghosting, and blur was a non-issue. Adaptive-Sync worked perfectly on both AMD and Nvidia platforms (even though it’s not G-Sync-certified) with or without HDR. Frame rates stayed maxed at 144 frames per second (fps) in all the games we played. Input lag was also a non-issue with snappy control response and no stuttering or flicker. At this price, it’s hard to imagine finding superior gaming performance.
BenQ put extra effort into its audio by tuning the built-in speakers with a technology called TreVolo. It’s a digital signal processor (DSP) devoted to tweaking the frequency response and phase of the speakers. There are three sound modes, and though they don’t deliver thumping bass at ear-bleeding levels, they sounded better than the average monitor speakers.
Another bonus feature is in the ability to engage a sensor to alter brightness and color temperature to better suit the room’s lighting. This is also something that will deviate from accepted imaging parameters but won’t degrade the picture.
The Razer BlackWidow V3 is a good performer that offers more customization than some of its companions. However, for fans of pink there are cosmetic issues that make the Quartz Edition hard to recommend over the black model, especially with some of the default lighting effects.
For
Solidly built
Comfortable typing
Against
Similar cosmetic issues to other BlackWidow V3 keyboards
Very large
The best gaming keyboards come in many shapes, sizes and styles. With RGB now basically expected of gaming clackers, how can a keyboard stand out in the looks department? Premium media controls can help. You can also go for a nice wrist rest or fancy keycaps and more. But nothing quite makes a keyboard stand out the way a pink color scheme does.
The Razer BlackWidow V3 ($140 as of writing) comes in black but is also available in pink, dubbed Quartz Edition. But not all pinks are made the same, and there’s, of course, more to a keyboard than its looks. Let’s find out if there’s a quality keyboard under that unique pigment, or if Razer simply decided to put lipstick on a pig.
Razer BlackWidow V3 Specs
Switches
Razer Green (clicky) or Razer Yellow (linear)
Lighting
Per-key RGB
Onboard Storage
5 profiles
Media Keys
Yes (
Interface
USB 2.0 Type-A
Cable
Attached, rubber
Additional Ports
None
Key Caps
Doubleshot ABS plastic
Construction
Aluminum top plate, plastic base
Software
Razer Synapse
Dimensions (LxWxH)
17.8 x 6.1 x 1.7 inches (45.2 x 15.5 x 4.3cm)
Weight
2.2 pounds (997.9g)
Design of the Razer BlackWidow V3 Quartz Edition
Image 1 of 6
Image 2 of 6
Image 3 of 6
Image 4 of 6
Image 5 of 6
Image 6 of 6
Razer’s products often feature a black, green and grey color palette that looks striking to some and dull to others. The BlackWidow V3 can match this theme with a black hue, but the BlackWidow V3 Quartz Edition we’re reviewing distinguishes itself from the rest of the lineup with its powder pink pigment. This color will prove even more divisive—(which is probably why most keyboard manufacturers opt for ol’ reliable), but there’s no denying that it makes Razer’s Quartz collection distinctive.
Underneath that pink color scheme lies a full keyboard that seems massive in comparison to the Razer BlackWidow V3 Tenkeyless, especially with the optional wrist rest attached. If the tenkeyless model is a teacup pig, the Quartz edition BlackWidow V3 is a warthog. We expected this take on the keyboard to be bigger than the TKL model, but we weren’t expecting this large a difference. Without the wrist rest, this behemoth is17.8 inches long, 6.1 inches deep and 1.7 inches tall, compared to the tenkeyless model’s 1.62 x 14.26 x 6.1 inches and weighs 2.2 pounds to its smaller counterpart’s 1.85 pounds. Where the TKL felt reassuringly dense thanks to its aluminum top plate, this model is simply big and it demands attention.
Compared to other full-sized keyboards with dedicated media controls, though the BlackWidow V3 is of a comparable size, if not a bit long. For example, the Redragon K580 Vata is shorter, a little less long but deeper (17.6 x 7.6 x 1.5 inches ), and the HyperX Alloy Elite 2 is less, deep and tall than our review focus (17.5 x 6.9 x 1.5 inches ).
Nor were we expecting Razer to replace the base model BlackWidow V3’s volume roller. The black model’s roller features grooves that make it easier to, well, roll. This version replaces those grooves with a bunch of spikes that make the Quartz edition BlackWidow V3 look like something one might find in an accessories store that caters specifically to teenage girls. That one change had a dramatic effect on the keyboard’s style.
Similar to the BlackWidow V3 Tenkeyless, this full-sized version has some (in my opinion, ugly) keycaps that make the RGB backlighting look worse than it should. They’re doubleshot ABS plastic, which means they should be a bit more durable than your standard keycaps and the writing won’t fade. But the RGB comes through unevenly and looks smudged. That problem was frustrating on the TKL, but at least the issue was limited to the lighting. On the BlackWidow V3 Quartz Edition, the uneven RGB makes the pink coloring look washed out and kind of sickly.
This might seem overly critical of a keyboard’s color scheme, but that color palette is this keyboard’s raison d’etre. People are supposed to buy this version of the BlackWidow V3 because it’s pink, and that aesthetic choice will probably be weighed more heavily than anything else about the keyboard’s design. It’s a shame that such a potentially compelling look was let down by the same issues as other BlackWidow V3 models.
Of course, you can opt for the black version, but we expect to the RGB to look uneven there too, like it did on the black version of the BlackWidow V3 Tenkeyless we tested.
Typing Experience on the Razer BlackWidow V3 Quartz Edition
Razer offers the BlackWidow V3 with its clicky green or linear yellow mechanical switches. Our review model came equipped with Razer Green switches that boast a 50g actuation force, 1.9mm actuation point, and a 0.4mm difference between the actuation and reset point with a total travel distance of 4mm. The yellow switches change those specs to 45g, 1.2mm and 0mm, respectively, with a total travel distance of 3.5mm. Razer Green and Yellow switches can also be found in other BlackWidow V3 models, such as the wireless Razer BlackWidow V3 Pro or aforementioned TKL version.
We ran through 10fastfingers.com’s typing test with the BlackWidow V3, a Logitech G Pro with Romer-G switches and the Apple Magic Keyboard for iPad Pro to get as close to an objective look at our performance on the keyboard as we could get. (The test isn’t perfect, and there’s bound to be variance, but at least it’s quantifiable.) We did the test three times on each keyboard and took the average for the final result.
The results: 114.66 words per minute (wpm) with 95.44% accuracy on the BlackWidow V3, 116.3 wpm with 97.47% accuracy on the G Pro and 114 wpm with 97.24% accuracy on the Apple Magic Keyboard. Razer’s offering performed well considering our familiarity with the other keyboards—we’ve used the G Pro for several years and spend a lot of time using the iPad Pro. Still, the head-to-head tests highlighted some of the BlackWidow V3’s flaws.
The biggest issue was notable pinging on many keys when they’re struck with enough force to bang out more than 100 wpm. The space bar offered a hollow “thud” in between words, too, which made the tests a bit maddening. There’s an important distinction between the pleasant “clack” of a clicky mechanical switch and the unpleasant “ping” of an unhappy spring. We found ourselves typing much slower in normal usage to avoid the latter.
It was also difficult to get the BlackWidow V3 in a comfortable position. The keyboard itself is laid out well—we didn’t notice any undue discomfort during everyday usage, the typing tests, or the writing of this review. But it’s a massive keyboard that we struggled to make room for on our desk in a way that made it easy to reach the mouse as well. (More on that in a moment.) Whether or not the number pad is worth that much space is subjective.
With its clicky Razer Green switches, the BlackWidow V3 will do fine during everyday use, despite pinging issues, when subjected to particularly forceful typing. Intrepid buyers could probably solve that problem with a bit of lube too.
Gaming Experience on the Razer BlackWidow V3 Quartz Edition
This BlackWidow V3 proved as responsive as desired in-game. Something as simple as peeking after a flash in a game like Valorant requires a lot of key presses: Q to prep the flash, A to peek the corner, D to counter-strafe in time to make an accurate shot, Ctrl to crouch when you need to commit to a spray, etc. It never felt like that sequence was messed up because of the keyboard. (Let’s just say that our typing speed doesn’t always translate to in-game key presses.)
Our fingertips didn’t slide from the doubleshot ABS keycaps, it was comfortable to hover like a claw over that all-important WASD cluster and it didn’t take us any time to adapt to the layout when we needed to reach additional keys. We missed having an identifier on the “W” key, like what you” find on the Roccat Vulcan TKL Pro, but aside from that, Razer’s offering was pretty standard on the gaming keyboard front. It didn’t make us any better, nor it didn’t make us any worse.
But this is where the inclusion of a number pad becomes more divisive. TKL keyboards have become increasingly popular, in part because people who play first-person shooters, action games and other genres that don’t rely on that cluster of keys want as much room as possible for their best gaming mouse and mousepad.
Having a behemoth keyboard like the BlackWidow V3 directly violates that principle. We simply couldn’t find a way to make the keyboard fit next to our large-sized Razer Gigantus V2 mousepad in a way that A) was ergonomically viable and B) didn’t look absolutely ridiculous to onlookers.
How much this matters will depend on the games you like to play. Tactical shooters like Valorant and Counter-Strike: Global Offensive reward low in-game sensitivities, which means having a spacious, ergonomic setup is vital. Games that don’t require precise aim can make do with a smaller area of the desk. That isn’t to say ergonomics should ever be overlooked — everyone should try to make their setups as comfortable as possible — but it does mean space is relative.
Our fingertips didn’t slide from the key caps, it was comfortable to hover like a claw over that all-important WASD cluster and it didn’t take us any time to adapt to the layout when we needed to reach additional keys. We missed having an identifier on the “W” key, like what you” find on the k space would probably be better served by the TKL version of the keyboard, however, and those seeking maximum responsiveness should probably opt for linear Razer Yellow switches.
Features and Software on the Razer BlackWidow V3 Quartz Edition
Image 1 of 3
Image 2 of 3
Image 3 of 3
The BlackWidow V3 relies on Razer Synapse 3 and the Chroma Studio add-on for its customization. Synapse 3 offers the ability to disable the Windows key by toggling Gaming Mode, determine when the keyboard should go to sleep and choose from a list of default RGB effects. You can use Chroma Studio to set per-key RGB lighting in the 16.8 million color spectrum. We don’t necessarily love having to install multiple apps to access those settings, but it’s become the status quo for Razer, so anyone who’s purchased the company’s peripherals before has probably already decided they’re content to install whatever they need to along the way.
Razer’s software also includes the Razer Hypershift feature that allows every key to perform a secondary function–such as launching apps or running a macro–while a designated Hypershift key is held down. Note that the keyboard also allows for on-the-fly macro recording.
This customization also extends to the volume knob and media buttons mentioned earlier. Those descriptions apply to the inputs’ default functions, but Synapse 3 can be used to customize the “multi-function roller wheel” to serve a variety of purposes, while the button next to it can be programmed just like any other key on the keyboard.
Unlike the BlackWidow V3 TKL, this version of the keyboard has on-board storage with support for up to five profiles. People who create Razer accounts can sync profiles across devices using Synapse 3.
Bottom Line
There’s no denying that the Razer BlackWidow V3 has a strong foundation. We’ve liked the other BlackWidow V3 models we’ve reviewed, and they share a lot of similarities, so it’s not surprising we like this one as well. Objectively speaking, Razer made a quality keyboard. Although, it doesn’t exactly break the mold.
If you opt for the pink SKU, the Black Widow V3 Quartz Edition looks more unique. But you probably shouldn’t commit to it without seeing it because it can look very different from photos in real life (it’s hard to take a decent photo that captures how off-putting the keyboard looks at times). We were excited to test the Quartz Edition because we thought it’d stand out, but the pink here is washed out and marred by other cosmetic issues.
If you’re looking for a keyboard in this price range with a striking look, the HyperX Alloy Elite 2 is worth considering, and the Thermaltake Level 20 is a juggernaut of its own. Meanwhile, the Patriot Viper V765 brings an RGB deck and is much cheaper than the BlackWidow V3 currently.
But with a solid construction and cozy typing experience, the BlackWidow V3 is a worthy full-sized competitor, if you can find a hue you like.
Along with news, features, opinions, and tech reviews, video has become an increasingly important part of The Verge’s content. But to make great, involving videos, you’ve got to have staff with the expertise to create that video — along with the tools that allow those staff members to let their imaginations soar.
Alix Diaconis is one of the directors who helps make video magic for The Verge. We talked to Alix about what she does and what tools she uses.
Alix, what do you do for The Verge?
I’m one of the video directors for The Verge. I get to work every day with my three co-workers (but really, friends) to create the videos on The Verge’s YouTube channel. Sometimes deadlines are fast because tech and news are fast, but our team has been working together for years, so even live events feel seamless and fun. We each shoot, take photos, and edit; then the video gets treated by our sound and graphics wizards. Then bam, on to the next one!
What hardware and software tools are needed to produce a video for a site like The Verge?
It really varies video to video. For some videos, we’ll pull out all the stops, while for others, we need to do quick and light. Heck, I think we’ve shot videos with just a GoPro.
When we go to a press event, we’ll keep it very light with a monopod, lavalier microphone, and a camera we feel most comfortable with. And then I’ll edit at the event on my MacBook Pro.
But most of the time when we’re shooting on location, we’ll bring a bigger kit with an HD monitor, a slider (which helps you do tracking shots), maybe a drone. And when we’re making the big stuff, like a flagship phone review, we like to bring out everything, including a probe lens like the Venus Optics Laowa to make intro shots like this.
The opening shot on this video was created using a probe lens.
Since we’re uploading videos for our job, good internet upload speeds make life a lot easier. We also have a shared server so we have access to our terabytes and terabytes of footage at all times.
Oh, and also teamwork. Lots and lots of teamwork.
What specific hardware tools do you use for your work?
For shooting, I prefer to use the Canon EOS C200 — I think it looks really cinematic — and my preferred lens is the Canon EF 70-200mm (for B-roll at least). Sometimes I’ll use the Sony A7S II or III, which looks extra crisp, but I’m not a big fan of Sony menus. For sound, I’ll typically use a Sennheiser G3 lavalier or a Zoom H6 recorder. For photos, I use the Canon 50D.
For post-production in The Verge offices, I would edit on a 27-inch iMac, which is due for an upgrade. At home, though, I have a more powerful editing PC that my producer built for me. It has an AMD Ryzen 7 3700X 8-core processor, 2TB NVMe drive, a Radeon RX 580 series video card, 32GB RAM, and an Asus 28-inch 4K display. Of course, there are always technical issues — it’s part of editing — but the PC is the best editing machine I’ve personally owned. (Thank you, Phil!) I do miss the beautiful iMac display though.
Also, since video takes up a lot of space, I’ll sometimes use an additional SSD for projects. And as for headphones, I use the Sony MDR-7506, which are the only headphones I can wear comfortably all day.
And then there’s the fun, random gear: a GoPro Hero 8, an Insta360 panoramic video camera (which we recently used for this e-bike video), a Zhiyun Crane, a DJI Mavic Pro drone… and whatever else we can get our hands on.
This video was created using an Insta360 panoramic video camera.
What software tools do you use for your work?
All Adobe everything. Premiere Pro for editing, After Effects for basic graphics, and Photoshop for the video thumbnails. You can do a lot in Premiere, but it does have its bugs, and it’s not always optimized for Apple’s hardware.
What tools do you use for your own projects?
I’ve been teaching myself DaVinci Resolve to color footage. I still barely understand the program, but it makes footage look 100x better than coloring it in Premiere. And purely for fun, I shoot 35mm film on my dad’s old Minolta camera.
What hardware and software tools would you recommend for somebody just starting out?
Premiere is very common for editing. But if you want to try something free and you have an iPhone or iPad, there’s the Splice app. It’s really intuitive, but you’re limited to clips you have on your device. There’s also DaVinci Resolve, which is free and as advanced as most paid editing softwares.
As for cameras, just get one that you feel comfortable using! And for a computer, invest in a good one if you see yourself editing for a long time; iMacs and Windows PCs are both good, and the specs will just depend on how big your projects will be. I haven’t had a chance to use Apple’s new M1 MacBook Air or Pro yet, but both seem like good choices if you’d prefer a laptop.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.