AMD will reveal a new Radeon RX 6000 graphics card during Episode 3 of its “Where Gaming Begins” event on March 3 on 11 AM US Eastern. Although the chipmaker didn’t specify which model, it’s likely going to be the much-awaited Radeon RX 6700 XT. Following AMD’s Big Navi release pattern,. the Radeon RX 6700 Xt is the next SKU in line after all.
AMD’s render of the RDNA 2 graphics card aligns with a previous leaked design of what the Radeon RX 6700 could look like. On an aesthetic level, the Radeon RX 6700 XT shares similar traits as the reference design for other Big Navi models, such as the Radeon RX 6900 XT, RX 6800 XT and RX 6800. However, the Radeon RX 6700 XT features a less robust cooling system with a dual-slot design.
The Radeon RX 6700 XT emerges with a shorter cooler with only two cooling fans. A quick glimpse at the front of the graphics card reveals three DisplayPort 1.4a outputs and a single HDMI 2.1 port. It would seem that AMD has removed the USB Type-C connector on the Radeon RX 6700 XT. While the USB Type-C port has its uses, it never really took off so it will please consumers to know that it has been replaced with an extra DisplayPort 1.4a output instead.
On March 3rd, the journey continues for #RDNA2. Join us at 11AM US Eastern as we reveal the latest addition to the @AMD Radeon RX 6000 graphics family. https://t.co/5CFvT9D2SR pic.twitter.com/tUpUwRfpgkFebruary 24, 2021
See more
The Radeon RX 6700 XT will be gunning after Nvidia’s mid-range Ampere-based graphics cards, such as the GeForce RTX 3060 that launches tomorrow. The specifications for the new Big Navi (I guess this is really Medium Navi) graphics card are still blurry, but we expect to see a full Navi 22 (codename Navy Flounder) die, which houses 40 Compute Units (CUs). As AMD has done in the past, it’s reasonable to think that the chipmaker would also put out a Radeon RX 6700, which would probably leverage a cut-down version of the Navi 22 silicon.
The rumors are painting the Radeon RX 6700 XT and RX 6700 with 2,560 and 2,304 Stream Processors (SPs), respectively. Assuming that the SP count is accurate, the XT variant will have 40 ray accelerators at its disposal, while the non-XT variant should be equipped with 36 of them.
On the memory aspect, Gigabyte has registered multiple custom Radeon RX 6700 XT graphics cards before the EEC (Eurasian Economic Commission) with 12GB of GDDR6 memory. Similary, ASRock has submitted a couple of Radeon RX 6700 SKUs with 6GB of GDDR6 memory.
Pricing and performance are important, but availability has ultimately taken up a bigger role nowadays given the graphics card shortages, crypto-mining boom and scalpers. AMD has made it clear that it’ll announce a Radeon RX 6000 graphics card on March 3. However, it’ll be interesting to see if it will be available for purchase sooner rather than later.
AMD has announced a new hardware event for next month, where the company plans to unveil the next GPU in its Radeon RX 6000 line of cards. The presentation will air on March 3rd at 8AM PT / 11AM ET.
Like other GPUs in the RX 6000 series, this new model will use the RDNA 2 architecture, including real-time, hardware-accelerated ray tracing and variable rate shading. AMD’s announcement of a new GPU presentation comes just one day before sales kick off for Nvidia’s affordable RTX 3060 GPU at select retailers.
AMD debuted the Radeon RX 6000 line of graphics cards in late October, with the GPUs serving as a direct competitor to Nvidia’s RTX 30 cards. Currently, the RX 6000 consists of three GPUs: the flagship RX 6900 XT, the $649 RX 6800 XT, and the RX 6800, which is the most affordable of the trio at $579.
With the announcement of a new RX 6000 card coming, we anticipate that, like other GPUs in this series, it will sell out quickly. In January, AMD told The Verge that within the first quarter of 2021, it expects to sell more of its own RX 6000 cards through its website, which is bittersweet news considering the RX 6000 has been difficult to buy.
After updating the ThinkPad X1 line at CES, Lenovo is now opening the floodgates and releasing refreshed versions of its more mainstream ThinkPad T, X and L series, as well as some updates to the P-series workstations.
The new T-series, among the most popular ThinkPads around, are getting a slew of changes. There are Intel 11th Gen Core versions: the ThinkPad T14 i, T14s i and T15; and AMD Ryzen 5000 options: the ThinkPad T14 and T14s.
Laptop
Starting Price
Available
Intel or AMD-based
ThinkPad T14s i
$1,499
March 2021
Intel
ThinkPad T14s
$1,279
May 2021
AMD
Think Pad T14 i
$1,379
March 2021
Intel
ThinkPad T14
$1,159
May 2021
AMD
ThinkPad T15
$1,379
March 2021
Intel
ThinkPad X13 Yoga
$1,379
April 2021
Intel
ThinkPad X13 i
$1,299
March 2021
Intel
ThinkPad X13
$1,139
May 2021
AMD
ThinkPad L14
$689
May 2021
AMD
ThinkPad L15
$689
May 2021
AMD
ThinkPad P14s i
$1,389
March 2021
Intel
ThinkPad P14s
$1,169
May 2021
AMD
ThinkPad 15s
$1,389
March 2021
Intel
The 14-inch models are getting aluminum chassis, rather than silver paint over the typical black shell. For connectivity, Lenovo will offer optional 5G (sub-6 GHz) or 4G. Most Intel models will use Wi-Fi 6E (though some will use Wi-Fi 6), while AMD options will use regular Wi-Fi 6 exclusively.
But the T14s models are the only ones with an FHD IR webcam, which should make for superior video chatting, especially while so many people are working from home. On the T14 and T15 versions, it’s still standard 720p (with or without IR).
Lenovo’s X13 laptops, which are more portability-focused, will get an Intel-based ThinkPad X13 i and X14 Yoga, while the AMD version is the ThinkPad X13. They’re getting 13.3-inch, 16:10 aspect ratio displays, joining a trend of taller screens.
There’s a similar split with the more affordable ThinkPad L series: The Intel-based ThinkPad L14 i and L15 i will have 11th Gen Core vPro, while the L15 will gets Ryzen 5000.
The T14s, T14, X13 and X13 Yoga are getting new aluminum chassis options, while the T14s, X13 and X13 Yoga will all feature “Human Presence Detection” in order to lock when you’re not around. That latter feature can work with Windows Hello and the infrared camera to make it so you can log in and then lock the screen, all without ever entering passwords or even touching the machine.
For those who prefer fingerprint readers to facial recognition, the T14s, X13 and X13 Yoga will all have a reader built into the power button.
Additionally, Lenovo has its workstations with the P14s using Ryzen, P14s i and P15s running 11th gen Core vPro. The Intel models will offer up to Nvidia T5000 graphics or Intel Iris Xe, while the AMD models will use integrated Radeon graphics. Again, the Intel laptops will have Wi-Fi 6E while the AMD version has Wi-Fi 6.
The Intel models are scheduled to start shipping next month, while the AMD models are coming later in the spring. We’ll see how they stack up and if any vie for the title of best ultrabook as they come out.
Another hint for AMD’s unannounced mid-ranged RDNA 2 cards has been spotted online: This time, preliminary support for AMD’s future Radeon RX 6700 and RX 6600 GPUs has appeared in a new version of GPU-Z, a popular GPU monitoring tool developed by TechPowerUp.
We’ve heard rumors that an RX 6700 and RX 6600 (XT) is at least planned by AMD for the past six months. The rumors started last year when Navy Flounder (Navi 22) and Dimgrey Cavefish (Navi 23) were shown in a leak in the Mesa 3D graphics library. We believe Navi 22 & 23 will be the dies used to produce RX 6600 and 6700 series devices.
More recently, there have been fillings to the ECC & EEC on multiple occasions regarding an RX 6700 XT 12GB, RX 6700 6GB, and an RX 6600 XT 12GB from Gigabyte, PowerColor, and ASRock. Possibly indicating that AMD could be close to production with these SKUs.
Now GPU-Z has official preliminary support for AMD’s Radeon RX 6700 and 6600 series GPUs, so we are almost certain that these cards are coming soon.
But with the massive silicon shortage going on right now, it seems almost unreasonable to produce another SKU at this time. However, AMD has no mid-range or entry-level competitor to Nvidia’s Ampere products so far. If they want to maintain a competitive edge, as they have with the RX 6800/XT and RX 6900 XT, they’ll need to figure out some way to produce these cards and maintain decent production.
But at least we now know that mid-range RDNA2 is in the works by AMD, with this official support by GPU-Z featuring the RX 6600 and RX 6700 series. Now it’s a question of when these cards will come out, and will there be another “paper” launch? We’ll have to wait and see.
When it comes to sharp image quality, 4K resolution is where it’s at in 2021. Sure, there are 8K screens and even more modest 6K ones. And lower resolutions deliver higher frame rates on even the best graphics cards. . But if we’re being realistic about what our eyes need and can perceive, how big of a screen we can fit, our budget and the media available, 3840 x 2160 sits on the upper echelon of premium viewing experiences, whether you’re gaming, watching a movie, surfing the web or getting work done. And with one of the best budget 4K monitors, you can get there without going broke.
For a while, 4K was a luxury that wasn’t quite reasonable for a PC monitor. But as these high-res screens have become more common and the bleeding edge has turned to higher pixel counts, a market segment of budget 4K monitors now allow you to take the Ultra HD experience to your desktop.
Below are the best budget 4K monitors we’ve tested. All usually go for about $400 or cheaper.se.
When shopping for the best budget 4K monitor, remember the following:
What size do you need? For a budget screen, 32 inches is a good sweet spot, giving you ample space while still being able to fit on your desktop. 28-inch and 27-inch screens are also common in this price range and will be cheaper. They’re good for productivity, but you probably won’t want to share a screen that size.
Decide the monitor’s main purpose. If it’s gaming, higher refresh rates and Adaptive-Sync (AMD FreeSync or Nvidia G-Sync) are priorities, as is a beefy graphics card. You should have a minimum of a GTX 1070 Ti or RX Vega 64 for medium settings or, for high or better settings, an RTX-series or Radeon VII. For general productivity or entertainment, look for high contrast for high image quality. Creatives should strive for accuracy. For more, see How to Buy a PC Monitor and our Best Gaming Monitors page.
Errors under 3 Delta E (dE) are generally invisible to the naked eye. A monitor with a 5dE color error, for example, probably has colors that look visibly off. Accuracy matters more for creative work.
Do you need HDR? A 4K monitor with the right HDR implementation makes 4K/HDR content look significantly better than it would on a regular, or SDR, monitor. While many 4K monitors support HDR, few budget ones do it with noticeable impact. If you want a monitor that makes the HDR upgrade worth it, consider upping your budget to stay in 4K or opting for a lower resolution to save money. See our How to Choose the Best HDR Monitor article for more.
Consider ports and other features. Do you need HDMI 2.1, the latest DisplayPort (1.4)? Are USB Type-A ports important, and do you want USB-C for charging or a single-cable setup? Speakers and the stand’s ability to tilt, swivel or rise are also factors.
The Samsung UR59C is the best budget 4K monitor, offering a 32-inch VA panel with accuracy and curves. Image quality is superb with bold, accurate colors and clear text — after calibration, that is. When we tested in sRGB mode, we recorded a color error of 4.3dE with visible errors, but our calibration (see our recommended settings on page 1 of the review) got it down to 0.9dE. Your web and games should look as intended. The UR59C also offers fantastic contrast, as expected from a VA panel, hitting an impressive 2,590.5:1 after calibration.
Ultrawide screens typically offer more noticeable curves, but despite its 16:9 aspect ratio, the UR59C’s1500R curve is noticeable and beneficial, allowing us to keep more windows in view.
This monitor isn’t fit for serious gaming, but casual players can make it work. The UR59C has a 60 Hz refresh rate, 4ms response time and no FreeSync or G-Sync to fight screen tears. You’d get noticeably better response times and input lag scores from a 75 Hz screen even. But with its high contrast and the pixel density of a 32-inch, 4K screen, games didn’t look bad. If you’re games that aren’t graphically intense or at lower settings and you have a speed enough graphics card that can consistently hit 60 frames per second (fps), you can enjoy blur-free gaming on the UR59C.
You’ll have to pay a hefty price for a monitor that can push 8.3 million pixels at a 144 Hz refresh rate. The best budget 4K gaming monitor, the Asus TUF Gaming VG289Q, is a slower 60 Hz but fights screen tears with FreeSync. Yes, input lag is significantly larger than what you’ll find on a 144 Hz monitor, as is response time. But if you’re working with a budget graphics card and want your games to look detailed and realistic, this is a great option. SDR games looked extra colorful on the VG289Q, and dynamic contrast brought subtle visual benefits, like added dimension. There are screens on this page with better contrast though.
HDR isn’t as fantastic as you’ll find on a monitor with a full-array local dimming (FALD) backlight or even an edge array backlight, but shadows and highlights looked more distinct, and we enjoyed the boost in color.
For more premium high-res gaming screens, check out our Best 4K Gaming Monitors round-up.
If the best budget 4K monitor for you is in the 32-inch range, check out the LG 32UN500-W Contrast is a top consideration when it comes to image quality, and the 32UN500-W’s VA panel didn’t disappoint in our benchmarks, hitting 2,353.9:1 out of the box. The 32UN500-W’s native color gamut is P3, and it covers that color space accurately without any visible errors.
Again, as a budget 4K monitor, the 32UN500-W isn’t winning any HDR prizes. Color lacks the expected pop, and overall the image doesn’t provide a noticeable boost over SDR.
But the 32UN500-W also thoughtfully includes two 5W speakers and even AMD FreeSync to fight screen tears during casual gaming. In general, it delivered popping colors with deep blacks, making it a great fit for your favorite 4K movie and the like.
The Dell S2721QS earns the title of best 27-inch budget 4K monitor with a bright screen, reliably accurate image and useful add-ons. Those bonus add-ons include the ability to connect multiple PCs and view them simultaneously via picture-in-picture or picture-by-picture and an optional app that makes it easy to calibrate the screen or arrange up to 6 windows in various preset layouts. The latter is a productivity boon.
HDR isn’t this monitor’s strong suit. We recorded undersaturated color in this mode, as well as as well as visible grayscale errors. And this monitor doesn’t have the speed or Adaptive-Sync (FreeSync or G-Sync) to make it an appropriate gaming screen.
But in terms of image quality, this is a bright screen, hitting 393 nits in our testing, along with strong contrast for an IPS monitor (1,101:1). You can also expect accurate colors. We recorded just a 2.6dE error with sRGB color
If you’re doing professional work, you should probably opt for a professional monitor. Pro monitors are known for offering exceptional accuracy for a premium price. But with monitors continuously improving, we’re at a point where you can find monitors with pro-level accuracy in key areas, like color, just without the pro-level price tag.
The HP U28 is one such screen and the best budget 4K monitor for creatives. None of the monitors on this page are color slouches, but the U28 stands out with its ability to accurately cover both the sRGB and P3 color spaces with just a switch in the OSD and no calibration. You also get an adjustable stand that allows height and swivel adjustments and the ability to flip into portrait mode, offering plenty of flexibility for creative work.
HP’s U28 comes at a premium though. While not as pricey as professional monitors, the U28 is the most expensive monitor on this list as of this writing.
Still, with a USB-C port letting you charge laptops (or other devices), you may be able to reduce cable clutter, and there are many other ports here too. With that bonus in mind and creative-level accuracy, the U28 is great for feeding your hobby or even career.
The Philips 558M1RY represents a price breakthrough in the jumbo gaming monitor category. Though it leaves out HDMI 2.1, it brings everything else to the gaming table with 120 Hz, adaptive sync and accurate DCI-P3 and sRGB color. Add in killer HDR and you have a winner for a relatively low price.
For
Good contrast
Color accurate
Bright
Perfect Adaptive-Sync at 120 Hz
Unbeatable audio quality
Against
No HDMI 2.1
No streaming apps
Features and Specifications
Go big or go home. Size matters. The bigger, the better. Whatever your favorite cliché, games are more fun when you play on a big screen. That’s part of the reason the PlayStation 5 (PS5), Xbox Series X and console gaming in general is so popular. It’s not just cost; consoles also make it easier to play on the large TV in the living room, rather than the desk-sized screen sitting in the home office.
But PCs have a huge performance advantage over consoles. You’re not going to hit 144 fps on a console, and you’re definitely not going to find a DisplayPort on a console or TV. For those committed to high-performance and speed in one of the best 4K gaming monitors, the question is how much are you willing to spend on a jumbo monitor?
If around $1,500 is within your budget, the Philips Momentum 558M1RY may be for you. It’s a 55-inch VA monitor specced for up to 1,000 nits brightness, HDR, AMD FreeSync and a 120 Hz refresh rate. If that’s not enough, the 558M1RY includes a high-quality soundbar from Bowers & Wilkins. Yes, that B&W. That’s a premium package at a premium price.
Philips Momentum 558M1RY Specs
Panel Type / Backlight
VA / W-LED, edge array
Screen Size / Aspect Ratio
54.5 inches / 16:9
Max Resolution & Refresh Rate
3840×2160 @ 120 Hz
FreeSync: 48-120 Hz
Native Color Depth & Gamut
10-bit / DCI-P3
DisplayHDR 1000, HDR10
Response Time (GTG)
4ms
Max Brightness
SDR: 750 nits
HDR: 1,200 nits
Contrast
4,000:1
Speakers
B&W 40-watt Ported Soundbar: 2x tweeters, 2x mid, 1x sub
Video Inputs
1x DisplayPort 1.4
3x HDMI 2.0
Audio
3.5mm headphone output
USB 3.2
1x up, 4x down
Power Consumption
53.5w, brightness @ 200 nits
Panel Dimensions WxHxD w/base
48.5 x 32.8 x 12.1 inches (1232 x 833 x 307mm)
Panel Thickness
4 inches (102mm)
Bezel Width
Top/sides: 0.4 inch (10mm)
Bottom: 0.9 inch (22mm)
Weight
58.3 pounds (26.5kg)
Warranty
4 years
We’ve looked at a few jumbo monitors in the past, like the HP Omen X 65 Emperium and the Alienware 55 OLED panel. Both perform admirably but cost a fortune. In terms of jumbo gaming monitors, Philips’ 558M1RY is the least expensive we’ve seen yet.
There’s no question around whether or not the 558M1RY is a TV. There’s no tuner and no smart TV apps. But despite Philips advertising the monitor as offering “new-level console gaming,” there is no HDMI 2.1 to support the new PS5 and Xbox consoles’ fastest frame rates. If you use the monitor with a console you’ll be limited to a 60 Hz refresh rate, unless you drop down to 1440p resolution, where you can reach 120 Hz. For 4K at 120 Hz, you have to use the DisplayPort connection, which, of course, is only found on PCs.
A VA panel promises high native contrast, there’s an extended color gamut and an LED edge-array backlight that Philips promises is good for 750 nits brightness with SDR content and a whopping 1,200 nits with HDR, which surpasses VESA’s highest HDR certification, DisplayHDR 1000, which requires 1,000 nits. The 558M1RY certainly has the tools to produce a stunning 4K image.
Gamers will enjoy the 558M1RY’s seamless AMD FreeSync Premium Pro implementation. Compared to standard FreeSync and FreeSync Premium, FreeSync Premium Pro adds HDR support and low latency in HDR mode. We also got Nvidia G-Sync to run on the scree,n even though it’s not G-Sync Compatible-certified. (To learn how, see our How to Run G-Sync on a FreeSync Monitor tutorial). We verified that both kinds of Adaptive-Sync work over a 48-120 Hz range with or without HDR through DisplayPort 1.4.
Assembly and Accessories on Philips Momentum 558M1RY
You’ll need a friend to help you unbox the Philips Momentum 558M1RY because it arrives fully assembled with stand and soundbar already fixed in place. The package weighs around 65 pounds in total. As with a desktop monitor, you get an IEC power cord plus HDMI, DisplayPort and USB cables. A tiny remote is also included which makes menu navigation a lot easier. If you plan to wall mount, there’s a 200mm VESA pattern in back with four large bolts included in the box.
Philips Momentum 558M1RY Product 360
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
It’s hard to appreciate the scale of the Philips Momentum 558M1RY from the photos, because it’s styled just like a standard PC monitor. The first clue that this is an extreme display is the soundbar firmly attached to the panel. These B&W speakers are covered with a burlap-like wool-blend fabric in dark gray. Meanwhile, the base and upright are very solid and stable and offer a tilt adjustment like a desktop display. The generally low-key design makes sense, considering that this will likely sit in a living or family room for all to see. There are no gaming cues in sight — until you activate the 558M1RY’s colorful lighting effect.
The 558M1RY features what Philips calls Ambiglow, a lighting feature across the sides and top of the panel’s backside. It can glow a single color with adjustable brightness or you can set it change according to what’s currently on the screen. That effect adds an interesting motion element you won’t find on any other gaming monitor. It works particularly well if you have a neutral-colored wall behind the screen.
From a side view, the 558M1RY looks chunky with angles and straight lines making up the 558M1RY’s shape. In back, you can see a heat vent across the top and a tiny Philips logo. The small dots around the perimeter are the Ambiglow LEDs. On the soundbar, you can see a port on one side that extends the bass lower.
Perfect for a living room, the 558M1RY even comes with a 6-inch-long wand-shaped remote that easily controls all monitor functions. You get a power toggle, plus mute, input and menu up top. After the four-way nav pad is picture mode and return. Two rockers at the bottom adjust brightness and volume.
The input panel is up under the upright and fairly hard to reach. You get three HDMI 2.0 inputs and a single DisplayPort 1.4. USB is version 3.2 and includes one upstream and four downstream ports. Two of them can charge or power devices when the 558M1RY is off.
OSD Features on Philips Momentum 558M1RY
With 12 logically arranged submenus, the on-screen display (OSD) is exactly like the one found in all the Philips monitors we’ve reviewed. You can access it with a joystick on the back-right side of the panel or the handy remote control.
First up is Ambiglow, the LED lighting effect. You can set any color to a steady state and adjust its brightness with a slider, or choose a random rotation of colors. The coolest feature is image match, where the colors change with the onscreen content. It sounds gimmicky, but in practice, it added an interesting dimension to both gameplay and video.
A Game Setting menu offers aiming crosshairs, a low input lag mode, which can be left on all the time, and a three-level overdrive. The speediest overdrive settings, Fastest, works well at reducing motion blur without leaving ghosting artifacts.
Most of the image controls are in the Picture menu, where you get brightness and contrast, along with sharpness and color saturation. At the top, SmartImage offers 7 different picture modes, which are task-specific. One of the modes focuses on improved screen uniformity. It delivered but reduced contrast in the process. Our sample didn’t need that feature though.
There are also five gamma presets in the Picture menu.
The Color menu offers color temp adjustments by Kelvin value or with RGB sliders. The Philips Momentum 558M1RY measures well out of the box and doesn’t need calibration, but there are slight gains available with a few adjustments. Here also is the sRGB mode, which effectively renders that gamut with decent gamma and grayscale tracking. Color purists will be happy to have this feature available.
Philips Momentum 558M1RY Calibration Settings
The 558M1RY has a native DCI-P3 color gamut that it uses for all content unless you engage the sRGB mode in the Color menu. Since sRGB mode can’t be calibrated, we calibrated the 558M1RY via the User Define color temp and left SmartImage off. With slight changes to the RGB sliders and a switch in gamma from 2.2 to 2.4, we achieved excellent results.
You’ll notice in the table below that we could only turn the backlight down to 105 nits minimum. That’s a bit bright for gaming in a completely dark room, but you can get some relief by turning on Ambiglow and setting it to a dim white.
Here are the calibration settings we used for SDR mode on the 558M1RY:
Picture Mode
Smart Image Off
Brightness 200 nits
54
Brightness 120 nits
12
Minimum
105 nits
Contrast
50
Gamma
2.4
Color Temp User
Red 99, Green 99, Blue 99
When an HDR signal is applied, you get five additional presets. For the brightest presentation, choose DisplayHDR 1000. For the best HDR image, choose Personal.
Gaming and Hands-on with Philips Momentum 558M1RY
Using a 55-inch monitor for workday tasks is a bit unusual, but if you sit around 6 feet away, it works. The Philips Momentum 558M1RY’s stand raises the screen a bit less than 5 inches from the desktop, so to put your viewpoint in the center, you’ll have to raise your chair or use a lower than typical piece of furniture. Unlike a TV, the Philips’ stand has a tilt function of 10 degrees, so there is some flexibility there. The soundbar moves with the panel so its sound is always focused on the user.
With font scaling set to 300%, Windows apps were easy to use from up to 10 feet away. Small text was easily readable, so if you want to sit on the couch and browse the web, the 558M1RY can oblige. Contrast is superb with a nearly 5,000:1 contrast ratio available in SDR mode. We used the extended color gamut for most productivity apps, except Photoshop, where we switched to the sRGB mode.
Movie watching is a pleasure with such a large screen. You can sit close and have a very immersive experience, both visually and sonically. The impact of good audio cannot be overstated either. In the AV world, the most commonly given — and most commonly ignored — advice is to spend twice as much on audio as video. This is hard to do because we all want the largest possible display. But when sound is as good as the B&W soundbar included with the 558M1RY, you’re getting a huge value-add.
With two tweeters, two 10W midranges and a 20W woofer for bass, the Philips Momentum 558M1RY has some of the best built-in sound we’ve ever heard. The full frequency spectrum is represented and only the very deepest bass, below 80 Hz, is a little weak.
For gaming and movies, the soundbar is a huge asset. Not only is the audio crystal clear, but also the sound stage is much wider than the bar’s physical size. Higher partials, like female voices and finely detailed ambient effects, came through in perfect balance with no trace of sibilance or harshness.
Of course, the Philips Momentum 558M1RY is at heart a gaming monitor, and for that, it excels. SDR games, like Tomb Raider, look fantastic when running at 120 frames per second (fps) at 4K resolution with max detail. We paired the monitor with a system running a GeForce RTX 3090. One of the best graphics cards, it has no trouble keeping frame rates high. We also tested the monitor with a Radeon RX 5700 XT-equipped machine. In either case, we got Adaptive-Sync to run perfectly. And in general, response and input lag were low enough not to call attention to themselves.
Contrast was particularly impressive in the dark areas of the game environment where blacks were true and shadow detail was rich. The depth afforded by a quality VA panel like this makes the suspension of disbelief far more palpable.
Color also stood out, thanks to an accurate gamut. We played Tomb Raider in sRGB mode, where it looked great, and with the full DCI-P3 gamut engaged, where it looked even better. Though purists like us prefer to use the mastered color spec whenever possible, there’s no denying the impact of a little more saturation.
Switching to HDR in Windows worked seamlessly, and thanks to the availability of the contrast slider in the HDR Personal mode, we were able to dial down the extreme brightness to make the desktop less fatiguing to look at. With the HDR title Call of Duty: WWII, however, we enjoyed 750-nit highlights that made the picture really pop. It never looked too bright, even in daylight outdoor scenes.
Video processing was also perfect with HDR engaged. 120 Hz and Adaptive-Sync worked flawlessly on both AMD and Nvidia platforms with HDR content.
The BenQ Zowie XL2546K leaves out HDR and extended color but has DyAc+, which is the best blur reduction feature we’ve ever seen. The monitor delivers smooth and responsive gameplay. With a few tweaks, it delivers excellent color too. It’s definitely worth a look if a 240 Hz monitor is on your radar.
For
Saturated color with calibration
Low input lag
Excellent blur reduction
Against
Below-average contrast
Poor color and gamma out of the box
No HDR
No extended color
Features and Specifications
In the early days of video gaming, competition took place in computer labs, and the prizes were things like magazine subscriptions or special parking privileges at the local university. Today, eSports is a major spectator sport with millions of loyal fans and professional players who earn a living competing in virtual arenas. With that meteoric rise in skill level comes a need for better tools and that’s where the best gaming monitors come in.
Once, 144 Hz was enough to earn a monitor eSports status, but 240 Hz is quickly becoming the new standard for gaming monitors and is no longer an exclusive refresh rate. You’ll still pay a premium to go that fast though, case in point, BenQ’s Zowie XL2546K. It sells for around $500, which is a median price in this category.
For that price, you get a 25-inch (24.5-inch viewable) TN panel with 1080p resolution and AMD FreeSync Premium. Though its out-of-the-box image quality could be better, the BenQ Zowie XL2546K offers a strong gaming experience with minimal input lag and fantastic blur reduction.
BenQ Zowie XL2546K Specs
Brand & Model
BenQ Zowie XL2546K
Panel Type & Backlight
TN / W-LED, edge array
Screen Size & Aspect Ratio
24.5 inches / 16:9
Max Resolution & Refresh
1920×1080 @ 240 Hz
FreeSync: 48-240Hz
G-Sync compatible
Native Color Depth & Gamut
8-bit (6-bit+FRC) / sRGB
Response Time (GTG)
0.5 ms
Brightness (mfr)
320 nits
Contrast (mfr)
1000:1
Speakers
–
Video Inputs
1x DisplayPort 1.2
3x HDMI 2.0
Audio
3.5mm headphone output
USB 3.0
–
Power Consumption
19.4w, brightness @ 200 nits
Panel Dimensions WxHxD w/base
22.5 x 14.5-20.7 x 7.9 inches (572 x 368-526 x 191mm)
Panel Thickness
2.2 inches (55mm)
Bezel Width
Top/sides: 0.5 inch (13mm)
Bottom: 0.7 inch (17mm)
Weight
13.7lbs (6.2kg)
Warranty
Three years
Panel Type / Backlight
TN / W-LED, edge array
Screen Size & Aspect Ratio
24.5 inches / 16:9
Max Resolution & Refresh
1920×1080 @ 240 Hz
AMD FreeSync Premium: 48-240 Hz
Native Color Depth & Gamut
8-bit (6-bit+FRC) / sRGB
Response Time (GTG)
0.5 ms
Brightness
320 nits
Contrast
1,000:1
Speakers
None
Video Inputs
1x DisplayPort 1.2
3x HDMI 2.0
Audio
3.5mm headphone output
USB 3.0
None
Power Consumption
19.4w, brightness @ 200 nits
Panel Dimensions
22.5 x 14.5-20.7 x 7.9 inches
WxHxD w/base
(572 x 368-526 x 191mm)
Panel Thickness
2.2 inches (55mm)
Bezel Width
Top/sides: 0.5 inch (13mm)
Bottom: 0.7 inch (17mm)
Weight
13.7 pounds (6.2kg)
Warranty
3 years
The BenQ Zowie XL2546K is somewhat old school with a TN panel running at FHD resolution. The pixel count isn’t unusual for this class, but the TN screen is. It’s no longer necessary for a fast monitor to be TN. IPS has evolved to 240 Hz and beyond. Witness the two 360 Hz IPS monitors we recently covered, Asus’ ROG Swift PG259QN and Alienware’s AW2521H. While they both sell for over $700, they’re proof that you don’t need TN to go fast.
BenQ offers the XL2546K as a no-frills gaming monitor by leaving out HDR and extended color. While these things are not necessary in a competition gaming tool, they are nice to have for the rest of us. Granted, this category doesn’t see a lot of DCI-P3 color gamuts, but our recent experience with the AW2521H also demonstrated that good HDR is possible with a fast display.
AMD FreeSync Premium is the featured Adaptive-Sync tech. Compared to standard FreeSync, it includes low framerate compensation (LFC). The XL2546K isn’t Nvidia-certified, but we got it to run Nvidia G-Sync too. See our How to Run G-Sync on a FreeSync Monitor article for instructions.
Assembly and Accessories of BenQ Zowie XL2546K
After bolting the upright and base together, the XL2546K’s panel snaps in place. If you’d rather use a monitor arm, a 100mm VESA pattern is included with large-head bolts already installed.
The stand is completely wobble-free once assembled. Rigid shades click in place on the sides, but there is no light blocking piece for the top. The controller for the on-screen display (OSD) comes out of its own little box and connects to a special Mini-USB port. You also get a DisplayPort cable and an IEC power cord. Everything is neatly and carefully packed as a premium product should be.
Product 360
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
BenQ bakes in its usual solid build quality and functionality with a wired OSD controller and light blocking shades for the panel’s sides. Along with a beefy stand, the XL2546K is ready for competition or just to satisfy a casual enthusiast’s lust for speed.
The XL2546K is the first monitor in our recent memory to be devoid of any logos or graphics on the front. The base and upright are similarly unadorned, but around back, you’ll find a Zowie logo in red. The same symbol is molded into the hinged light shutters. Red trim lines a large hole in the upright through which you can pass cables.
The OSD puck controller, BenQ calls it the S Switch, has five buttons and a scroll wheel that makes menu navigation quick and intuitive. The bezels are always visible but aren’t too thick at 13mm on the top and sides and 17mm at the bottom.
In case you need to lug the screen about, the XL2546K features a metal handle that’s more than up to the task of moving the XL2546K around. To the side is a flip-out headphone hook, and at the bottom are OSD controls, namely a joystick and two buttons. The third key there is a power toggle.
From the side, you can see that there are no USB ports. The input panel underneath doesn’t have them either. The stand has a small red arrow that you slide into your preferred position to recall the height setting. A similar feature is in evidence on the base via tick marks indicating swivel angle. Adjustments include 6 inches of height, 45-degree swivel to either side, -5/23-degree tilt and a portrait mode. Movements exude the quality of a premium display.
The input panel features three HDMI 2.0 ports and a single DisplayPort 1.2. A 3.5mm jack accommodates headphones or external audio. There are no internal speakers, but you can adjust volume in the OSD. The HDMI ports will accommodate the 120 Hz refresh rate from the new Xbox Series X and PS5 consoles.
OSD Features of BenQ Zowie XL2546K
A quick menu appears when you press any key on the BenQ Zowie XL2546K’s panel or on the S Switch puck controller. The S switch is very handy, particularly since you can program four of its functions. This means you can change settings quickly and conveniently without going through the OSD’s full menu.
Once you get into the OSD, you’ll find many options to tailor both image and performance. There are eight picture modes, all of which are fully adjustable. Settings save to each mode individually and by input. The number of possible combinations is, therefore, vast. The default mode is FPS1, which takes some less than attractive liberties with color and gamma. We’ll show you its effects in the image tests. Standard is the better choice, as it comes close to the mark without calibration.
To tweak the Zowie XL2546K’s image, BenQ provides three color temps, plus a user mode with RGB sliders. They work extremely well and deliver very accurate color in the sRGB gamut. You also get five gamma presets, black equalizer for enhancing shadow detail, color vibrance, which adjusts overall saturation, low blue light for reading and a color weakness feature for color blind users deficient in either red or green.
The Picture menu has the brightness and contrast sliders along with DyAc+ (more on this in the Hands-on section), BenQ’s name for its blur reducing backlight strobe. DyAc+ has two settings, which vary the LED pulse width. The lesser of the two is called High and is enough to remove any visible blur.
BenQ also offers overdrive, which it calls AMA. This option is best left turned off because it produced visible ringing when we played games using Adaptive-Sync. The artifact isn’t as obvious with DyAc+ but doesn’t improve the image either.
BenQ Zowie XL2546K Calibration Settings
If you do nothing else, we strongly recommend switching your BenQ Zowie XL2546K to Standard mode. The default, FPS1, alters color and gamma in unattractively. Accurate color is always the best choice.
You don’t absolutely need to calibrate the Standard mode, but a few changes resulted in a visible improvement. We improved grayscale with adjustments to the RGB sliders. Perceived contrast also increased with a change from gamma 3 to gamma 4, and we reduced the contrast control by 18 steps to fix a color clipping issue which bumped up the color saturation. We’ll talk about all of that on page three.
Our recommended settings for the BenQ Zowie XL2546K are below.
Picture Mode
Standard
Brightness 200 nits
67
Brightness 120 nits
32
Brightness 100 nits
23
Brightness 80 nits
15
Brightness 50 nits
3 (min. 45 nits)
Contrast
32
Gamma
4
Color Temp User
Red 96, Green 100, Blue 97
Gaming and Hands-on with BenQ Zowie XL2546K
The BenQ Zowie XL2546K gave us a few surprises when we sat down for some gaming. After our calibration (see our recommended settings above), we wondered how our contrast setting, which seemed extreme, would look. The answer is very good. Though the panel doesn’t show great native contrast, changing the gamma from 3 to 4 and lowering the contrast slider makes a huge difference in color saturation and shadow depth. Those tweaks made the BenQ equal to the better IPS screens we’ve reviewed.
The second, and greater, surprise came via the XL2546K’s blur reduction feature that BenQ calls DyAc+. Blur reduction usually means a brightness reduction, but BenQ managed to avoid this pitfall with some clever engineering. We measured the two DyAc+ settings (High and Premium) with the brightness control set to the same value, and light output did not change. This is a first in our experience.
This is the first monitor we’ve played on where the backlight strobe produced better motion resolution and video processing quality than Adaptive-Sync. FreeSync and G-Sync both worked perfectly with two systems: one equipped with a GeForce RTX 3090 and the other a Radeon RX 5700 XT. Frame rates were maxed at 240 frames per second (fps) in all the games we played, so tearing did not occur, even with Adaptive-Sync off. Since there’s no reduction in brightness, we recommend using DyAc+ instead of Adaptive-Sync. And that’s something we thought we’d never say.
In either case, input lag was a complete non-issue. There are gamers who prefer using backlight strobes instead of Adaptive-Sync because they believe input lag is lower. We can’t confirm this with testing, but at 240 Hz, but no one is going to perceive a 1ms or 2ms difference. When you think about a control input, the BenQ Zowie XL2546K responds. It is certainly fast and responsive enough for competitive gaming. And DyAc+ is the best implemented backlight strobe we’ve seen yet.
Color and contrast are excellent for gaming. With a little bonus saturation in the primary colors, on-screen environments are vibrant and three-dimensional. There is plenty of light output to compliment the darker gamma we chose, and the resulting picture is much better than the test numbers suggest. This is also an unusual thing in our experience, but there’s no denying that the BenQ Zowie XL2546K plays games well and looks great doing it.
It also looks great performing workday tasks. Some might prefer higher pixel density but at 25 inches, there are 89 pixels per square inch, which is enough to resolve small fonts and details. Photo editing isn’t this monitor’s strong suit, but its accuracy is sufficient for the demands of color-critical work. The XL2546K is a solid all-around display.
Whether you’re a student, a professional or just want to stay connected and productive, a laptop is one of the most important tools of the trade. But some are better than others, with wide differences in keyboards, battery life, displays and design. If you’re looking for a powerful laptop that easily fits in your bag and doesn’t break your back, you want an ultrabook.
The “ultrabook” moniker was originally coined by Intel in 2012 and used to refer to a set of premium, super-thin laptops that met the chipmaker’s predefined standards. However, just as many folks refer to tissues as Kleenexes or web searching as Googling, the term ultrabook commonly refers to any premium ultraportable laptop, whether it carries Intel’s seal of approval or not.
Of course, there’s always new tech coming down the pipe. Intel has announced its 11th Gen Core “Tiger Lake” processors with Iris Xe graphics and Thunderbolt 4, with laptops shipping in time for the holiday season. And its likely that an AMD Ryzen refresh won’t be far behind, bringing USB 4 to laptops. That’s in addition to the possibility of Apple’s first Arm-powered MacBook coming this fall.
Get a good keyboard: Whether you’re using an ultrabook to browse the web, send emails, code, write or do other productivity work, the keyboard is one of your primary ways of interacting. Get something with responsive keys that aren’t mushy. Low-travel is ok if the keys have the right feel to them, but the last thing you want to do is “bottom out” while typing.
Consider what you need in a screen: At a minimum, your laptop should have a 1920 x 1080 screen. Some laptops offer 4K options, though it’s sometimes harder to see the difference at 13-inches or below. While 4K may be more detailed, 1080p screens give you much longer battery life.
Some laptops can be upgraded: While CPUs and GPUs are almost always soldered down, some laptops let you replace the RAM and storage, so you can buy cheaper now and add more memory and a bigger hard drive or SSD down the road. But the thinnest laptops may not have that option.
Battery life is important: Aim for something that lasts for 8 hours or longer on a charge (gaming is an exception). For productivity, many laptops easily surpass this number. But be wary of manufacturer claims, which don’t always use strenuous tests. Some laptops are starting to add fast charging, which is a nice bonus.
The HP Spectre x360 14 is everything a modern ultrabook should be. This laptop has an attractive design, but isn’t about form over function. It has both Thunderbolt 4 over USB Type-C, as well as a microSD card reader, all in a thin chassis.
But what really wows is the display. The
3:2 aspect ratio
is tall and shows more of your work or web pages, and is also more natural for tablet mode. The OLED model we reviewed also offered vivid colors, though you would likely get longer battery life with the non-OLED, lower resolution panel.
The other big plus is the Spectre x360’s keyboard, which is clicky and comfortable. Sure, it’s no desktop mechanical keyboard, but for a laptop, it’s very responsive and feels great to use.
The Dell XPS 13 has long been celebrated for both its form and function. The laptop is tiny, but packs a punch with Intel’s Tiger Lake processors and adds some extra screen real estate with a tall, 16:10 display (many laptops have a 16:9 screen).
We also like the XPS 13’s keyboard, with a snappy press and slightly larger keycaps than previous designs. The screen is bright, and we shouldn’t take its thin bezels for granted, as Dell continues to lead on that front.
Admittedly, the XPS 13 is short on ports, opting for a pair of Thunderbolt 4 ports for booth charging and accessories. Its performance, portability and long battery life are likely to make up for that for those on the go.
Read: Dell XPS 13 (9310) review
3. MacBook Pro 13-inch (M1)
The Best Mac
CPU: Apple M1 | GPU: 8-core GPU on SOC | Display: 13.3-inch, 2560 x 1600, True Tone | Weight: 3.0 pounds / 1.4 kg
M1 is powerful and fast
Runs cool and quiet
Apps just work, even if emulated
Long-lasting battery life
Strong audio
Limited ports and RAM options
Touch Bar isn’t very useful
Poor webcam
While some people may still want the power, large display and port selection of the
16-inch MacBook Pro
, Apple has proved with the 13-inch version that its own home-grown M1 chip is capable of the needs of plenty of people. This is Apple’s first step in breaking away from Intel, and it is extremely impressive.
The 13-inch MacBook Pro runs cool and quiet, while the chip is faster than its competition in most cases. It’s also efficient and ran for more than 16 and a half hours on our battery test.
Many apps run natively on the Arm processor and those that don’t use Apple’s Rosetta 2 software for emulation. Even then, users will barely know that emulation is being used at all. Everything just works.
The big difference between the Pro and the Air, which also uses M1, is that the Pro has a fan. Those who aren’t doing intensive work may be able to save a bit and get a very similar machine by going with the Air, and they will get function keys instead of the MacBook Pro’s Touch Bar.
Read: Apple MacBook Pro 13-inch (M1) review
4. MSI GE66 Raider
The Best Overall Gaming Laptop
CPU: Intel Core i9-10980HK | GPU: Nvidia GeForce RTX 2080 Super Max-Q | Display: 15.6 inches, 1920 x 1080, 300 Hz | Weight: 5.3 pounds (2.4 kg)
Great gaming performance
300 Hz display
Well-executed RGB light bar
High-end build
Cramped keyboard
Tinny audio
The MSI GE66 Raider is a gaming laptop, and it’s saying it loud with a massive RGB light bar. It’s new look is aggressive, but it’s not just talk, with options going up to an Intel Core i9-10980HK and Nvidia GeForce RTX 2080 Super Max-Q.
For those looking for esports-level performance in games like League of Legends or Overwatch, there’s an option for a 300 Hz display.
And while it’s not the slimmest laptop around (or even MSI’s thinnest), it does feel remarkably portable considering the power inside, and we can’t help but appreciate high-end build quality.
Lenovo’s ThinkPads have always been favorites, and the ThinkPad X1 Carbon (Gen 8) continues that trend with a slim design, excellent keyboard and an excellent selection of ports to keep you connected to all of your peripherals.
If you get the 1080p option, you can count on all-day battery life (the 4K model we tested didn’t fare as well, but that’s often the tradeoff for higher resolution among ultrabooks).
Of course, the ThinkPad X1 Carbon also attracts one other audience: fans of the TrackPoint nub in the center of the keyboard.
Read:Lenovo ThinkPad X1 Carbon (Gen 8) review
6. Asus ZenBook Duo 14 UX482
Best Dual Screen Laptop
CPU: Intel Core i7-1165G7 | GPU: Intel iris Xe | Display: 14-inch 1080p (1920 x 1080) touchscreen, 12.6 inch (1920 x 515) ScreenPad Plus | Weight: 3.5 pounds / 1.6 kg
$999 starting price with an i5
Very good battery life
Loud speakers
Improved hinge mechanism and keyboard layout
Keyboard/touchpad are awkward
8GB of RAM in lower configurations
Asus has begun to refine the dual screen laptop. Sure, there’s a more powerful version, but for a laptop with two screens, this one is fairly light, and ran for over 10 and a half hours on a charge.
Windows 10 doesn’t yet natively support dual screen software, Asus’s ScreenPad Plus launcher has improved since launch, with easy flicks and drags to move apps around the display. For Adobe apps, there’s custom dial-based software.
The keyboard and mouse placement are the big compromises, as there isn’t a wrist rest and they can feel cramped. But if you want two-screens, this is as good as it gets for now.
If you’re going for a big screen, the Dell XPS 17 shines. The display on the laptop is bright and colorful, especially on the 4K+ option that we tested, and with minimal bezels around it, your work (or play) is all that’s in focus.
With up to an Intel Core i7 and an Nvidia GeForce RTX 2060 Max-Q, there’s plenty of power here. While it’s not on our list of best gaming laptops, you can definitely play video games on it, including intensive games that use ray tracing.
All of that comes in an attractive design similar to the XPS 13 and XPS 15, though the trackpad takes advantage of the extra space. It’s a luxurious amount of room to navigate and perform gestures.
Read: Dell XPS 17 (9700) review
CPU
GPU
RAM
Storage
Display
HP Spectre x360 14
Up to Intel Core i7-1165G7
Intel Iris Xe (integrated)
Up to 16GB LPDDR4-3733
Up to 2TB M.2 PCIe NVMe SSD
13.5-inch touchscreen, up to 3000 x 2000 resolution, OLED
Dell XPS 13 (9310)
Up to Intel Core i7-1165G7
Intel Iris Xe (integrated)
Up to 16GB LPDDR4x-4276
Up to 512GB M.2 PCIe NVMe SSD
13.4-inch touchscreen, 1920 x 1200 resolution
MacBook Pro (16-inch)
Up to Intel Core i9-9980HK
Up to AMD Radeon Pro 5500M
Up to 64GB DDR4
Up to 8TB SSD
16 inches, 3072 x 1920
Asus ROG Zephyrus G14
Up to AMD Ryzen 4900HS
Nvidia GeForce RTX 2060 with ROG Boost
Up to 16GB DDR4-3200 (8GB on-board, 8GB SODIMM)
1TB PCIe 3.0 M.2 NVMe
14 inches, 1920 x 1080, 120 Hz
Lenovo ThinkPad X1 Carbon (Gen 8)
Up to Intel Core i7-10610U
Intel UHD Graphics
Up to 16GB LPDDR3
Up to 1TB PCIe NVMe SSD
14 inches, up to 4K with Dolby Vision and HDR400
Asus ZenBook Duo UX481
Up to Intel Core i7-10510U
Nvidia GeForce MX250
Up to 16GB DDR3
1TB PCIe NVMe SSD
14 inch 1080p (1920 x1080) touchscreen, 12.6 inch (1920 x 515) ScreenPad Plus
Unless you’re a backlight strobe purist, the Porsche Design AOC Agon PD27 has no flaws. It delivers premium gaming performance with 240 Hz, Adaptive-Sync, HDR and a high-contrast VA panel with 144p resolution. And its physical aesthetic is unmatched at any price.
For
240 Hz
Wide, accurate color gamut
High contrast
Excellent HDR
Build quality, styling
Against
Backlight strobe causes some smearing
Expensive
Features and Specifications
Most players are looking for a gaming monitor that provides maximum performance for the money. But many premium and mid-priced models include some kind of style element, usually a lighting feature, to set themselves apart. Occasionally, a company goes all out on aesthetics with a product that sits well above the rank and file. AOC has done this by partnering with a company that needs no introduction, Porsche Design.
The Porsche Design AOC Agon PD27 is one of the most attractive monitors we’ve seen, more stylish than most of the best gaming monitors with a unique stand and completely customizable lighting with glowing colors and a pattern that projects on your desk. There’s also an all-metal wireless remote that looks like something from a modern art museum.
But don’t think the PD27 places style over substance. Under the hood, there’s a 1440p resolution VA panel with 3,000:1 contrast, HDR, extended color, 240 Hz, Adaptive-Sync and a backlight strobe for fighting motion blur. There’s even strong sound, via two 5W internal speakers and DTS tuning with multiple sound effect modes.
Expensive? Of course ($750 as of writing), but if you want style, there’s nothing else like it.
Porsche Design AOC Agon PD27 Specs
Panel Type / Backlight
VA / W-LED, edge array
Screen Size , Aspect Ratio & Curve
27 inches / 16:9
Curve radius: 1000mm
Max Resolution & Refresh
2560×1440 @ 240 Hz
AMD FreeSync Premium Pro: 48-240 Hz
Native Color Depth & Gamut
8-bit / DCI-P3
HDR10, DisplayHDR 400
Response Time (MPRT)
0.5 ms
Brightness (mfr)
550 nits
Contrast (mfr)
2,500:1
Speakers
2x 5w, DTS-tuned
Video Inputs
2x DisplayPort 1.4
2x HDMI 2.0
Audio
3.5mm headphone output
USB 3.2
1x up, 4x down
Power Consumption
34.6w, brightness @ 200 nits
Panel Dimensions WxHxD w/base
23.9 x 17.1-23.1 x 12.7 inches (606 x 434-587 x 322mm)
Bezel Width
Top/sides: 0.3 inch (8mm)
Bottom: 1.1 inches (28mm)
Weight
19.6 pounds (8.9kg)
Warranty
4 years
This isn’t AOC’s first collaboration with Porsche Design. In 2018, we reviewed the PDS271, an enterprise monitor that brought a minimalist style and merely average performance to the table. The PD27 ups the ante considerably.
A 27-inch VA panel with a 1000R curve claims an honest 3000:1 contrast ratio. It’s Adaptive-Sync of choice is AMD FreeSync Premium Pro, which adds low framerate compensation (LFC) and HDR support compared to standard FreeSync. It’s not Nvidia-certified, but we also got G-Sync to run (see: How to Run G-Sync on a FreeSync Monitor).
HDR10 signals are supported with either Adaptive-Sync or blur reduction up to 240 Hz over the two DisplayPort 1.4 inputs. Alternatively, you can use either of the two HDMI 2.0 ports for refresh rates as high as 144 Hz (see: DisplayPort vs HDMI: Which is Better?).
Assembly and Accessories of the AOC Agon PD27
The AOC Agon PD27 ships in rubbery foam, not the crumbly kind, and is packaged beautifully, worthy of an unboxing video. The stand and panel are permanently mated, so there’s no option for an aftermarket arm or bracket, not that you’d want one.
Accessories are in a nice presentation-style box and include HDMI, DisplayPort and USB cables along with a large external power supply.
Product 360
Image 1 of 9
Image 2 of 9
Image 3 of 9
Image 4 of 9
Image 5 of 9
Image 6 of 9
Image 7 of 9
Image 8 of 9
Image 9 of 9
Style? Oh yeah, there’s lots o’ dat. Right from the start, the eye is drawn to the AOC Agon PD27’s unique metal stand, making the lack of VESA mounting understandable. The stand is made from chrome-plated tubing meant to evoke a racecar’s roll cage. While it doesn’t have visible weld marks like the real article, it does remind us of something automotive. It’s quite beautiful and substantial. Not only is it rock-solid, but the footprint is quite deep at nearly 13 inches. You’ll need to clear a little more space on your desk than you would for the average 27-inch monitor.
In front, the Porsche Design logo is proudly displayed in shiny metal letters against a brushed finish. The top and side bezels are thin and flush, just 8mm wide. If you reach around the right side, you’ll find the joystick, which integrates all monitor controls, including power. You can tell power status via a tiny LED: orange for standby and white for powered.
The chrome-plated stand is one of the most solid pieces out there, and it offers full ergonomics. You get nearly 6 inches of height adjustment, along with 150degree swivel either way and 23-degree tilt. Movements are as good as it gets with just the right resistance level and easily controlled positioning.
Gamers will enjoy not only the PD27’s advanced video processing and 240 Hz refresh rate, but also its extensive lighting features. The lighting effect comes in two flavors. An LED ring surrounds the stand/panel pivot point and casts a large glow off the back of the AOC Agon PD27. A projector is also included that casts the Porsche Design logo on your desktop. There are dozens of preset effects or you can customize the color with RGB values for both sets of lights in the OSD.
-Tying it all together is one of the coolest remotes we’ve ever seen. Not too many 27-inch monitors include remotes, but AOC has created a slick metal piece that works wirelessly. It has full OSD navigation ,plus three programmable keys for preset modes. Like the rest of the Agon PD27, it’s quite substantial and definitely worthy of a premium display like this.
The input panel is well-stocked and split across two sections. Video inputs include two each of HDMI 2.0 (144 Hz) and DisplayPort 1.4 (240 Hz). Both interfaces support Adaptive-Sync and HDR. A 3.5mm audio jack is right next to the DPs. On the other side are USB 3.2 ports, one upstream and four down.
OSD Features of the AOC PD27
The PD27 unique styling carries all the way down to its OSD, well with a cascading card look to denote the eight sub-menus. Once you dig in, you’ll find it arranged like that of other AOC monitors and easily controllable. Osd2
The Game Setting menu has three game-specific picture modes and three user memories apart from the image presets in the Luminance menu. To keep things simple, we left Game Mode off and did all testing in the Standard Eco mode. This menu also has Shadow Control, which makes dark detail brighter. Its five settings are very coarse and even one click will turn the lovely deep blacks into a murky gray. MBR is the PD27’s blur reduction feature. It operates at up to 240 Hz and cancels Adaptive-Sync. We found a few issues there, which we’ll tell you about in the Hands-on section.
Overdrive has three levels and works well at its Medium setting. If you want to monitor frame rates, turn on the FrameCounter.
The Audio menu has much more than just volume and mute controls. There are five sound modes that use phase and frequency manipulation to create different sound stages. Or choose Off and set the equalization yourself over five bands from 200 Hz to 7 kHz.
LightFX is where you’ll find the many options for the main LED ring around the stand’s pivot. There are preset modes, or you can change intensity, color and pattern manually. Color is controlled with RGB sliders, so, in theory, there are millions of possibilities. Oddly, the control for the logo projector is in another menu called Extra. It too has user selectable intensity and color options.
Porsche Design AOC Agon PD27 Calibration Settings
You won’t need to calibrate the AOC Agon PD27 if you do one thing: change the color temp to User Define. That single adjustment provides superb accuracy in all metrics within the DCI-P3 color gamut. If you prefer to play your SDR games in sRGB color, there’s an option for that in the color temp menu, and it, too, is quite accurate. We tweaked the RGB sliders just a bit for ego’s sake. There are three gamma presets, but again, the default setting is best.
Below are our recommended calibration settings for the Porsche Design AOC Agon PD27 and SDR content.
Eco Mode
Standard (Game Mode Off)
Brightness 200 nits
70
Brightness 120 nits
21
Brightness 100 nits
7 (min. 89 nits)
Contrast
50
Gamma
1
Color Temp User
Red 49, Green 50, Blue 49
When you apply an HDR signal, the AOC Agon PD27 switches over automatically and offers four additional picture modes. DisplayHDR is the default and best choice. There are no adjustments for HDR content, but none are needed, as you’ll see on page four.
Gaming and Hands-on with the AOC Agon PD27
The curve is pronounced with a tight 1000mm radius. That’s as tight as it gets these days. You’ll see some distortion of things like spreadsheets and documents on the PD27, but moving images look natural as they cross the screen. We’ve experienced this extreme curve before in the Samsung Odyssey G7 32-Inch It’s a real asset to gaming and movie watching though a 32-inch screen makes better use of the radius than the AOC Agon PD27.
The AOC Agon PD27 is a bit small for its extreme curvature for a balance of workday tasks and gaming. 1000R is as curvy as it gets, and we’ve tried this spec with the Samsung G7 but were spoiled by its 32-inch size and extra height compared to the PD27. But the AOC has excellent color with a bright and sharp picture, so it is certainly capable of pushing through spreadsheets and word processing.
Watching video is pleasure with a screen that wraps this much. We felt a greater sense of immersion when checking out movies and TV shows through various streaming services. And the audio quality was well beyond what we usually hear from internal monitor speakers. Audio is provided by two 5W speakers with DTS-tuning and multiple effects modes. Or you can tailor the sound yourself with a multi-band equalizer in the OSD, setting the equalization yourself over five bands from 200 Hz to 7 kHz. Sound quality is a cut above the norm for internal monitor speakers with rich full audio and a feeling of spaciousness.
The DTS tuning created a far larger space with sounds clearly coming from different parts of the screen. Some phased effects even seemed to come from behind, giving a decent approximation of surround sound. There isn’t a ton of bass, but dialogue sounded crystal-clear and never chesty.
Gaming is clearly the PD27’s forte. Both FreeSync and (unofficial) G-Sync worked perfectly with or without HDR. An RTX 3090 video card had no trouble maintaining 240 frames per second (fps) in both Tomb Raider and Call of Duty: WWII with detail set to maximum. Our Radeon RX 5700 XT managed about 200 fps with the same games. The AOC Agon PD27 can keep up with your best graphics cards and favorite games. HDR only cost us about 10 fps and definitely enhanced the Call of Duty experience with clearer shadow detail and bright highlights.
Color and contrast were excellent in both SDR and HDR modes. The large and accurate gamut was apparent, especially in bright and colorful material, which looked vibrant and three-dimensional. VA contrast is on full display here as the PD27 has a very broad native dynamic range. If AOC ever wished to add dynamic contrast to the monitor’s HDR mode, it would look even better. But we had no complaints; HDR looked better than it does on most of the IPS screens we’ve tested.
When monitors run at 240 Hz or faster, it’s feasible to choose the backlight strobe over Adaptive-Sync. We did this successfully in our recent review of the Alienware AW2521H, where ultra-low motion blur (ULMB) at 240 Hz provided smoother motion than G-Sync at 360 Hz. The PD27, however, is not in that category. Its blur reduction (AOC calls it MBR) doesn’t work all that well. It looked fine when running BlurBusters test patterns, but in games, there was a slight smearing effect that can’t be solved with the overdrive.
The best choice is to leave Adaptive-sSnc turned on and run the overdrive at its Medium setting. That keeps detail clear when the action gets intense. If you are a backlight strobe purist, the AOC Agon PD27 is not for you. But its implementation of adaptive sync and overdrive is flawless.
What’s the best mining GPU, and is it worth getting into the whole cryptocurrency craze? Bitcoin and Ethereum mining are making headlines again; prices and mining profitability are way up compared to the last couple of years. Everyone who didn’t start mining last time is kicking themselves for their lack of foresight. Not surprisingly, the best graphics cards and those chips at the top of our GPU benchmarks hierarchy end up being very good options for mining as well. How good? That’s what we’re here to discuss, as we’ve got hard numbers on hashing performance, prices, power, and more.
We’re not here to encourage people to start mining, and we’re definitely not suggesting you should mortgage your house or take out a big loan to try and become the next big mining sensation. Mostly, we’re looking at the hard data based on current market conditions. Predicting where cryptocurrencies will go next is even more difficult than predicting the weather, politics, or the next big meme. Chances are, if you don’t already have the hardware required to get started on mining today (or really, about two months ago), you’re already late and won’t see the big gains that others are talking about. Like the old gold rush, the ones most likely to strike it rich are those selling equipment to the miners rather than the miners themselves.
If you’ve looked for a new (or used) graphics card lately, the current going prices probably caused at least a raised eyebrow, maybe even two or three! We’ve heard from people who have said, in effect, “I figured with the Ampere and RDNA2 launches, it was finally time to retire my old GTX 1070/1080 or RX Vega 56/64. Then I looked at prices and realized my old card is selling for as much as I paid over three years ago!” They’re not wrong. Pascal and Vega cards from three or four years ago are currently selling at close to their original launch prices — sometimes more. If you’ve got an old graphics card sitting around, you might even consider selling it yourself (though finding a replacement could prove difficult).
Ultimately, we know many gamers and PC enthusiasts are upset at the lack of availability for graphics cards (and Zen 3 CPUs), but we cover all aspects of hardware — not just gaming. We’ve looked at GPU mining many times over the years, including back in 2011, 2014, and 2017. Those are all times when the price of Bitcoin shot up, driving interest and demand. 2021 is just the latest in the crypto coin mining cycle. About the only prediction we’re willing to make is that prices on Bitcoin and Ethereum will change in the months and years ahead — sometimes up, and sometimes down. And just like we’ve seen so many times before, the impact on graphics card pricing and availability will continue to exist. You should also be aware that, based on past personal experience that some of us have running consumer graphics cards 24/7, it is absolutely possible to burn out the fans, VRMs, or other elements on your card. Proceed at your own risk.
The Best Mining GPUs Benchmarked, Tested and Ranked
With that preamble out of the way, let’s get to the main point: What are the best mining GPUs? This is somewhat on a theoretical level, as you can’t actually buy the cards at retail for the most part, but we have a solution for that as well. We’re going to use eBay pricing — on sold listings — and take the data from the past seven days (for prices). We’ll also provide some charts showing pricing information from the past three months (90 days) from eBay, where most GPUs show a clear upward trend. How much can you make by mining Ethereum with a graphics card, and how long will it take to recover the cost of the card using the currently inflated eBay prices? Let’s take a look.
For this chart, we’ve used the current difficulty and price of Ethereum — because nothing else is coming close to GPU Ethereum for mining profitability right now. We’ve tested all of these GPUs on our standard test PC, which uses a Core i9-9900K, MSI MEG Z390 ACE motherboard, 2x16GB Corsair DDR4-3600 RAM, a 2TB XPG M.2 SSD, and a SeaSonic 850W 80 Plus Platinum certified PSU. We’ve tuned mining performance using either NBminer or PhoenixMiner, depending on the GPU, with an eye toward minimizing power consumption while maximizing hash rates. We’ve used $0.10 per kWh for power costs, which is much lower than some areas of the world but also higher than others. Then we’ve used the approximate eBay price divided by the current daily profits to come up with a time to repay the cost of the graphics card.
It’s rather surprising to see older GPUs at the very top of the list, but that’s largely based on the current going prices. GTX 1060 6GB and RX 590 can both hit modest hash rates, and they’re the two least expensive GPUs in the list. Power use isn’t bad either, meaning it’s feasible to potentially run six GPUs off a single PC — though then you’d need PCIe riser cards and other extras that would add to the total cost.
Note that the power figures for all GPUs are before taking PSU efficiency into account. That means actual power use (not counting the CPU, motherboard, and other PC components) will be higher. For the RTX 3080 as an example, total wall outlet power for a single GPU on our test PC is about 60W more than what we’ve listed in the chart. If you’re running multiple GPUs off a single PC, total waste power would be somewhat lower, though it really doesn’t impact things that much. (If you take the worst-case scenario and add 60W to every GPU, the time to break even only increases by 4-5 days.)
It’s also fair to say that our test results are not representative of all graphics cards of a particular model. RTX 3090 and RTX 3080 can run high GDDR6X temperatures without some tweaking, but if you do make the effort, the 3090 can potentially do 120-125MH/s. That would still only put the 3090 at third from the bottom in terms of time to break even, but it’s quite good in terms of power efficiency, and it’s the fastest GPU around. There’s certainly something to be said for mining with fewer higher efficiency GPUs if you can acquire them.
Here’s the real problem: None of the above table has any way of predicting the price of Ethereum or the mining difficulty. Guessing at the price is like guessing at the value of any other commodity: It may go up or down, and Ethereum, Bitcoin, and other cryptocurrencies are generally more volatile than even the most volatile of stocks. On the other hand, mining difficulty tends to increase over time and rarely goes down, as the rate of increased difficulty is directly tied to how many people (PCs, GPUs, ASICs, etc.) are mining.
So, the above is really a best-case scenario for when you’d break even on the cost of a GPU. Actually, that’s not true. The best-case scenario is that the price of Ethereum doubles or triples or whatever, and then everyone holding Ethereum makes a bunch of money. Until people start to cash out and the price drops, triggering panic sells and a plummeting price. That happened in 2018 with Ethereum, and it’s happened at least three times during the history of Bitcoin. Like we said: Volatile. But here we are at record highs, so everyone is happy and nothing could possibly ever go wrong this time. Until it does.
Still, there are obviously plenty of people who believe in the potential of Ethereum, Bitcoin, and blockchain technologies. Even at today’s inflated GPU prices, which are often double the MSRPs for the latest cards, and higher than MSRP for just about everything, the worst cards on the chart (RTX 3090 and RX 6900 XT) would still theoretically pay for themselves in less than seven months. And even if the value of the coins drops, you still have the hardware that’s at least worth something (provided the card doesn’t prematurely die due to heavy mining use). Which means, despite the overall rankings (in terms of time to break even), you’re generally better off buying newer hardware if possible.
Here’s a look at what has happened with GPU pricing during the past 90 days, using tweaked code from:
GeForce RTX 3060 Ti: The newest and least expensive of the Ampere GPUs, it’s just as fast as the RTX 3070 and sometimes costs less. After tuning, it’s also the most efficient GPU for Ethereum right now, using under 120W while breaking 60MH/s.
Radeon RX 5700: AMD’s previous generation Navi GPUs are very good at mining, and can break 50MH/s while using about 135W of power. The vanilla 5700 is as fast as the 5700 XT and costs less, making it a great overall choice.
GeForce RTX 2060 Super: Ethereum mining needs a lot of memory bandwidth, and all of the RTX 20-series GPUs with 8GB end up at around 44MH/s and 130W of power, meaning you should buy whichever is cheapest. That’s usually the RTX 2060 Super.
Radeon RX 590: All the Polaris GPUs with 8GB of GDDR5 memory (including the RX 580 8GB, RX 570 8GB, RX 480 8GB, and RX 470 8GB) end up with relatively similar performance, depending on how well your card’s memory overclocks. The RX 590 is currently the cheapest (theoretically), but all of the Polaris 10/20 GPUs remain viable. Just don’t get the 4GB models!
Radeon RX Vega 56: Overall performance is good, and some cards can perform much better — our reference models used for testing are more of a worst-case choice for most of the GPUs. After tuning, some Vega 56 cards might even hit 45-50MH/s, which would put this at the top of the chart.
Radeon RX 6800: Big Navi is potent when it comes to hashing, and all of the cards we’ve tested hit similar hash rates of around 65MH/s and 170W power use. The RX 6800 is generally several hundred dollars cheaper than the others and used a bit less power, making it the clear winner. Plus, when you’re not mining, it’s a very capable gaming GPU.
GeForce RTX 3080: This is the second-fastest graphics card right now, for mining and gaming purposes. The time to break even is only slightly worse than the other GPUs, after which profitability ends up being better overall. And if you ever decide to stop mining, this is the best graphics card for gaming — especially if it paid for itself! At around 95MH/s, it will also earn money faster after you recover the cost of the hardware (if you break even, of course).
What About Ethereum ASICs?
One final topic worth discussing is ASIC mining. Bitcoin (SHA256), Litecoin (Scrypt), and many other popular cryptocurrencies have reached the point where companies have put in the time and effort to create dedicated ASICs — Application Specific Integrated Circuits. Just like GPUs were originally ASICs designed for graphics workloads, ASICs designed for mining are generally only good at one specific thing. Bitcoin ASICs do SHA256 hashing really, really fast (some can do around 25TH/s while using 1000W — that’s trillions of hashes per second), Litecoin ASICs do Scrypt hashing fast, and there are X11, Equihash, and even Ethereum ASICs.
The interesting thing with hashing is that many crypto coins and hashing algorithms have been created over the years, some specifically designed to thwart ASIC mining. Usually, that means creating an algorithm that requires more memory, and Ethereum falls into that category. Still, it’s possible to optimize hardware to hash faster while using less power than a GPU. Some of the fastest Ethereum ASICs (e.g. Innosilicon A10 Pro) can reportedly do around 500MH/s while using only 1000W. That’s about ten times more efficient than the best GPUs. Naturally, the cost of such ASICs is prohibitively expensive, and every big miner and their dog wants a bunch of them. They’re all sold out, in other words, just like GPUs.
Ethereum has actually tried to deemphasize mining, but obviously that didn’t quite work out. Ethereum 2.0 was supposed to put an end to proof of work hashing, transitioning to a proof of stake model. We won’t get into the complexities of the situation, other than to note that Ethereum mining very much remains a hot item, and there are other non-Ethereum coins that use the same hashing algorithm (though none are as popular / profitable as ETH). Eventually, the biggest cryptocurrencies inevitably end up being supported by ASICs rather than GPUs — or CPUs or FPGAs. But we’re not at that point for Ethereum yet.
Finding the best graphics card at anything approaching reasonable pricing has become increasingly difficult. Just about everything we’ve tested in our GPU benchmarks hierarchy is sold out unless you go to eBay, but current eBay prices will take your breath away. If you thought things were bad before, they’re apparently getting even worse. No doubt, a lot of this is due to the recent uptick in Ethereum mining’s profitability on GPUs, compounded by component shortages, and it’s not slowing down.
A couple of weeks back, we wrote about Michael Driscoll tracking scalper sales of Zen 3 CPUs. Driscoll also covered other hot ticket items like RTX 30-series GPUs, RDNA2 / Big Navi GPUs, Xbox Series S / X, and PlayStation 5 consoles. Thankfully, he provided the code to his little project, and we’ve taken the opportunity to run the data (with some additional filtering out of junk ‘box/picture only’ sales) to see how things are going in the first six weeks of 2021. Here’s how things stand for the latest AMD and Nvidia graphics cards:
Image 1 of 7
Image 2 of 7
Image 3 of 7
Image 4 of 7
Image 5 of 7
Image 6 of 7
Image 7 of 7
That’s … disheartening. Just in the past month, prices have increased anywhere from 10% to 35% on average. The increase is partly due to the recent graphics card tariffs, and as you’d expect, the jump in prices is more pronounced on the lower-priced GPUs.
For example, the RTX 3060 Ti went from average prices of $690 in the first week of January to $920 in the past week. It also represents nearly 3,000 individual sales on eBay, after filtering out junk listings — and these are actual sales, not just items listed on eBay. RTX 3080 saw the next-biggest jump in pricing, going from $1,290 to $1,593 for the same time periods, with 3,400 listings sold.
Nvidia’s RTX 3070 represents the largest number of any specific GPU sold, with nearly 5,400 units, but prices have only increased 17% — from $804 in January to $940 in February. The February price is interesting because it’s only slightly higher than the RTX 3060 Ti price, which suggests strongly that it’s Ethereum miners snapping up most of these cards. (The 3060 Ti hits roughly the same 60MH/s as the 3070 after tuning since they both have the same 8GB of GDDR6 memory.)
Wrapping up Nvidia, the RTX 3090 accounts for 2,291 units sold on eBay, with pricing increasing 14% since January. For the most expensive GPU that already had an extreme price, it’s pretty shocking to see it move up from $2,087 to a new average of $2,379. I suppose it really is the heir to the Titan RTX throne now.
We see a similar pattern on the AMD side, but at far lower volumes in terms of units sold. The RX 6900 XT had 334 listings sold, with average pricing moving up just 8% from $1,458 to $1,570 during the past six weeks. Considering it delivers roughly the same mining performance as the less expensive Big Navi GPUs, that makes sense from the Ethereum mining perspective.
Radeon RX 6800 XT prices increased 11% from $1,179 in January to $1,312 in February. It’s also the largest number of GPUs sold (on eBay) for Team Red, at 448 graphics cards. Not far behind is the RX 6800 vanilla, with 434 units. It saw the biggest jump in pricing over the same period, from $865 to $1,018 (18%). That strongly correlates with expected profits from GPU mining.
That’s both good and bad news. The good news is that gamers are most likely being sensible and refusing to pay these exorbitant markups. The bad news is that as long as mining remains this profitable, stock and pricing of graphics cards isn’t likely to recover. It’s 2017 all over again, plus the continuing effects of the pandemic.
As usual, Intel’s poised for a busy year. The company has already launched its new 11th Generation Tiger Lake H35 mobile chips, and 11th Gen Rocket Lake should blast into the market this year to take on the likes of AMD Ryzen 5000. This week during The Tom’s Hardware Show, Intel also discussed the role resizable BAR is playing in its efforts to boost performance for gamers opting for those chips.
Through an advanced feature available through PCIe , resizable BAR lightens the burden on a GPU’s VRAM by only transferring data, like shaders and textures, when needed and, if there are multiple requests, simultaneously. This should boost gaming performance by allowing the CPU to “efficiently access the entire frame buffer,” as Nvidia put it. AMD already tackles this with its Smart Access Memory (SAM) feature available with Radeon RX 6000 graphics cards, but Nvidia added support for RTX 30-series mobile cards in January, with desktop graphics card support beginning in March.
Intel’s GM of premium and gaming notebook segments, Fredrik Hamberger, got into support for resizable BAR on The Tom’s Hardware Show, saying Intel collaborated with graphics card makers, namely Nvidia and AMD, for implementation. The goal, he said was a “standard solution” that could be compatible with multiple vendors.
Intel’s H35-series mobile chips, which target ultraportable gaming laptops, already support resizable BAR, as do all of Intel’s Comet Lake-H series chips and upcoming H45 series, Hamberger said. It’s just up to the laptop and graphics card makers to make the feature usable.
“The final drivers, from our side, it’s already there,” Hamberger told Tom’s Hardware. “Some of the OEMs are working on finalizing exact timing on when they have the driver from the graphics vendors, so I think you’d have to ask them on the exact timing.”
The exec also pointed to some games seeing performance gains of 5-10%.
“It is a pretty nice boost by just turning on this pipeline and, again, standard implementation versus trying to do something custom and proprietary,” Hamberger said.
Of course, the more games that support resizable BAR, the better. But Hamberger has confidence that we’ll see a growing number of game developers make that possible.
“It’s a pretty late feature that … is being turned on, but since it’s following a standard, I think that the nice thing is if you’re a developer you don’t have to worry about it being like, ‘Hey, [only] these three systems have it.’ It’s gonna be available both on notebooks … it’s part of our Rocket Lake platform as well on the desktop side,” Hamberger said.
“Our expectation is that you’ll see more and more developers turn on the ability to use this, and we’ll continue to scale it.”
You can enjoy this week’s episode of The Tom’s Hardware Show via the video above, on YouTube, Facebook, or wherever you get your podcasts.
The Gigabyte M27Q is a very capable and speedy gaming monitor with few flaws. Though it has a huge color gamut, red is a little under-saturated, and it doesn’t offer extra contrast in HDR mode. But you do get superb gaming performance with 170 Hz and super-low input lag. As a value choice, it’s hard to beat.
For
170 Hz
Low input lag
Large color gamut
Accurate sRGB mode
KVM switch
Against
Aim Stabilizer causes ghosting
Lackluster HDR
DCI-P3 red is slightly under-saturated
Features and Specifications
Performance-to-price ratio is something we talk about often. While there are many seeking the lowest priced components and some for whom price is no object, most want the highest possible performance for the money.
Every computer component has a market sweet spot where you get most of the speed and power of top-level components for a lot less than the premium price, and it is no different for PC gaming monitors. We’re talking about the elements that gamers shop for: speed, resolution and screen size.
The Gigabyte M27Q ($330 as of writing) packs 1440p resolution into an IPS panel running at a speedy 170 Hz. The picture quality quotient is upped by a wide color gamut and HDR support. But is the best gaming monitor for value-seekers?
Gigabyte M27Q Specs
Panel Type / Backlight
Super Speed IPS / W-LED, edge array
Screen Size / Aspect Ratio
27 inches / 16:9
Max Resolution & Refresh Rate
2560×1440 @ 170 Hz
AMD FreeSync Premium: 48-170 Hz
Native Color Depth & Gamut
8-bit / DCI-P3
DisplayHDR 400
HDR10
Response Time (GTG)
0.5 ms
Brightness (mfr)
400 nits
Contrast (mfr)
1,000:1
Speakers
2x 2w
Video Inputs
1x DisplayPort 1.2
2x HDMI 2.0
1x USB-C
Audio
3.5mm headphone output
USB 3.0
2x up, 2x down
Power Consumption
21w, brightness @ 200 nits
Panel Dimensions WxHxD w/base
24.2 x 15.8-21 x 8 inches (615 x 401-533 x 203mm)
Panel Thickness
1.7 inches (43mm)
Bezel Width
Top/sides: 0.3 inch (8mm)
Bottom: 0.8 inch (21mm)
Weight
12.1 pounds (5.5kg)
Warranty
3 years
High-contrast VA panels make for amazing image quality on gaming monitors, but speedy IPS implementations are quickly moving to a position of domination in the speediest part of the genre. The M27Q opts for Super Speed (SS) IPS, Gigabyte’s branding for IPS tech that achieves lower response times by using a thinner liquid crystal layer and higher driving voltage than standard IPS screens. Our review focus runs at a 170 Hz refresh rate without overclock and supports AMD FreeSync Premium. It’s not an official G-Sync Compatible monitor, but we got the M27Q to run G-Sync (see our How to Run G-Sync on a FreeSync Monitor tutorial). A claimed 0.5 ms response time puts it in company with most 240 Hz monitors.
The backlight is a flicker-free white LED in an edge array that’s specced to deliver over 400 nits brightness for both SDR and HDR content. It also advertises a “Super Wide Color Gamut” on the box, and we confirmed that claim — although there’s a caveat that we’ll explain on page three.
For the price, the M27Q promises a lot of gaming performance and plenty of features for the enthusiast. Let’s dive in and see if it lives up to the spec sheet.
Assembly and Accessories
Unpacking the substantial carton reveals a panel already bolted to an upright. Just attach the large base with a captive bolt, and you’re ready to make connections. The power supply is a small external brick. Bundled cables include HDMI, DisplayPort and USB 3.0. Despite having a USB-C input, the M27Q does not include a USB-C cable.
Product 360
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
To keep the price low, (and is is under $350), there are few frills in the M27Q’s design. The monitor doesn’t include an RGB effect, and styling is understated. Build quality, however, is in keeping with higher-priced monitors, and you get Gigabyte’s usual suite of gaming features, like aiming points and timers.
The M27Q is unassuming from the front with just a Gigabyte logo and a tiny white LED adorning the bottom trim strip. The remainder of the bezel is flush mounted with an 8 mm frame around the image. The anti-glare layer is the same 3H-hardness part found on almost all computer monitors. Here, it provides a sharp, bright image with no apparent grain or optical distortion.
There are a few styling cues around back with a shiny polished strip across the top underlined with a thin grill. “M27Q” is molded in below that and in the same gloss finish. The rest of the plastic cover is matte finished in two different textures. Futuristic-looking lines are set around a button for activating KVM mode, which lets you control two PCs connected to the monitor with one keyboard and mouse, and the joystick for controlling the on-screen display (OSD). The upright can be removed if you’d rather use the 100mm VESA mount for a monitor arm.
The stand is very solid with firm movements. The vertical movement has subtle detents, which make it even more positive. You get a 5.2-inch height adjustment plus -5 and 20-degree tilts. There is no swivel or portrait mode. Thankfully, we didn’t encounter any play or wobble when moving the M27Q around. It is very well-built.
The side view shows the M27Q to be a touch thinner than most 27-inch monitors. There are no USB or headphone jacks here. Instead, they’re on the bottom input panel. which includes two HDMI 2.0, one DisplayPort 1.2, one USB-C and three USB 3.0 ports, one upstream and two down. Input labels are easy to see, making connections easier.
OSD Features
Outside of the monitor’s integrated on-screen display (OSD), the M27Q is controllable via the Windows desktop if you download the OSD Sidekick app. You can also create up to three custom reticles in the app. The OSD, however, offers the full-featured menu.
The M27Q’s OSD looks just like the one found on all Gigabyte and Aorus monitors with a large rectangular window and four columns making up the menu tree. There are seven sub-menus, plus a reset all function. The top portion always shows signal information and the status of various settings at a glance.
The first menu is for gaming and includes Aim Stabilizer, Gigabyte’s term for backlight strobe-based blur reduction. Engaging it means turning off Adaptive-Sync and overdrive. It doesn’t affect peak brightness, like most backlight strobes do, but in our tests, it introduced significant ghosting around moving objects. Aim Magnifier enlarges the center of the screen, just the thing for sniping. Unfortunately, it also requires losing Adaptive-Sync and overdrive.
Further adjustments include Black Equalizer, which brightens shadow areas, and Super Resolution, which adds edge enhancement. Display Mode contains aspect ratio options and has a FreeSync toggle. The overdrive feature here is interesting in that you can’t completely turn it off. It has three levels (Balance is the best choice), plus Auto. In our tests, Auto corresponded to the Balance choice. At this setting, overdrive reduced blur nicely without ghosting.
The Picture menu offers seven picture modes, plus three custom memories for user settings. You can store more configurations on your PC by using the OSD Sidekick app. The best mode is Standard as it offers good out-of-box accuracy and calibration to a high standard. It locks the user into the full native color gamut, which as we found out is very large, over 100% of DCI-P3. The sRGB mode is completely usable though with accurate grayscale, gamma and color gamut rendering. That’s the choice for SDR content if you’re a color purist.
You can get to the Game Assist menu by pressing the joystick once, then clicking right. The monitor has one crosshair included, but you can create three more of your own using the aforementioned OSD Sidekick app. Game Info offers timers that count up or down and a frame rate indicator. Dashboard requires a USB connection and displays CPU and GPU temps, fan speeds and usage stats in an on-screen box that can be placed anywhere you like. If you plan to use multiple M27Qs, this menu has alignment marks available too.
Gigabyte M27Q Calibration Settings
In the Standard picture mode, the M27Q is accurate enough to satisfy most. The native color space is DCI-P3, but you can use the sRGB mode for an accurate display of that gamut. Its only available adjustment is brightness.
For calibration though, the Standard mode offers five gamma presets and three color temps plus a user mode. We left gamma alone but tweaked the RGB sliders for excellent grayscale and gamma tracking.
Here are our recommended calibration settings for enjoying SDR content on the Gigabyte M27Q and what we used for our calibrated benchmarks:
Picture Mode
Standard
Brightness 200 nits
41
Brightness 120 nits
19
Brightness 100 nits
14
Brightness 80 nits
9
Brightness 50 nits
1 (min. 48 nits)
Contrast
48
Gamma
3
Color Temp User
Red 95, Green 98, Blue 100
When it comes to HDR signals, the only adjustment available is brightness. We found the best HDR quality by leaving that slider maxed.
Gaming and Hands-on
One unique feature included of thee M27Q is its KVM (keyboard, video, mouse) switch. The ability to control two PCs connected to the monitor with one keyboard and mouse isn’t that common among PC monitors and is almost always found in general use/productivity monitors, rather than gaming ones. In a gaming monitor, a KVM switch makes it easy to toggle from your best gaming laptop, for example, over to your work-sanctioned work PC without unplugging and replugging all your peripherals. The M27Q’s OSD includes a wizard to easily assign video inputs and then switch between them with a dedicated button above the OSD joystick. The USB-C input can be a video connection and a USB upstream port.
With the M27Q calibrated to 200 nits brightness, the Windows desktop looked bright and sharp. Our office has a moderate light level with filtered sunlight coming in one window. We never had trouble with glare or other environmental factors affecting the image. Color looked well-saturated but not overly so. Greens and blues are especially vibrant. Pictures of sky and grass radiated with brilliant hues. Skin tones looked natural and robust without excessive warmth. Detail in tiny fonts and icons was well-resolved, thanks to the screen’s 109 pixel per inch (ppi) pixel density — right at our sweet spot.
Turning on HDR brightened the M27Q’s image considerably, but you can compensate with the brightness slider if it seems too harsh. We only used HDR for gaming and video, not for workday tasks. It offers no benefit when editing spreadsheets. Switchover is automatic and rapid when you select the HDR option in Windows’ Display Settings.
With HDR on we played a bit of Call of Duty: WWII. Comparing HDR to SDR in this game showed a brighter overall environment for HDR but better detail and color saturation in SDR mode. Your selection will come down to user preference. We preferred playing all games in SDR mode. Other titles, like Tomb Raider, looked fantastic with deeply detailed shadows, vivid color and defined textures in this mode.
The M27Q’s video processing was visually perfect in every game we tried when paired with high frame rates. Our GeForce RTX 3090 drove the frames per second (fps) counter to 170 every time. At this speed, there is no hesitation or stutter at all. Frame tears were non-existent, and control inputs were instantly responded to. Blur was also a non-issue.
On a machine running an Radeon RX 5700 XT graphics card, the same games ran at around 120 fps and delivered a similar experience. To casual gamers, that additional 50 Hz makes little difference, but more skilled players will appreciate the M27Q’s extra speed. That performance was reliably delivered and never wavered in quality.
Our final takeaway was that this Gigabyte is a serious gaming monitor for an attractive price. Its performance-to-price ratio yielded favorable results on the battlefield.
Today, we bring you our first review of a custom design Radeon RX 6900 XT graphics card in the MSI RX 6900 XT Gaming X Trio. When AMD originally announced the RX 6000 series “Big Navi,” with the RX 6900 XT release set for its own exclusive date, the company hadn’t made up its mind on whether to enable custom-design RX 6900 XT boards, which explains why it took some time for board partners to come up with custom designs. The MSI Radeon RX 6900 XT is the company’s flagship graphics card from the red team, designed to square off against NVIDIA’s fastest, such as the GeForce RTX 3080 or even RTX 3090. It supercharges the fully-unlocked “Big Navi” silicon with a custom-design PCB bolstered by a stronger VRM design, triple power inputs, and the company’s latest Tri-Frozr cooling solution.
The Radeon RX 6900 XT by AMD is the company’s fastest GPU from this generation, and the flagship product based on the new RDNA2 graphics architecture that debuted on next-gen consoles, before making it to the PC. This common architecture enables easy optimization of games to the PC platform, as they’re already optimized for the console hardware. RDNA2 is AMD’s first graphics architecture with full DirectX 12 Ultimate readiness, including real-time raytracing through Ray Accelerators, fixed-function hardware. The RX 6900 XT is based on the same 7 nm “Navi 21” silicon as the RX 6800 series, but maxes it out, with all its 5,120 stream processors enabled, as well as 80 Ray Accelerators, 320 TMUs, and 128 ROPs.
Real-time raytracing is the holy grail of consumer 3D graphics, and today’s GPU vendors have figured out how to combine conventional raster 3D with certain real-time raytraced elements, such as lighting, shadows, reflections, etc., to significantly increase realism. Even this much raytracing demands enormous amounts of compute power. AMD’s approach has been to deploy fixed-function hardware for the most compute-intensive part of the raytracing pipeline, while relying on a mighty SIMD setup for other raytracing-related tasks, such as denoising. A by-product of this approach is vastly improved raster 3D performance. Not only are the stream processors doubled over the previous generation RDNA, but they also run at significantly higher engine clocks.
AMD has also doubled the amount of memory to 16 GB and uses the fastest JEDEC-standard 16 Gbps GDDR6 memory, although the bus width is still 256-bit, yielding 512 GB/s memory bandwidth. AMD has worked around the bandwidth problem by deploying a fast on-die level 3 cache directly on the GPU, which it calls Infinity Cache. This 128 MB scratchpad for the GPU, when combined with the GDDR6 memory, belts out an effective bandwidth of 2 TB/s. AMD has also taken the opportunity to update the multimedia acceleration and display I/O capabilities of their GPUs.
MSI takes things a step ahead of AMD by giving the RX 6900 XT a powerful VRM solution that pulls power from three 8-pin PCIe power connectors, and using its premium Tri-Frozr cooling solution deployed across all Gaming X Trio graphics cards from both the RX 6000 and NVIDIA RTX 30 series. This cooler features a chunky aluminium fin-stack heatsink, the company’s latest generation TorX fans, a blinding amount of RGB bling, and other innovative features, such as a mechanism that counteracts PCB bending. MSI’s MSRP for the RX 6900 XT isn’t known, but we doubt it’s anywhere close to AMD’s original MSRP. We’re expecting this card to sell for $1800 or higher—that’s the price point of other premium-design RX 6900 XT cards on the market right now.
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
Many people who used to commute to an office every day have passed the six-month mark as remote workers, and we are now working from home offices, dining room tables, desks set up in the corners of bedrooms, or living room sofas. The folks here at The Verge are no different, and we thought it might be interesting to talk to some of our co-workers and find out how they’re coping with roommates, kids, spouses, and other distractions — not to mention having to find a place in their homes to double as workspaces.
For this article, we talk to The Verge video director Becca Farsace.
You’re one of The Verge’s video directors — what exactly do you do?
I am! How friggin’ cool is that? I drink a whole lot of coffee while simultaneously filming, hosting, and editing videos about technology for The Verge. It’s truly a dream.
Could we start by talking about the space? That’s a nice little nook you have.
When I was apartment hunting, the first thing I would look for was a space that this desk could fit in. This desk means a lot to me, so there was no way I was leaving it behind, but more on that later. In my current apartment, this is the only wall long enough to fit this six-foot-long slab of a tree, so I really didn’t have much of a choice. But it ended up working out well with a really nice view out to the yard on one side and Abe Lincoln on the other.
That desk is beautiful! I was wondering if you’d made it yourself.
Oh heck, yeah! I am always looking for ways to bring more natural elements into my home. The inspiration came from those really beautiful, long dining room tables you see in fancy log cabins. But I knew I would never have the space for that in a tiny Brooklyn apartment, and I also knew I couldn’t afford one of those.
So I went to the Big Reuse, my favorite secondhand store here in Brooklyn, where they sell unfinished, large pieces of wood, and decided I would make a desk out of one of these pieces instead. My partner at the time and I didn’t have too much money to spend, so we picked out a piece of wood that had a large crack in it, and was therefore discounted. We borrowed one of the Big Reuse’s rolling carts, and we rolled this beautiful six-foot-long piece of trunk home.
A workspace created by found objects.
We then played Legos with metal pipes on the floor of Lowes to create the legs. Everything about building this desk was a challenge, and there is certainly so much more to the story — like having to glue the crack together because I dropped the piece of wood in the backyard literally as soon as we got it home. But every day I get to work on the insides of a tree and at a desk that I made.
Cool! Now tell me about your tech.
Oh man, where do I begin? Since I am reviewing products, I have a lot of tech coming and going, often multiple pieces of one type of tech.
I have the most earbuds though, and I am constantly switching between them so that I can really understand what works best when and for what. For example, as I’m writing this it’s 11:30AM, and already today I have used Pixel Buds on a walk, AirPods on a video call, and the Galaxy Beans (Galaxy Buds Live) when I was making coffee. Those three are my daily drivers and all live on my desk when I’m not using them. (I have learned if they don’t have a home they will get lost.)
When the Vox office shut down for COVID-19, I agreed to do every supercut (that’s a video where we cut down live events to only the most important information for our viewers). But only if I could take home The Verge’s Mac Pro (a 3.2GHz 16-core Intel Xeon W processor, two Radeon Pro Vega II graphics cards, 96GB of RAM, and a 1TB SSD) and Pro Display XDR. And if I’m being honest, I’m going to be really sad when I have to go back to my late-2013 iMac. (I foresee a large update coming to that because I simply don’t know how I can go back.)
Anyhow, the Pro Display doesn’t have a webcam or a mic, so for video calls and working away from my desk, I have a 15-inch 2016 MacBook Pro with an i7 Intel processor, 16GB of RAM, and Radeon Pro 460 graphics card.
I have a 500GB Samsung SDD that I use for the projects I am currently working on and three other hard disk drives I back everything up on.
There is also almost always a camera I am reviewing on my desk, along with its lenses or mods. When this was written in October 2020, the GoPro Hero9 and Sony a7C were sitting up here with me.
That’s a great setup you have in that filing cabinet. How did you do it?
I have so many cords and dongles and just small pieces! So a filing cabinet just made sense to me: a large container that doesn’t need to be treated delicately and provides maximum storage. I also love the industrial look of filing cabinets, and they are easy and cheap to thrift. The blue dividers inside are a funnier story.
I really hate buying anything new and am a huge fan of upcycling. Which has led me to the very abundant dumpsters in NYC. (My parents get so mad when I tell people I dumpster dive — sorry, Mom!) There are large warehouses by my apartment that house many small businesses, and when COVID-19 started, sadly, many of them went out of business, which led to whole offices being dumped into dumpsters and driven away.
The best dumpster was from a clothing company that was really going for that Brooklyn outdoorsman vibe. They had tons of buttons and zippers and pieces of fabric, and along with all of those things, they had many, many dividers. I found those blue dividers deep in that dumpster before I even had the filing cabinet, but absolutely had an organization problem. Ugh, that was a great dumpster. (To be clear, I do not recommend diving into just any dumpster. There can be hella glass and pests that you really don’t want to come in contact with, but all of that is for another post.)
That laptop looks like it’s about to fall off. Do you ever lose tech to the floor?
You know, I don’t lose as much to the floor as you might think — knock on wood. I have broken enough tech to really make sure it is secure before I leave it. Plus, this is a standing desk, so a fall from this can truly be deadly.
Like me, your desk is littered with little toys and other tchotchkes. I count offhand: Must Go Hard water bottle, Little Panda, the rock pile, the plant, a troll doll, and more. Are there any that have stories that you could tell?
Play is extremely important to me. It is what inspires me, it is a large form of therapy for me, and it really informs my style. So I like to fill my desk with things to play with. These objects are constantly rotating but currently I have two favorites: the Little Panda piggy bank and the rocks.
The “Little Panda” comes out of the box and says “Hello!” when you put a coin on the white button, then he snatches the coin and says “Thank you!” A really close friend gave it to me too, so besides the fun it provides, I think of her when I use it, which always makes me smile. I keep a pile of coins next to him, it’s seriously so cute and fun!
Then, the rocks, I just move around in my hands endlessly. I would classify them as a “tinker toy” but also a subtle way to keep me close to nature and keep me grounded. I just love this Earth so much and am beyond grateful for the beauty it provides, so the rocks keep me close to that.
Is that a portrait of Lincoln on your wall?
Oh yeah, good ole Abe always watching over me. Most of the things I surround myself with have emotional value, but he was just a classic thrift find. I saw him and just knew I needed to see that every day. It’s actually a replica of Lincoln’s portrait that hangs in the White House. The funny part is I brought him home and put him on a hook that a previous tenant had hung and he just fit so perfectly. It truly feels like he was meant to be there. And I named my Wi-Fi after him because he is right above the router.
Toys, notes, photos, and other stuff.
How do you keep the world out while you’re working — or do you need to?
If I’m being honest, I’m not great at keeping the world out. There are simply so many fun things in my home I could be messing with, but I know this about myself and actively set boundaries and rules to stay focused. A timer has become my greatest friend and worst enemy. I usually set the timer in one- or two-hour increments during which I cannot leave my desk or stop working until the time runs out. Quite literally, I need to chain myself to my desk, but I have found that once I get into a flow I don’t have trouble staying in it, I just need to get into that flow first. And tying myself to things gets me flowing with it.
Is there anything you’d like to change or add to the current setup?
I have been wanting a stool with a back for awhile now. I stand 75 percent of my day, but when I do sit I find my stool to not be very comfy, and I think I would be a bit more productive if I was more comfortable in that 15 percent of time.
I also have this piece of wall that has nothing on it above and to the left of my monitor. I try to let walls decorate themselves with things that naturally find their way into my life, like Abe, but this has been white for a few months now, and it’s starting to bother me. Whatever it ends up being, I know it can’t be too big or it will make the wall feel small, and I know that it can’t be in a beefy frame that might compete with Abe. I think about this at least once a day and have been eyeing Oxford Pennant’s website for awhile now… Suggestions are welcome!
Is there anything we haven’t covered that you’d like to tell us about?
I just wanna say thank you to everyone reading this. I feel so grateful for being able to wake up and create every day, and without everyone tuning in I simply wouldn’t be able to do that. It’s weird times man, but the support I have received from y’all has really been such a source of light. Okay! That’s all. Sappy Becca out. Be well, buds!
Update February 9th, 2021, 10:30AM ET: This article was originally published on October 1st, 2020; the prices have been updated.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.