Matthew Wilson 3 days ago Featured Tech News, General Tech, Operating Systems
We’ve been hearing about Windows 10X for a few years now. The lighter version of Windows was supposed to be Microsoft’s answer to ChromeOS and boost sales of entry-level and student laptops. Unfortunately, development has reportedly been shelved.
According to a report from Petri, Windows 10X will no longer be releasing this year and may never see the light of day in its current form. Originally, Windows 10X was created for dual-screen devices, like Microsoft’s unreleased Surface Neo tablet. Since then, the company has shifted gears in an effort to compete with ChromeOS and focus the OS around single-screen devices.
Microsoft has reportedly shifted resources away from Windows 10X and back to Windows 10 proper. As we know, Microsoft is currently making an effort to update the current version of Windows 10 with new UI elements and design changes.
Some of the technologies originally built for Windows 10X may still see the light of day within Windows 10. This is apparently due to conversations with customers, who wanted some of those features Microsoft was developing, but didn’t want them in an entirely separate operating system.
Microsoft has yet to comment on these reports publicly. Discuss on our Facebook page, HERE.
KitGuru Says: It sounds like Windows 10X is meeting a similar fate to Windows 10 S, Microsoft’s earlier attempt at delivering a lighter version of Windows for students and entry-level PCs.
Become a Patron!
Check Also
Galax to release RTX 3070 and RTX 3080 FG graphics cards with Nvidia’s anti-mining tech
Galax confirms Nvidia is preparing the launch of RTX 3070 and RTX 3080 graphics cards …
(Pocket-lint) – Oppo has enjoyed some successes recently; with some players faltering, such as Huawei, it’s clear that Oppo is attempting to step into the vacuum that’s been left behind.
The Oppo Find X3 Pro received rave reviews as a flagship. It’s also flanked by a couple of devices that share its name: the Find X3 Neo is, basically, built on the previous year’s flagship hardware, while the cheapest of the bunch is this, the Find X3 Lite.
Despite the ‘Lite’ name, however, good performance continues, with plenty that’s enjoy in this mid-range phone.
Design & Build
Dimensions: 159.1 x 73.4 x 7.9mm / Weight: 172g
3.5mm headphone socket
If you’re a follower of Oppo phones, you might get caught off guard but the shuffle in naming convention. The Find X3 Lite effectively rivals much of what the Find X2 Neo offered, but does make a couple of sacrifices to achieve its price point.
Pocket-lint
One area that doesn’t seem to have been sacrificed, however, is the build. The Find X3 Lite is a quality device, with Gorilla Glass 5 on the front and rear to help protect against scratches, and an aluminium frame holding everything together. There’s a clear case in the box too, to keep things looking fresh.
As is often the case on affordable devices there’s a 3.5mm headphone socket. However, there’s no stereo speaker offering: it’s a mono affair, with the speaker on the bottom of the phone providing the power – and it’s easily blocked when holding the phone in landscape orientation, such as when playing games.
The Oppo Find X3 family have differing designs, so there’s no sculpted bump on the rear for the Lite’s cameras, it’s a lot more conventional – but we like the looks, especially on this Starry Black version where it’s slightly less prominent than some.
Pocket-lint
In line with the Lite name there’s no waterproofing on this model, as you’ll find elsewhere in the range.
Display
6.4-inch AMOLED panel, 2400 x 1080 resolution, 90Hz refresh
There’s a flat display on the Find X3 Lite, with minimal bezels for a smart look. A punch-hole sits in the top left-hand corner for the front camera, a convenient position for those playing games in landscape, as this corner generally is covered by your left hand, so you don’t have a hole getting in the way of your game.
It’s an AMOLED display, measuring 6.4-inches on the diagonal, with a Full HD+ resolution That’s become the average for this size and type of device, with many flagships now sticking to similar resolution for the sake of battery life.
Pocket-lint
There’s a 90Hz refresh rate, helping to smooth out some of your scrolling content, with the option to switch back to 60Hz if you prefer – although this is fairly buried within the settings so we doubt that anyone will bother to make that change. Again, it’s a typical setting for this level of device, with an increasing number of devices over the past 12 months offering a faster refresh.
You’ll note that the touch sampling rate is 180Hz, slower than many of the top devices, and while this doesn’t matter to a lot of people, it’s one area where Oppo is keeping a tight check on things to deliver at this price point.
The display is vibrant, delivering a great palette of colours, looking great whether you’re browsing online, gaming or watching movies. It’s not the brightest display around, so it struggles a little in brighter outdoor conditions and you may have to bump the brightness up or down a little to suit the conditions you’re in.
Pocket-lint
There’s a fingerprint scanner under the display too, which provides fast unlocking and has proven generally reliable, although it only takes a little dust or water to disturb it.
Hardware and performance
Qualcomm Snapdragon 765G 5G, 8GB RAM
4,300mAh battery, 65W fast-charging
128GB storage
The hardware loadout fits with those great mid-range devices from 2020. The Qualcomm Snapdragon 765G found here is good solid hardware that’s delivered many great phones in the recent past. Technically, that’s now been replaced with the Snapdragon 780G, but the Find X3 Lite was launched before that hardware was available.
That’s not a huge loss: while there might be some incremental improvements to performance, you’re still getting a great device for the asking price. Indeed, the Motorola Moto G100 uses that newer hardware, but is quite a bit more expensive than the X3 Lite.
Pocket-lint
Performance wise, there’s little to complain about. We’ve been playing a full run of games on the Find X3 Lite and they play perfectly smoothly, while everything else is slick and fast. There’s not really anything in performance terms that fits with the Lite name – it’s a great experience.
There’s no microSD card support, however, so you’re looking at 128GB storage being your all.
Where Oppo is adding some excitement is with 65W charging. That’s thanks to the SuperVOOC 2.0 technology and the chunky charger that you’ll find in the box. What this means is you’ll be able to recharge the phone’s battery at blistering speed – from zero to full in around 35 minutes.
There is battery management software that will attempt to control the charging speeds to preserve battery health though, so that short time-frame isn’t always feasible. This software monitors your usage patterns and will charge the battery slowly as applicable, if you’re in the habit of charging overnight, to ensure it’ll last longer over an extended period of ownership.
Pocket-lint
However, this can be irritating at times – especially if you only have time for a short charge overnight, because the automatic system doesn’t seem to recognise the difference between you plugging it on at the normal time, or 6 hours later, meaning you can wake up with a phone that’s not charged if you don’t have the, for example, full 8 hours on the charger that you’d normally get.
We also found that this setting had a habit of turning itself back on, even when we’d turned it off. The best solution, in reality, is a short quick charge during the day and leaving your phone off the charger at night. That should work out for most people, because the battery life of the Find X3 Lite is good, easily lasting through the day, including a few hours of gaming.
Motorola’s new Moto G9 Plus is a stunner of a phone – find out why, right here
By Pocket-lint Promotion
·
Cameras
Quad rear camera system:
Main: 64-megapixel, f/1.7 aperture
Ultra-wide: 8MP, f/2.4
Macro: 2MP, f/2.4
Mono: 2MP, f/2.4
Front: 32MP, f/2.4
Oppo plays the typical 2021 mid-range phone game, plastering the rear of the camera with sensors so it can claim it’s a “quad camera”. There’s the appearance of the low-resolution macro sensor – which isn’t anything to get excited about – and there’s also a 2-megapixel “mono camera”.
Pocket-lint
This mono lens notionally feeds data into the portrait system to improve its performance. Portrait is offered on the front camera from a single lens, suggesting to us that it’s simply an unnecessary feature.
The portrait performance isn’t especially good anyway, with the edge detection a little crude. The bokeh effect needs to be set at the time of taking the photo because you can’t adjust it once the picture is taken. Yes, there are options to increase the blur, but unlike the options from Samsung or Google Pixel, for example, you can’t reduce the level of the blur if you find the effect too strong.
The Lite’s front camera is reasonable: we can’t fathom why it’s a 32-megapixel sensor as that doesn’t really deliver any benefits. There’s no pixel binning, as it pumps out 32-megapixel images which just take up more storage and need more data to share. It will give you a decent shot in good conditions, but you’ll need to use the night mode in low-light as it gets noisy rather quickly in less than perfect situations.
The rear camera sees a headline 64-megapixel main, which is par for the course. This is very much about appearing to keep up with rivals than actually delivering better images – but again, it’s typical for this level of phone.
Here there is some pixel binning, with 16-megapixel images as a result by default. If you want to shoot at full 64-megapixel resolution you have the option to turn that on in normal photo mode; there’s also an Extra HD mode which oversamples to give a 108-megapixel image.
Visually, those images basically look the same (the Extra HD mode loses the AI scene optimisation), but greater resolution gives the potential to zoom and crop – although the detail is rather mushy and we can’t see anyone really wanting to do this.
With all that said, the main camera puts in a decent performance for this price of phone and you’ll get decent photos from it in most conditions. There’s no zoom, however, which is a slight limitation, only offering digital zoom.
The ultra-wide camera puts in an average performance, although we like the options this introduces from a usability perspective. However, there is a colour shift between this and the main camera, as well as blurring as you move out of the centre of the frame.
As with many phones in this price category, the Lite will probably do everything you want it to do – as well as a whole load of stuff you don’t want it to do. Just don’t fall for the “quad camera” system marketing and stick to main lens and there’s a perfectly usable single camera on the rear.
Software
Google Android 11 OS
Oppo ColorOS 11
Like many brands, Oppo goes to town customising Google’s Android operating system with its own ColorOS setup. ColorOS has seen great improvements over recent years to make it a lot more usable and approachable – and the offering on the Find X3 Lite isn’t too bad.
There isn’t too much bloat or duplication – except for photos, music, and an app to help you relax you’ll likely never use – but with Google Messages, Gboard and Chrome all in place, there’s not too much messing around needed to get to the services you want. Access to Google Discover from the home screen is welcomed too.
Pocket-lint
But beneath this, ColorOS changes the look and feel of many areas of Android 11. It gives you plenty of options for customisation, but some things fall down the cracks too.
Notifications seem to be particularly irksome: some applications have failed to deliver notifications consistently, we’ve also found that “bedtime mode” – part of the digital wellbeing suite – took about a week to figure out how to run to the schedule we gave it.
Some of these might just be teething troubles, but the experience doesn’t feel quite as slick as the software on the Samsung Galaxy A52 5G, which is a close rival to this phone.
At the same time, we’ve not found the software to get in the way: once you’re in your favourite app or game things run very much as they should.
Verdict
There’s a lot that’s interesting about the Oppo Find X3 Lite: the core hardware is solid, the display is good, and fast battery charging is a real benefit.
The niggles are also fairly minor: the over-sell on the cameras, the single speaker that’s easily blocked, and some software quirks that seem to block notifications. Despite this running on year-old hardware, it’s still a capable phone, and the Lite naming is perhaps an undersell considering how much you get for your money.
But over the past 12 months, this has emerged as the most competitive smartphone segment: there are better camera performers on this hardware (Pixel 4a 5G), there are better displays in this position (Samsung Galaxy A52 5G) and lots of options besides.
Also consider
Pocket-lint
Samsung Galaxy A52 5G
Samsung’s budget offering sits a little lower in the hardware stakes, but offers waterproofing – which is rare at this level – as well as a great 120Hz display.
Read our full review
squirrel_widget_4315049
Pocket-lint
Redmi Note 10 Pro
Redmi offers blistering value for money, although this is a 4G model only and on slightly lower hardware – but you still get a lot of phone for your money.
Microsoft has indefinitely delayed Windows 10X, its lightweight operating system for low-spec systems and foldables, according to reports from Windows Central and Petri. Instead, the company will turn its focus to the existing desktop experience.
A Microsoft spokesperson told Tom’s Hardware that “Microsoft has nothing to share at this time.”
Windows 10X was introduced in 2019 at a Surface-based event as a version of the Windows 10 operating system designed for dual-screen devices, like the Surface Neo. But Windows 10X was delayed to spring 2021 after a shift to single screen devices to service people’s needs during the Covid-19 pandemic. The Surface Neo was delayed, and no date was ever given for a release. The device was also removed from Microsoft’s website.
In theory, not only would Windows 10X power foldables, but also rival Google’s Chrome OS with support for low-power computers. The Surface Neo, for instance, was going to run on Intel’s Lakefield platform.
The company is reportedly focusing on the existing Windows 10 experience now. Its Sun Valley update, which will include a visual overhaul will likely see the benefits of Windows 10X later this year.
Windows 10X was not Microsoft’s first attempt to rejuvenate Windows 10. Windows 10 S showed up with the Surface Laptop in 2017, but was seen as limited and later made into a locked down mode that users could easily switch out of. And let’s not forget Windows RT, which launched alongside the original Surface Tablet in 2012, only to be discontinued a year later, although technically that was in the days of Windows 8.
Recently, Microsoft announced that 1.3 billion active devices are running Windows 10. It appears that Microsoft is focusing on those experiences instead.
In what feels like an increasingly rare occurrence, Google announced that it’s updating Wear OS with a new feature today — but don’t get too excited, it’s just a keyboard. In this case, Google is porting over Gboard, the swipeable, predictive text-powered keyboard that it offers for Android, iOS, and Android TV, as spotted by 9to5Google.
Like the larger phone version, Gboard for Wear OS supports multiple input methods on a keyboard that’s slightly more finger-friendly than the old version the operating system used. You can type by tapping, swiping, or speaking. The keyboard also has easier access to emoji and “enhanced suggestions” above the keyboard.
Along with the new keyboard, Google says it’s also introducing multi-language support for all of the languages offered on Wear OS. To switch languages, Gboard for Wear OS has a language shortcut at the bottom of the keyboard that will pull up a menu with options.
The update is minor, but Google is at least paying attention to an operating system that’s mainly received minor updates for a while. Google opened up Tiles on the OS to third-party developers earlier this year, and before that, promised to improve to CPU performance in August 2020. With Google I/O 2021 — the company’s developer conference — on the horizon, there’s a possibility Google has more improvements to share for Wear OS, but until then, be happy you can at least swipe to type on your wrist.
Microsoft is finally preparing to refresh its Windows 95-era icons. The software giant has been slowly improving the icons it uses in Windows 10, as part of a “sweeping visual rejuvenation” planned for later this year. We saw a number of new system icons back in March, with new File Explorer, folder, Recycle Bin, disk drive icons, and more. Microsoft is now planning to refresh the Windows 95-era icons you still sometimes come across in Windows 10.
Windows Latest has spotted new icons for the hibernation mode, networking, memory, floppy drives, and much more as part of the shell32.dll file in preview versions of Windows 10. This DLL is a key part of the Windows Shell, which surfaces icons in a variety of dialog boxes throughout the operating system. It’s also a big reason why Windows icons have been so inconsistent throughout the years. Microsoft has often modernized other parts of the OS only for an older app to throw you into a dialog box with Windows 95-era icons from shell32.dll.
Hopefully this also means Windows will never ask you for a floppy disk drive when you dig into Device Manager to update a driver. That era of Windows, along with these old icons, has been well and truly over for more than a decade now.
All of this work to improve the consistency of Windows is part of Microsoft’s design overhaul to Windows 10, codenamed Sun Valley. The visual changes are expected to appear in the Windows 10 21H2 update that should arrive in October. Microsoft has not officially detailed its Sun Valley work, but a job listing earlier this year teased a “sweeping visual rejuvenation of Windows.”
Microsoft has so far revealed new system icons for Windows 10, alongside File Explorer icon improvements, and more colorful Windows 10 icons that appeared last year. Rounded corners will also be a big part of Sun Valley, alongside changes to built-in apps and the Start menu.
We’re expecting to hear more about Sun Valley at Microsoft’s Build conference later this month, or as part a dedicated Windows news event.
Adobe Flash is reaching the very end of its life. The final nail in its coffin comes from Microsoft. Even after Adobe officially ended support for Flash on the very last day of 2020, Flash remains a component of Windows 10, that is until Microsoft releases the 21H1 update for Windows 10. Rollout for this update begins this month and it will remove the Flash component from the operating system.
The Verge cited this change by an update posted to the Windows Blog. Titled the “Update on Adobe Flash Player End of support”, it outlines that a component called ‘Update for the removal of Adobe Flash Player’ will be included in Windows 10 cumulative update version 1507 starting in July. Machines on Windows 8.1, Windows Server 2012, and Windows Embedded 8 Standard will also receive this component via its Monthly Rollup and Security Only Update.
Additionally, please note that when you update to Windows 10, version 21H1 or later, Flash will be removed. More information on Windows 10,
Adobe Flash was once used to run interactive multimedia applications like games or programs right from the web browser in the late 90s and early 2000s. Open web standards like HTML5, WebGL, and WebAssembly all became replacements for Flash player. This, in addition to increasing security vulnerabilities led to the obsoletion of the once popular web platform for multimedia.
Resident Evil Village is the latest addition to the long-running horror series, and just like last year’s Resident Evil 3 remake, it is built on Capcom’s RE Engine. We test over 25 GPUs at 1080p, 1440p and 4K to find out what sort of hardware you need to run this game at maximum settings, while also looking at the performance and visual quality of the game’s ray tracing options.
Watch via our Vimeo channel (below) or over on YouTube at 2160p HERE
In terms of visual settings, there are a number of options in the display menu. Texture and texture filtering settings are on offer, as well as variable rate shading, resolution, shadows, and so on. There’s also selection of quick presets, and for our benchmarking today we opted for the Max preset, but with V-Sync and CAS disabled.
One interesting thing about the Max preset is the default ambient occlusion setting – FidelityFX CACAO, which stands for Combined Adaptive Compute Ambient Occlusion, a technology optimised for RDNA-based GPUs. To make sure this setting wouldn’t unfairly penalise Nvidia GPUs, we tested CACAO vs SSAO with both the RX 6800 and RTX 3070:
Both GPUs only lost 3% performance when using CACAO instead of SSAO, so we were happy to use the former setting for our benchmarking today.
Driver Notes
AMD GPUs were benchmarked with a pre-release driver provided by AMD for Resident Evil Village.
Nvidia GPUs were benchmarked with the 466.27 driver.
Test System
We test using the a custom built system from PCSpecialist, based on Intel’s Comet Lake-S platform. You can read more about it over HERE, and configure your own system from PCSpecialist HERE.
CPU
Intel Core i9-10900K
Overclocked to 5.1GHz on all cores
Motherboard
ASUS ROG Maximus XII Hero Wi-Fi
Memory
Corsair Vengeance DDR4 3600MHz (4 X 8GB)
CL 18-22-22-42
Graphics Card
Varies
System Drive
500GB Samsung 970 Evo Plus M.2
Games Drive
2TB Samsung 860 QVO 2.5″ SSD
Chassis
Fractal Meshify S2 Blackout Tempered Glass
CPU Cooler
Corsair H115i RGB Platinum Hydro Series
Power Supply
Corsair 1200W HX Series Modular 80 Plus Platinum
Operating System
Windows 10 2004
Our 1-minute benchmark pass came from quite early on in the game, as the player descends down into the village for the first time. Over the hour or so that I played, the results do seem representative of wider gameplay, with the exception of intense combat scenes which can be a bit more demanding. Those are much harder to benchmark accurately though, as there’s more variation from run to run, so I stuck with this outdoor scene.
1080p Benchmarks
1440p Benchmarks
2160p (4K) Benchmarks
Closing Thoughts
After previously looking at the Resident Evil 3 remake last year, a game which is also built on Capcom’s RE Engine, I wasn’t too surprised to see that overall performance is pretty similar between both games.
That’s certainly a good thing though, as the game plays very well across a wide range of hardware. At the lower end, weaker GPUs like the GTX 1650, or older cards like the GTX 1060 6GB, still deliver a very playable experience at 1080p max settings. Village also scales very well, so if you have a higher-end GPU, you will be rewarded with significantly higher frame rates.
AMD does see the benefit to its partnership with Capcom for this one, as RDNA-based GPUs do over-perform here compared to the average performance we’d expect from those cards. The RX 6700 XT is matching the RX 3070 for instance – when we’d typically expect it to be slower – while the RX 6900 XT is 7% faster than the RTX 3090 at 1440p.
In terms of visual fidelity, I don’t think the RE Engine delivers a cutting edge experience like you’d get from Cyberpunk 2077 or Red Dead Redemption 2 when using Ultra settings, but it still looks good and I am particularly impressed with the detailed character models.
The only negative point for me is that the ray tracing is pretty underwhelming. As we demonstrate in the video above, it doesn’t really deliver much extra from a visual perspective, at least in my opinion. Overall though, Resident Evil Village looks good and runs well on pretty much any GPU, so it definitely gets a thumbs up from me.
Discuss on our Facebook page HERE.
KitGuru says: Capcom’s newest game built on the RE Engine delivers impressive performance and looks good while doing so.
The Apple Watch Series 3 was first released in September 2017, bringing fitness improvements and a faster processor. Nearly four years later, in 2021, Apple is still selling the Series 3 as its entry-level Apple Watch model starting at $199, an $80 savings compared to the more recent Apple Watch SE. Only, as I’ve recently learned, “still selling” and “supporting in a reasonable manner” are two very different things, and updating an Apple Watch Series 3 in 2021 is a nightmare of infuriating technological hoops to jump through.
Normally, updating an Apple Watch is an annoyingly long but straightforward process: you charge your Watch up to 50 percent, plug it in, and wait for the slow process of the update transferring and installing to your smartwatch.
But the non-cellular Apple Watch Series 3 has a tiny 8GB of internal storage, a fair chunk of which is taken up by the operating system and other critical software. So installing a major update — like the recently released watchOS 7.4 — goes something like this:
Unpair and wipe your Apple Watch to factory settings
Set up the Apple Watch again and restore from backup
Realize you weren’t supposed to restore from your backup yet
Watch an episode or two of Brooklyn Nine-Nine while you wait for the backup to finish restoring
Start from step one again — but as a brand-new Apple Watch, without restoring from an existing backup
Update completely fresh Apple Watch, which now has enough free memory to update
Consider how much you actually want to use this face unlocking feature everyone keeps hyping up in the first place
Unpair and wipe the Apple Watch a third time
Restore from your backup and finally use normally
And the issue seems to apply whether you’ve installed a pile of apps or not. Apple’s support website doesn’t even recommend that Series 3 owners bother trying to clear up space — it just advocates that they go straight to the aforementioned reset cycle.
It’s clear that the current process is untenable.
I’m an editor at a technology news site and willing to put in the comical amount of time and energy to manage this, frustrating as it might be. But if you’re a more casual user — the same one who is likely to own an older, outdated Watch in the first place — why on earth would you bother with the worst update mechanism since GE’s instructions for resetting a smart bulb? And being able to update your hardware’s software is important: the just-released watchOS 7.4.1, for example, patches a critical security flaw. But with it being so difficult to install, there’s a good chance that plenty of Series 3 owners won’t bother.
I know that Apple loves to claim support for as many older hardware generations as it can with each new update. It’s one of the biggest appeals of Apple products, compared to the lackluster pace of updates on competing Android phones (like the just-deprecated Galaxy S8).
But the miserable update process for the Series 3 is a strong argument that Apple is being a little too generous with what it considers “current” hardware. Keeping the Series 3 around this long was always a money grab, a way for Apple to clear out old inventory and take advantage of mature manufacturing processes that have long since broken even in order to appeal to users who really can’t afford the extra $80 for the markedly better Apple Watch SE. It’s a similar trend to the inexplicably still-on-sale Apple TV HD, which is almost six years old and costs just $30 less than the brand-new 4K model. (Much like the Series 3, don’t buy a new Apple TV HD in 2021 either.)
But hopefully, with the announcement of watchOS 8 almost assuredly around the corner at WWDC this June, the company takes into account the basic functionality of its hardware when considering what it does and doesn’t offer support for. Because if Apple is going to insist on selling a product this old in the future, it’s going to need to be a lot more mindful of just how it actually handles its software support.
Apple is planning to add a new HiFi tier to its Apple Music streaming service, and it could coincide with the launch of its AirPods 3 true wireless earbuds, sources say. Hits Double Daily quotes music label sources as saying the HiFi tier will allow high-fidelity music streaming (we’re assuming CD-quality) and cost the same $9.99 (£9.99, AU$9.99) as the current Individual tier.
Both the new tier and the AirPods 3 will launch “in the coming weeks”, the sources say. Apple’s annual WWDC developers conference starts on 7th June, so there’s a chance we could see a launch around then.
An Apple Music HiFi tier would compete directly with Spotify, who should be launching its own CD-quality Spotify HiFi tier later in the year.
The two firms are arch-rivals. Apple was recently reprimanded by the European Commission for taking a cut of any music subscriptions bought from within apps running on its iOS operating system. It was also criticised for not allowing firms to advertise other – potentially cheaper – ways of subscribing on their iOS apps. The case followed a complaint from Spotify in 2019.
Last week, Spotify raised some of its prices, which didn’t go down particularly well in some quarters. And, it also launched podcast subscriptions, which is another area in which it will compete directly with Apple.
The third-gen AirPods have been rumoured for a while now. The AirPods 2 launched in 2019, so the earbuds are due a refresh. The next model is expected to look similar to the AirPods Pro, but without Pro-only features like active noise cancellation (ANC).
Amazon currently offers a high-fidelity service called Amazon Music HD. At £12.99 ($12.99) a month, it costs a little more than the standard Amazon Music Unlimited tier but it does include hi-res versions of a number of tracks. Whether Apple wants to go down the hi-res audio route remains to be seen, but it looks like all could be revealed in the next few weeks.
The University of Minnesota’s path to banishment was long, turbulent, and full of emotion
On the evening of April 6th, a student emailed a patch to a list of developers. Fifteen days later, the University of Minnesota was banned from contributing to the Linux kernel.
“I suggest you find a different community to do experiments on,” wrote Linux Foundation fellow Greg Kroah-Hartman in a livid email. “You are not welcome here.”
How did one email lead to a university-wide ban? I’ve spent the past week digging into this world — the players, the jargon, the university’s turbulent history with open-source software, the devoted and principled Linux kernel community. None of the University of Minnesota researchers would talk to me for this story. But among the other major characters — the Linux developers — there was no such hesitancy. This was a community eager to speak; it was a community betrayed.
The story begins in 2017, when a systems-security researcher named Kangjie Lu became an assistant professor at the University of Minnesota.
Lu’s research, per his website, concerns “the intersection of security, operating systems, program analysis, and compilers.” But Lu had his eye on Linux — most of his papers involve the Linux kernel in some way.
The Linux kernel is, at a basic level, the core of any Linux operating system. It’s the liaison between the OS and the device on which it’s running. A Linux user doesn’t interact with the kernel, but it’s essential to getting things done — it manages memory usage, writes things to the hard drive, and decides what tasks can use the CPU when. The kernel is open-source, meaning its millions of lines of code are publicly available for anyone to view and contribute to.
Well, “anyone.” Getting a patch onto people’s computers is no easy task. A submission needs to pass through a large web of developers and “maintainers” (thousands of volunteers, who are each responsible for the upkeep of different parts of the kernel) before it ultimately ends up in the mainline repository. Once there, it goes through a long testing period before eventually being incorporated into the “stable release,” which will go out to mainstream operating systems. It’s a rigorous system designed to weed out both malicious and incompetent actors. But — as is always the case with crowdsourced operations — there’s room for human error.
Some of Lu’s recent work has revolved around studying that potential for human error and reducing its influence. He’s proposed systems to automatically detect various types of bugs in open source, using the Linux kernel as a test case. These experiments tend to involve reporting bugs, submitting patches to Linux kernel maintainers, and reporting their acceptance rates. In a 2019 paper, for example, Lu and two of his PhD students, Aditya Pakki and Qiushi Wu, presented a system (“Crix”) for detecting a certain class of bugs in OS kernels. The trio found 278 of these bugs with Crix and submitted patches for all of them — the fact that maintainers accepted 151 meant the tool was promising.
On the whole, it was a useful body of work. Then, late last year, Lu took aim not at the kernel itself, but at its community.
In “On the Feasibility of Stealthily Introducing Vulnerabilities in Open-Source Software via Hypocrite Commits,” Lu and Wu explained that they’d been able to introduce vulnerabilities into the Linux kernel by submitting patches that appeared to fix real bugs but also introduced serious problems. The group called these submissions “hypocrite commits.” (Wu didn’t respond to a request for comment for this story; Lu referred me to Mats Heimdahl, the head of the university’s department of computer science and engineering, who referred me to the department’s website.)
The explicit goal of this experiment, as the researchers have since emphasized, was to improve the security of the Linux kernel by demonstrating to developers how a malicious actor might slip through their net. One could argue that their process was similar, in principle, to that of white-hat hacking: play around with software, find bugs, let the developers know.
But the loudest reaction the paper received, on Twitter and across the Linux community, wasn’t gratitude — it was outcry.
“That paper, it’s just a lot of crap,” says Greg Scott, an IT professional who has worked with open-source software for over 20 years.
“In my personal view, it was completely unethical,” says security researcher Kenneth White, who is co-director of the Open Crypto Audit Project.
The frustration had little to do with the hypocrite commits themselves. In their paper, Lu and Wu claimed that none of their bugs had actually made it to the Linux kernel — in all of their test cases, they’d eventually pulled their bad patches and provided real ones. Kroah-Hartman, of the Linux Foundation, contests this — he told The Verge that one patch from the study did make it into repositories, though he notes it didn’t end up causing any harm.
Still, the paper hit a number of nerves among a very passionate (and very online) community when Lu first shared its abstract on Twitter. Some developers were angry that the university had intentionally wasted the maintainers’ time — which is a key difference between Minnesota’s work and a white-hat hacker poking around the Starbucks app for a bug bounty. “The researchers crossed a line they shouldn’t have crossed,” Scott says. “Nobody hired this group. They just chose to do it. And a whole lot of people spent a whole lot of time evaluating their patches.”
“If I were a volunteer putting my personal time into commits and testing, and then I found out someone’s experimenting, I would be unhappy,” Scott adds.
Then, there’s the dicier issue of whether an experiment like this amounts to human experimentation. It doesn’t, according to the University of Minnesota’s Institutional Review Board. Lu and Wu applied for approval in response to the outcry, and they were granted a formal letter of exemption.
The community members I spoke to didn’t buy it. “The researchers attempted to get retroactive Institutional Review Board approval on their actions that were, at best, wildly ignorant of the tenants of basic human subjects’ protections, which are typically taught by senior year of undergraduate institutions,” says White.
“It is generally not considered a nice thing to try to do ‘research’ on people who do not know you are doing research,” says Kroah-Hartman. “No one asked us if it was acceptable.”
That thread ran through many of the responses I got from developers — that regardless of the harms or benefits that resulted from its research, the university was messing around not just with community members but with the community’s underlying philosophy. Anyone who uses an operating system places some degree of trust in the people who contribute to and maintain that system. That’s especially true for people who use open-source software, and it’s a principle that some Linux users take very seriously.
“By definition, open source depends on a lively community,” Scott says. “There have to be people in that community to submit stuff, people in the community to document stuff, and people to use it and to set up this whole feedback loop to constantly make it stronger. That loop depends on lots of people, and you have to have a level of trust in that system … If somebody violates that trust, that messes things up.”
After the paper’s release, it was clear to many Linux kernel developers that something needed to be done about the University of Minnesota — previous submissions from the university needed to be reviewed. “Many of us put an item on our to-do list that said, ‘Go and audit all umn.edu submissions,’” said Kroah-Hartman, who was, above all else, annoyed that the experiment had put another task on his plate. But many kernel maintainers are volunteers with day jobs, and a large-scale review process didn’t materialize. At least, not in 2020.
On April 6th, 2021, Aditya Pakki, using his own email address, submitted a patch.
There was some brief discussion from other developers on the email chain, which fizzled out within a few days. Then Kroah-Hartman took a look. He was already on high alert for bad code from the University of Minnesota, and Pakki’s email address set off alarm bells. What’s more, the patch Pakki submitted didn’t appear helpful. “It takes a lot of effort to create a change that looks correct, yet does something wrong,” Kroah-Hartman told me. “These submissions all fit that pattern.”
So on April 20th, Kroah-Hartman put his foot down.
“Please stop submitting known-invalid patches,” he wrote to Pakki. “Your professor is playing around with the review process in order to achieve a paper in some strange and bizarre way.”
Maintainer Leon Romanovsky then chimed in: he’d taken a look at four previously accepted patches from Pakki and found that three of them added “various severity” security vulnerabilities.
Kroah-Hartman hoped that his request would be the end of the affair. But then Pakki lashed back. “I respectfully ask you to cease and desist from making wild accusations that are bordering on slander,” he wrote to Kroah-Hartman in what appears to be a private message.
Kroah-Hartman responded. “You and your group have publicly admitted to sending known-buggy patches to see how the kernel community would react to them, and published a paper based on that work. Now you submit a series of obviously-incorrect patches again, so what am I supposed to think of such a thing?” he wrote back on the morning of April 21st.
Later that day, Kroah-Hartman made it official. “Future submissions from anyone with a umn.edu address should be default-rejected unless otherwise determined to actually be a valid fix,” he wrote in an email to a number of maintainers, as well as Lu, Pakki, and Wu. Kroah-Hartman reverted 190 submissions from Minnesota affiliates — 68 couldn’t be reverted but still needed manual review.
It’s not clear what experiment the new patch was part of, and Pakki declined to comment for this story. Lu’s website includes a brief reference to “superfluous patches from Aditya Pakki for a new bug-finding project.”
What is clear is that Pakki’s antics have finally set the delayed review process in motion; Linux developers began digging through all patches that university affiliates had submitted in the past. Jonathan Corbet, the founder and editor in chief of LWN.net, recently provided an update on that review process. Per his assessment, “Most of the suspect patches have turned out to be acceptable, if not great.” Of over 200 patches that were flagged, 42 are still set to be removed from the kernel.
Regardless of whether their reaction was justified, the Linux community gets to decide if the University of Minnesota affiliates can contribute to the kernel again. And that community has made its demands clear: the school needs to convince them its future patches won’t be a waste of anyone’s time.
What will it take to do that? In a statement released the same day as the ban, the university’s computer science department suspended its research into Linux-kernel security and announced that it would investigate Lu’s and Wu’s research method.
But that wasn’t enough for the Linux Foundation. Mike Dolan, Linux Foundation SVP and GM of projects, wrote a letter to the university on April 23rd, which The Verge has viewed. Dolan made four demands. He asked that the school release “all information necessary to identify all proposals of known-vulnerable code from any U of MN experiment” to help with the audit process. He asked that the paper on hypocrite commits be withdrawn from publication. He asked that the school ensure future experiments undergo IRB review before they begin, and that future IRB reviews ensure the subjects of experiments provide consent, “per usual research norms and laws.”
Two of those demands have since been met. Wu and Lu have retracted the paper and have released all the details of their study.
The university’s status on the third and fourth counts is unclear. In a letter sent to the Linux Foundation on April 27th, Heimdahl and Loren Terveen (the computer science and engineering department’s associate department head) maintain that the university’s IRB “acted properly,” and argues that human-subjects research “has a precise technical definition according to US federal regulations … and this technical definition may not accord with intuitive understanding of concepts like ‘experiments’ or even ‘experiments on people.’” They do, however, commit to providing more ethics training for department faculty. Reached for comment, university spokesperson Dan Gilchrist referred me to the computer science and engineering department’s website.
Meanwhile, Lu, Wu, and Pakki apologized to the Linux community this past Saturday in an open letter to the kernel mailing list, which contained some apology and some defense. “We made a mistake by not finding a way to consult with the community and obtain permission before running this study; we did that because we knew we could not ask the maintainers of Linux for permission, or they would be on the lookout for hypocrite patches,” the researchers wrote, before going on to reiterate that they hadn’t put any vulnerabilities into the Linux kernel, and that their other patches weren’t related to the hypocrite commits research.
Kroah-Hartman wasn’t having it. “The Linux Foundation and the Linux Foundation’s Technical Advisory Board submitted a letter on Friday to your university,” he responded. “Until those actions are taken, we do not have anything further to discuss.”
From the University of Minnesota researchers’ perspective, they didn’t set out to troll anyone — they were trying to point out a problem with the kernel maintainers’ review process. Now the Linux community has to reckon with the fallout of their experiment and what it means about the security of open-source software.
Some developers rejected University of Minnesota researchers’ perspective outright, claiming the fact that it’s possible to fool maintainers should be obvious to anyone familiar with open-source software. “If a sufficiently motivated, unscrupulous person can put themselves into a trusted position of updating critical software, there’s honestly little that can be done to stop them,” says White, the security researcher.
On the other hand, it’s clearly important to be vigilant about potential vulnerabilities in any operating system. And for others in the Linux community, as much ire as the experiment drew, its point about hypocrite commits appears to have been somewhat well taken. The incident has ignited conversations about patch-acceptance policies and how maintainers should handle submissions from new contributors, across Twitter, email lists, and forums. “Demonstrating this kind of ‘attack’ has been long overdue, and kicked off a very important discussion,” wrote maintainer Christoph Hellwig in an email thread with other maintainers. “I think they deserve a medal of honor.”
“This research was clearly unethical, but it did make it plain that the OSS development model is vulnerable to bad-faith commits,” one user wrote in a discussion post. “It now seems likely that Linux has some devastating back doors.”
Corbet also called for more scrutiny around new changes in his post about the incident. “If we cannot institutionalize a more careful process, we will continue to see a lot of bugs, and it will not really matter whether they were inserted intentionally or not,” he wrote.
And even for some of the paper’s most ardent critics, the process did prove a point — albeit, perhaps, the opposite of the one Wu, Lu, and Pakki were trying to make. It demonstrated that the system worked.
Eric Mintz, who manages 25 Linux servers, says this ban has made him much more confident in the operating system’s security. “I have more trust in the process because this was caught,” he says. “There may be compromises we don’t know about. But because we caught this one, it’s less likely we don’t know about the other ones. Because we have something in place to catch it.”
To Scott, the fact that the researchers were caught and banned is an example of Linux’s system functioning exactly the way it’s supposed to. “This method worked,” he insists. “The SolarWinds method, where there’s a big corporation behind it, that system didn’t work. This system did work.”
“Kernel developers are happy to see new tools created and — if the tools give good results — use them. They will also help with the testing of these tools, but they are less pleased to be recipients of tool-inspired patches that lack proper review,” Corbet writes. The community seems to be open to the University of Minnesota’s feedback — but as the Foundation has made clear, it’s on the school to make amends.
“The university could repair that trust by sincerely apologizing, and not fake apologizing, and by maybe sending a lot of beer to the right people,” Scott says. “It’s gonna take some work to restore their trust. So hopefully they’re up to it.”
VMware updated us on its progress on making Fusion compatible with Apple’s M1 chip this week. The company said it’s committed to “delivering a Tech Preview of VMware Fusion for macOS on Apple silicon this year,” but it’s not clear if that version of the tool will support Windows 10 on Arm, because of Microsoft’s licensing terms.
This isn’t the first time VMware has warned against M1-equipped Mac owners running Windows 10 on Arm. VMWare product line manager Michael Roy said earlier this month that “It’s uncharted waters, so everyone is treading lightly… Like, you can’t even BUY Windows for ARM, and folks using it who aren’t OEMs could be violating EULA… we’re not into doing that for the sake of a press release…”
So don’t expect VMware to follow Parallels in enabling Windows 10 on Arm support for M1-equipped Macs until Microsoft gives it the go-ahead. Roy said in the official announcement that VMware has “reached out to Microsoft for comment and clarification on the matter,” and that the company is “confident that if Microsoft offers Windows on Arm licenses more broadly, we’ll be ready to officially support it.”
For its part, Microsoft seems content not to commit to bringing Windows to the latest Macs. Apple said in November 2020 that its silicon is ready for Windows; it’s simply up to Microsoft to update the operating system to natively support the M1 chip. Now we have two leading virtualization software makers either moving forward without Microsoft (Parallels) or publicly calling for a verdict on the issue (VMware).
But this week’s announcement wasn’t all about Windows. The next major update to VMware Fusion is set to support Linux-based operating systems, and that progress appears to be going well. Roy said that he could boot seven Arm-based VMs—two command-line interfaces and five full desktops “configured with 4CPU and 8GB of RAM”—on a battery-powered MacBook Air that doesn’t even include a fan.
“Of course, just booting a bunch of VMs that are mostly idle isn’t quite a ‘real world experience’, nor is it the same as doing some of the stress testing that we perform in the leadup to a release,” Roy said. “Even with that said, and note that I’m using ‘debug’ builds which perform slower, in my 12 years at VMware I’ve never seen VMs boot and run like this. So we’re very encouraged by our early results, and seriously can’t wait to get it on every Apple silicon equipped Mac out there.” (Emphasis his.)
But there are some caveats. VMware Fusion doesn’t “currently have things like 3D hardware accelerated graphics,” Roy said, “and other features that require Tools which Fusion users on Intel Macs have come to expect.” The company also doesn’t plan to offer x86 emulation via Fusion—which means M1-equipped Mac owners won’t be able to install Windows or Linux .ISOs meant for the architecture.
Roy said VMware plans to release a preview of an M1-compatible version of Fusion “before the end of this year.” The company should offer more information about its progress toward supporting Apple silicon via the VMware Technology Network and Twitter “in the coming months.” Maybe that will give Microsoft enough time to publically decide whether or not it wants to make it easier to run Windows on the latest Macs.
Earlier today, Microsoft released a blog post sharing the company’s thoughts on the gaming industry and its focus for gaming in 2021 and beyond. Microsoft’s goal has changed for the Xbox team and its gaming initiatives from focusing more on a specific platform (i.e the Xbox consoles) to being focused on multiple platforms, specifically the PC.
Microsoft’s end goal is to be a ‘player first’ company, focusing more on the game rather than the platform it’s on. This means that we’ll be seeing more and more features being focused on the PC platform, and not just the Xbox consoles alone.
This is good news for PC gamers is that Microsoft is now focusing more than ever before on the PC gaming ecosystem, which hopefully means more and more gaming optimizations for the PC and more cross-play potential, which is what Microsoft also highlighted in its blog post.
For example, Microsoft shares more details on Halo Infinite which is getting a significant amount of PC development time and will fully support features such as ultrawide (21:9) and super ultrawide (32:9) screens, triple keybinds, and higher fidelity graphics which will be PC exclusive.
Plus, Halo infinite will also have cross-play capabilities with PC and the latest Xbox Series X/S consoles.
Microsoft is also expanding its cloud gaming services to allow any device to stream over 100 console games to your PC. A few days ago, Microsoft announced the beta for its cloud gaming service, and it will work with both Windows 10 devices and Apple iOS devices through web browsers such as Edge, Chrome, and Safari.
As for developers, Microsoft is adding more features to give developers an easier time making games for PCs. Currently, there’s a new DirectX12 feature called the Agility SDK, allowing developers to push the latest DirectX 12 features and updates to their games without the end user (player) requiring an operating system update.
Microsoft is also working on other features like Auto HDR technology and continuing to work on DirectStorage technology, which was an Xbox exclusive feature but is now being developed for the PC.
These are just some of the features Microsoft addressed in its blog post, but overall it’s great to see Microsoft focusing on the PC player experience just as much as on the Xbox consoles.
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
The Moto G Play is the least expensive of four phones that Motorola introduced for the US market earlier this year. Introduced at $169, it’s already enjoying an apparently permanent $10 mark down, placing it firmly in budget territory.
Most of what you’ll find on the G Play’s spec list makes a lot of sense given that price point: a 6.5-inch 720p LCD with standard refresh rate, Snapdragon 460 processor with 3GB of RAM, and a generous 5,000mAh battery.
There’s another factor that might sway some shoppers toward the G Play, too: as LG leaves the budget phone space and its remaining stock disappears from retailers’ shelves, Motorola’s budget-friendly phone will be one of a smaller number of options. In the US. we haven’t seen the kind of sub-$150 devices from Motorola that the company has introduced in other markets this year, so for now, the G Play is a likely candidate for the budget-oriented phone shopper’s consideration.
There are a couple of aspects of the phone’s feature set that feel underwhelming even considering the price: namely, a paltry 32GB of storage (though it can be expanded after the fact) and a 13-megapixel main camera when more advanced, high-resolution pixel-binning sensors are becoming the norm at every price point, for example. But in short, the G Play performs just fine for its price. Just know that you’ll need to bring patience and acceptance of a few shortcomings, and that spending an extra $50 to $100 if your budget allows would get you some meaningful upgrades.
Moto G Play performance and screen
The G Play uses a Snapdragon 460 chipset which was introduced over a year ago and is an entry-level Qualcomm processor. Coupled with 3GB of RAM, it manages to keep up with routine tasks like jumping between apps and scrolling social media, albeit with subtle-but-noticeable stuttering along the way. Heavier tasks like starting and stopping Google Maps navigation take an extra couple of beats. It’s not as frustrating as my experience with the LG Stylo 6 was, but it’s a step down from the kind of performance you can expect from a phone that’s $50-100 more expensive, including those in Motorola’s own G-series lineup.
The screen, likewise, gets the job done but doesn’t particularly shine. Its 720p resolution is stretched thin across the 6.5-inch display and images are noticeably pixelated. Colors are on the cooler side and the screen is a bit dim unless you max out the brightness. Even with the brightness cranked all the way up I had a hard time seeing it outdoors. We spend so many hours looking at our devices that this is one area where it might be worth upgrading. That said, there isn’t exactly anything wrong with the display — it’s just not very nice.
There’s better news on the battery front. The G Play includes a 5,000mAh battery, which is significantly bigger than the typical 4,000 or 4,500mAh found in other comparably sized Android phones. Motorola claims it will get three days of battery life, which is probably true if you’re a light user and conservative with your screen brightness. I had no problem getting two days on a charge with a couple of hours each day of screen-on time. A full day and then some of heavy use is definitely reasonable to expect.
The Moto G Play includes just 32GB of storage — about as low as it gets in 2021. Considering roughly half of that will be taken up with operating system files, it’s just not enough. Storage is expandable via microSD card, so plan on that extra $10-15 as a necessary part of the purchase if you don’t already have one.
The G Play ships with Android 10 installed. While it’s on Motorola’s list to receive an Android 11 update, the timing is unclear and, given the company’s track record, could be many months away. The phone will receive security updates until January 2023. That’s an unfortunately short life span, so you might want to count on trading it in or cashing in your upgrade with your carrier after a couple of years.
Moto G Play camera
The G Play has just one 13-megapixel rear-facing camera (accompanied by a 2-megapixel depth sensor) and a 5-megapixel selfie camera. That’s it. Even in the budget class, that’s not many cameras in 2021. I don’t think anyone (myself included) will miss having the low-quality macro camera that manufacturers keep putting on their devices these days, but not having an ultrawide is a bummer considering it’s not hard to find a phone that offers one at this price.
If nothing else, the G Play’s camera offerings are very straightforward. There are just two main shooting modes in the native camera app: photo and video. Portrait mode and a few other extras are available in the shooting menu, but there’s no night mode here.
Outside in good lighting, this 13-megapixel camera does okay. Overall exposures are balanced and the camera doesn’t try to do too much HDR-ing, which I appreciate, but you don’t have to look too close to notice that details in grass and leaves are smoothed over. Things go downhill quickly in less good light — the G Play just isn’t up to low-light photography. The selfie camera is also guilty of aggressive over-smoothing at its default “Face Beauty Auto” setting that made my face look like a glazed donut. Thankfully, you can turn this off.
Clearly the G Play has its share of shortcomings — at $160, it has to. The question is whether these are trade-offs you can live with for a couple of years. If you enjoy a very casual relationship with your phone, the G Play will do all of the things you need it to do.
Everyday performance for the basics — light web browsing, social media, email, music — is sufficient. If you just use your phone camera for quick snapshots out and about and don’t expect too much from it, the G Play will do fine.
If you suspect that you need a little more from your phone, or that you want the experience of using your phone to be a little more enjoyable, then I’d strongly encourage spending a bit more on something like Motorola’s own Moto G Power for a better camera. Samsung and OnePlus have recent entries in the sub-$200 class that are worth looking at, too; I haven’t tested them, but they’re specced competitively.
If your relationship with technology is less complicated than those of us who spend hours each day of our precious time on this Earth staring at the little glowing screen in our hands, jumping between social media apps, and pushing our phone cameras to their limits, then you’ll get along fine with the Moto G Play. Just spare a thought for the rest of us, please?
Best Media Streamers Buying Guide: Welcome to What Hi-Fi?’s round-up of the best media streamers you can buy in 2021.
No BT Sport or ESPN+ app on your TV?You need to get yourself a media streamer, but which one? Not all media streamers are the same. The best media steamers will provide a total and endless supply of TV shows, films and music but there are performance differences too. Some look and sound better than others.
Whether it’s Netflix, Prime Video, Apple TV, Google Play Movies & TV, a service dedicated to skateboarding or free 1970s kung fu films, it’s a media streamer’s job to deliver them.
Few smart TVs cover all the apps and a media streamer will put that right without you having to spend big. It’s a media streamer’s raison d’etre to make sure that they’re stacked with services. With more competition in the market than ever, prices are low, standards are high and any gaps in their app offerings could be a killer weakness.
TV streaming devices are pretty much foolproof too. All you do is plug them into your flatscreen, connect them wirelessly to your home wi-fi network and get watching. Despite their ease of use, though, there is quite some difference in how much they cost. More advanced models that offer 4K, HDR and voice controls will charge more, but there are plenty of simple streaming sticks for those on tight budgets too.
Before you chose, bear in mind that to enjoy HD and 4K content, you’ll need a fast broadband connection. Netflix recommends a steady connection of 25Mbps or higher for 4K video, for example.
You should also check which services each device offers, especially as exclusive, original TV shows and films are all the rage. So whether you want to use Apple’s library or Google’s, watch the latest Netflix, Disney Plus or Amazon Prime Video TV show, independent films on MUBI, or live sport courtesy of Now TV, our round-up of the best media streamers has got you covered.
HDR TV: What is it? How can you get it?
Disney Plus streaming service: everything you need to know
1. Chromecast with Google TV
Google’s cracked it this time.
SPECIFICATIONS
Max resolution: 4K | Audio: Dolby Atmos | Output: HDMI | HDR: HDR10, HLG, HDR10+, Dolby Vision | Dimensions (HWD): 12.5 x 6 x 16cm
Reasons to Buy
Lots of apps
Excellent HDR picture
Dolby Vision and Atmos
Reasons to Avoid
No Apple content available
Rivals sound more dynamic
Google was in Amazon’s media streaming shadow until the arrival of the most recent Chromecast and, specifically, the Google TV user-interface that comes with it. While this streamer and the Fire TV Stick 4K are an even match for performance, it’s Google’s superior operating system that wins the day.
Google TV is the successor to Android TV and is beginning its roll-out across smart TVs from 2021. It’s better looking, more intuitive, more searchable and, crucially, excellent at making suggestions of what to watch next.
That’s best underlined in the way that it presents search results with an even hand. Top suggestions will always be from streaming services to which you already subscribe, and in top quality where possible, rather than Google trying to sell you content to which you already have access.
While, natively, it doesn’t have quite as many apps as some rivals, you can make up for that by casting anything it doesn’t have from your mobile or tablet instead. The only caveat is that it won’t bring access to Apple TV or Apple Music. If you need those, then try Amazon or the more expensive Apple TV 4K instead.
Read the full Chromecast with Google TV review
2. Amazon Fire TV 4K
Amazon’s excellent streaming stick is amazing value.
Amazon’s 4K streaming stick is as worthy a no.2 as you’ll find. It offers unbeatable value, 4K streaming, support for multiple HDR formats and all with the Alexa voice-activated personal assistant.
Amazon Prime Video comes as standard (of course), alongside Netflix, the terrestrial catch-up services (BBC iPlayer, ITV Hub, All 4 and My5), Now TV, BT Sport, Apple TV, Disney Plus and the bonus of music services Spotify, Deezer, Apple Music and Tidal. The only minor omissions are an official Rakuten app and Google Play Movies & TV.
The main reason that this device doesn’t score quite as well as the Chromecast above is because its operating system isn’t as good. It’s too Amazon-focused doesn’t perform quite as well for suggestions. That said, it’s worth noting that Amazon’s latest Fire TV OS will arrive on the Fire TV Stick 4K in the coming months and that could be a game-changer for usability. Watch this space or, specifically, the one just above.
Read the full review: Amazon Fire TV Stick 4K
3. Apple TV 4K
Not cheap, but up there with the best streaming devices around.
This box of tricks offers the typically slick experience we’ve come to expect from Apple. Voice controls come courtesy of Siri – Apple’s personal assistant – while 4K and HDR are all part and parcel of the package. There’s plenty to watch too, thanks to Apple’s extensive catalogue of 4K and HDR content.
And with the arrival of the Apple TV+ streaming service that’s only got better. Netflix, iPlayer and Amazon Prime Video are offered with Now TV and All 4 both present now too. It’s not cheap – it’s positively exorbitant compared to some on this list – but if you’re happy with life in the Apple ecosystem and you can afford it, it’s money well spent for the home streaming enthusiast.
Read the full review: Apple TV 4K
4. Google Chromecast (2018)
A cheap, quick and convenient media streamer.
SPECIFICATIONS
Max resolution: 1080p | Audio: Dolby Digital 5.1, Dolby Digital Plus 7.1 | Output: HDMI | HDR: n/a | Dimensions (HWD): 5.2×1.38×1.38cm
Reasons to Buy
Affordable
Casting is neat
Good video and sound
Reasons to Avoid
Little new of note
No dedicated remote
At just £30/$30, this is one of the cheapest video streaming devices around. Chromecast is a decent little device and if you don’t have a 4K TV, its 1080p resolution is all you need. You can ‘cast’ Netflix, BBC iPlayer, ITV Player, All 4, My5 and Now TV, along with Google Play Movies and YouTube TV. On the music front, Spotify, TuneIn and Tidal are all catered for. Amazon Prime Video is now included too.
You have to control Google Chromecast from your phone or tablet, so it’s a different proposition from most of the streamers here. But it does what it does very well indeed.
Read the full review: Google Chromecast (2018)
5. Amazon Fire TV Stick with Alexa
This sophisticated streaming device is a joy to use.
SPECIFICATIONS
Max resolution: 1080p | Audio: Dolby Digital 5.1, Dolby Digital Plus 7.1 | Output: HDMI | HDR: n/a | Dimensions (HWD): 3×8.6×1.3cm
Reasons to Buy
Alexa voice control
Responsive UI
Multiple streaming services
Amazon’s cheaper streaming stick loses the 4K and HDR, but retains the Alexa personal assistant for voice control. All the major streaming services are supported, apart from Now TV, and the sound quality is impressive for such a cheap device. If you’re looking for a cheap and easy way to start streaming, this might be the one for you.
Read the full review: Amazon Fire TV Stick with Alexa
6. Roku Streaming Stick+
An excellent, all-round video streamer with a tempting price tag.
Roku might not be as well known in some parts, but it’s a big global player in the streaming market and this device is a solid bet. It’s affordable, boasts 4K and HDR (albeit limited formats for the latter) and doesn’t need mains power to run. Because Roku doesn’t make its own shows, there’s no hard sell as to what to watch, as there is with Amazon devices, and all the major streaming services are supported, including Now TV (which you won’t find on an Amazon device).
Read the full review: Roku Streaming Stick+
7. Now TV Smart Stick
Sky content streamed via a stick, without the subscription.
SPECIFICATIONS
Max resolution: 1080p | Audio: Dolby Digital 5.1 and 7.1 | Output: HDMI | HDR: n/a | Dimensions (HWD): 8.4×2.3×1.3cm
Reasons to Buy
Inexpensive
Easy-to-use interface
No contract or dish
Reasons to Avoid
Limited app selection
Sky content limited to 720p
At under £20, this is one of the cheapest ways to turn your old TV into a smart TV. Most of the main streaming services are here, except for Amazon Prime Video, and as you’d expect, it gently nudges you towards Sky’s Now TV streaming service at every turn. While it can stream in 1080p, Now TV tops out at 720p. If you can put up with these limitations, it’s a bargain, and a great way to get Sky TV without a subscription.
Read the full review: Now TV Smart Stick
MORE:
30 of the best TV shows to watch on Netflix
21 of the best TV shows to watch on Amazon Prime Video
AMD’s Threadripper consumer HEDT processors continue to be praised strongly for their excellent compute performance and connectivity options. But what if you want more than 256GB of memory? What if you want your RAM to run in 8-channel mode? What if you want more than 64 PCIe Gen 4 lanes? Well… that’s where Threadripper Pro comes in.
Watch via our Vimeo Channel (Below) or over on YouTube at 2160p HERE
Video Timestamps:
00:00 Start
00:15 Some details/pricing
01:15 Star of the show – Threadripper Pro 3975WX
03:20 The CPU cooler
03:46 Memory setup / weird plastic shrouds with fans
05:27 AMD Radeon Pro W5700 GPU
07:00 Motherboard
08:55 Storage options
09:41 1000W PSU (Platinum) and custom setup
10:32 Luke’s thoughts and I/O panels
11:22 The Chassis
11:40 Cooling and tool less design
12:35 Summary so far
14:02 Performance tests
16:49 System temperatures, power and noise testing
19:05 System under idle conditions – ‘rumbling’ noise we experienced
19:22 Pros and Cons / Closing thoughts
Primary Specifications:
32-core AMD Threadripper Pro 3975WX processor
128GB of 3200MHz ECC DDR4 memory in 8-channel mode
AMD Radeon Pro W5700 graphics card with 8GB GDDR6 VRAM
WD SN730 256GB NVMe SSD
1kW 80Plus Platinum PSU
We are examining the Lenovo ThinkStation P620 workstation that is built around Threadripper Pro and its 8-channel memory support. There are a few options for the base processor on Lenovo’s website including 12, 16, 32, and 64 core options. Specifically, we are looking at the 32-core Threadripper Pro 3975WX chip and we are hoping that Lenovo can keep it running at the rated 3.5-4.2GHz speeds beneath that modestly sized CPU cooler.
Partnering this 280W TDP monster with its 128 PCIe Gen 4 lanes is 128GB of 8-channel DDR4 3200MHz ECC memory. While a 128GB installation is merely small-fry for Threadripper Pro, the 3200MHz modules running in 8-channel mode should allow for some excellent results in bandwidth-intensive tasks. Plus, you get a 1600MHz Infinity Fabric link for the Zen 2 cores.
I will, however, emphasise my dislike for Lenovo decision to deploy a 40mm fan and shroud to cool each DIMM bank. This seems unnecessary for a 128GB installation and merely adds additional noise and points of failure. Metal heatspreaders on the DIMMs would have been better, if enhanced cooling is deemed necessary.
Graphics comes in the form of an 8GB Radeon Pro W5700 blower-style card which we have already reviewed on KitGuru. That makes this an all-AMD system as far as the key components go. Another key benefit is ISV certification for the Lenovo P620. That point will be music to the ears of system buyers in a business environment with users who run software on the guaranteed support list.
Another point that will garner particular attention from prospective buyers is the display output connectivity. On its ‘pro-grade’ card, AMD deploys five Mini-DisplayPort 1.4 connections and one USB-C port. That gives you convenient access to six total display outputs which is super. As highlighted in our review of the Radeon Pro W5700, you can power five 4K monitors or three 5K alternatives, making this an excellent workstation proposition.
Lenovo uses its own WRX80 motherboard to house the sWRX8 Threadripper Pro CPU. The power delivery solution looks competent and Lenovo’s use of proper finned VRM heatsinks with passive cooling is to be commended. Six total PCIe Gen 4 slots are provided by the motherboard – four x16 bandwidth and two x8. However, only two x16 slots remain usable due to the slot spacing, and the top one will likely interfere with the RAM fan’s header.
It is actually disappointing to see Lenovo offering up sub-par expansion slot capability. There is no clear way to use the 128 lane capability from Threadripper Pro. That is especially disappointing to users who will want multiple graphics card alongside high-bandwidth networking and storage devices. However, the limited expandability is a clear compromise from Lenovo’s use of a compact chassis with just a couple of 80mm fans for intake and exhaust airflow.
At least you do get dual, cooled M.2 slots on the motherboard. One of those is occupied by a 256GB WD SN730 SSD in our install. Clearly, most users will want to adjust the storage configuration. But this is clearly a very subjective requirement, so I respect Lenovo for offering a basic, cheap drive for the baseline configuration.
Power is delivered by a 1kW 80Plus Platinum unit. Lenovo highlights 92% efficiency on the configurator page, but this is likely a mistake for 230/240V UK customers given the more stringent 80Plus Platinum requirements for those operating voltages. The PSU’s tool-less design is absolutely superb and works very well; a single connector port feeds power from the unit through the motherboard where it is then distributed accordingly, including via break-out cables for PCIe and SATA connectors.
Connectivity for the system is just ‘OK‘. You get 10GbE Aquantia AQC107 networking onboard, but a secondary network adapter is disappointingly omitted. I would have liked to see a few more USB ports on the rear IO, including some in Type-C form and preferably 20Gbps high-speed rated. However, the front IO is excellent with four 10Gbps USB connections, two of which are Type-C. I also appreciated the system’s included audio speaker when using the unit without a proper set of speakers.
The chassis build quality is good and feels very well-built given its compact form. Man-handling the hefty system is easy thanks to the front handle. And the internal tool-less design is excellent. Lenovo’s configurator gives an option to upgrade to a side panel with key locking to prevent unauthorised access, which is good to see.
With that said, cooling certainly looks to be limited with just two 80mm intake fans on the chassis. The graphics card, CPU, PSU, and (annoyingly) RAM also have fans to take care of their own cooling. If you are thinking of adding a second high power GPU, though, the internals are likely to get very toasty.
Priced at around £5.5-6K inc. VAT in the UK (depending on the graphics card situation given current shortages), we are keen to see how Threadripper Pro performs in this reasonably compact workstation.
Detailed Specifications
Processor: AMD Threadripper Pro 3975WX (32 cores/64 threads, 3.5/4.2GHz, 280W TDP, 144MB L2+L3 cache, 128 PCIe Gen 4 lanes, up to 2TB 8-channel DDR4-3200 ECC memory support)
Motherboard: Lenovo WRX80 Threadripper Pro Motherboard
Memory: 128GB (8x16GB) SK Hynix 3200MHz C24 ECC DDR4, Octa-channel
Graphics Card: 8GB AMD Radeon Pro W5700 (RDNA/Navi GPU, 36 compute units, 2304 stream processors, 205W TDP, 1183MHz base clock, 1750MHz GDDR6 memory on a 256-bit bus for 448GBps bandwidth)
System Drive: 256GB WD SN730 PCIe NVMe SSD
CPU Cooler: Lenovo dual-tower heatsink with 2x 80mm fans
Power Supply: 1000W 80Plus Platinum PSU
Case: Lenovo Thinkstation P620 Workstation
Networking: Aquantia AQC107 10GbE onboard
Operating System: Windows 10 Pro
Become a Patron!
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.