Nvidia today announced at Computex 2021 that it’s partnered with Valve to bring its Deep Learning Super Sampling (DLSS) graphics tech to Linux via Steam Proton. Now people who game on Linux systems should be able to put their Nvidia graphics cards—including the new GeForce RTX 3080 Ti and RTX 3070 Ti—to even better use.
DLSS is Nvidia’s solution to the problem of improving a game’s performance without having to compromise too much on presentation. The first version of the technology debuted in September 2018; the second version was released in March 2020. Both versions were limited to RTX graphics cards used to play games on Windows.
That’s about to change. Nvidia said in a press release that it, Valve, and “the Linux gaming community are collaborating to bring NVIDIA DLSS to Proton – Linux gamers will be able to use the dedicated AI cores on GeForce RTX GPUs to boost frame rates for their favorite Windows Games running on the Linux operating system.”
Proton is Valve’s open-source tool “for use with the Steam client which allows games which are exclusive to Windows to run on the Linux operating system,” as it’s described on GitHub, with some assistance from the Wine utility that Linux users have relied on to run Windows programs since its debut in 1993.
Valve said that Proton is built in to the Linux Steam Client Beta; the open-source project is meant to give “advanced users” more control over their experience. Presumably, the upcoming DLSS support will be part of the core Linux Steam Client Beta, but it could also be implemented as an optional feature, at least to start.
Nvidia didn’t offer many other details about its partnership with Valve or to whom it was referring when it said “the Linux gaming community.” But it did make it clear that Linux gamers won’t have to wait long for DLSS: “Support for Vulkan titles is coming this month,” the company said, “with DirectX support coming in the Fall.”
The continued expansion of DLSS arrived shortly after AMD announced that its FidelityFX Super Resolution technology, which promises similar features but will be available on many hardware platforms, will be available on June 22. At least now Nvidia can say that DLSS will be available on multiple operating systems, right?
Intel briefly demoed an Alder Lake laptop at Computex 2021 and confirmed that the company already has mobile versions of its new hybrid chips shipping to its customers and partners. Ultimately the demo was little more than Intel showing the laptop playing back a video, but it is an important milestone because it confirms that the mobile Alder Lake variants are deep in the development process.
It’s well known that Intel’s 12th-Gen Alder Lake will bring the company’s hybrid architecture, which combines a mix of larger high-performance Golden Cove cores paired with smaller high-efficiency Gracement cores, to desktop x86 PCs for the first time. However, Intel is going all-in: Intel will reunify its desktop and mobile lines with Alder Lake, bringing its new 10nm architecture and leading-edge connectivity options, like PCIe 5.0 and DDR5, to laptops.
We’ve already pieced together plenty of information about Alder Lake, which you can read here. Here’s the brief rundown:
Qualification and production in the second half of 2021
Hybrid x86 design with a mix of big and small cores (Golden Cove/Gracemont)
Up to 16 cores
10nm Enhanced SuperFin process
LGA1700 socket requires new motherboards
PCIe 5.0 and DDR5 support rumored
Five variants: -S for desktop PCs, -P for mobile, -M for low-power devices, -L Atom replacement, -N educational (probably Chromebooks)
Gen12 Xe integrated graphics
New hardware-guided operating system scheduler tuned for high performance
Intel hasn’t released the official specifications of the Alder Lake processors, but a recent update to the SiSoft Sandra benchmark software, along with listings to the open-source Coreboot (a lightweight motherboard firmware option), have given us plenty of clues to work with.
The Coreboot listing outlines various combinations of the big and little cores in different chip models, with some models even using only the larger cores (possibly for high-performance gaming models). The information suggests four configurations with -S, -P, -N, and -M designators, and an -L variant has also emerged:
Alder Lake-S: Desktop PCs (Both LGA and BGA models)
Alder Lake-P: High-performance notebooks
Alder Lake-M: Low-power devices
Alder Lake-L: Listed as “Small Core” Processors (Atom)
Alder Lake-N: Educational and consumer client (Chromebook-class devices)
Naturally, Intel didn’t divulge which flavor of the mobile processor it unveiled today, but it appears there will be four different flavors of the mobile devices to choose from. Intel divulged that Alder Lake laptops will come later this year, so we won’t have to wait long for further details.
Intel kicked off Computex 2021 by adding two new flagship 11th-Gen Tiger Lake U-series chips to its stable, including a new Core i7 model that’s the first laptop chip for the thin-and-light segment that boasts a 5.0 GHz boost speed. As you would expect, Intel also provided plenty of benchmarks to show off its latest silicon.
Intel also teased its upcoming Beast Canyon NUCs that are the first to accept full-size graphics cards, making them more akin to a small form factor PC than a NUC. These new machines will come with Tiger Lake processors. Additionally, the company shared a few details around its 5G Solution 5000, its new 5G silicon for Always Connected PCs that it developed in partnership with MediaTek and Fibocom. Let’s jump right in.
Intel 11th-Gen Tiger Lake U-Series Core i7-1195G7 and i5-1155G7
Intel’s two new U-series Tiger Lake chips, the Core i7-1195G7 and Core i5-1155G7, slot in as the new flagships for the Core i7 and Core i5 families. These two processors are UP3 models, meaning they operate in the 12-28W TDP range. These two new chips come with all the standard features of the Tiger Lake family, like the 10nm SuperFin process, Willow Cove cores, the Iris Xe graphics engine, and support for LPDDR4x-4266, PCIe 4.0, Thunderbolt 4 and Wi-Fi 6/6E.
Intel expects the full breadth of its Tiger Lake portfolio to span 250 designs by the holidays from the usual suspects, like Lenovo MSI, Acer and ASUS, with 60 of those designs with the new 1195G7 and 1155G7 chips.
Intel Tiger Lake UP3 Processors
PROCESSOR
CORES/THREADS
GRAPHICS (EUs)
OPERATING RANGE (W)
BASE CLOCK (GHZ)
SINGLE CORE TURBO FREQ (GHZ)
MAXIMUM ALL CORE FREQ (GHZ)
Cache (MB)
GRAPHICS MAX FREQ (GHZ)
MEMORY
Core i7-1195G7
4C / 8T
96
12 -28W
2.9
5.0
4.6
12
1.40
DDR4-3200, LPDDR4x-4266
Core i7-1185G7
4C / 8T
96
12 – 28W
3.0
4.8
4.3
12
1.35
DDR4-3200, LPDDR4x-4266
Core i7-1165G7
4C / 8T
96
12 – 28W
2.8
4.7
4.1
12
1.30
DDR4-3200, LPDDR4x-4266
Core i5-1155G7
4C / 8T
80
12 – 28W
2.5
4.5
4.3
8
1.35
DDR4-3200, LPDDR4x-4266
Core i5-1145G7
4C / 8T
80
12 – 28W
2.6
4.4
4.0
8
1.30
DDR4-3200, LPDDR4x-4266
Core i5-1135G7
4C / 8T
80
12 – 28W
2.4
4.2
3.8
8
1.30
DDR4-3200, LPDDR4x-4266
Core i3-1125G4*
4C / 8T
48
12 – 28W
2.0
3.7
3.3
8
1.25
DDR4-3200, LPDDR4x-3733
The four-core eight-thread Core i7-1195G7 brings the Tiger Lake UP3 chips up to a 5.0 GHz single-core boost, which Intel says is a first for the thin-and-light segment. Intel has also increased the maximum all-core boost rate up to 4.6 GHz, a 300 MHz improvement.
Intel points to additional tuning for the 10nm SuperFin process and tweaked platform design as driving the higher boost clock rates. Notably, the 1195G7’s base frequency declines by 100 MHz to 2.9 GHz, likely to keep the chip within the 12 to 28W threshold. As with the other G7 models, the chip comes with the Iris Xe graphics engine with 96 EUs, but those units operate at 1.4 GHz, a slight boost over the 1165G7’s 1.35 GHz.
The 1195G7’s 5.0 GHz boost clock rate also comes courtesy of Intel’s Turbo Boost Max Technology 3.0. This boosting tech works in tandem with the operating system scheduler to target the fastest core on the chip (‘favored core’) with single-threaded workloads, thus allowing most single-threaded work to operate 200 MHz faster than we see with the 1185G7. Notably, the new 1195G7 is the only Tiger Lake UP3 model to support this technology.
Surprisingly, Intel says the 1195G7 will ship in higher volumes than the lower-spec’d Core i7-1185G7. That runs counter to our normal expectations that faster processors fall higher on the binning distribution curve — faster chips are typically harder to produce and thus ship in lower volumes. The 1195G7’s obviously more forgiving binning could be the result of a combination of the lower base frequency, which loosens binning requirements, and the addition of Turbo Boost Max 3.0, which only requires a single physical core to hit the rated boost speed. Typically all cores are required to hit the boost clock speed, which makes binning more challenging.
Image 1 of 3
Image 2 of 3
Image 3 of 3
The four-core eight-thread Core i5-1155G7 sees more modest improvements over its predecessor, with boost clocks jumping an additional 100 MHz to 4.5 GHz, and all-core clock rates improving by 300 MHz to 4.3 GHz. We also see the same 100 MHz decline in base clocks that we see with the 1195G7. This chip comes with the Iris Xe graphics engine with 80 EUs that operate at 1.35 GHz.
Intel’s Tiger Lake Core i7-1195G7 Gaming Benchmarks
Intel shared its own gaming benchmarks for the Core i7-1195G7, but as with all vendor-provided benchmarks, you should view them with skepticism. Intel didn’t share benchmarks for the new Core i5 model.
Image 1 of 3
Image 2 of 3
Image 3 of 3
Intel put its Core i7-1195G7 up against the AMD Ryzen 7 5800U, but the chart lists an important caveat here — Intel’s system operates between 28 and 35W during these benchmarks, while AMD’s system runs at 15 to 25W. Intel conducted these tests on the integrated graphics for both chips, so we’re looking at Iris Xe with 96 EUs versus AMD’s Vega architecture with eight CUs.
Naturally, Intel’s higher power consumption leads to higher performance, thus giving the company the lead across a broad spate of triple-A 1080p games. However, this extra performance comes at the cost of higher power consumption and thus more heat generation. Intel also tested using its Reference Validation Platform with unknown cooling capabilities (we assume they are virtually unlimited) while testing the Ryzen 7 5800U in the HP Probook 455.
Intel also provided benchmarks with DirectX 12 Ultimate’s new Sampler Feedback feature. This new DX12 feature reduces memory usage while boosting performance, but it requires GPU hardware-based support in tandem with specific game engine optimizations. That means this new feature will not be widely available in leading triple-A titles for quite some time.
Intel was keen to point out that its Xe graphics architecture supports the feature, whereas AMD’s Vega graphics engine does not. ULMark has a new 3DMark Sampler Feedback benchmark under development, and Intel used the test release candidate to show that Iris Xe graphics offers up to 2.34X the performance of AMD’s Vega graphics with the feature enabled.
Intel’s Tiger Lake Core i7-1195G7 Application Benchmarks
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Here we can see Intel’s benchmarks for applications, too, but the same rules apply — we’ll need to see these benchmarks in our own test suite before we’re ready to claim any victors. Again, you’ll notice that Intel’s system operates at a much higher 28 to 35W power range on a validation platform while AMD’s system sips 15 to 25W in the HP Probook 455 G8.
As we’ve noticed lately, Intel now restricts its application benchmarks to features that it alone supports at the hardware level. That includes AVX-512 based benchmarks that leverage the company’s DL Boost suite that has extremely limited software support.
Intel’s benchmarks paint convincing wins across the board. However, be aware that the AI-accelerated workloads on the right side of the chart aren’t indicative of what you’ll see with the majority of productivity software. At least not yet. For now, unless you use these specific pieces of software very frequently in these specific tasks, these benchmarks aren’t very representative of the overall performance deltas you can expect in most software.
In contrast, the Intel QSV benchmarks do have some value. Intel’s Quick Sync Video is broadly supported, and the Iris Xe graphics engine supports hardware-accelerated 10-bit video encoding. That’s a feature that Intel rightly points out also isn’t supported with MX-series GPUs, either.
Intel’s support for hardware-accelerated 10-bit encoding does yield impressive results, at least in its benchmarks, showing a drastic ~8X reduction in a Handbrake 4K 10-bit HEVC to 1080P HEVC transcode. Again, bear in mind that this is with the Intel chip running at a much higher power level. Intel also shared a chart highlighting its broad support for various encoding/decoding options that AMD doesn’t support.
Intel Beast Canyon NUC
Image 1 of 2
Image 2 of 2
Intel briefly showed off its upcoming Beast Canyon NUC that will sport 65W H-Series Tiger Lake processors and be the first NUC to support full-length graphics cards (up to 12 inches long).
The eight-litre Beast Canyon certainly looks more like a small form factor system than what we would expect from the traditional definition of a NUC, and as you would expect, it comes bearing the Intel skull logo. Intel’s Chief Performance Strategist Ryan Shrout divulged that the system will come with an internal power supply. Given the size of the unit, that means there will likely be power restrictions for the GPU. We also know the system uses standard air cooling.
Intel is certainly finding plenty of new uses for its Tiger Lake silicon. The company recently listed new 10nm Tiger Lake chips for desktop PCs, including a 65W Core i9-11900KB and Core i7-11700KB, and told us that these chips would debut in small form factor enthusiast systems. Given that Intel specifically lists the H-series processors for Beast Canyon, it doesn’t appear these chips will come in the latest NUC. We’ll learn more about Beast Canyon as it works its way to release later this year.
Intel sold its modem business to Apple back in 2019, leaving a gap in its Always Connected PC (ACPC) initiative. In the interim, Intel has worked with MediaTek to design and certify new 5G modems with carriers around the world. The M.2 modules are ultimately produced by Fibocom. The resulting Intel 5G Solution 5000 is a 5G M.2 device that delivers up to five times the speed of the company’s Gigabit LTE solutions. The solution is compatible with both Tiger and Alder Lake platforms.
Intel claims that it leads the ACPC space with three out of four ACPCs shipping with LTE (more than five million units thus far). Intel’s 5G Solution 5000 is designed to extend that to the 5G arena with six designs from three OEMs (Acer, ASUS and HP) coming to market in 2021. The company says it will ramp to more than 30 designs next year.
Intel says that while it will not be the first to come to market with a 5G PC solution, it will be the first to deliver them in volume, but we’ll have to see how that plays out in the face of continued supply disruptions due to the pandemic.
(Pocket-lint) – When ZTE told us the Axon 30 Ultra 5G was en route for review, we got that fuzzy feeling inside. That’s because the older Axon 20 5G was the first device we’d ever seen with an under-display selfie camera – so surely the Axon 30 Ultra would take this technology to the next level?
Um, nope. Instead the Axon 30 Ultra instead has a more traditional punch-hole selfie camera front and centre, so that fuzzy feeling quickly dissipated. Without such a ‘magic camera’ on board what then is the appeal of this flagship?
The Axon 30 Ultra is all about power and affordability. It crams a top-tier Qualcomm Snapdragon 888 processor into a slender body with a 6.67-inch AMOLED display that can push its refresh rate to a class-leading 144Hz. All for just £649 in the UK and $749 in the USA. So is that as exceptional value as it sounds or are there hidden compromises?
Having moved out of the gigantic Xiaomi Mi 11 Ultra, the ZTE’s more slender frame and trim 20:9 aspect ratio felt like a revelation by comparison. It’s not that the Axon 30 Ultra is small, per se, but it’s a well balanced scale.
Pocket-lint
The model we have in review is apparently black – that’s what the box says anyway – but the phone’s rear has a much softer metallic appearance about it, with some degree of blue to its colour balance. Really we’d call it a metallic grey. It looks pleasant, while fingerprint smears aren’t a massive problem thanks to the soft-touch material.
The camera unit on the rear is a fairly chunky protrusion, but that’s because there’s a 5x zoom periscope housed within that frame. It’s a relatively elegant block of cameras, though, and even with the phone flat against a desk it doesn’t rock about unwantedly.
The screen is the big selling point though. It’s a 6.67-inch AMOLED panel, the kind we’ve seen in the Redmi Note 10 Pro, for example, except the ZTE goes all-out when it comes to refresh rate by offering up to 144Hz. You can pick from 60Hz/90Hz/120Hz too, with the option to display the refresh rate in the upper left corner.
Pocket-lint
Having a faster refresh rate means smoother visuals, especially when it comes to moving content. You’re more likely to notice it when scrolling through emails than much else, though, so we’ve found our preference for balancing rate to battery life has meant settling on 90Hz. A more dynamic software approach would be better, or the option to designate specific apps to function at specific frame rates – especially games.
Are you really going to tell the difference between 144Hz and 120Hz? No. But the simple fact the Axon 30 Ultra can do this is to show its worth; to show that it’s got more power credentials than many less adept phones at this price point.
Otherwise the screen hits all the right notes. It’s got ample resolution. Colours pop. Blacks are rich thanks to the AMOLED technology. It’s slightly curved to the edges too, but only subtly to help hide away the edge bezel from direct view – and we haven’t found this to adversely affect use due to accidental touches and such like.
Pocket-lint
There’s also an under-display fingerprint scanner tucked beneath the screen’s surface, which we’ve found to be suitably responsive for sign-ins. Or you can sign-up to face unlock instead to make things even easier.
Having that scanner in such a position, rather than over the power button, leaves the Axon 30 Ultra’s edges to be rather neat. Other than the on/off and volume up/down rocker to the one side, and USB-C port, single speaker and SIM tray to the bottom edge, there’s nothing to disrupt the phone’s form. That keeps it looking neat and tidy. It also means no 3.5mm headphone jack, but that’s hardly a surprise.
Performance & Battery
Processor: Qualcomm Snapdragon 888, 8GB/12GB RAM
Storage: 128GB/256GB/1TB, no microSD card slot
Battery: 4600mAh, 66W fast-charging
Software: ZTE MyOS 11 (Android 11)
Elegant looks complement an elegant operation, too, largely down to the power that’s available on tap. With Qualcomm’s Snapdragon 888 processor on board, couple with 8GB RAM, there’s little else more powerful that you can buy. Indeed, the Axon 30 Ultra is knocking on the door of gaming phone territory given that 144Hz refresh rate screen.
Pocket-lint
Navigating around the interface is super smooth and speedy, apps open quickly, and there’s no downturn in performance if you happen to open a whole bunch. Games are a breeze, too, as you’d expect from this kind of hardware – although we’d like a game centre to prevent over-screen notifications and such like.
But it’s not perfectly smooth sailing on account of ZTE’s own software, which here is MyOS 11 over the top of Google’s Android 11 operating system. It’s a common problem among Chinese makers, so we probably sound like a broken record, but there are definitely issues with notifications. WhatsApp might take a couple of hours to notify you of a message, for example, but there’s never a fixed period of time – and other times it’s immediate. The mail app Outlook rarely to never notified of new mails in the inbox either.
A lot of this is down to software management. Because there’s rather a lot of it in MyOS. Under battery settings is an ‘Apps AI-control’, which is said to intelligently manage apps to save power. Except, as we’ve highlighted above, this can stifle some apps inappropriately. It can be turned off for manual control, where individual apps can have their auto-start and background running characteristics specified.
All of this is an attempt to aid the overall battery life. Because, as you can imagine, cranking out gaming sessions using the 144Hz and top-end engine from Qualcomm’s SD888 definitely eats away at the supply pretty rapidly. The 4,600mAh cell on board isn’t as capacious as some competitors we’ve seen and that, as a result, can see a heavy use day only just about scrape through a 15 hours day. It’ll manage, but only just.
Pocket-lint
Another oddity we’ve experienced with the Axon 30 Ultra is Wi-Fi connectivity seems to be a little up and down. With less strong signal our Zwift Companion app was very choppy in its updating of data – something that hasn’t been an issue with other phones we’ve compared in the same environment. We suspect that’s because the ‘a/b/g/n/ac/6e’ designation is catering for higher frequencies (‘ac’ is 5GHz only, for example, whereas ‘ax’ caters for both 2.6GHz and 5GHz, while the newly adopted ‘6e’, i.e. 6GHz, isn’t widely supported yet).
On the rear the Axon 30 Ultra houses an apparent four lenses: a 64-megapixel main; a 0.5x ultra-wide (also 64MP); a 5x periscope zoom lens (just 8MP); and what we would call a ‘portrait lens’ with 2x zoom (also 64MP).
It’s a bit of a mish-mash when it comes to results though. The main camera, at its best, is really great. It snaps into focus quickly, reveals heaps of detail – as you can see from the main flower shot below – but isn’t the most subtle when you look in detail, as images are over-sharpened.
The ability to zoom in the camera app is actioned on a slider to the side, but you don’t really ever know which lens you’re using – until there’s a clear ‘jump’ between one visualisation and the next, because, for example, the 5x periscope zoom is far poorer in its delivery. It’s only 8-megapixels, for starters, so there’s not nearly the same clarity revealed in its images. Plus the colour balance looks far out of sync with the main lens. Really this periscope is overoptimistic.
The 2x portrait zoom lens we also can’t really work out. Sometimes zoom shots are great, sometimes they’re quite the opposite – all mushy and, again, over-sharpened. It seems to depend which sensor/lens the camera is using at that particular moment – because the image of a horse in a field that we captured (within gallery above) looks fine, whereas the sheep in a field (shown in our wide-to-main-to-zoom-to-periscope gallery, below) is miles off the mark.
Motorola’s new Moto G9 Plus is a stunner of a phone – find out why, right here
By Pocket-lint Promotion
·
Pocket-lint
: Ultra-wide lensUltra-wide lens
There’s potential here overall. The specifications read rather well, but somehow the Axon 30 Ultra gets away from itself a little. It needs to rein in the offering really, simplify things, and deliver a more detailed app that explains specifically what kit you’re shooting with. That said, the main lens will please plenty, while close-up macro work – with the artificial intelligence ‘AI’ activated – snaps into focus really well.
Verdict
To answer our opening question: what compromises do you have to accept if looking to buy the ZTE Axon 30 Ultra 5G? Relatively few at this price point. There are some irks, though, such as the software causing notification problems (by which we mean absences), the battery being a little stretched, and the cameras get away from their potential somewhat – despite the main lens being perfectly decent.
Otherwise ZTE has crammed one heck of a lot into the Axon 30 Ultra. Its screen is commendable and having that headline-grabbing 144Hz refresh rate is sure to bring attention. The subtlety of the design is elegant, too, delivering a well-balanced scale that’s comfortable to hold and fairly fingerprint-resistant on the rear. And there’s bundles of power from the top-end Qualcomm Snapdragon 888 platform, ensuring apps and games run a treat.
There might be less ‘wow factor’ than if there was an under-display front-facing camera to captivate prospective customers (like there was in the Axon 20), but given the Axon 30 Ultra 5G’s price point undercuts the big-dog Samsung, that’ll be enough of a lure to many.
Also consider
Pocket-lint
Samsung Galaxy S20 FE
The ‘Fan Edition’ Galaxy might be a year older than the ZTE, but it’s a similar price, has more stable software in our experience – and that makes all the difference to everyday use.
Asahi Linux developer Hector Martin has revealed a covert channel vulnerability in the Apple M1 chip that he dubbed M1RACLES, and in the process, he’s gently criticized the way security flaws have started to be shared with the public.
Martin’s executive summary for M1RACLES sounds dire: “A flaw in the design of the Apple Silicon ‘M1’ chip allows any two applications running under an OS to covertly exchange data between them, without using memory, sockets, files, or any other normal operating system features. This works between processes running as different users and under different privilege levels, creating a covert channel for surreptitious data exchange. […] The vulnerability is baked into Apple Silicon chips, and cannot be fixed without a new silicon revision.“ (Emphasis his.)
He also noted that this was the result of an intentional decision on Apple’s part. “Basically, Apple decided to break the ARM spec by removing a mandatory feature, because they figured they’d never need to use that feature for macOS,” he explained. “And then it turned out that removing that feature made it much harder for existing OSes to mitigate this vulnerability.” The company would have to make a change on the silicon level with its followup to the M1 to mitigate this flaw.
But he also made it clear in the FAQ that Mac owners shouldn’t be particularly worried about M1RACLES because that covert channel affects two bits. It can be expanded, and Martin said that transfer rates over 1 MB/s are possible “without much optimization,” but any malicious apps that might take advantage of such methods would be far more likely to share information via other channels. Calling this a two-bit vulnerability would be both technically and linguistically correct. It’s a real security flaw, sure, but it‘s unlikely to pose a real threat to Apple’s customers.
So why bother coming up with a catchy name, drawing up a logo, and setting up a website in the first place? Martin addressed that in the FAQ: “Poking fun at how ridiculous infosec click-bait vulnerability reporting has become lately. Just because it has a flashy website or it makes the news doesn’t mean you need to care,” he wrote. “If you’ve read all the way to here, congratulations! You’re one of the rare people who doesn’t just retweet based on the page title 🙂 […] Honestly, I just wanted to play Bad Apple!! over an M1 vulnerability. You have to admit that’s kind of cool.“
It has become increasingly common for vulnerability disclosures to include all the elements Martin parodied with M1RACLES. Nobody cares about CVE identifiers—they care about names like Heartbleed, Meltdown, and Spectre. Researchers didn’t just say there were problems with drivers from Intel, Nvidia, AMD, and many other companies; they called their report Screwed Drivers. Early malware targeting the M1 wasn’t simply called M1_Malware_1; it was dubbed Silver Sparrow. Honestly it’s kind of surprising researchers haven’t started to sell tee-shirts alongside their reports.
M1RACLES does in some ways mean that we’ve reached a sort of meta-branding where a catchy name, logo, and website that were created ironically are effective, of course, but at least we all have our tongues planted firmly in our cheeks. More information about the flaw should be available at the Mitre listing for CVE-2021-30747 at some point in the future. Martin’s efforts to bring Linux to the M1 via Asahi Linux—whose m1n1 “experimentation playground for Apple Silicon” was used to discover this flaw—can also be followed via the project’s website.
Google started developing its Fuchsia OS as an open-source operating system for many platforms back in 2016, but for the longest time there haven’t been any news abut it.
Well, Fuchsia is now the official OS for Google’s original Nest Hub, formerly the Google Home Hub. The team behind Fuchsia is replacing the Nest Hub’s limited Cast OS with Fuchsia, but on the surface the device should look and behave as before.
The move from Cast OS to Fuchsia OS will take several months and begin with users in the Preview Program first.
Fuchsia OS is designed to support smart devices such as Chromebooks, smartphones and others and Google has described it as a secure, updatable, inclusive and pragmatic operating system.
It appears Google wants to test the OS thoroughly, hence the month-long rollout to the Nest Hub.
You don’t ship a new operating system every day, but today is that day.
— Petr Hosek (@petrh) May 25, 2021
Still, this move leaves questions as to what exactly is the point of Fuchsia OS on the Nest Hub, given that it will look identical to Cast OS. And it doesn’t offer any meaningful insight into Google’s plans for Fuchsia.
Google previously said that Fuchsia isn’t a replacement for Android, but it will be able to run Android apps natively. The main difference between Fuchsia and Android is that the former isn’t based on a Linux kernel, but a microkernel of its own, called Zircon.
We’ll have to wait and see what Google does with Fuchsia going forward.
Microsoft claims that as of a new release this week, its Edge browser will be the “best performing browser on Windows 10.” The announcement was made at the company’s annual Build developer conference, being held virtually due to the COVID-19 pandemic.
When Edge version 91 releases, it will include two new features in startup boost and sleeping tabs that should boost performance. Startup boost makes the browser launch more quickly. Microsoft says “core” Microsoft Edge processes will run in the background and won’t need more resources when you add additional Windows. This should, Microsoft says, make for far faster launching.
The second feature, “sleeping tabs” sounds like it will address a bigger issue in the browser market. It aims to boost performance of the browser by “freeing up system resources from unused tabs,” including putting ads to sleep in background tabs. This month, Microsoft intends to enhance the feature to allow for up to 82% memory savings, per its internal testing using preview builds of the browser.
Since last year’s Build, Microsoft has made more than 5,300 commits to the open-source Chromium project, so that other browsers using the project can also see improvements made to Edge. Microsoft has also added a Progressive Web Apps, or “PWAs” build on Edge to the Microsoft Store.
Microsoft Edge is taking on an increasingly important role as part of Windows 10.
Microsoft is retiring Internet Explorer
on June 15, 2022, for most versions of the operating system. At Build, the company is pushing developers to transition away from IE11 websites and apps, though Edge’s Internet Explorer mode is expected to last through at least 2029.
At Build, Microsoft will discuss the WebView2 embedded web control and Edge in a session about apps for hybrid work, while the Edge team will also have a session to take questions directly from attendees.
Other Windows-based announcements include the ability to use Windows Terminal as the default emulator, along with a “Quake mode” to open a new terminal with a keyboard shortcut. Additionally, there will be GUI app support on the Windows Subsystem for Linux. More will be announced at Build throughout the week.
Microsoft isn’t talking about its big Windows plans at Build 2021 this week, and that’s because the company is preparing to detail what’s next for its PC operating system separately. Microsoft CEO Satya Nadella teased this announcement during his Build keynote this morning, revealing he has been testing “the next generation of Windows” in recent months.
“Soon we will share one of the most significant updates to Windows of the past decade to unlock greater economic opportunity for developers and creators. I’ve been self-hosting it over the past several months, and I’m incredibly excited about the next generation of Windows. Our promise to you is this: we will create more opportunity for every Windows developer today and welcome every creator who is looking for the most innovative, new, open platform to build and distribute and monetize applications. We look forward to sharing more very soon.”
Microsoft has been working on a new app store for Windows in recent months, alongside some significant UI changes to the operating system. Nadella appears to reference the store changes here, with a promise to unlock a better economy for developers and creators within Windows itself.
This will likely include some significant changes to the Windows Store, allowing developers to submit any Windows application — including browsers like Chrome or Firefox. Rumors have suggested Microsoft may even allow third-party commerce platforms in apps, so developers could avoid Microsoft’s own 15 percent cut on apps and 12 percent cut on games.
Nadella’s specific mention of a “next generation of Windows” is interesting, too. Microsoft typically refers to everything as “Windows 10,” and this language could suggest the company is preparing a more significant shift with Windows branding than just the user interface alone.
Microsoft confirmed last week that Windows 10X, its OS originally built for dual-screen devices, will no longer ship. The software maker is now bringing the best bits of Windows 10X, a simplified version of Windows, into the main version of Windows 10 instead. We’re expecting to see some significant UI changes to Windows under something codenamed “Sun Valley.”
Some of that work has already started, with new system icons, File Explorer improvements, and the end of Windows 95-era icons. Microsoft is also focusing on improving the basic foundations of Windows, with fixes for a rearranging apps issue on multiple monitors, the addition of the Xbox Auto HDR feature, and improvements to Bluetooth audio support.
Nadella says we’ll hear more about the future of Windows “very soon,” so we’d expect some type of announcement or event in the coming weeks ahead.
In a new blog post, iFixit heavily criticizes Samsung’s recently announced Galaxy Upcycling program (via ArsTechnica), an initiative which the repair specialists helped launch in 2017. It’s a damning look at how the initiative morphed from its ambitious origins to a “nearly unrecognizable” final form, and completely sidelined iFixit in the process.
Here’s how iFixit describes the original plan:
The original Upcycling announcement had huge potential. The purpose was twofold: unlock phones’ bootloaders—which would have incidentally assisted other reuse projects like LineageOS—and foster an open source marketplace of applications for makers. You could run any operating system you wanted. It could have made a real dent in the huge and ever-growing e-waste problem by giving older Samsung devices some value (no small feat, that). It was a heck of a lot more interesting than the usual high-level pledges from device makers about carbon offsets and energy numbers.
You can see this original vision on display in a Samsung trailer from 2017 (embedded below). Samsung outlined how an old smartphone could be turned into a sensor for a fish tank, simultaneously re-using an old phone while at the same time helping to stop people from needing to buy a dedicated single-use device. Other potential ideas included turning old phones into smart home controllers, weather stations and nanny cams.
It sounds like a cool initiative, and iFixit was initiallyheavily involved. It lent its branding to the launch, and its CEO Kyle Wiens helped announce the project onstage at Samsung’s developer conference. It had even planned to expand its support pages and spare parts program for Samsung phones had the project shipped, but…
Instead, we heard crickets. The actual software was never posted. The Samsung team eventually stopped returning our emails. Friends inside the company told us that leadership wasn’t excited about a project that didn’t have a clear product tie-in or revenue plan.
So what’s the problem with the program in its 2021 form? Two things: it only goes back three years to the Galaxy S9, and it only gives it basic smart home functionality. Less, in other words, than what’s possible from a cheap $40 Raspberry Pi.
So instead of an actually-old Galaxy becoming an automatic pet feeder, full-fledged Linux computer, retro game console, a wooden-owl Alexa alternative, or anything else that you or a community of hackers can dream of, the new program will take a phone you can still sell for $160 and turn it into something like a $30 sensor.
Most will have probably just shrugged and moved on when they saw Samsung’s upcycling announcement in January. But it’s disappointing to realize that the project could have been so much more. iFixit’s post is well worth reading in its entirety.
Google’s long-awaited Fuchsia OS is starting to quietly roll out on its first consumer device, the first-generation Nest Hub, 9to5Google reports. Google’s work on Fuchsia OS first emerged in 2016, and the open-source operating system is notable for not being based on a Linux kernel, instead using a microkernel called Zircon. “You don’t ship a new operating system every day, but today is that day,” tweeted a Google technical lead on the Fuchsia OS project, Petr Hosek.
While the rollout on the Nest Hub (which originally released as the Google Home Hub before being renamed) begins today, the whole release process will take several months. It’ll come to users in the Preview Program first, before slowly releasing more broadly. We’ve known for a while that the operating system has been tested on the Nest Hub, and earlier this month more evidence for a release emerged thanks to a Bluetooth SIG listing that showed the Nest Hub running Fuchsia 1.0.
You don’t ship a new operating system every day, but today is that day.
— Petr Hosek (@petrh) May 25, 2021
Although the Nest Hub will swap its current Cast OS for Fuchsia OS, 9to5Google notes that the experience is likely to be almost identical, and most users are unlikely to even notice the switch.
All of this raises the question of what exactly Fuchsia OS is meant to achieve. Google calls it a “production-grade operating system that is secure, updatable, inclusive, and pragmatic.” We know that the OS could eventually power laptops and smartphones (Google was spotted testing it on the Pixelbook back in 2018, and more recently it proposed a solution for how it could run Android and Linux apps), but Fuchsia is not meant to be a one-for-one replacement of Android or Chrome OS.
“Fuchsia is about just pushing the state of the art in terms of operating systems and things that we learn from Fuchsia we can incorporate into other products,” Android and Chrome chief Hiroshi Lockheimer said cryptically in 2019. Google’s smart display is unlikely to be the last device or even form-factor to receive an update to Fuchsia OS. But the exact implications for the switch might take longer to emerge.
Radxa have announced that it has updated its Raspberry Pi alternative, Rock Pi 4 line of single-board computers with the Rockchip OP1 processor, onboard eMMC storage, and a pre-installed version of Twister OS to create the new Rock Pi 4 Plus family of products. Via CNX-Software.
Camera connector for camera (possibly the Raspberry Pi official camera)
Gigabit Ethernet with PoE support (Model B and additional HAT required)
Dual-band 802.11ac WiFi 5, Bluetooth 5.0 (Model B)
2 x USB 3.0 ports
2 x USB 2.0 port
40 Pin GPIO
Real Time Clock
USB C PD
There are two models of the Rock Pi 4 Plus at launch, the Model A and Model B, that can both be configured with either 2GB of LPDDR4 memory and 16GB of onboard eMMC storage or 4GB of LPDDR4 memory and 32GB of onboard eMMC storage.
We noticed on the Aliexpress listing that the primary difference between the Model A and Model B is that the latter offers wireless connectivity out of the box and POE support via a HAT.
The inclusion of the community created Twister OS is an interesting addition. Twister OS has been with us for around a year and has seen some success as an alternative to Raspberry Pi OS. It is a solid operating system that comes with plenty of extras.
Radxa did say these new models will be 11% faster than their predecessors thanks to the OP1. It’s not clear how that would be the case, however, because OP1 appears to be a brand name for the RK-3399 SoC used in the original models. Radxa may have upgraded the Rock Pi 4 Plus to the RK-3399Pro, which adds a 2.4 TOPS NPU to the base SoC, but that doesn’t mesh with the company’s claim that the “OP1 brings faster performance on both CPU and GPU” to the new models.
Radxa said all of the original Rock Pi 4 accessories will be compatible with the Rock Pi 4 Plus. The new models are supposed to be available via AliExpress, Allnet, and Amazon, but at time of writing the storefronts only offer the Rock Pi 4 Plus Model B.
The latest iPad Air gets more than just a makeover – it’s a brilliant all-rounder and all the tablet most people could ever need
For
Great picture and sound
Attractive design
Excellent user experience
Against
Imperfect front-facing camera
Touch ID button is awkward
Not the cheapest tablet around
It’s not every day an Apple product gets what you could call a major revamp. On many occasions in the past, there has been a slight change here and a minor tweak there, leaving the tech world slightly underwhelmed. However, by Apple’s standards, it has positively gone to town on the iPad Air (2020).
Not only does the fourth generation iPad Air boast a brand new design, complete with a new Touch ID sensor and speaker layout, there’s also a bigger screen, more powerful processor and improved main camera. Everything is set up for the iPad Air (2020) to make quite the splash, but where does it rank when it comes to the best iPads you can buy?
Pricing
The fourth-generation iPad Air slots between the entry-level iPad and the flagship iPad Pro (2021). It’s available with either 64GB or 256GB of storage and prices start at £579 ($599, AU$899) for the entry-level 64GB Wi-Fi-only model and £729 ($749, AU$1129) for the Wi-Fi/Cellular model.
That makes the latest generation iPad Air around £100 ($100, AU$120) more expensive than the previous model. If Apple had just touched up the design and kept the status quo, you’d probably consider that a big jump in price. But the new model is a clear improvement on iPad Airs of old.
Build
Nowhere are the changes more obvious and apparent than with the new iPad Air’s exterior. It has been redesigned to mirror the iPhone 12, and if you like the look and feel of that smartphone, you’re going to love the iPad Air 4.
Apple iPad Air (2020) tech specs
Screen size 10.9in
Resolution 2360×1640 pixels
Storage 64GB/256GB
Finishes x5
Battery life 10 hours
Cameras 12MP rear / 7MP front
Dimensions (hwd) 24.8 x 17.9 x 0.6cm
Weight 458g
Those flat sides and crisp edges give the tablet a more purposeful appearance from the off. It makes for quite the contrast switching from the smooth, curved edges of the previous version, but it doesn’t feel uncomfortable, and those flat sides make it easier to grip when you’re watching in portrait or landscape.
It’s similar in look and feel to its big brother, the iPad Air Pro, although the Air is the first iPad to be made available in a range of optional colours. There are Rose Gold, Green and Sky Blue variants to choose from, in addition to the more traditional Silver and Space Grey. We find the Green finish of our review sample particularly easy on the eye.
Run your eyes around those flat edges and you’ll also spot a couple of new additions. The first is the presence of speaker grilles on the top and bottom of the tablet. Instead of offering speakers along just the bottom, the iPad Air 4 now offers landscape stereo speakers. That’s right, no longer does audio sound lopsided.
The iPad Air is now fitted with a USB-C port instead of Lightning, which makes one wonder why Apple is persisting with Lightning on the iPhone. Perhaps we’ll see the socket on the iPhone 13 when it makes an appearance later in the year?
The iPad Air’s volume buttons remain in the same location, as does the power button, although it’s slightly larger and longer, likely because it now handles Touch ID duties. We find that this takes some time to get used to and is at times a little more awkward to operate than the dedicated face-mounted Touch ID button of before. We can’t help but think a fingerprint sensor built under the screen, which is already used by smartphones such as the Samsung Galaxy S21 range, might work better.
Features
The big news on the screen front is that the new iPad Air is bigger than ever. At 10.9in, it’s marginally larger than the previous model (10.5in), but you only notice the difference when viewing the two tablets together. The effect is emphasised by the slimmer bezels at the top and bottom, which have been achieved by ditching the fingerprint scanner/home button. It all makes for a streamlined viewing experience.
Resolution is 2360 x 1640 (vs 2224 x 1668 on the iPad Air 3) with a pixel density of 264ppi and a maximum brightness of 500 nits.
It’s still a wide colour display with True Tone, so the iPad Air can adjust the balance of its screen based on ambient lighting conditions. The only thing it doesn’t have compared to its more expensive Pro sibling is a 120Hz refresh rate, which would be nice, but not vital.
The new iPad Air (2020) has the brains to match its beauty too. It is powered by Apple’s A14 Bionic chip, the same silicone that drives the entire iPhone 12 line.
In terms of CPU performance, Apple claims the iPad Air 2020 is 40 per cent faster than the previous generation A12 Bionic chip, while its GPU performance is supposedly up 30 per cent, too, for faster graphics processing.
Although it can’t match the specs of the iPad Pro 2021, both in terms of processing power and storage, Apple still claims the Air is more than powerful enough to be able to edit 4K video on and it’s fully compatible with the Apple Pencil 2, which will come in handy for creative types.
So how do Apple’s claimed performance percentage increases translate into real life? The iPad reacts extremely well to multiple apps being open and even the rigours of gaming. In fact, the iPad Air arguably turns the iPad Pro into even more of a niche product. For most people, the iPad Air 2020 will be a powerful enough tool.
Apps such as Netflix and Amazon Music boot up without hesitation, and even if you have more than a handful of apps running in the background, the iPad Air won’t struggle to cope. Navigating between apps via a series of simple swipes is quick and hassle-free, and once again, Apple’s intuitive iOS operating system delivers a smooth and class-leading user experience.
As far as cameras are concerned, the iPad Air 2020 sports a 12MP snapper on the rear (up from 8MP on the previous version) while it sticks with the old 7MP FaceTime HD camera on the front. The Air can record in 4K resolution at 24, 25, 30 or 60fps and capture slow-mo video in 1080p at 120fps or 240fps.
Apple has stripped down the accessories included in the box for the iPhone, but you still get a 20W charger to go alongside the USB-C charging cable. With a full battery, the iPad Air 2020 should be good for up to 10 hours of battery life under average use. As an occasional web browser and viewing device for the odd episode of The Crown, you should be more than covered.
Sound
One of the more exciting changes to the iPad Air’s design from an AV perspective relates to its speakers. On the previous iPad Air, they were positioned on one side, on the edge beneath the Touch ID sensor. Here, the speakers have been repositioned to fire out from either end of the tablet, so you can be treated to proper stereo audio with both sides of your iPad contributing equally.
Not having the audio offset to one side makes a big difference. It’s a better fit for watching programmes in landscape mode, especially while bingeing episodes of your favourite series on Netflix. The most obvious improvement is a wider spread of sound, which helps give it a more cinematic and immersive feel. It’s not exactly surround sound, but it is better than it was previously.
The sound coming out of the speakers is more solid and defined too. There’s extra weight to dialogue and although the vibrations through the iPad’s chassis can be pretty disturbing at higher volumes, it never seems to muddy the clarity of what you’re hearing. Stick to normal volume levels and you’ll be just fine.
You still need to be a little careful about hand placement if you’re holding the iPad in landscape mode, although it is much improved on the older model.
Switch to playing tracks through a pair of wireless headphones and Apple’s trademark musicality is there to enjoy. The iPad makes quick work of Radiohead’s 15 Steps and its attempts to trip the tablet over, displays an excellent sense of rhythm and there’s a real snap to the claps that help keep the track on course. There’s plenty of precision to the percussion including a solid, weighty kick drum.
Screen
Apple’s tablets have a reputation for delivering excellent images when watching video and the iPad Air 2020 doesn’t let the side down. It’s punchy and bright, but also throws in a great level of subtlety when the scene demands. Compared with the previous model, the latest iPad Air appears a bit sharper, slightly better detailed and capable of great subtlety in dark scenes.
Playing the second episode of Jupiter’s Legacy on Netflix, as everyone lines up at the funeral of their fallen comrades, the detail and definition in each character’s suit really captures the eye. Blacks are deep and rich, but there’s subtlety around creases and where light casts a shadow on certain areas. There’s a great general sense of depth to the scene too.
The iPad peers into the nooks and crannies and paints different gradations of black and grey with great care and attention. It also picks out subtle differences in the intensity of the white shirts worn by some of the characters. Skin tones also appear natural. As Sheldon, Walter, Grace and Brandon sit down for dinner, the bulbs in the chandelier bulbs emit a welcoming, warm glow and there’s a great sense of depth.
Verdict
If you want the ultimate iPad experience, Apple would probably point you in the direction of its Pro range. But the iPad Air 4 (2020) is all the iPad most people will ever need. It’s such a solid and capable all-rounder, that very few will feel the need to spend the extra for the iPad Pro.
The design is superb, the user experience is tough to beat and both sound and picture quality are on point. It’s an excellent tablet, and even with a slight price increase, we still feel it’s worth every penny.
SCORES
Picture 5
Sound 5
Features 5
MORE:
Read our guide to the best tablets
Read our Samsung Galaxy Tab S7+ review
Everything you need to know about the new Apple iPad Pro
Today we are back with another extensive performance analysis, as we check out the recently-released Days Gone. As the latest formerly PlayStation-exclusive title to come to the PC, we test thirty graphics cards in this game to find out exactly what sort of GPU you need to play at maximum image quality settings. Has this game launched in a better state than when Horizon Zero Dawn first came to PC? Let’s find out.
Watch via our Vimeo channel (below) or over on YouTube at 2160p HERE
The first thing to know about Days Gone is that it is developed by Sony’s Bend Studio, and is built on Unreal Engine 4. Interestingly though, it uses DirectX 11, and there’s no option for DX12. That means there’s no ray tracing or DLSS features in Days Gone, something which is becoming more unusual these days.
In terms of visual settings, there are a number of options in the display menu. Textures, lighting, shadows and more can all be adjusted, while it’s great to see a field of view (FOV) slider as well as a render scale setting. There’s also a selection of quick presets – Low, Medium, High and Very High – and for our benchmarking today we opted for the Very High preset, with V-Sync of course disabled.
Driver Notes
AMD GPUs were benchmarked with the 21.5.2 driver.
Nvidia GPUs were benchmarked with the 466.47 driver.
Test System
We test using the a custom built system from PCSpecialist, based on Intel’s Comet Lake-S platform. You can read more about it over HERE, and configure your own system from PCSpecialist HERE.
CPU
Intel Core i9-10900K
Overclocked to 5.1GHz on all cores
Motherboard
ASUS ROG Maximus XII Hero Wi-Fi
Memory
Corsair Vengeance DDR4 3600MHz (4 X 8GB)
CL 18-22-22-42
Graphics Card
Varies
System Drive
500GB Samsung 970 Evo Plus M.2
Games Drive
2TB Samsung 860 QVO 2.5″ SSD
Chassis
Fractal Meshify S2 Blackout Tempered Glass
CPU Cooler
Corsair H115i RGB Platinum Hydro Series
Power Supply
Corsair 1200W HX Series Modular 80 Plus Platinum
Operating System
Windows 10 2004
Our 1-minute benchmark pass came from quite early on in the game, as Deacon is riding on the back of Boozer’s motorbike, headed to Crazy Willie’s. This represents a reasonably demanding section of the game based on the first hour or so that I played through, and it is also highly repeatable which makes it great for benchmarking multiple GPUs.
1080p Benchmarks
1440p Benchmarks
2160p (4K) Benchmarks
Closing Thoughts
By and large, Days Gone is an impressive PC port that almost everyone will be happy with. I say almost everyone, as currently my main issue with the game is related to visible stuttering when using an RDNA 2 GPU. This didn’t happen for other AMD cards though, or Nvidia GPUs, so hopefully it is a quick fix for AMD’s driver team or the game’s developers.
As a DX11 title built on Unreal Engine 4, if we had to guess before testing the game, we would’ve thought Nvidia GPUs would perform the best, and that is certainly true. RTX 2070 Super is significantly faster than the RX 5700 XT, for example, while RTX 3070 also beats out the RX 6800 across the board, which isn’t something we usually see.
Even then, the game does run well across a wide variety of hardware. GTX 1060 and RX 580, for instance, aren’t far off from hitting 60FPS at 1080p with maximum image quality settings, with just a few small tweaks to the IQ needed to hit that figure. VRAM doesn’t appear to be in high demand either, with both the 4GB and 8GB versions of the RX 5500 XT performing almost identically.
If you do want to drop down some image quality settings, the game’s options scale well. We found that the High preset offered 35% more performance than Very High (which is more than enough to get a GTX 1060 averaging over 60FPS at 1080p), while you can almost double frame rates using the Low preset when compared to Very High.
The only other issue I noticed is what appears to be an animation hitching problem in the game, which is particularly noticeable when riding a motorbike – the game feels like it is slowing down but then correcting itself by speeding up again. This wasn’t a game breaker for me but it was most noticeable when frame rates were below 60FPS – the higher the frame rate, the less I noticed the issue.
Discuss on our Facebook page HERE.
KitGuru says: Days Gone is definitely in a better state at launch than what we saw when Horizon Zero Dawn hit PCs in 2020. There’s a couple of issues to be fixed, but by and large this game performs well across a good range of graphics cards.
Android has been around for over a decade at this point and has grown tremendously during that time. Google’s mobile operating system has now set a new record, with Android being used on over 3 billion active devices.
Since Android is open source, smartphone makers have been free to adopt it and even make changes to help differentiate their devices. This has been a successful approach, with the vast majority of major smartphone makers using Android instead of their own custom operating system.
Back in 2014, Google reached 1 billion active Android devices for the first time and by 2019, that number had grown to 2.5 billion. Now, the number of active Android devices has surpassed the 3 billion milestone.
Google I/O returned this week after a break in 2020. During the event, Google’s Vice President of Product Management, Sameer Samat, announced the new milestone. With three billion devices actively used, Android’s user base now dwarfs Apple’s iOS platform, which has an active device base of 1 billion as of this year.
Breaking down the numbers, this means that an additional 500 million Android devices have been activated since 2019 and 1 billion since 2017.
KitGuru Says: Android has come a long way over the years. What was the first Android device that you owned?
Shopping for a laptop can be stressful — doubly stressful if you or your children will be learning online for the first time. Kids of different ages have a range of different laptop use cases and different needs. And as the choices for best laptop and best Chromebook evolve, so do students’ needs. So I spoke to some experts on the subject: students themselves.
My recommendations here are meant to accommodate a variety of preferences and price ranges. But they’re a jumping-off point rather than an exhaustive list: every student is different. Before making a decision, you’ll want to make sure you read reviews and try out devices yourself if you can. I’ll do my best to keep this article up to date with items that are in stock.
Best laptop for students
Best laptop for elementary school
For younger students, a touchscreen device is easier to use than a keyboard and touchpad, says Michelle Glogovac. Glogovac’s five-year-old son uses an iPad for Webex meetings with his kindergarten class. He’s gotten the hang of it; Glogovac says he’s already learned how to mute and unmute himself, “a skill that many adults aren’t familiar with.”
That said, it may be worth investing in a keyboard case if you go the tablet route. Glogovac has to type her son’s meeting codes and passwords for him, which can be cumbersome on the iPad’s flat screen.
As kids get older, their best laptop choice will vary depending on their needs. As a parent, it’s important that you and your child are in sync about how they intend to use it and the size of the programs they want.
Kristin Wallace purchased a budget HP laptop for her daughter, Bella, but didn’t realize how quickly the nine-year-old would fill up 32GB of storage. “It is really slow and has no space for games. I need a computer with more storage space,” said Bella, who uses the laptop to Zoom with friends and take virtual guitar lessons and math enrichment classes. Wallace plans to buy Bella a better device in the next few weeks.
Audio quality is an important consideration for kids’ laptops. Lisa Mitchell, an elementary library media specialist, says her students use their devices to watch YouTube videos in addition to their online classes. Battery life is also a plus, even for distance learners who may not be far from a wall outlet. Bella likes to use her laptop all around the house and doesn’t want to bring the cord with her.
Durability is also worth paying for, according to Mitchell. If you’re using a tablet, get a protective case. “If a reasonably-priced insurance or replacement policy is available, it’s usually worth the extra expense.”
Check out:
Amazon Fire HD 10 Kids Edition ($199): a colorful, fast tablet with kid-friendly content
Lenovo Chromebook Duet ($279): a tiny 10-inch Chromebook with a detachable keyboard
Apple 10.2-inch iPad ($329): a great budget tablet that supports the Apple Pencil
Microsoft Surface Go 2 ($399): a solid Windows tablet with a built-in kickstand
Google Pixelbook Go ($649): a sturdy touchscreen Chromebook
Best laptop for middle school
The middle school students I spoke to don’t use their laptops for much more than web-based schoolwork and browsing. Don’t be too concerned about power — prioritize a machine that’s comfortable and easy for your child to use.
“We just got the most basic Chromebook and it is totally perfect,” says Gabrielle Hartley, an attorney and mother of three children who take a mix of in-person and online classes. “The most basic Chromebook serves all the needs of the basic middle schooler.”
Hartley’s son Max, who is in eighth grade, agrees. “I would really like a gaming PC or gaming laptop that can plug into a monitor and run video games with 120fps, but I really don’t need that,” Max says. “Most eighth graders aren’t going to be running any video games on their laptops or any software that requires a lot of power.”
Max mostly uses his laptop for Google Classroom applications, including Gmail, Slides, Google Docs, and Google Sheets. They’re very easy to use on his device, which he describes as “a run-of-the-mill Samsung Chromebook.” That said, if your child is starting middle school this year, it could be worth checking with their teachers to see what operating system is most compatible with their workflow. Caspian Fischer Odén, a ninth grader from Sweden, told me he has trouble with his Chromebook because his school has blocked downloading apps from the Google Play Store.
Even kids with more demanding hobbies think a budget device can get the job done. Sam Hickman, an eighth grader from the UK who uses his laptop for photo and video editing, says, “For most middle schoolers, any processor developed within the last two years will be able to handle any tasks they can throw at it.”
So, what is worth paying for? A comfortable keyboard, several students told me. Many middle school kids aren’t used to typing for long periods of time. You should also look for a device that’s compact and easy for them to carry around, particularly if they’re preparing for in-person school. Shoot for an 11- to 13-inch model — certainly nothing larger than 15 inches.
Check out:
HP Chromebook x360 ($279): an affordable Chromebook with great battery life
Lenovo Flex 3 Chromebook ($350): a small but sturdy laptop made for students
Lenovo 300e ($378): a durable 2-in-1 with a stylus
Acer Aspire 5 ($466): a portable option for kids who need a 15-inch screen
Microsoft Surface Laptop Go ($549): an attractive, light Windows laptop
Best laptop for high school
High schoolers’ laptop needs can vary based on their interests, but most don’t need powerful machines with lots of bells and whistles — especially if they come with glitches or serious downsides that could interfere with schoolwork. Miles Riehle, a student at Laguna Beach High School, has a high-end Surface Pro 7 but finds it overwhelming. “There is so much other stuff that I don’t use very often,” he said. “Something simpler might be a little more straightforward.”
The best operating system may depend on what your child is used to. Aryan Nambiar, a student at Barrington High School in Illinois, has an iMac at home and enjoys using an iPad for his schoolwork. Riehle says he would prefer a Chromebook because he has an Android phone and often uses Google services.
But almost every student I spoke to agreed that the most important feature of a high school laptop is portability. Kids attending in-person classes may be carrying their device around for much of the day with a stack of other books. Look for a 13- or 14-inch screen, or a lighter 15- to 17-inch model.
Students also recommend something sturdy. “Most high schoolers I’ve seen will throw their laptop in their bag without too much care,” says Moses Buckwalter, a student at Penn Manor High School. Backpacks can be jostled in the hallway as well. Distance learners can still run into trouble at home. “Anything can happen,” says Aadit Agrawal, a high school student from India. “My own brother scratched my laptop with his nails.”
Battery life is another key feature. “It can be a real struggle to find a place to charge while in class,” says Cas Heemskerk, a sophomore from the Netherlands. Unlike college students, many high schoolers don’t have frequent breaks to juice up their devices, so try to find something that can last a full day.
Many students recommend a touchscreen with stylus support. Nambiar uses the feature for his biology class, where he does a lot of visual modeling. “The touchscreen is always a bonus for drawing diagrams, whereas if you’re using a laptop it’d be a whole process to submit a diagram you drew,” Nambiar says. Riehle uses a Surface Pen to fill out school forms and annotate PDFs. Agrawal finds it useful to take notes on the same screen as his online lessons.
Depending on the broadband situation in your area, you may also want a laptop with multiple connectivity options. Agrawal’s online classes are sometimes interrupted by powercuts, so he recommends an LTE model. Matej Plavevski, a junior at Yahya Kemal College in North Macedonia, recommends looking for an Ethernet port in case slower connections disrupt meetings. That’s hard to find on smaller laptops, but there’s a slew of affordable dongles and docks to consider.
Check out:
Acer Chromebook Spin 513 ($349): a convertible Chromebook with all-day battery
Apple iPad Air ($599): a powerful tablet with a great screen
Acer Chromebook Spin 713 ($629): a fantastic Chromebook that’s not too pricey
Dell XPS 13 ($931): a solid clamshell Windows laptop
Surface Laptop 4 ($999): an excellent, light laptop that’s comfortable to use
Best laptop for college
College kids are justified in spending a bit more money than other age groups. Some (especially in STEM courses) can expect to do some fairly demanding work. Assad Abid, an electrical engineering undergrad from Pakistan, needs to run simulation software for his assignments. Aakash Chandra, a student at New Horizon College of Engineering in India, does a lot of coding, in addition to creative work in Premiere Pro and Photoshop, and gaming. Students also noted that it’s worthwhile to pay for a laptop that will last for a few years after graduation. That means you won’t have to worry about finding and financing your next device until you’re (hopefully) settled into a job.
But among high-end, capable devices there’s still a wide range of options. Students stressed that a college laptop should be light. Expect to bring it between classes, meals, meetings, the library, and other locations on campus. “It’s a boon that I can carry my laptop as well as some notebooks without feeling like I’m carrying too much for six hours a day,” says Haseeb Waseem, a senior at Villanova University.
Another universally-lauded feature: battery life. Waseem, who uses an HP Spectre, says the all-day juice gives him “the flexibility to study in a bunch of different locations, and even outside.”
Speakers and webcams are often overlooked, even in top-end devices. But students say it’s worth looking for good ones if you’re starting college this year. Zoom will be a large part of university life this semester: many kids will be attending virtual classes, while others will still be meeting with clubs, study groups, and professors as well as hanging out with friends online. Waseem isn’t satisfied with his laptop’s audio and picture quality, which he says has made it difficult to pay attention in class and to engage with other students.
Many students will need to invest more in areas tailored to their interests and schoolwork needs. Chandra’s dream laptop would include a stylus and touchscreen for his creative work as well as a high-end GPU. Waseem, who uses his laptop for a hodgepodge of activities, including streaming, coding, social media, video chatting, and Microsoft Office work, would prefer to prioritize a large screen to keep track of his multitasking.
Check out:
Acer Swift 3 ($613): a super light laptop that performs well
HP Envy x360 13 ($689): a fast and stylish 2-in-1
Dell XPS 13 ($930): a solid clamshell Windows laptop
Surface Laptop 4 ($999): an excellent, light laptop that’s comfortable to use
HP Spectre x360 14 ($1,269): a premium convertible with standout battery life
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.