ipad-pro-(2021)-review:-dream-screen

iPad Pro (2021) review: dream screen

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

How much do you care about having a great screen?

That, really, is the only question that matters with the new 12.9-inch iPad Pro. It has a new kind of display so good I think it is the best thing for watching movies that isn’t a high-end television. It starts at $1,099 for a 128GB version, but increased storage and accessories like a keyboard or the Apple Pencil can shoot the price up fairly quickly.

Both the 12.9 and the smaller 11-inch iPad Pro (which starts at $799) feature Apple’s M1 processor and some other updated specs, all of which are excellent. But even that fancy processor — the same as you’ll find in the new iMac, MacBook Air, MacBook Pro, and Mac mini — doesn’t fundamentally change the story of what the iPad Pro is and what it can do.

It is an iPad, after all.

But the 12.9-inch version of the iPad Pro is an iPad with a very beautiful display. And so again, the question is what that screen means to the experience of using an iPad, especially since the price has jumped $100 compared to the last model. How much do you care about having a great screen?

Here is a very brief, wildly incomplete, and necessarily oversimplified education on flat panel screen technology. (Chris Welch has a longer one.)

There are two basic types you usually see, LCD and OLED. Both have pixels that combine red, green, and blue subpixels to create colors, but in order for you to actually see those colors the display pixels need to be lit up. OLED pixels are self-lit; LCD panels light up the display pixels by putting one, several, or many LED backlights behind them.

The benefit of LCD panels with LED backlighting is that they’re relatively inexpensive, long-lasting, bright, and unlikely to burn in. The benefit of OLED is that the black pixels are not lit at all, meaning you get superb contrast, but they are relatively expensive and don’t get as bright. Each technology’s strength is the other’s weakness.

Mini LED, the technology powering the 12.9-inch iPad Pro display, is designed to bring the LCD panel as close as possible to OLED’s contrast and black levels. Its display pixels are not self-lit, but instead lit from behind. The trick is that they’re lit by 10,000 tiny LED lights split up by software into 2,500 local dimming zones. It’s almost like the backlight itself is a lower-resolution screen behind the screen, tracking the image and making sure the black parts of the picture aren’t lit up.

I would never call the display on the 11-inch iPad Pro bad, because it’s a stellar display. But because it uses a more traditional LCD backlight system with fewer dimming zones, you can see that the blacks are actually just a little gray. On the 12.9-inch version, Mini LED lets blacks be truly black, offers a high contrast ratio, and can also get very bright.

Apple is calling this screen the “Liquid Retina XDR display.” And it has all the benefits of Apple’s previous iPad Pro displays: it’s very high resolution, color-accurate, and it has fairly good viewing angles. It supports ProMotion, Apple’s term for a variable refresh rate to increase smoothness and match the frame rate of videos. (“Liquid Retina,” as far as Apple has ever told us, refers to the Apple-specific method of making round corners on an LCD.)

The funny thing about the 12.9-inch iPad is that it is very easy to miss the benefits of Mini LED in normal day-to-day use. At first you don’t see it.

Sure, there’s great contrast when you’re browsing the web, texting, playing games, and so on, but really it’s not very different from any other iPad. Apple still limits the max brightness in most scenarios to 600 nits, which is bright enough but not eye-popping (the iPad and iPad Air max out at 500 nits).

The magic kicks in when you are viewing videos or photos in full-screen. When you do that, the iPad Pro kicks into a different HDR mode (or in Apple’s parlance, XDR, for “Extreme Dynamic Range”) that really is stunning. The overall max brightness of the screen jumps up to a powerful 1,000 nits and peak brightness for certain lighting can hit 1,600 nits.

You don’t see it until you see it — but then you see it.

The iPad Pro with the Liquid Retina XDR display
Photo by Vjeran Pavic / The Verge

The joke I’ve been telling people is that the display is so good that Tenet actually makes sense when you watch it on this iPad Pro. HDR content is incredible on this screen. I am not a display quality enthusiast, but this screen is functionally equivalent to a high-end OLED TV to my eyes, especially in a dark room.

If you are also not a display quality enthusiast, you might be left unimpressed with descriptions of nits and contrast ratios. I get it, but there are intangibles to the screen that I struggle to describe and have struggled even harder to capture in photos and video. For example, some colors just look better and more accurate to me, especially textured yellows. It just does a better job showing fine detail in situations where dark and light elements get mixed together, like with hair or a building reflecting sunlight.

Both Kindle and Apple Books in dark mode, in a very dark room, have a weird gray halo around text not due to normal blooming (Photo edited to best replicate what I saw with my eyes)
Dieter Bohn / The Verge

The display isn’t perfect, of course. If you run a local dimming test you will see blooming on brightly lit pixels against a black background. I only noticed this when running tests that are specifically designed to surface blooming, though. In regular use, everything looked great, sharp, and evenly lit across the entire screen.

There was one odd bug I experienced. Putting either the Kindle app or Apple Books into dark mode and viewing them in a near-pitch black room, I noticed a strange gray haze around all of the text blocks. It’s too big to be blooming; it’s more like the local dimming algorithms got a little confused. It’s a minor thing that I hope gets fixed.

For me, the quality of the display when watching video on the 12.9-inch iPad Pro is impressive, but it’s also not at the top of my list of priorities when picking a computer. I care a little more about portability, weight, and — yes — functionality.

So let’s talk about that M1 chip.

The iPad Pro is great for video editing
Photo by Kayla Murkison / The Verge

What does it really mean that the iPad Pro now has the same chip that powers Apple’s latest Mac computers?

It does not mean that iPads will be able to run Mac apps now. While Apple is happy to let the Mac run iPad apps and generally let you do whatever you want on it (except touch the screen), the iPad Pro continues to be a more, shall we say, curated experience.

One interesting consequence of the M1 is that for the first time in the history of iOS devices, Apple is publicly disclosing how much RAM these devices have. It’s 8GB on models with 512GB or fewer of storage and 16GB on models with 1TB or more. Whether or not that’s actionable information is another matter.

The M1 is obviously fast, and in benchmarks it’s faster than the last A12Z Bionic that Apple put in the previous iPad Pro models. But in my usage, I didn’t actually perceive any speed improvements in any of the apps that I use — because everything was already very fast on the iPad Pro. I got the exact same export speeds in Premiere Rush on the brand new 12.9-inch iPad Pro with the M1 as on my 2018 iPad Pro.

Both the RAM and the M1 processor are specs that won’t make an appreciable difference to the vast majority of iPad users. They’re specs that will matter to certain “pro” users who have found specific apps and workflows that push the limit of what an iPad can do.

Apple touts soon-to-be released capabilities in apps like LumaFusion and AR effects, and I have no doubt that there are benefits for power users of those apps. For the rest of us, the reasons to get an iPad Pro are less about speeds and feeds and more about the overall experience.

Another consequence of the M1 is that the USB-C port now supports Thunderbolt accessories. In theory, that’s great. I plugged my iPad into my very fancy CalDigit TS3 Plus Thunderbolt dock and was gratified to see my monitor light up right away. From there, though, I ran into the same old iPad problems.

I have a USB microphone interface hooked into the dock, and for whatever reason I was unable to get any audio out of it on the iPad, just silence. I also tested out some admittedly old LaCie Thunderbolt 2 drives with an adapter and couldn’t get them to show up in the Files app. Oh and just to remind you: the monitor still can only mirror the iPad Pro — it can’t serve as a second display.

Similarly, the dock has an audio-out so it shows up as a speaker. On the Mac, I can easily change settings to let my computer know to play audio out of its own speakers since I don’t have anything hooked up to the dock for sound. Nothing doing on the iPad Pro — if there’s a setting that would let me move the audio back to the iPad’s own excellent speakers, I couldn’t find it. (Long pressing on the AirPlay icon in Control Center only listed the dock as an option.)

Seemingly every new iPad Pro inspires an admittedly exhausting but also necessary discussion about whether or not iPadOS is actually capable enough to justify the price of the hardware that runs it.

The M1 processor sharpens that discussion. To me, the biggest difference between the Mac and the iPad at this point isn’t the touchscreen, it’s Apple’s approach to the operating system. On the iPad, Apple would rather not offer a feature than have it work in a non-iPad way. That’s noble, but it means the company has committed itself to reinventing a lot of wheels in computing: files, peripheral support, multi-window interfaces, and all the rest have to be re-thought and re-done.

Sometimes that reinvention results in some genuinely great features. The iPad’s “windowing” system takes some getting used to and has its limitations, but it can be a joy to use and makes organizing your digital stuff a bit easier. The problem is that all that reinvention is taking a very long time — it’s been six years since the original iPad Pro.

Center Stage works very well to keep you in frame. And as a bonus, the camera doesn’t blank out when you multitask in Zoom calls anymore.
GIF: Dieter Bohn

One new invention I love is the Center Stage feature. It zooms and follows human faces to keep them centered in the frame of the iPad’s wide-angle front facing camera. It works in any video conferencing app without the need for setup and it performs very well, better than similar features on smart displays like the Echo Show or Facebook Portal.

I’d love it even more if the front-facing cameras on the iPad Pro weren’t still in the wrong spot when attached to a keyboard — off to the side instead of centered on top. It’s great that the camera can keep my face centered in frame, it’s not so great that I am literally giving my coworkers the side-eye because I’m looking off to the right of where the cameras are to see their faces.

One more note: Apple has said that the original 12.9-inch Magic Keyboard “may not precisely fit when closed” as the new iPad Pro is slightly thicker. But when I tested it, I couldn’t discern any difference between the fit on the original and the new, white Magic Keyboard. Both worked — and closed — fine.

In any case, set aside the “What’s a computer” argument and let’s be more pragmatic. A 256GB 12.9-inch iPad Pro with a Magic Keyboard costs $1,548. A 256GB MacBook Air with the same processor costs $999. And just to be realistic: most people can get more done on the Mac than on the iPad. Taken strictly as a work machine, the Mac wins out on both price and functionality.

That includes battery life, by the way. Both iPad Pros have good battery life, but it’s not significantly improved over previous iPads. And as many people have discovered during the pandemic, if you actually use the iPad for work all day (especially if you do a lot of Zoom calls), the iPad Pro can conk out in eight hours or less. The MacBook Air edges it out.

Last and certainly not least, Apple’s refusal to offer multi-user support on the iPad has gone from being mystifying to obstinate. The company clearly intends this to be a single-user device, despite the fact that it would theoretically make for an even more compelling family computer than the pastel-colored iMacs that share the same processor.

Typing on the iPad Pro
Photo by Kayla Murkison / The Verge

But to give Apple the benefit of the doubt here, if you’re looking strictly at the iPad Pro as a work machine, you’re probably missing the point. The iPad Pro is simply a more beautiful, more premium object than even Apple’s own laptops.

It’s easy to take for granted, but the hardware in this tablet really is amazing: Face ID, dual rear cameras that are quite good and paired with LiDAR, quad speakers with superb sound and decent volume, excellent microphones, support for the Apple Pencil, the best screen you can get on a portable device, and on and on.

The reason to get the iPad Pro 12.9 (or even the 11) is simply to get the best, nicest iPad. Unless you can specifically answer right now which app in your workflow is slowed down by the specs on a lesser iPad, the $599 iPad Air or even the $329 base iPad offer the same core features that most people really use.

Except for a slim minority of people, the justification for getting an iPad Pro isn’t its feature set, it’s the experience of using a well-made, high-end object. Until I hit the limits of iPadOS (which I hit regularly), I enjoy using an iPad Pro more than I do any other computer.

The wonderful Mini LED display on the 12.9-inch iPad Pro doesn’t change any of those equations, it just makes the nicest iPad Pro even nicer. And so my yearly refrain about the iPad Pro remains. If you want the very best iPad, this is the very best iPad.

Just remember, it’s an iPad.

android-12-preview:-first-look-at-google’s-radical-new-design

Android 12 preview: first look at Google’s radical new design

There are new features, but it’s the biggest design update in years

Google is announcing the latest beta for Android 12 today at Google I/O. It has an entirely new design based on a system called “Material You,” featuring big, bubbly buttons, shifting colors, and smoother animations. It is “the biggest design change in Android’s history,” according to Sameer Samat, VP of product management, Android and Google Play.

That might be a bit of hyperbole, especially considering how many design iterations Android has seen over the past decade, but it’s justified. Android 12 exudes confidence in its design, unafraid to make everything much larger and a little more playful. Every big design change can be polarizing, and I expect Android users who prefer information density in their UI may find it a little off-putting. But in just a few days, it has already grown on me.

There are a few other functional features being tossed in beyond what’s already been announced for the developer betas, but they’re fairly minor. The new design is what matters. It looks new, but Android by and large works the same — though, of course, Google can’t help itself and again shuffled around a few system-level features.

I’ve spent a couple of hours demoing all of the new features and the subsequent few days previewing some of the new designs in the beta that’s being released today. Here’s what to expect in Android 12 when it is officially released later this year.


Android 12’s new design includes a lot of color and customization
Image: Google

Material You design and better widgets

Android 12 is one implementation of a new design system Google is debuting called Material You. Cue the jokes about UX versus UI versus… You, I suppose. Unlike the first version of Material Design, this new system is meant to mainly be a set of principles for creating interfaces — one that goes well beyond the original paper metaphor. Google says it will be applied across all of its products, from the web to apps to hardware to Android. Though as before, it’s likely going to take a long time for that to happen.

In any case, the point is that the new elements in Android 12 are Google’s specific implementations of those principles on Pixel phones. Which is to say: other phones might implement those principles differently or maybe even not at all. I can tell you what Google’s version of Android 12 is going to look and act like, but only Samsung can tell you what Samsung’s version will do (and, of course, when it will arrive).

The feature Google will be crowing the most about is that when you change your wallpaper, you’ll have the option to automatically change your system colors as well. Android 12 will pull out both dominant and complementary colors from your wallpaper automatically and apply those colors to buttons and sliders and the like. It’s neat, but I’m not personally a fan of changing button colors that much.

You can customize the system colors to match your wallpaper in Android 12.
GIF: Google

The lock screen is also set for some changes: the clock is huge and centered if you have no notifications and slightly smaller but still more prominent if you do. It also picks up an accent color based on the theming system. I especially love the giant clock on the always-on display.

Android’s widget system has developed a well-deserved bad reputation. Many apps don’t bother with them, and many more haven’t updated their widget’s look since they first made one in days of yore. The result is a huge swath of ugly, broken, and inconsistent widgets for the home screen.

New widget designs in Android 12.
GIF: Google

Google is hoping to fix all of that with its new widget system. As with everything else in Android 12, the widgets Google has designed for its own apps are big and bubbly, with a playful design that’s not in keeping with how most people might think of Android. One clever feature is that when you move a widget around on your wallpaper, it subtly changes its background color to be closer to the part of the image it’s set upon.

I don’t have especially high hopes that Android developers will rush to adopt this new widget system, so I hope Google has a plan to encourage the most-used apps to get on it. Apple came very late to the home screen widget game on the iPhone, but it’s already surpassed most of the crufty widget abandonware you’ll find from most Android apps.

Bigger buttons and more animation

As you’ve no doubt gathered already from the photos, the most noticeable change in Android 12 is that all of the design elements are big, bubbly, and much more liberal in their use of animation. It certainly makes the entire system more legible and perhaps more accessible, but it also means you’re just going to get fewer buttons and menu items visible on a single screen.

That tradeoff is worth it, I think. Simple things like brightness and volume sliders are just easier to adjust now, for example. As for the animations, so far, I like them. But they definitely involve more visual flourish than before. When you unlock or plug in your phone, waves of shadow and light play across the screen. Apps expand out clearly from their icon’s position, and drawers and other elements slide in and out with fade effects.

More animations mean more resources and potentially more jitter, but Samat says the Android team has optimized how Android displays core elements. The windows and package manager use 22 percent less CPU time, the system server uses 15 percent less of the big (read: more powerful and battery-intensive) core on the processor, and interrupts have been reduced, too.

Android has another reputation: solving for jitter and jank by just throwing ever-more-powerful hardware at the problem: faster chips, higher refresh rate screens, and the like. Hopefully none of that will be necessary to keep these animations smooth on lower-end devices. On my Pixel 5, they’ve been quite good.

One last bit: there’s a new “overscroll” animation — the thing the screen does when you scroll to the end of a page. Now, everything on the screen will sort of stretch a bit when you can’t scroll any further. Maybe an Apple patent expired.

Shuffling system spaces around

It wouldn’t be a new version of Android without Google mucking about with notifications, Google Assistant, or what happens when you press the power button. With Android 12, we’ve hit the trifecta. Luckily, the changes Google has made mostly represent walking back some of the changes it made in Android 11.

The combined Quick Settings / notifications shade remains mostly the same — though the huge buttons mean you’re going to see fewer of them in either collapsed or expanded views. The main difference in notifications is mostly aesthetic. Like everything else, they’re big and bubbly. There’s a big, easy-to-hit down arrow for expanding them, and groups of notifications are put together into one bigger bubble. There’s even a nice little visual flourish when you begin to swipe a notification away: it forms its own roundrect, indicating that it has become a discrete object.

The quick settings and notification shade have gotten a facelift in Android 12.
GIF: Google

The thing that will please a lot of Android users is that after just a year, Google has bailed on its idea of creating a whole new power button menu with Google Wallet and smart home controls. Instead, both of those things are just buttons inside the quick settings shade, similar to Samsung’s solution.

Holding down the power button now just brings up Google Assistant. Samat says it was a necessary change because Google Assistant is going to begin to offer more contextually aware features based on whatever screen you’re looking at. I say the diagonal swipe-in from the corner to launch Assistant was terrible, and I wouldn’t be surprised if it seriously reduced how much people used it.

I also have to point out that it’s a case of Google adopting gestures already popular on other phones: the iPhone’s button power brings up Siri, and a Galaxy’s button brings up Bixby.

New privacy features in Android 12.
Image: Google

New privacy features for camera, mic, and location

Google is doing a few things with privacy in Android 12, mostly focused on three key sensors it sees as trigger points for people: location, camera, and microphone.

The camera and mic will now flip on a little green dot in the upper-right of the screen, indicating that they’re on. There are also now two optional toggles in Quick Settings for turning them off entirely at a system level.

When an app tries to use one of them, Android will pop up a box asking if you want to turn it back on. If you choose not to, the app thinks it has access to the camera or mic, but all Android gives it is a black nothingness and silence. It’s a mood.

For location, Google is adding another option for what kind of access you can grant an app. Alongside the options to limit access to one time or just when the app is open, there are settings for granting either “approximate” or “precise” locations. Approximate will let the app know your location with less precision, so it theoretically can’t guess your exact address. Google suggests it could be useful for things like weather apps. (Note that any permissions you’ve already granted will be grandfathered in, so you’ll need to dig into settings to switch them to approximate.)

Google is also creating a new “Privacy Dashboard” specifically focused on location, mic, and camera. It presents a pie chart of how many times each has been accessed in the last 24 hours along with a timeline of each time it was used. You can tap in and get to the settings for any app from there.

The Android Private Compute Core

Another new privacy feature is the unfortunately named “Android Private Compute Core.” Unfortunately, because when most people think of a “core,” they assume there’s an actual physical chip involved. Instead, think of the APCC as a sandboxed part of Android 12 for doing AI stuff.

Essentially, a bunch of Android machine learning functions are going to be run inside the APCC. It is walled-off from the rest of the OS, and the functions inside it are specifically not allowed any kind of network access. It literally cannot send or receive data from the cloud, Google says. The only way to communicate with the functions inside it is via specific APIs, which Google emphasizes are “open source” as some kind of talisman of security.

Talisman or no, it’s a good idea. The operations that run inside the APCC include Android’s feature for ambiently identifying playing music. That needs to have the microphone listening on a very regular basis, so it’s the sort of thing you’d want to keep local. The APCC also hands the “smart chips” for auto-reply buttons based on your own language usage.

An easier way to think of it is if there’s an AI function you might think is creepy, Google is running it inside the APCC so its powers are limited. And it’s also a sure sign that Google intends to introduce more AI features into Android in the future.

No news on app tracking — yet

Location, camera, mic, and machine learning are all privacy vectors to lock down, but they’re not the kind of privacy that’s on everybody’s mind right now. The more urgent concern in the last few months is app tracking for ad purposes. Apple has just locked all of that down with its App Tracking Transparency feature. Google itself is still planning on blocking third-party cookies in Chrome and replacing them with anonymizing technology.

What about Android? There have been rumors that Google is considering some kind of system similar to Apple’s, but there won’t be any announcements about it at Google I/O. However, Samat confirmed to me that his team is working on something:

There’s obviously a lot changing in the ecosystem. One thing about Google is it is a platform company. It’s also a company that is deep in the advertising space. So we’re thinking very deeply about how we should evolve the advertising system. You see what we’re doing on Chrome.

From our standpoint on Android, we don’t have anything to announce at the moment, but we are taking a position that privacy and advertising don’t need to be directly opposed to each other. That, we don’t believe, is healthy for the overall ecosystem as a company. So we’re thinking about that working with our developer partners and we’ll be sharing more later this year.

The Android TV remote in Android 12.
Image: Google

A few other features

Google has already announced a bunch of features in earlier developer betas, most of which are under-the-hood kind of features. There are “improved accessibility features for people with impaired vision, scrolling screenshots, conversation widgets that bring your favorite people to the home screen” and the already-announced improved support for third-party app stores. On top of those, there are a few neat little additions to mention today.

First, Android 12 will (finally) have a built-in remote that will work with Android TV systems like the Chromecast with Google TV or Sony TVs. Google is also promising to work with partners to get car unlocking working via NFC and (if a phone supports it) UWB. It will be available on “select Pixel and Samsung Galaxy phones” later this year, and BMW is on board to support it in future vehicles.

Android 12 will add some new integrations between Android phones and Chromebooks.
GIF: Google

For people with Chromebooks, Google is continuing the trend of making them work better with Android phones. Later this year, Chrome OS devices will be able to immediately access new photos in an Android phone’s photo library over Wi-Fi Direct instead of waiting for them to sync up to the Google Photos cloud. Google still doesn’t have anything as good as AirDrop for quickly sending files across multiple kinds of devices, but it’s a good step.

Android already has fast pairing for quickly setting up Bluetooth devices, but it’s not built into the Bluetooth spec. Instead, Google has to work with individual manufacturers to enable it. A new one is coming on board today: Beats, which is owned by Apple. (Huh!) Ford and BMW cars will also support one-tap pairing.

Android Updates

As always, no story about a new version of Android would be complete without pointing out that the only phones guaranteed to get it in a timely manner are Google’s own Pixel phones. However, Google has made some strides in the past few years. Samat says that there has been a year-over-year improvement in the “speed of updates” to the tune of 30 percent.

A few years ago, Google changed the architecture of Android with something called Project Treble. It made the system a little more modular, which, in turn, made it easier for Android manufacturers to apply their custom versions of Android without mucking about in the core of it. That should mean faster updates.

Some companies have improved slightly, including the most important one, Samsung. However, it’s still slow going, especially for older devices. As JR Raphael has pointed out, most companies are not getting updates out in what should be a perfectly reasonable timeframe.

Beyond Treble, there may be some behind-the-scenes pressure happening. More and more companies are committing to providing updates for longer. Google also is working directly with Qualcomm to speed up updates. Since Qualcomm is, for all intents and purposes, the monopoly chip provider for Android phones in the US, that should make a big difference, too.

That’s all heartening, but it’s important to set expectations appropriately. Android will never match iOS in terms of providing timely near-universal updates as soon as a new version of the OS is available. There will always be a gap between the Android release and its availability for non-Pixel phones. That’s just the way the Android ecosystem works.

Android 12.
Photo composite by Amelia Holowaty Krales and Vjeran Pavic / The Verge

That’s Android 12. It may not be the biggest feature drop in years, but it is easily the biggest visual overhaul in some time. And Android needed it. Over time and over multiple iterations, lots of corners of the OS were getting a little crufty as new ideas piled on top of each other. Android 12 doesn’t completely wipe the slate clean and start over, but it’s a significant and ambitious attempt to make the whole system feel more coherent and consistent.

The beta that’s available this week won’t get there — the version I’m using lacks the theming features, widgets, and plenty more. Those features should get layered in as we approach the official release later this year. Assuming that Google can get this fresh paint into all of the corners, it will make Google’s version of Android a much more enjoyable thing to use.

apple-music-lossless-supported-devices:-what-will-(and-won’t)-play-lossless-and-spatial-audio,-and-why

Apple Music lossless supported devices: what will (and won’t) play lossless and Spatial Audio, and why

(Image credit: Apple Music)

Apple Music is being upgraded in a big, big way. From June, the music streaming service will support CD-quality and hi-res lossless audio as well as Dolby Atmos-powered Spatial Audio, offering subscribers much higher quality, immersive playback. 

Both lossless and Spatial Audio will be available to Apple Music users at no extra cost. Apple describes these two additions as Apple Music’s “biggest advancement ever in sound quality” – which we’d have to agree with. “Excellent news. Well done, Apple!”, we thought upon hearing the news. 

But here comes the (rather large) downside.

Apple’s own headphones don’t support lossless audio. None of them. That means even if you’ve spent £549 ($549, AU$899) on a pair of AirPods Max, you can’t listen to Apple Music in the highest quality. Miffed? We don’t blame you. 

That’s not the whole story, though, and not the only Apple device not able to take advantage of the new Apple Music features. Let’s take a look at which devices can benefit from hi-res audio, which don’t, and why…

What is Apple Music Lossless?

Essentially, it’s Apple embracing hi-res audio. Apple’s lossless streams use ALAC (Apple Lossless Audio Codec) to offer more detail and information in a recording. 

Apple offers three tiers of higher resolution audio: CD quality (16-bit/44.1kHz), Apple Music Lossless (24-bit/48kHz), and Hi-Res Lossless (up to 24-bit/192kHz). You can choose your quality through the Settings > Music > Audio Quality section of Apple Music.

As of next month, all of Apple Music’s 75-million-strong music catalogue will be available in CD quality or Apple Music Lossless. At launch, 20 million will be accessible in the highest quality Hi-Res Lossless format, with the whole catalogue following “by the end of 2021”.

Apple Music isn’t the first service to offer lossless streaming, of course. Tidal, Qobuz and Amazon Music HD all offer CD-quality and hi-res listening, while Deezer offers the former. The Spotify HiFi lossless tier is due to offer CD-quality streams later in the year too.

What is Spatial Audio?

Another new addition to Apple Music is Spatial Audio, which is an Apple technology designed to provide “multidimensional sound and clarity”; to deliver surround sound and 3D audio via your headphones.

Spatial Audio was initially launched as part of iOS 14 and iPadOS 14, and the newer Spatial Audio with Dolby Atmos feature for Apple Music is coming as part of the iOS 14.6 and iPadOS 14.6 updates that are due in June 2021.

Spatial Audio is a slightly different beast to this new Dolby Atmos-powered Spatial Audio for Apple Music in that it also utilises the sensors in Apple’s own headphones to enable head-tracking. Because the implementation for Apple Music of spatial audio is sound-only, there’s no head-tracking involved. 

“Thousands” of Apple Music tracks will be available in Spatial Audio with Dolby Atmos at launch, with more being added regularly.

Which Apple devices work with Apple Music lossless?

(Image credit: Apple)

The big news is that no model of AirPods will support lossless audio. In the case of the AirPods and AirPods Pro, that’s not exactly surprising, seeing as they’re both completely wireless and Apple only supports the AAC (rather than ALAC) codec over Bluetooth – that’s a step up from MP3, but nowhere near the quality of lossless.

The AirPods Max can be wired to an iPhone, so one might hope that that could be a way to get lossless audio. But actually they only work with analogue audio sources in wired listening mode. Which again means no lossless listening.

Its HomePod range of smart speakers also won’t support lossless – that’s the now discontinued HomePod and still-very-much-current HomePod Mini.

Apple’s iPhones (since the iPhone 7) natively support lossless – but only Apple Music Lossless, and not the highest quality Hi-Res Lossless (which delivers up to 24-bit/192kHz). If you want to listen to Apple Music tracks above 24-bit/48kHz on your iPhone, you’ll need to connect an external DAC and use a wired pair of headphones. Check out our guide for how to listen to hi-res audio on an iPhone.

The same is true of the Apple TV and iPad families, which are listed as supporting Apple Lossless, with no mention of Hi-Res Lossless.

Which Apple devices support Spatial Audio?

(Image credit: Apple)

Apple devices are much better represented when it comes to Spatial Audio through Apple Music. Some – like the AirPods Pro and Max – already support it, after all. And it will come to the standard AirPods 2 in due course.

In fact, it will be available on all AirPods and Beats headphones with an H1 or W1 chip. (That’s the AirPods, AirPods Pro, AirPods Max, BeatsX, Beats Solo3 Wireless, Beats Studio3, Powerbeats3 Wireless, Beats Flex, Powerbeats Pro, and Beats Solo Pro.) But you don’t have to line Apple’s pockets to hear the Dolby Atmos tracks: it will also work on any headphones connected to an iPhone or iPad. You just have to enable it manually.

To do so, head to Settings on your iPhone or iPad and then to Music – once the update has landed next month, a new Dolby Atmos option will be available. This will be set to Automatic by default, which means Dolby Atmos tracks will play correctly when you’re listening via any W1- or H1-enabled pair of Apple or Beats headphones, but not when you’re using third-party headphones. However, if you switch this option to Always On, even non-Apple headphones will play back Dolby Atmos tracks in all their sonic glory.

However, this only applies to Dolby Atmos Spatial Audio tracks on Apple Music, and not Spatial Audio content from other apps like TV. (Remember, Spatial Audio in the TV app is a slightly different beast in that it also utilises the sensors in Apple’s own headphones to enable head-tracking.)

The HomePod and HomePod Mini also support Spatial Audio, so you can fill your room with virtual 3D sound from a single device. As do the iPhone 11 onwards and iPad Pro (but not iPad, iPad Mini or iPad Air).

Playing from an Apple TV 4K into a Dolby Atmos soundbar or system will work too.

MORE:

Check out the best wireless headphones around

3D sound from a soundbar: Best Dolby Atmos soundbars

Try 30 Apple Music tips, tricks and features

Need new music? 10 Apple Music playlists to listen to right now