tens-of-thousands-of-verkada-cameras-were-easily-accessible-to-employees-as-well-as-hackers

Tens of thousands of Verkada cameras were easily accessible to employees as well as hackers

Employees of cloud-based surveillance firm Verkada had widespread access to feeds from customers’ cameras, according to new reports from Bloomberg and The Washington Post.

Verkada’s systems were recently breached by a “hacktivist” collective which gained access to more than 150,000 of the company’s cameras in locations ranging from Tesla factories, to police stations, gyms, schools, jails, and hospitals. The group, who call themselves Advanced Persistent Threat 69420, stumbled across log-in credentials for Verkada’s “Super Admin” accounts online. They publicized their findings, saying they were motivated by “lots of curiosity, fighting for freedom of information and against intellectual property, a huge dose of anti-capitalism, a hint of anarchism — and it’s also just too much fun not to do it.”

Now, anonymous Verkada employees say the same “Super Admin” accounts that the hackers accessed were also widely shared in the company itself. More than 100 employees had Super Admin privileges, reports Bloomberg, meaning that these individuals could browse the live feeds from tens of thousands of cameras around the world at any time. “We literally had 20-year-old interns that had access to over 100,000 cameras and could view all of their feeds globally,” one former senior-level employee told the publication.

Verkada, meanwhile, says access was limited to employees who needed to fix technical problems or address user complaints. “Verkada’s training program and policies for employees are both clear that support staff members were and are required to secure a customer’s explicit permission before accessing that customer’s video feed,” said the Silicon Valley firm in a statement given to Bloomberg.

The Washington Post, though, cites the testimony of surveillance researcher Charles Rollet, who says individuals with close knowledge of the company told him that Verkada employees could access feeds without customers’ knowledge. “People don’t realize what happens on the back-end, and they assume that there are always these super-formal processes when it comes to accessing footage, and that the company will always need to give explicit consent,” said Rollet. But clearly that’s not always the case.”

Another former employee told Bloomberg that although Verkada’s internal systems asked workers to explain why they were accessing a customer’s camera, this documentation was not taken seriously. “Nobody cared about checking the logs,” said the employee. “You could put whatever you wanted in that note; you could even just enter a single space.”

Verkada’s cameras offer AI-powered analytics, including facial recognition and the ability to search footage for specific individuals.
Image: Verkada

Verkada’s cloud-based cameras were sold to customers in part on the strength of their analytical software. One feature called “People Analytics” let customers “search and filter based on many different attributes, including gender traits, clothing color, and even a person’s face,” said Verkada in a blog post. Their cloud-based systems that gave customers’ easy access to their camera’s feeds also enabled the breach.

The hacker collective Advanced Persistent Threat 69420 (the name is a nod to the taxonomy used by cybersecurity companies to catalog state-sponsored hackers combined with the meme numbers 69 and 420) say they wanted to inform the public of the dangers of such ubiquitous surveillance. The breach “exposes just how broadly we’re being surveilled, and how little care is put into at least securing the platforms used to do so, pursuing nothing but profit,” one member of the group told Bloomberg. “It’s just wild how I can just see the things we always knew are happening, but we never got to see.”

i-guess-i-have-to-watch-ads-everywhere-on-my-$1,500-lg-tv-now

I guess I have to watch ads everywhere on my $1,500 LG TV now

This afternoon, I was updating the streaming apps on my 2020 LG CX OLED TV, something I do from time to time, but today was different. Out of nowhere, I saw (and heard) an ad for Ace Hardware start playing in the lower-left corner. It autoplayed with sound without any action on my part.

Now I’m fully aware that it’s not unusual to see ads placed around a TV’s home screen or main menu. LG, Samsung, Roku, Vizio, and others are all in on this game. We live in an era when smart TVs can automatically recognize what you’re watching, and TV makers are building nice ad businesses for themselves with all of the data that gets funneled in.

But this felt pretty egregious even by today’s standards. A random, full-on commercial just popping up in LG’s app store? Is there no escape from this stuff? We’re just going to cram ads into every corner of a TV’s software, huh? Imagine if an autoplay ad started up while you were updating the apps on your smartphone.

The Ace spot wasn’t particularly annoying — it was over in 15 seconds — nor did it feel targeted at me or creepy. It’s really the placement that feels like a step too far.

This stuff can come off as invasive, but it’s also partially what’s steadily brought the prices down on even high-end TVs. I got this 55-inch CX on sale for like $1,400, and it’s pretty much the best TV on the market for next-gen gaming. But even if this beautiful panel came cheaper than it might have without ads plastered in random places, the level of ad infiltration on display here is still disheartening to see. LG recently announced it will be licensing webOS to other TV brands, so maybe the company is trying to see how far it can push things.

I guess I can always cut the TV’s internet connection and stick to a streaming stick or my Xbox Series X if the autoplay commercials keep popping up everywhere. Or maybe I can opt out of a setting somewhere to end the barrage. Some people aren’t bothered by this stuff, but if you are, check out this excellent Reddit thread, which can help you fight back and block some of the domains that these TVs phone home to for their ads. A lot of people on Twitter also pointed me to Pi-hole as a fix.

I guess I have to watch ads everywhere on my $1,500 LG TV now

This afternoon, I was updating the streaming apps on my 2020 LG CX OLED TV, something I do from time to time, but today was different. Out of nowhere, I saw (and heard) an ad for Ace Hardware start playing in the lower-left corner. It autoplayed with sound without any action on my part.

Now I’m fully aware that it’s not unusual to see ads placed around a TV’s home screen or main menu. LG, Samsung, Roku, Vizio, and others are all in on this game. We live in an era when smart TVs can automatically recognize what you’re watching, and TV makers are building nice ad businesses for themselves with all of the data that gets funneled in.

But this felt pretty egregious even by today’s standards. A random, full-on commercial just popping up in LG’s app store? Is there no escape from this stuff? We’re just going to cram ads into every corner of a TV’s software, huh? Imagine if an autoplay ad started up while you were updating the apps on your smartphone.

The Ace spot wasn’t particularly annoying — it was over in 15 seconds — nor did it feel targeted at me or creepy. It’s really the placement that feels like a step too far.

This stuff can come off as invasive, but it’s also partially what’s steadily brought the prices down on even high-end TVs. I got this 55-inch CX on sale for like $1,400, and it’s pretty much the best TV on the market for next-gen gaming. But even if this beautiful panel came cheaper than it might have without ads plastered in random places, the level of ad infiltration on display here is still disheartening to see. LG recently announced it will be licensing webOS to other TV brands, so maybe the company is trying to see how far it can push things.

I guess I can always cut the TV’s internet connection and stick to a streaming stick or my Xbox Series X if the autoplay commercials keep popping up everywhere. Or maybe I can opt out of a setting somewhere to end the barrage. Some people aren’t bothered by this stuff, but if you are, check out this excellent Reddit thread, which can help you fight back and block some of the domains that these TVs phone home to for their ads. A lot of people on Twitter also pointed me to Pi-hole as a fix.

gpu-test-system-update-march-2021

GPU Test System Update March 2021

Introduction

TechPowerUp is one of the most highly cited graphics card review sources on the web, and we strive to keep our testing methods, game selection, and, most importantly, test bench up to date. Today, I am pleased to announce our newest March 2021 VGA test system, which has one of many firsts for TechPowerUp. This is our first graphics card test bed powered by an AMD CPU. We are using the Ryzen 7 5800X 8-core processor based on the “Zen 3” architecture. The new test setup fully supports the PCI-Express 4.0 x16 bus interface to maximize performance of the latest generation of graphics cards by both NVIDIA and AMD. The platform also enables the Resizable BAR feature by PCI-SIG, allowing the processor to see the whole video memory as a single addressable block, which could potentially improve performance.

A new test system heralds completely re-testing every single graphics card used in our performance graphs. It allows us to kick out some of the older graphics cards and game tests to make room for newer cards and games. It also allows us to refresh our OS, testing tools, update games to the latest version, and explore new game settings, such as real-time raytracing, and newer APIs.

A VGA rebench is a monumental task for TechPowerUp. This time, I’m testing 26 graphics cards in 22 games at 3 resolutions, or 66 game tests per card, which works out to 1,716 benchmark runs in total. In addition, we have doubled our raytracing testing from two to four titles. We also made some changes to our power consumption testing, which is now more detailed and more in-depth than ever.

In this article, I’ll share some thoughts on what was changed and why, while giving you a first look at the performance numbers obtained on the new test system.

Hardware

Below are the hardware specifications of the new March 2021 VGA test system.

Test System – VGA 2021.1
Processor: AMD Ryzen 7 5800X @ 4.8 GHz

(Zen 3, 16 MB Cache)
Motherboard: MSI B550-A Pro

BIOS 7C56vA5 / AGESA 1.2.0.0
Memory: Thermaltake TOUGHRAM, 16 GB DDR4

@ 4000 MHz 19-23-23-42 1T

Infinity Fabric @ 2000 MHz (1:1)
Cooling: Corsair iCue H100i RGB Pro XT

240 mm AIO
Storage: Crucial MX500 2 TB SSD
Power Supply: Seasonic Prime Ultra Titanium 850 W
Case: darkFlash DLX22
Operating System: Windows 10 Professional 64-bit

Version 20H2 (October 2020 Update)
Drivers: AMD: 21.2.3 Beta

NVIDIA: 461.72 WHQL


The AMD Ryzen 7 5800X has emerged as the fastest processor we can recommend to gamers for play at any resolution. We could have gone with the 12-core Ryzen 9 5900X or even maxed out this platform with the 16-core 5950X, but neither would be faster at gaming, and both would be significantly more expensive. AMD certainly wants to sell you the more expensive (overpriced?) CPU, but the Ryzen 7 5800X is actually the fastest option because of its single CCD architecture. Our goal with GPU test systems over the past decade has consistently been to use the fastest mainstream-desktop processor. Over the years, this meant a $300-something Core i7 K-series LGA115x chip making room for the $500 i9-9900K. The 5900X doesn’t sell for anywhere close to this mark, and we’d rather not use an overpriced processor just because we can. You’ll also notice that we skipped upgrading to the 10-core “Comet Lake” Core i9-10900K processor from the older i9-9900K because we saw no significant increases and negligible gaming performance gains, especially considering the large overclock on the i9-9900K. The additional two cores do squat for nearly all gaming situations, which is the second reason besides pricing that had us decide against the Ryzen 9 5900X.

We continue using our trusted Thermaltake TOUGHRAM 16 GB dual-channel memory kit that served us well for many years. 32 GB isn’t anywhere close to needed for gaming, so I didn’t want to hint at that, especially to less experienced readers checking out the test system. We’re running at the most desirable memory configuration for Zen 3 to reduce latencies inside the processor: Infinity Fabric at 2000 MHz, memory clocked at DDR4-4000, in 1:1 sync with the Infinity Fabric clock. Timings are at a standard CL19 configuration that’s easily found on affordable memory modules—spending extra for super-tight timings usually is overkill and not worth it for the added performance.

The MSI B550-A PRO was an easy choice for a motherboard. We wanted a cost-effective motherboard for the Ryzen 9 5800X and don’t care at all about RGB or other bling. The board can handle the CPU and memory settings we wanted for this test bed, the VRM barely gets warm. It also doesn’t come with any PCIe gymnastics—a simple PCI-Express 4.0 x16 slot wired to the CPU without any lane switches along the way. The slot is metal-reinforced and looks like it can take quite some abuse over time. Even though I admittedly swap cards hundreds of times each year, probably even 1000+ times, it has never been any issue—insertion force just gets a bit softer, which I actually find nice.

Software and Games

  • Windows 10 was updated to 20H2
  • The AMD graphics driver used for all testing is now 21.2.3 Beta
  • All NVIDIA cards use 461.72 WHQL
  • All existing games have been updated to their latest available version

The following titles were removed:

  • Anno 1800: old, not that popular, CPU limited
  • Assassin’s Creed Odyssey: old, DX11, replaced by Assassin’s Creed Valhalla
  • Hitman 2: old, replaced by Hitman 3
  • Project Cars 3: not very popular, DX11
  • Star Wars: Jedi Fallen Order: horrible EA Denuvo makes hardware changes a major pain, DX11 only, Unreal Engine 4, of which we have several other titles
  • Strange Brigade: old, not popular at all

The following titles were added:

  • Assassin’s Creed Valhalla
  • Cyberpunk 2077
  • Hitman 3
  • Star Wars Squadrons
  • Watch Dogs: Legion

I considered Horizon Zero Dawn, but rejected it because it uses the same game engine as Death Stranding. World of Warcraft or Call of Duty won’t be tested because of their always-online nature, which enforces game patches that mess with performance—at any time. Godfall is a bad game, Epic exclusive, and commercial flop.

The full list of games now consists of Assassin’s Creed Valhalla, Battlefield V, Borderlands 3, Civilization VI, Control, Cyberpunk 2077, Death Stranding, Detroit Become Human, Devil May Cry 5, Divinity Original Sin 2, DOOM Eternal, F1 2020, Far Cry 5, Gears 5, Hitman 3, Metro Exodus, Red Dead Redemption 2, Sekiro, Shadow of the Tomb Raider, Star Wars Squadrons, The Witcher 3, and Watch Dogs: Legion.

Raytracing

We previously tested raytracing using Metro Exodus and Control. For this round of retesting, I added Cyberpunk 2077 and Watch Dogs Legion. While Cyberpunk 2077 does not support raytracing on AMD, I still felt it’s one of the most important titles to test raytracing with.

While Godfall and DIRT 5 support raytracing, too, neither has had sufficient commercial success to warrant inclusion in the test suite.

Power Consumption Testing

The power consumption testing changes have been live for a couple of reviews already, but I still wanted to detail them a bit more in this article.

After our first Big Navi reviews I realized that something was odd about the power consumption testing method I’ve been using for years without issue. It seemed the Radeon RX 6800 XT was just SO much more energy efficient than NVIDIA’s RTX 3080. It definitely is more efficient because of the 7 nm process and AMD’s monumental improvements in the architecture, but the lead just didn’t look right. After further investigation, I realized that the RX 6800 XT was getting CPU bottlenecked in Metro: Last Light at even the higher resolutions, whereas the NVIDIA card ran without a bottleneck. This of course meant NVIDIA’s card consumed more power in this test because it could run faster.

The problem here is that I used the power consumption numbers from Metro for the “Performance per Watt” results under the assumption that the test loaded the card to the max. The underlying reason for the discrepancy is AMD’s higher DirectX 11 overhead, which only manifested itself enough to make a difference once AMD actually had cards able to compete in the high-end segment.

While our previous physical measurement setup was better than what most other reviewers use, I always wanted something with a higher sampling rate, better data recording, and a more flexible analysis pipeline. Previously, we recorded at 12 samples per second, but could only store minimum, maximum, and average. Starting and stopping the measurement process was a manual operation, too.

The new data acquisition system also uses professional lab equipment and collects data at 40 samples per second, which is four times faster than even NVIDIA’s PCAT. Every single data point is recorded digitally and stashed away for analysis. Just like before, all our graphics card power measurement is “card only”, not the “whole system” or “GPU chip only” (the number displayed in the AMD Radeon Settings control panel).

Having all data recorded means we can finally chart power consumption over time, which makes for a nice overview. Below is an example data set for the RTX 3080.

The “Performance per Watt” chart has been simplified to “Energy Efficiency” and is now based on the actual power and FPS achieved during our “Gaming” power consumption testing run (Cyberpunk 2077 at 1440p, see below).

The individual power tests have also been refined:

  • “Idle” testing is now measuring at 1440p, whereas it used 1080p previously. This is to follow the increasing adoption rates of high-res monitors.
  • “Multi-monitor” is now 2560×1440 over DP + 1920×1080 over HDMI—to test how well power management works with mixed resolutions over mixed outputs.
  • “Video Playback” records power usage of a 4K30 FPS video that’s encoded with H.264 AVC at 64 Mbps bitrate—similar enough to most streaming services. I considered using something like madVR to further improve video quality, but rejected it because I felt it to be too niche.
  • “Gaming” power consumption is now using Cyberpunk 2077 at 1440p with Ultra settings—this definitely won’t be CPU bottlenecked. Raytracing is off, and we made sure to heat up the card properly before taking data. This is very important for all GPU benchmarking—in the first seconds, you will get unrealistic boost rates, and the lower temperature has the silicon operating at higher efficiency, which screws with the power consumption numbers.
  • “Maximum” uses Furmark at 1080p, which pushes all cards into its power limiter—another important data point.
  • Somewhat as a bonus, and I really wasn’t sure if it’s as useful, I added another run of Cyberpunk at 1080p, capped to 60 FPS, to simulate a “V-Sync” usage scenario. Running at V-Sync not only removes tearing, but also reduces the power consumption of the graphics card, which is perfect for slower single-player titles where you don’t need the highest FPS and would rather conserve some energy and have less heat dumped into your room. Just to clarify, we’re technically running a 60 FPS soft cap so that weaker cards that can’t hit 60 FPS (GTX 1650S and GTX 1660) won’t run 60/30/20 FPS V-Sync, but go as high as able.
  • Last but not least, a “Spikes” measurement was added, which reports the highest 20 ms spike recorded in this whole test sequence. This spike usually appears at the start of Furmark, before the card’s power limiting circuitry can react to the new conditions. On RX 6900 XT, I measured well above 600 W, which can trigger the protections of certain power supplies, resulting in the machine suddenly turning off. This happened to me several times with a different PSU than the Seasonic, so it’s not a theoretical test.

Radeon VII Fail

Since we’re running with Resizable BAR enabled, we also have to boot with UEFI instead of CSM. When it was time to retest the Radeon VII, I got no POST, and it seemed the card was dead. Since there’s plenty of drama around Radeon VII cards suddenly dying, I already started looking for a replacement, but wanted to give it another chance in another machine, which had it working perfectly fine. WTF?

After some googling, I found our article detailing the lack of UEFI support on the Radeon VII. So that was the problem, the card simply didn’t have the BIOS update AMD released after our article. Well, FML, the page with the BIOS update no longer exists on AMD’s website.

Really? Someone on their web team made the decision to just delete the pages that contain an important fix to get the product working, a product that’s not even two years old? (launched Feb 7 2019, page was removed no later than Nov 8 2020).

Luckily, I found the updated BIOS in our VGA BIOS collection, and the card is working perfectly now.

Performance results are on the next page. If you have more questions, please do let us know in the comments section of this article.

twitter-tries-to-fix-problematic-image-crops-by-not-cropping-pictures-anymore

Twitter tries to fix problematic image crops by not cropping pictures anymore

Twitter has devised a potential solution to its problematic image cropping issue: no more cropping. The company said on Wednesday it’s now testing a “what you see is what you get” image preview within the tweet compose box and experimenting with displaying full-frame images. That way, images will show up in the Twitter timeline looking just as they did when the user was composing the tweet.

“Now testing on Android and iOS: when you Tweet a single image, how the image appears in the Tweet composer is how it will look on the timeline –– bigger and better,” the company wrote in its announcement tweet on the new feature test. Twitter also says its testing new 4K image uploading on Android and iOS as part of a broader push “to improve how you can share and view media on Twitter.”

Now testing on Android and iOS: when you Tweet a single image, how the image appears in the Tweet composer is how it will look on the timeline –– bigger and better. pic.twitter.com/izI5S9VRdX

— Twitter Support (@TwitterSupport) March 10, 2021

With the new image preview change, there should be less algorithmic surprises — like the ones several users brought attention to last fall that showed how the company’s automated cropping tool quite often favored white faces over Black ones. In many of those cases, irregularly sized images shared on Twitter were automatically cropped behind the scenes using an AI-powered algorithm, but in ways that raised some troubling questions about how the software prioritized skin color and other factors.

Twitter at the time said the neural network it uses for automated image cropping was tested for racial bias, and the company claims it found none. But it also admitted it needed to perform more analysis and refine its approach to avoid situations like this where even the appearance of bias was a possibility.

“It’s clear that we’ve got more analysis to do. We’ll open source our work so others can review and replicate,” wrote Twitter communications lead Liz Kelley in the aftermath of the controversy going viral. “Just because a system shows no statistical bias, doesn’t mean it won’t cause harm.” Kelley said Twitter would rely “less on auto-cropping so more often the photo you see in the Tweet composer is what it will look like in the Tweet.”

we are going to rely less on auto-cropping so more often the photo you see in the Tweet composer is what it will look like in the Tweet =

— liz kelley (@lizkelley) October 1, 2020

Twitter’s Parag Agrawal, the company’s chief technology officer, later wrote a blog post delving into the issue at length, saying at the time that Twitter would be conducting “additional analysis to add further rigor to our testing” and that it was “committed to sharing our findings and… exploring ways to open-source our analysis so that others can help keep us accountable.”

Now, it looks like Twitter’s proposed solution is here, at least in a test phase. While tweets in standard aspect ratios will be identical when previewed in the compose window and displayed in the timeline, Twitter’s design chief Dantley Davis says extra-wide or tall images will be center cropped for those included in the test. Twitter has not shared a concrete timeline for when this change may be pushed live for all users.

With this test, we hope to learn if this new approach is better and what changes we need to make to provide a “what you see is what you get” experience for Tweets with images.

— Dantley Davis (@dantley) March 10, 2021

asus’-new-rog-phone-5-ultimate-has-18gb-of-ram-and-a-rear-facing-oled-screen

Asus’ new ROG Phone 5 Ultimate has 18GB of RAM and a rear-facing OLED screen

Asus is going big with its latest gaming phones. The ROG Phone 5 lineup will start shipping this month across the globe, costing 799 euros (around $950) for the base configuration with 8GB of RAM and 256GB of fast UFS 3.1 storage. Every configuration has a 6.78-inch FHD+ OLED screen with a 144Hz refresh rate and a 300Hz touch sampling rate. Also, the headphone jack has made a comeback after being absent from the ROG Phone 3, this time with a quad DAC in tow for hi-res audio. (In case you’re wondering where the ROG Phone 4 went, Asus skipped over the number four, like OnePlus did, due to its similarities with the word “death” in some Asian languages.)

The most notable changes from the last generation are exclusive to some even more expensive configurations, the ROG Phone 5 Pro and Ultimate (which I published a review of) that release in April for 1,199 euros (approximately $1,420) and in May for 1,299 euros (about $1,583), respectively. Both of these models have double the storage and more RAM (starting at 16GB in the Pro and going all the way up to 18GB in the Ultimate); come in limited edition colors; and have two additional ultrasonic touch sensors than the standard model, located near where your ring fingers might rest while holding the phone in landscape mode. You’ll also get a case and a clip-on AeroActive Cooler 5 fan attachment with purchase with either the Pro or Ultimate phone (which adds two more buttons attached to the fan.)

The new Asus gaming phones aren’t huge departures from their predecessors, though the hardware and software are more refined. I reviewed the ROG Phone 5 Ultimate, and while it delivers on its promises to be a spec and feature juggernaut in some clever ways, paying $1,580 for it seems steep. Even the $950 base configuration isn’t what I’d consider affordable.

The Ultimate and Pro include “ROG Vision,” a feature that pushes premade or custom text or graphics to its rear-facing OLED screen. It’s a spin on Asus’ “Anime Matrix” effect used in the Zephyrus G14 gaming laptop, allowing you to personalize your phone if you want. The standard ROG Phone 5 simply has a backlit ROG logo, which some might find to be just enough pizazz.

I go into all of the features in the review. But if you’re just passing by and want to know all about the specs, I’ve attached a handy table just for you.

Asus ROG Phone 5 lineup specs

Comparison ROG Phone 5 Ultimate ROG Phone 5 Pro ROG Phone 5
Comparison ROG Phone 5 Ultimate ROG Phone 5 Pro ROG Phone 5
Colors Matte white Glossy black Phantom black or Storm white
Price 1,299 Euros (approx. $1,583) 1,199 Euros (approx. $1,420) Starts at 799 Euros (approx. $950)
Processor Snapdragon 888 * *
OS Android 11 with ROG UI * *
Display 6.78-inch 2448 x 1080 OLED with 144Hz refresh rate * *
RAM 18GB LPDDR5 16GB LPDDR5 8GB,12GB or 16GB LPDDR5
Storage 512GB UFS 3.1 * 256GB UFS 3.1
Extra touch sensors Yes * No
Rear-facing cameras 64-megapixel with F/1.8 aperture, 13-megapixel 125-degree ultra-wide with F/2.4 aperture, and a 5-megapixel macro lens with F/2.0 * *
Front-facing camera 24-megapixel with F/2.45 aperture * *
ROG Vision support Yes, monochromatic Yes, color No
Battery 6,000mAh * *
Included charger 65W * *
Dimensions 172.8 x 77.2 x 10.29 mm * *
Weight 238 grams * *
Connectivity LTE and sub-6GHz 5G on AT&T and T-Mobile, Wi-Fi 6E, Bluetooth 5.2 * *
Included accessories AeroActive Cooler 5, Aero case * No cooler included
* represents the same spec as the Ultimate

Photography by Cameron Faulkner / The Verge

apple-iphone-11-vs-iphone-11-pro-vs-iphone-11-pro-max:-which-should-you-buy?

Apple iPhone 11 vs iPhone 11 Pro vs iPhone 11 Pro Max: Which should you buy?

(Pocket-lint) – Apple announced the iPhone 11, 11 Pro and 11 Pro Max in September 2019. They were then succeeded by the iPhone 12 mini, iPhone 12, iPhone 12 Pro and iPhone 12 Pro Max in September 2020.

If you’re in the market for a new iPhone but you don’t want the latest models, you’re in the right place. 

The iPhone 11 sits above the iPhone XR and is still available to buy from Apple alongside the iPhone 12 models. The iPhone 11 Pro and iPhone 11 Pro Max took the place of the iPhone XS and XS Max, but while all are discontinued through Apple, you might be able to get hold of them elsewhere.

Here is how the three 2019 iPhones compare to help you work out which is the right one for you.

You can also read our separate features on how the iPhone 11 compares to the iPhone XR and how the iPhone 11 Pro models compare to the iPhone XS models.

squirrel_widget_167227

What’s the same across the iPhone 11 series?

  • Processor
  • No 3D Touch
  • Storage options
  • Software

The Apple iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max all run on the same processor, like the iPhone XR, XS and XS Max.

For 2019, that processor was the A13 Bionic chip with a third-generation neural engine (it’s also in the iPhone SE (2020)). The three models also come in the same storage capacities – 64GB, 256GB and 512GB – none of which have microSD.

None of the devices have 3D Touch on board, with all opting for Haptic Touch like the iPhone XR and the iPhone 12 models, but all offer True Tone technology and a wide colour gamut. 

All three models also come with an improved front camera compared to their predecessors offering a 12-megapixel lens, next-generation Smart HDR for photos and Portrait Mode, as well as Portrait Lighting. 

Face ID is on board all three models (as you would expect) and it too was improved compared to older models with more angles supported. All models launched on iOS 13, but they would now support iOS 14, delivering the same user experience and the same new features.

squirrel_widget_167226

What’s different between the iPhone 11 series?

Whilst the three iPhone 11s share numerous similarities – including power, software and a similar (though not identical) design, there are a few differences to consider before you make your choice.

Camera capabilities

  • iPhone 11: Dual camera
  • iPhone 11 Pro: Triple camera
  • iPhone 11 Pro Max: Triple camera

One of the main differences between the iPhone 11 series is their camera capabilities. The iPhone 11 comes with a dual camera, while the iPhone 11 Pro models come with a triple rear camera. 

The iPhone 11 has dual 12-megapixel ultra-wide and wide cameras with the ultra-wide lens offering an aperture of f/2.4, while the wide lens has an aperture of f/1.8. The iPhone 11 Pro models have a triple 12-megapixel sensor setup, with the same two lenses as the iPhone 11, along with a telephoto lens offering an aperture of f/2.0. 

All models have Night Mode, Auto Adjustments, Portrait mode with advanced bokeh and Depth Control, Portrait Lighting with six effects and next-generation Smart HDR for photos. The Night Mode is very good, offering much better low light capabilities across all three devices.

The iPhone 11 has 2x optical zoom out, digital zoom up to 5x, while the iPhone 11 Pro models have 2x optical zoom in, 2x optical zoom out and 10x digital zoom. The optical zoom out refers to the ultra-wide-angle lens, allowing you to get more in the shot. Only the Pro models have 2x optical zoom in thanks to the third telephoto lens. The Pro models also have dual optical image stabilisation, while the iPhone 11 has standard optical image stabilisation.

  • Apple iPhone 11 Pro cameras explained: Why three and what does each do?

Display

  • iPhone 11: 6.1-inch, LCD, 1792 x 828, True Tone, Haptic Touch, 625nits
  • iPhone 11 Pro: 5.8-inch, OLED, HDR, 2436 x 1125, True Tone, Haptic Touch, 800nits
  • iPhone 11 Pro Max: 6.5-inch, OLED, HDR, 2688 x 1242, True Tone, Haptic Touch, 800nits

Display sizes differ between the three iPhone 11 models, as they did for 2018’s iPhone XR, XS and XS Max and resolutions differ too with the iPhone 11 offering a pixel density of 326ppi and the iPhone 11 Pro models offering pixel densities of 458ppi. The 11 Pro models are also brighter. You’ll notice this difference if you’re looking at the iPhone 11 and iPhone 11 Pro models side-by-side but otherwise, the iPhone 11’s display will be more than sufficient for most users.

The iPhone 11 Pro has a 5.8-inch screen like the iPhone XS, the iPhone 11 has a 6.1-inch screen like the iPhone XR and the iPhone 11 Pro Max has a 6.5-inch display like the iPhone XS Max. 

The Pro models also have OLED displays like their predecessors, allowing for punchier colours and blacker blacks than the iPhone 11 and its LCD screen, but again, this is only really noticeable if you place the devices together. The Pro models do have HDR support though, which the iPhone 11 does not, meaning you’ll see less detail on the standard iPhone when watching HDR-compatible content.

All models have True Tone technology and Haptic Touch.  

squirrel_widget_167218

Physical footprint

  • iPhone 11 Pro: 144 x 71.4 x 8.1mm, 188g 
  • iPhone 11: 150.9 x 75.7 x 8.3mm, 194g
  • iPhone 11 Pro Max: 158 x 77.8 x 8.1mm, 226g

As with the display sizes, the physical footprint between the three 2019 iPhones differs.

The iPhone 11 Pro is the smallest and lightest, followed by the iPhone 11 and then the iPhone 11 Pro Max. With the frosted glass finished on the iPhone 11 Pro models though, the iPhone 11 Pro Max doesn’t look as big as the iPhone XS Max did. It’s an optical illusion of course, but for those that wanted the larger model but thought it looked to big in 2018, you might find yourself thinking differently here. 

The iPhone 11 is a great in-between device in terms of size though.

Design

  • iPhone 11: Dual camera, aluminium frame
  • iPhone 11 Pro models: Triple camera, stainless steel frame

While the design is similar across the three 2019 iPhones, with all offering a notch on the front at the top of the display, there are differences on the rears, as well as material choice.

The iPhone 11 Pro models have a square camera housing with three camera lenses, while the iPhone 11 has a dual camera. All models have an IP68 water and dust resistance rating, but the Pro models can be submerged up to four-metres for 30 minutes, while the iPhone 11 can only be submerged up to two-metres for 30 minutes.

The Pro models also have a textured matte glass and stainless steel design, while the iPhone 11 is made form aluminium and standard glass. In the flesh, the Pro models are really beautiful, especially in the green and gold colour options. They look more premium than the iPhone 11 but this is something you will only notice when they are next to each other. Otherwise, the iPhone 11 is a lovely, solid device in its own right.

Battery capacities

  • iPhone 11: Up to 17-hours, wireless charging
  • iPhone 11 Pro: Up to 18-hours, wireless charging
  • iPhone 11 Pro Max: Up to 20-hours, wireless charging

Batteries were claimed to have improved for the 2019 iPhone models when they first launched, and while Apple doesn’t detail specific capacities, they did improve in our experience. The iPhone 11 is said to last up to 17 hours, while the iPhone 11 Pro is said to last up to 18 hours and the iPhone 11 Pro Max is said to last up to 20 hours.

We were really impressed with the battery life of 2019 devices though during our testing. Both the iPhone 11 and iPhone 11 Max will see you through a day and evening without a problem in our experience.

All three models offer wireless charging but none have reverse wireless charging on board. All three models are also fast-charge capable, but only the Pro models come with 18W fast chargers in the box. 

Colour options

  • iPhone 11: 6 colours
  • iPhone 11 Pro models: 3 colours

Colour options vary between the standard iPhone 11 and the iPhone 11 Pro models.

The iPhone 11 was available in Purple, Yellow, Green, Black, White and Product(RED) when it first arrived. They were more muted than they were for the iPhone XR and lovely as a result.

The iPhone 11 Pro models were available in Midnight Green, Silver, Space Grey and Gold. The Midnight Green and Gold are fabulous and really stand out, especially with the matte rear.

  • iPhone 11 colours: All the iPhone 11 and 11 Pro colours available

Price

  • iPhone 11: From $699/£729
  • iPhone 11 Pro models: From $999/£1049

Pricing between the iPhone 11 and iPhone 11 Pro models unsurprisingly differs. 

The iPhone 11 started at $699/£729 when it first launched, the iPhone Pro started at $999/£1049 and the iPhone Pro Max started at $1099/£1149. As mentioned, only the iPhone 11 is available through Apple now, and it is cheaper, but you might still find the 11 Pro and 11 Pro Max at a good price elsewhere now they have been succeeded.

Conclusion

The Apple iPhone 11 is the cheaper option of the three 2019 iPhones and it’s great value. For many, it will be the one to buy from this trio of handsets.

The iPhone 11 Pro models offer some great features, specifically camera capabilities, design materials and better displays, but they are also particularly pricey compared to the standard iPhone 11.

The iPhone 11 offers more colours than the Pro models, even if it isn’t as premium in design, it sits in the middle in terms of size and while it misses out on a couple of features compared to the Pro models, such as optical zoom in terms of camera and a punchier display, it still offers a great camera with Night Mode and brilliant results.

Writing by Britta O’Boyle.

mips-starts-risc-v-cpu-development

MIPS Starts RISC-V CPU Development

(Image credit: Panasonic)

Wave Computing and its subsidiary MIPS Technologies, the developer of the MIPS processor architecture, recently emerged from Chapter 11 bankruptcy protection, renamed itself to MIPS and changed the business model. As reported by Electronic Engineering Journal the new company will focus on development of RISC-V CPU cores and will abandon further development of its own MIPS architecture. 

“Going forward, the restructured business will be known as MIPS, reflecting the company’s strategic focus on the groundbreaking RISC-based processor architectures which were originally developed by MIPS,” a statement by the company reads. “MIPS is developing a new industry-leading standards-based 8th generation architecture, which will be based on the open source RISC-V processor standard.”

MIPS does not get much publicity these days mostly because it is no longer used for game consoles or supercomputers. Yet, the architecture is still among the most popular in the industry. It is widely used for various microcontrollers, consumer electronics SoCs, communication equipment, and a variety of low-power devices. Hundreds of millions of such products are sold every year and for now MIPS cores are good enough for their applications.

It is hard for MIPS to compete against Arm Holdings which has a wider choice of cores and which who’s architecture powers more applications than any other architecture nowadays. It is close to impossible for MIPS to catch up with Arm when it comes to performance and industry support, so the new MIPS decided to change its business model. 

Previously MIPS Technologies and later Wave Computing licensed their architecture and cores to processor developers, which made them compete directly against Arm Holdings that has the same business model. Since the MIPS architecture belongs to one company it is solely responsible for the whole ecosystem (which includes hardware and software) and supporting it alone is hard. 

From now on, MIPS will develop RISC-V-based architecture and appropriate CPU cores that it will license to others. In general, the licensing nature of MIPS’s business will not change, but since RISC-V is an open standards architecture supported by dozens of companies, MIPS will not have to support the ecosystem alone, which will make its business more sustainable.

MIPS has a lot of expertise in CPU development as well as a broad portfolio of CPU patents and IP. All of these assets will inevitably be used for the upcoming RISC-V-based architecture as well as CPU cores, so it is logical to expect MIPS to be one of the leading RISC-V developers. Whether or not that architecture will be competitive against Arm’s offerings is something that remains to be seen.