tens-of-thousands-of-verkada-cameras-were-easily-accessible-to-employees-as-well-as-hackers

Tens of thousands of Verkada cameras were easily accessible to employees as well as hackers

Employees of cloud-based surveillance firm Verkada had widespread access to feeds from customers’ cameras, according to new reports from Bloomberg and The Washington Post.

Verkada’s systems were recently breached by a “hacktivist” collective which gained access to more than 150,000 of the company’s cameras in locations ranging from Tesla factories, to police stations, gyms, schools, jails, and hospitals. The group, who call themselves Advanced Persistent Threat 69420, stumbled across log-in credentials for Verkada’s “Super Admin” accounts online. They publicized their findings, saying they were motivated by “lots of curiosity, fighting for freedom of information and against intellectual property, a huge dose of anti-capitalism, a hint of anarchism — and it’s also just too much fun not to do it.”

Now, anonymous Verkada employees say the same “Super Admin” accounts that the hackers accessed were also widely shared in the company itself. More than 100 employees had Super Admin privileges, reports Bloomberg, meaning that these individuals could browse the live feeds from tens of thousands of cameras around the world at any time. “We literally had 20-year-old interns that had access to over 100,000 cameras and could view all of their feeds globally,” one former senior-level employee told the publication.

Verkada, meanwhile, says access was limited to employees who needed to fix technical problems or address user complaints. “Verkada’s training program and policies for employees are both clear that support staff members were and are required to secure a customer’s explicit permission before accessing that customer’s video feed,” said the Silicon Valley firm in a statement given to Bloomberg.

The Washington Post, though, cites the testimony of surveillance researcher Charles Rollet, who says individuals with close knowledge of the company told him that Verkada employees could access feeds without customers’ knowledge. “People don’t realize what happens on the back-end, and they assume that there are always these super-formal processes when it comes to accessing footage, and that the company will always need to give explicit consent,” said Rollet. But clearly that’s not always the case.”

Another former employee told Bloomberg that although Verkada’s internal systems asked workers to explain why they were accessing a customer’s camera, this documentation was not taken seriously. “Nobody cared about checking the logs,” said the employee. “You could put whatever you wanted in that note; you could even just enter a single space.”

Verkada’s cameras offer AI-powered analytics, including facial recognition and the ability to search footage for specific individuals.
Image: Verkada

Verkada’s cloud-based cameras were sold to customers in part on the strength of their analytical software. One feature called “People Analytics” let customers “search and filter based on many different attributes, including gender traits, clothing color, and even a person’s face,” said Verkada in a blog post. Their cloud-based systems that gave customers’ easy access to their camera’s feeds also enabled the breach.

The hacker collective Advanced Persistent Threat 69420 (the name is a nod to the taxonomy used by cybersecurity companies to catalog state-sponsored hackers combined with the meme numbers 69 and 420) say they wanted to inform the public of the dangers of such ubiquitous surveillance. The breach “exposes just how broadly we’re being surveilled, and how little care is put into at least securing the platforms used to do so, pursuing nothing but profit,” one member of the group told Bloomberg. “It’s just wild how I can just see the things we always knew are happening, but we never got to see.”

apex-legends-is-out-on-the-switch,-but-it’s-missing-a-key-feature:-cross-progression

Apex Legends is out on the Switch, but it’s missing a key feature: cross-progression

Apex Legends is the latest major cross-platform Switch port. After years of availability on Xbox, PlayStation, and PC, EA has finally brought the battle royale shooter to Nintendo’s handheld console, adding a fresh wave of players to the mix and a new, on-the-go option for existing Apex Legends fans. But there’s a glaring issue with the Switch port: right now, there’s no cross-progression, making the Switch port effectively a nonstarter for dedicated players.

At launch, the new Nintendo Switch version of Apex Legends offers cross-platform gameplay — meaning that you can play with and against players on Xbox, PlayStation, and PC in addition to other Switch players. But any progress or purchases that players have made on those other platforms won’t carry over. Effectively, Apex Legends players on the Switch are starting from scratch.

Despite the “Season 8” branding that covers Apex Legends, there’s no continuity for players on the Switch version — so much so, that players have to replay the tutorial before they’ll actually be able to drop into a full match.

In an interview with Nintendo Life, Chad Grenier (Respawn’s game director for Apex Legends) said that cross-progression is planned for the future, but with the caveat that “we’re a ways out from being able to offer that.”

Grenier explains that there are a mix of issues preventing Respawn from offering cross-progression, with contractual, legal, and technical problems that need to be sorted out. “It’s a complex challenge of multiple accounts existing for various users that we have to resolve or merge, there are legal and contractual things to navigate with purchasing on other platforms and having those carryover and also some technical challenges.”

Apex Legends is by no means the first mainstream game to run into this issue. Unfortunately, the lack of cross-progression is more common than not for most cross-platform games, both on the Switch and on other platforms.

Overwatch, for example, has been struggling with the lack of cross-platform progression and gameplay for years, despite the emphasis that Blizzard puts on cosmetic content unlocks.

Control has been ported to plenty of platforms since its launch, including Amazon’s Luna, a cloud-based version for the Nintendo Switch, and a next-gen version for PS5 and Xbox One. But there’s no crossover for saves between those titles — if you started Control on a PS4, then that’s where your save is stuck forever, even if you want to try streaming it from an internet service or playing with fancier graphics on a next-gen console.

Obviously, there are real technical and legal issues here. Overwatch’s skins are heavily tied to its loot box economy, which are all purchased through the platform-specific stores, which can complicate things. Control’s lack of next-gen saves are tied to updates to the game engine that prevented Remedy from offering continuity for existing players.

But there’s also a wealth of games in 2021 that show that a better way is possible. Fortnite and Rocket League are the gold standard here: simply log into Epic’s free-to-play games on your platform of choice, and all your stuff is there waiting for you. You can play with friends on any platform (well, except iOS), from any platform, with all of your skins, emotes, items, and unlocks.

And even recent Ubisoft games have added cross-play and cross-progression through Ubisoft Connect, letting players start playing sprawling RPGs like Assassin’s Creed Valhalla on one system and continue on another.

In 2021, offering cross-progression and cross-play is increasingly becoming table stakes for major games. With massive titles that can take dozens, if not hundreds of hours of players’ time, locking down progression to a single console or platform just doesn’t make sense.

And that’s doubly true for free-to-play games like Apex Legends, which live or die on the strength and size of their multiplayer community and the money that they can make off selling cosmetic items. When your game is free to download on any platform, it’s critical that the time and money that players invest into getting those digital rewards be consistent across those platforms, because the collection of those items is the main reward structure of those games.

Bungie figured that out a while ago, back when it transitioned Destiny 2 to a free-to-play title — it now allows players to sync their in-game items to whatever platform they’re playing on (even if Bungie is still working out cross-platform gameplay).

The whole point of putting a game like Apex Legends on the Switch is to offer players another avenue to play the game. Sure, it may attract some new players, but for many others it’s a way to spend even more time with a game they already love. But by locking things like hero characters or items that players have painstakingly unlocked through time or money to a single platform, the game is still stuck in an outdated model of game design.

Players have a finite amount of time. And why would you open up Apex Legends to unlock all your old gear again when you pick up your Switch when you could make progress on your Fortnite battle pass — a far more substantial unlock that isn’t tethered to a single system — instead?

photoshop-now-runs-natively-on-apple’s-m1-macs

Photoshop now runs natively on Apple’s M1 Macs

Lightroom was the first Adobe creative app to make the leap to Apple Silicon, and now the much-anticipated release of Photoshop is here. According to the company, Photoshop for M1 Macs completes most tasks 1.5 times faster than when running on Intel. But the speed improvements extend beyond actual editing; Adobe says a lot about Photoshop should now feel faster — including how quickly the app opens up.

Photoshop for Apple Silicon was previously in beta, but now it’s being widely rolled out to Creative Cloud customers with an M1 Mac: those include the MacBook Air, entry-level 13-inch MacBook Pro, and Mac mini“These great performance improvements are just the beginning, and we will continue to work together with Apple to further optimize performance over time,” Adobe’s Pam Clark wrote in a blog post.

In this case, “just the beginning” also means there are a small number of Photoshop features and tricks that haven’t yet made the move to the Apple Silicon version. According to Clark, these include recent additions like invite to edit cloud documents and preset syncing. “However, the performance gains across the rest of the application were so great we didn’t want to hold back the release for everyone while the team wraps up work on these last few features,” she added, noting that customers can always switch over to using the Intel build of Photoshop (with Rosetta 2) if they urgently need those features.

Adobe is also bringing new features to Photoshop for iPad: cloud documents version history and the ability to work on cloud files while offline. Cloud documents version history lets you revert to an old version of a file dating back as far as 60 days.

asus-rog-zephyrus-g15-review:-amd-and-nvidia-at-their-best

Asus ROG Zephyrus G15 review: AMD and Nvidia at their best

The Asus ROG Zephyrus G15 is the gaming laptop to beat.

When I tested Asus ROG’s Zephyrus G14 a year ago, I was blown away. Not only was it just over 3.5 pounds — a weight unheard of for a system with both a powerful processor and a discrete GPU — but it ran even the most demanding games at much better frame rates than any gaming laptop we’d ever seen at that size. And then everything else about it — the keyboard, the touchpad, the audio, the battery life — was also great. The G14 wasn’t just better than other gaming laptops in those areas: it was better than most other laptops at its price point, period.

Given the G14’s resounding success, it was only a matter of time before Asus put it in a 15-inch chassis. The formula wasn’t broken, and Asus didn’t fix it — Asus just made it bigger. While I had some questions when I heard the G15 was on the way (could it deliver the same combination of portability, battery life, and performance as a 14-inch product? Could it do that without costing over $2,000?), what’s become clear throughout my testing period is that the device isn’t just as good as its 14-inch counterpart; it’s somehow even better. Asus and AMD have done it again.

Gaming on Cloud 9.

The G15’s secret weapon is its processor. All models have AMD’s monstrous eight-core Ryzen 9 5900HS. My test model, priced at $1,799.99, pairs that chip with Nvidia’s new GeForce RTX 3070 (an 80W version, with dynamic boost up to 100W), as well as 16GB of RAM and 1TB of storage. This configuration is a step above the base model, which includes an RTX 3060 and 512GB of storage. There are also two RTX 3080 models — pair it with 16GB of RAM for $1,999.99 or 32GB of RAM for $2,499.99. (I think my test model hits a sweet spot: 512GB of storage isn’t a lot for a gaming laptop, and it seems like the RTX 3080 models are fairly low-clocked and don’t perform hugely better than the lower-tier options.)

Asus says the G15 has a “desktop-inspired layout” with separate keys to control the volume, toggle the microphone, and pull up Armoury Crate.

Another highlight, consistent across all models, is the G15’s 165Hz QHD display. We’re finally starting to see 15-inch laptops with QHD screens en masse this year, indicating that this is the first year that manufacturers think mobile hardware is powerful enough to take advantage of them. Traditionally, mobile gamers have had the option of a 1080p display or a 4K display. (Not only is the latter quite expensive, but very few laptops can run demanding games at playable frame rates in 4K.)

You get over 3.5 million pixels from this QHD display.

So, the big question: Can the Zephyrus G15 run games at QHD resolution? The answer is an emphatic yes.

Some raw numbers to start. The G15 averaged 178fps on CS:GO at maximum settings — dust particles, fires, and other graphically intensive effects looked just fine. Red Dead Redemption II, also at maximum settings, averaged 58fps. (Come on, that’s basically 60). Ray tracing was no problem for this machine: the system averaged 61fps on Shadow of the Tomb Raider with ray tracing on ultra, and a whopping 81fps with ray tracing off. Remember, the G15 is running these at QHD resolution, which is already a bigger haul than traditional 1080p.

Those frame rates mean you should be able to run whatever game you want in QHD without bumping down any settings. They put the G15 about on par with MSI’s GS66 Stealth with an Intel Core i7-10870H and a GeForce RTX 3080 Max-Q — the two laptops tied on Red Dead and were just one frame apart on Tomb Raider. MSI told us that the QHD GS66 model costs $2,599 — so the G15 with an RTX 3070 is getting the same frame rates for literally $800 less. The G15 also did better than the QHD / RTX 3070 Intel configuration of the Razer Blade 15 Base (53fps on Red Dead, 46fps on Tomb Raider), which costs $400 more. Those differentials should speak for themselves. Yes, the GS66 has a 240Hz screen, but that’s going to be excessive for most people at QHD resolution. If I didn’t already know where the G14 was priced last year, I would be emailing Asus to check if $1,799.99 was a typo. It’s an unbelievable value.

The games all looked great on this screen, which covers 100 percent of the sRGB gamut and 89 percent of AdobeRGB, and maxes out at 334 nits of brightness. It isn’t the highest refresh-rate screen you can get at 165Hz — Razer’s Blade 15 Advanced has a 240Hz QHD model, as does MSI’s GS66 Stealth — but it’s still a significant step above the Zephyrus G14’s 120Hz display. While the G15 doesn’t deliver the best picture I’ve ever seen, it still looks great and certainly improves upon the G14’s 1080p panel. Movement was all smooth, without a stutter in sight, and colors looked great. I saw a small amount of glare when using the device outdoors, but it was still quite usable at maximum brightness.

Cooling, while sometimes iffy on the G14, is stellar on this device. The G15’s “intelligent cooling” system includes two 84-blade fans and six heat pipes. It had no problem with any of the games I threw at it, spending the vast majority of its time between the mid-60s and mid-70s (Celsius) and never jumping above 80 degrees. That’s some of the best cooling performance I’ve ever seen from a gaming laptop, especially considering that this one was running heavy AAA titles, maxed out, at QHD resolution.

More impressively, the fans managed to do this without being deafeningly loud. I could certainly hear them while the machine was under load, but it was standard gaming-laptop noise, and I had no problem hearing game audio. You can also swap to the “Silent” profile in Asus’ Armoury Crate software. That toggle lived up to its name and completely silenced the fans, without causing any heat or performance problems that I observed.

Speaking of audio, the G15’s speakers also sound great. That’s to be expected — there are literally six of them, including two front-facing tweeters and force-canceling woofers under the palm rests. They deliver clear audio with very strong bass and powerful percussion. I don’t often get to say that about laptop audio, especially on gaming laptops. The G15 comes preloaded with Dolby Access, which you can use to jump between equalizer presets for gaming, movies, and music, and it makes a huge difference.

There are three microphones, which had no trouble picking up my voice. They also have presets for game streaming, music recording, and conference calls. Those are handy, but they’re not enough to make the G15 a good choice for remote work because it doesn’t include a webcam. The G14 also didn’t have a camera — Asus seems to have decided that webcams aren’t necessary on Zephyrus products. It’s the one significant knock against a device that is basically perfect otherwise. It’s also very odd to have such an advanced microphone setup and not have a webcam to go with it.

There are a couple other things to note about the G15’s chassis. Like many other Asus laptops, the G15 has an ErgoLift hinge, which folds under the deck when the laptop is open and lifts the keyboard above the ground. This is supposed to create a more ergonomic typing position, though I can’t say I ever noticed the difference. It does dig into your legs a bit if you’re using the laptop on your lap, though. The G15’s hinge isn’t as sharp as some other hinges, but as a frequent couch user, it’s still not my favorite feeling.

The keyboard and touchpad are both great as well. The G14 had one of my favorite keyboards of 2020, and the G15’s is quite similar. The click is comfortable, with 1.7mm of travel, and the dedicated volume keys (a Zephyrus staple) are quite convenient. There’s a fingerprint sensor built into the power button, which is on the top right of the keyboard deck.

The touchpad is massive, at 5.1 x 3.4 inches — 20 percent larger than that of the prior G15 generation. It’s so big that large portions of both my hands were resting on it when they weren’t typing, rather than on the palm rests. This was a bit annoying, but to the G15’s credit, it didn’t cause any palm-rejection issues. It’s also a bit loud and not the easiest or deepest click, but those are nitpicks — it’s a fine touchpad.

But what impressed me the most about the G15 is its battery life. This thing never dies. Using it as my daily driver with an office workload on Asus’ Silent profile around 200 nits of brightness, I averaged eight hours and 32 minutes. That’s just under what I got from the G14, and the G15 has a larger and higher-resolution screen to fuel. The result puts the G15 right up there with its smaller sibling as one of the longest-lasting gaming laptops we’ve ever seen. It has a large 90Wh battery inside, but plenty of gaming rigs with comparable bricks can only make it a few hours on a charge.

Gaming significantly shortens the G15’s life span, of course. I got an hour and 21 minutes of Red Dead out of one charge. Impressively, though, the game was quite playable for much of that time, avoiding stutters and performance issues. The game didn’t drop below playable rates until the G15 was down to 10 percent with six minutes remaining. The 200-watt charger also juices the G15 decently fast — during very light Chrome use, it got the device up to 60 percent in 37 minutes. If you don’t want to carry that heavy brick around and aren’t doing GPU-intensive tasks, the G15 also supports 100W Type-C charging.

G15s are available in eclipse gray (like this model) and moonlight white.

At the end of the day, there are things I can nitpick about this device. In particular, the lack of a webcam is egregious. And there are reasons it won’t be for everyone. Folks who are looking for a higher refresh-rate screen may prefer to spend more on a Blade 15 Advanced or a GS66. Those who want a jazzier design may find Asus’ Strix Scar 15 a better fit. And while $1,799 is a great value for these specs, anyone on a tighter budget has options like Lenovo’s Legion 5 on the table.

But almost everything about this laptop is fantastic. And not only is it fantastic, but it’s fantastic for several hundreds of dollars less than its QHD competitors. If you are willing to use an external webcam and you don’t need a 240Hz screen, there’s really no reason you should be buying any other QHD laptop in the thin 15-inch class. The G15 is superior on battery life, superior on power, superior on weight, and superior on price. It’s just the best.

moto-g10-review:-no-longer-the-default-budget-choice?

Moto G10 review: No longer the default budget choice?

(Pocket-lint) – It seems kinda mad that we’ve arrived here, but the Moto G is now up to number 10. It’s no surprise though: as the G series is Motorola’s most successful range and it has consistently delivered great value, simple and reliable phones.

But for 2021, the numbering and naming system has changed – the lower the number, the lower down it sits in the ranks. Therefore the G10 is the entry-level affordable phone in a series that’s long looked a bit crowded.

That causes a bit of a self-administered issue for the Moto G10, however, as it’s no longer the default choice in the range. Why? Because for a little extra money the Moto G30 also exists. 

Design

  • Dimensions: 165.2 x 75.7 x 9.2mm / Weight: 200g
  • Finishes: Aurora Gray, Iridescent Pearl
  • Rear positioned fingerprint scanner
  • Glass front, ribbed plastic back
  • 3.5mm headphone port
  • Single loudspeaker
  • microSD expansion

Moto G design has never been all that fancy or premium, which makes sense for a budget phone. Some corners need cutting to get it down to the right price. This generation Motorola has taken on something of an unusual finish with its ribbed back panel (it’s still better-looking than the G30’s odd colour choices though). 

That wave pattern you see isn’t just a visual thing, it has texture too. It’s a little weird to begin with, but the texture has its merits. It definitely makes it feel less likely to slip out of your hand, and you’ll never find it randomly slipping off a surface like a completely glossy glass back might. 

That’s not the only practical decision made here either. Unlike some more expensive phones, the Moto G10 is equipped with everything you could need. That means you get a 3.5mm headphone port at the top for plugging in your hands-free buds, or wired headphones.

There’s also a microSD card slot for expanding the storage. You might find that useful if you like to keep a physical copy of all your own media offline. And if you have have the 64GB phone, you may just find you fill up the internal storage quite quickly. 

So what else is there? Well, you’ll find three buttons up the right side. One is the usual power button, and there’s the volume rocker switch, but then curiously there’s also an additional button which – when pressed – will launch Google Assistant. Which is fine, but we can’t imagine it’s used by most people all that much. 

As for that fingerprint sensor on the back, usually we laud the appearance of physical scanners because they’re fast and reliable, but that’s not the case with this one. Most times it would take two or three goes before a successful scan, meaning it was often quicker just to type in the multi-digit PIN instead. 

The G10’s front is pretty standard too, with its relatively skinny bezel up the sides and the dewdrop-style notch at the top of the display, barely cutting into the available screen real-estate. And while there’s only one loud speaker, placed on the bottom edge, the speaker grille is long enough that we didn’t find it was all that easy to completely block, meaning you can hear it whether you hold the phone in portrait or landscape. 

Display

  • 6.5-inch IPS LCD display
  • 720 x 1600 resolution
  • 269 pixels per inch
  • 60Hz refresh rate
  • Android 11 

On to that display and – as with most affordable phones – this one uses a long aspect ratio HD+ resolution panel. That means, specifically, it’s IPS LCD and has 720 x 1600 pixels spread across that 6.5-inch diagonal.

Pocket-lint

Obviously that means it’s not super sharp, but it’s adequate for daily use and won’t leave you squinting. In fact, it’s pleasant enough when inside and watching movies, gaming and browsing the web. It’s not the most vivid panel around though – its dynamic range does suffer, but that’s almost to be expected from an LCD screen on a cheap smartphone such as this. 

The one place we did notice it struggle the most was outside in daylight. Trying to frame shots with the camera to shoot in sunlight was difficult. We could barely see what was on the screen, even with the brightness cranked right up. 

Performance and battery

  • Snapdragon 460 processor, 4GB RAM
  • 64GB or 128GB storage
  • 5000mAh battery

If what you’re after in a phone is really solid battery life, we’re happy to report the G10 delivers that – by the bucket load. Even in a phone with a high-end flagship processor and a top-of-the-line display, a 5,000mah capacity battery would be generous. So stick it in a phone with a low power chip and only a HD resolution panel, and you get one of the longest-lasting phones on the market. 

Pocket-lint

In testing we’d often get to the end of a second day and still have some juice left over, even after using it for testing the camera and playing a couple of hours of games each day. For most people we think this is a genuine two-day phone. You’ll never have to worry about it dying during the day if you’ve taken it off charge in the morning. It’s pretty epic. 

Moto also takes care of battery life long-term too. It has a couple of different tools in the battery settings designed to get the most out of the battery for as long as you own the phone. 

Optimised charging learns your usual charging pattern and then using that can predict when you need the battery to be fully charged. So if that is at 7am when your alarm goes off, it’ll charge all the way up to 80 percent, and hang there until it needs to charge the final 20 per cent, in time for you to wake up. 

There’s also overcharge protection. So if you’re a really light user and have a habit of  just leaving your phone plugged in costantly for days at a time, it will limit the charge to 80 per cent if your phone has been plugged in continuously for three days. 

Pocket-lint

Being 5,000mAh does mean charging times are a little slow, especially with the charging speeds maxing out at 10W. So it’s definitely one to plug in at night while you sleep. Thankfully, you’ll probably only have to do it once every other night. 

As for general performance, this is where the G10 slips up against its slightly more expensive sibling, the G30. The Snapdragon 400 series processor inside isn’t unusable by any means, but it does feel quite slow and laggy a lot of the time. Loading web pages, or backing up photos to Google Photos, seems to take longer than it should, while animations in the general interface appear quite stuttery.

In fact, Google Photos did – on a couple of occasions – just hang and crash, and then failed to upload our photos to the cloud. On a similar note, there were a couple of occasions where a chosen game would just freeze and crash too. It wasn’t just Google Photos getting up to these shenanigans.

The G30 just seems more reliable day-to-day in that regard, which is why we’d recommend that over this phone. It’s not that the G30 is super smooth and fast all the time, it just didn’t leave us hanging as much. Still, for most tasks, the G10 is fine, if unremarkable.



Best smartphones 2021 rated: The top mobile phones available to buy today


By Chris Hall
·

Pocket-lint

As for software, that’s the usual Moto style of having an almost Google Android stock experience with a couple of added extras from Moto. That means all your default apps are Google’s, and you get fun gestures like swiping down on the fingerprint sensor to get your notifications, or a chopping motion to switch on the flashlight. 

Camera  

  • Quad camera system:
    • Main (26mm focal length): 48-megapixel, f/1.7 aperture, 0.8µm pixel size, phase-detection autofocus
    • Ultra-wide (13mm): 8MP, f/2.2, 1.12µm
    • Macro: 2MP, f/2.4
    • Depth: 2MP, f/2.4
  • 8-megapixel front camera

As for camera quality, the quad system is lead by a 48-megapixel primary camera, which is joined by an 8MP ultra-wide, and pair of low-resolution depth and macro sensors. 

Stick to the main sensor and you’ll be mostly fine. In good daylight pictures will be sharp, colourful and feature decent depth. It’s not flagship level, naturally, but it’s good enough for social media use. 

The ultra-wide is just ok. It often struggles to focus though, and often leaves colours looking unnatural, completely different to the main sensor.

The macro lens can be useful for close-ups at times, but results are not consistent, and being a low resolution sensor means details aren’t that great either. 

So the G10 is yet another case of a budget phone having more cameras than it knows what to do with. Ignore the depth, macro and wide-angle and you’ve got a solid main camera – but that’s hardly selling itself to the “quad camera” standard, is it?

Verdict

The G10 might be the first entry-level Moto G we don’t unequivocally recommend as an easy purchase. There’s nothing wrong with it, per se – indeed, the battery life, software and practical design make it more than good enough for most people – but there’s the Moto G30 to consider.

Our experience with the G30 was just better, especially when it comes down to daily performance, so if you can afford the little extra then we’d recommend opting for that one.

With all that said, the Moto G10 offers great battery life, so if you don’t need anything too taxing then it’s still a decent option considering its asking price.

Also consider

Pocket-lint

Moto G30

squirrel_widget_4167552

If you have the ability to stump up a little more cash, the G30 is the more sensible choice in Moto’s new G-series range. It has a smoother overall experience and is still great value for money. 

  • Read the review
Pocket-lint

Redmi Note 10 Pro

squirrel_widget_4261498

Few phones at this price point are as accomplished as the Redmi Note 10 Pro. It’s more expensive than the G10, but it’s more than worth it, if you can cope with inferior software.

  • Read the review

Writing by Cam Bunton. Editing by Mike Lowe.

supermicro-1023us-tr4-review:-powerful-performance-in-a-slim-1u-package

Supermicro 1023US-TR4 Review: Powerful Performance in a Slim 1U Package

(Image credit: Tom’s Hardware)

Supermicro’s 1023US-TR4 is a slim 1U dual-socket server designed for high-density compute environments in high-end cloud computing, virtualization, and enterprise applications. With support for AMD’s EPYC 7001 and 7002 processors, this high-end server packs up to two 64-core Eypc Rome processors, allowing it to cram 128 cores and 256 threads into one slim chassis. 

We’re on the cusp of Intel’s Ice Lake and AMD’s EPYC Milan launches, which promise to reignite the fierce competition between the long-time x86 rivals. In preparation for the new launches, we’ve been working on a new set of benchmarks for our server testing, and that’s given us a pretty good look at the state of the server market as it stands today. 

We used the Supermicro 1023US-TR4 server for EPYC Rome testing, and we’ll focus on examining the platform in this article. Naturally, we’ll add in Ice Lake and EPYC Milan testing as soon as those chips are available. In the meantime, here’s a look at some of our new benchmarks and the current state of the data center CPU performance hierarchy in several hotly-contested price ranges. 

Inside the Supermicro 1023US-TR4 Server

Image 1 of 4

(Image credit: Tom’s Hardware)

Image 2 of 4

(Image credit: Tom’s Hardware)

Image 3 of 4

(Image credit: Tom’s Hardware)

Image 4 of 4

(Image credit: Tom’s Hardware)

The Supermicro 1023US-TR4 server comes in the slim 1U form factor. And despite its slim stature, it can host an incredible amount of compute horsepower under the hood. The server supports AMD’s EPYC 7001 and 7002 series chips, with the latter series topping out at 64 cores apiece, which translates to 128 cores and 256 threads spread across the dual sockets.

Support for the 7002 series chips requires a 2.x board revision, and the server can accommodate CPU cTDP’s up to 280W. That means it can accommodate the beefiest of EPYC chips, which currently comes in the form of the 280W 64-core EPYC 7H12 with a 280W TDP. 

The server has a tool-less rail mounting system that eases installation into server racks and the CSE-819UTS-R1K02P-T chassis measures 1.7 x 17.2 x 29 inches, ensuring broad compatibility with standard 19-inch server racks. 

The front panel comes with standard indicator lights, like a unit identification (UID) light that helps with locating the server in a rack, along with drive activity, power, status light (to indicate fan failures or system overheating), and two LAN activity LEDs. Power and reset buttons are also present at the upper right of the front panel.

By default, the system comes with four tool-less 3.5-inch hot-swap SATA 3 drive bays, but you can configure the server to accept four NVMe drives on the front panel, and an additional two M.2 drives internally. You can also add an optional SAS card to enable support for SAS storage devices. The front of the system also houses a slide-out service/asset tag identifier card to the upper left. 

Image 1 of 7

(Image credit: Tom’s Hardware)

Image 2 of 7

(Image credit: Tom’s Hardware)

Image 3 of 7

(Image credit: Tom’s Hardware)

Image 4 of 7

(Image credit: Tom’s Hardware)

Image 5 of 7

(Image credit: Tom’s Hardware)

Image 6 of 7

(Image credit: Tom’s Hardware)

Image 7 of 7

(Image credit: Supermicro)

Popping the top off the chassis reveals two shrouds that direct air from the two rows of hot-swappable fans. A total of eight fan housings feed air to the system, and each housing includes two counter-rotating 4cm fans for maximum static pressure and reduced vibration. As expected with servers intended for 24/7 operation, the system can continue to function in the event of a fan failure. However, the remainder of the fans will automatically run at full speed if the system detects a failure. Naturally, these fans are loud, but that’s not a concern for a server environment.  

Two fan housings are assigned to cool each CPU, and a simple black plastic shroud directs air to the heatsinks underneath. Dual SP3 sockets house both processors, and they’re covered by standard heatsinks that are optimized for linear airflow. 

A total of 16 memory slots flank each processor, for a total of 32 memory slots that support up to 4TB of registered ECC DDR4-2666 with EPYC 7001 processors, or an incredible 8TB of ECC DDR4-3200 memory (via 256GB DIMMs) with the 7002 models, easily outstripping the memory capacity available with competing Intel platforms.

We tested the EPYC processors with 16x 32GB DDR4-3200 Samsung modules for a total memory capacity of 512GB. In contrast, we loaded down the Xeon comparison platform with 12x 32GB Sk hynix DDR4-2933 modules, for a total capacity of 384GB of memory. 

The H11DSU-iN motherboard’s expansion slots consist of two full-height 9.5-inch PCIe 3.0 slots and one low-profile PCIe 3.0 x8 slot, all mounted on riser cards. An additional internal PCIe 3.0 x8 slot is also available, but this slot only accepts proprietary Supermicro RAID cards. All told, the system exposes a total of 64 lanes (16 via NVMe storage devices) to the user. 

As one would imagine, Supermicro has other server offerings that expose more of EPYCs available 128 lanes to the user and also come with the faster PCIe 4.0 interface. 

Image 1 of 2

(Image credit: Tom’s Hardware)

Image 2 of 2

(Image credit: Tom’s Hardware)

The rear I/O panel includes four gigabit RJ45 LAN ports powered by an Intel i350-AM4 controller, along with a dedicated IPMI port for management. Here we find the only USB ports on the machine, which come in the form of two USB 3.0 headers, along with a COM and VGA port. 

Two 1000W Titanium-Level (96%+) redundant power supplies provide power to the server, with automatic failover in the event of a failure, as well as hot-swapability for easy servicing. 

The BIOS is easy to access and use, while the IPMI web interface provides a wealth of monitoring capabilities and easy remote management that matches the type of functionality available with Xeon platforms. Among many options, you can update the BIOS, use the KVM-over-LAN remote console, monitor power consumption, access health event logs, monitor and adjust fan speeds, and monitor the CPU, DIMM, and chipset temperatures and voltages. Supermicro’s remote management suite is polished and easy to use, which stands in contrast to other platforms we’ve tested. 

Test Setup

Cores/Threads 1K Unit Price Base / Boost (GHz) L3 Cache (MB) TDP (W)
AMD EPYC 7742 64 / 128 $6,950 2.25 / 3.4 256 225W
Intel Xeon Platinum 8280 28 / 56 $10,009 2.7 / 4.0 38.5 205W
Intel Xeon Gold 6258R 28 / 56 $3,651 2.7 / 4.0 38.5 205W
AMD EPYC 7F72 24 / 48 $2,450 3.2 / ~3.7 192 240W
Intel Xeon Gold 5220R 24 / 48 $1,555 2.2 / 4.0 35.75 150W
AMD EPYC 7F52 16 / 32 $3,100 3.5 / ~3.9 256 240W
Intel Xeon Gold 6226R 16 / 32 $1,300 2.9 / 3.9 22 150W
Intel Xeon Gold 5218 16 / 32 $1,280 2.3 / 3.9 22 125W
AMD EPYC 7F32 8 / 16 $2,100 3.7 / ~3.9 128 180W
Intel Xeon Gold 6250 8 / 16 $3,400 3.9 / 4.5 35.75 185W

Here we can see the selection of processors we’ve tested for this review, though we use the Xeon Platinum Gold 8280 as a stand-in for the less expensive Xeon Gold 6258R. These two chips are identical and provide the same level of performance, with the difference boiling down to the more expensive 8280 coming with support for quad-socket servers, while the Xeon Gold 6258R tops out at dual-socket support. 

Memory Tested Processors
Supermicro AS-1023US-TR4 16x 32GB Samsung ECC DDR4-3200 EPYC 7742, 7F72, 7F52, 7F32
Dell/EMC PowerEdge R460 12x 32GB SK Hynix DDR4-2933 Intel Xeon 8280, 6258R, 5220R, 6226R, 6250

To assess performance with a range of different potential configurations, we used the Supermicro 1024US-TR4 server with four different EPYC Rome configurations. We outfitted this server with 16x 32GB Samsung ECC DDR4-3200 memory modules, ensuring that both chips had all eight memory channels populated. 

We used a Dell/EMC PowerEdge R460 server to test the Xeon processors in our test group, giving us a good sense of performance with competing Intel systems. We equipped this server with 12x 32GB Sk hynix DDR4-2933 modules, again ensuring that each Xeon chip’s six memory channels were populated. These configurations give the AMD-powered platform a memory capacity advantage, but come as an unavoidable side effect of the capabilities of each platform. As such, bear in mind that memory capacity disparities may impact the results below.  

We used the Phoronix Test Suite for testing. This automated test suite simplifies running complex benchmarks in the Linux environment. The test suite is maintained by Phoronix, and it installs all needed dependencies and the test library includes 450 benchmarks and 100 test suites (and counting). Phoronix also maintains openbenchmarking.org, which is an online repository for uploading test results into a centralized database. We used Ubuntu 20.04 LTS and the default Phoronix test configurations with the GCC compiler for all tests below. We also tested both platforms with all available security mitigations. 

Linux Kernel and LLVM Compilation Benchmarks

Image 1 of 2

(Image credit: Tom’s Hardware)

Image 2 of 2

(Image credit: Tom’s Hardware)

We used the 1023US-TR4 for testing with all of the EPYC processors in the chart, and here we see the expected scaling in the timed Linux kernel compile test with the AMD EPYC processors taking the lead over the Xeon chips at any given core count. The dual EPYC 7742 processors complete the benchmark, which builds the Linux kernel at default settings, in 21 seconds. The dual 24-core EPYC 7F72 configuration is impressive in its own right — it chewed through the test in 25 seconds, edging past the dual-processor Xeon 8280 platform. 

AMD’s EPYC delivers even stronger performance in the timed LLVM compilation benchmark — the dual 16-core 7F72’s even beat the dual 28-core 8280’s. Performance scaling is somewhat muted between the flagship 64-core 7742 and the 24-core 7F72, largely due to the strength of the latter’s much higher base and boost frequencies. That impressive performance comes at the cost of a 240W TDP rating, but the Supermicro server handles the increased thermal output easily. 

Molecular Dynamics and Parallel Compute Benchmarks

Image 1 of 6

(Image credit: Tom’s Hardware)

Image 2 of 6

(Image credit: Tom’s Hardware)

Image 3 of 6

(Image credit: Tom’s Hardware)

Image 4 of 6

(Image credit: Tom’s Hardware)

Image 5 of 6

(Image credit: Tom’s Hardware)

Image 6 of 6

(Image credit: Tom’s Hardware)

NAMD is a parallel molecular dynamics code designed to scale well with additional compute resources; it scales up to 500,000 cores and is one of the premier benchmarks used to quantify performance with simulation code. The EPYC processors are obviously well-suited for these types of highly-parallelized workloads due to their prodigious core counts, with the dual 7742 configuration completing the workload 28% faster than the dual Xeon 8280 setup. 

Stockfish is a chess engine designed for the utmost in scalability across increased core counts — it can scale up to 512 threads. Here we can see that this massively parallel code scales well with EPYC’s leading core counts. But, as evidenced by the dual 24-core 7F72’s effectively tying the 28-core Xeon 8280’s, the benchmark also generally responds well to the EPYC processors. The dual 16-core 7F52 configuration also beat out both of the 16-core Intel comparables. Intel does pull off a win as the eight-core 6250 processors beat the 7F32’s, though.  

We see similarly impressive performance in other molecular dynamics workloads, like the Gromacs water benchmark that simulates Newtonian equations of motion with hundreds of millions of particles and the NAS Parallel Benchmarks (NPB) suite. NPB characterizes Computational Fluid Dynamics (CFD) applications, and NASA designed it to measure performance from smaller CFD applications up to “embarrassingly parallel” operations. The BT.C test measures Block Tri-Diagonal solver performance, while the LU.C test measures performance with a lower-upper Gauss-Seidel solver. 

Regardless of the workload, the EPYC processors deliver a brutal level of performance in highly-parallelized applications, and the Supermicro server handled the heat output without issue. 

Rendering Benchmarks

Image 1 of 8

(Image credit: Tom’s Hardware)

Image 2 of 8

(Image credit: Tom’s Hardware)

Image 3 of 8

(Image credit: Tom’s Hardware)

Image 4 of 8

(Image credit: Tom’s Hardware)

Image 5 of 8

(Image credit: Tom’s Hardware)

Image 6 of 8

(Image credit: Tom’s Hardware)

Image 7 of 8

(Image credit: Tom’s Hardware)

Image 8 of 8

(Image credit: Tom’s Hardware)

Turning to more standard fare, provided you can keep the cores fed with data, most modern rendering applications also take full advantage of the compute resources. Given the well-known strengths of EPYC’s core-heavy approach, it isn’t surprising to see the 64-core EPYC 7742 processors carve out a commanding lead in the C-Ray and Blender benchmarks. Still, it is impressive to see the 7Fx2 models beat the competing Xeon processors with similar core counts nearly across the board. 

The performance picture changes somewhat with the Embree benchmarks, which test high-performance ray tracing libraries developed at Intel Labs. Naturally, the Xeon processors take the lead in the Asian Dragon renders, but the crown renders show that AMD’s EPYC can offer leading performance even with code that is heavily optimized for Xeon processors. 

Encoding Benchmarks

Image 1 of 3

(Image credit: Tom’s Hardware)

Image 2 of 3

(Image credit: Tom’s Hardware)

Image 3 of 3

(Image credit: Tom’s Hardware)

Encoders tend to present a different type of challenge: As we can see with the VP9 libvpx benchmark, they often don’t scale well with increased core counts. Instead, they often benefit from per-core performance and other factors, like cache capacity. 

However, newer encoders, like Intel’s SVT-AV1, are designed to leverage multi-threading more fully to extract faster performance for live encoding/transcoding video applications. Again, we can see the impact of EPYC’s increased core counts paired with its strong per-core performance as the EPYC 7742 and 7F72 post impressive wins. 

Python and Sysbench Benchmarks

Image 1 of 2

(Image credit: Tom’s Hardware)

Image 2 of 2

(Image credit: Tom’s Hardware)

The Pybench and Numpy benchmarks are used as a general litmus test of Python performance, and as we can see, these tests don’t scale well with increased core counts. That allows the Xeon 6250, which has the highest boost frequency of the test pool at 4.5 GHz, to take the lead. 

Compression and Security

Image 1 of 3

(Image credit: Tom’s Hardware)

Image 2 of 3

(Image credit: Tom’s Hardware)

Image 3 of 3

(Image credit: Tom’s Hardware)

Compression workloads also come in many flavors. The 7-Zip (p7zip) benchmark exposes the heights of theoretical compression performance because it runs directly from main memory, allowing both memory throughput and core counts to impact performance heavily. As we can see, this benefits the EPYC 7742 tremendously, but it is noteworthy that the 28-core Xeon 8280 offers far more performance than the 24-core 7F72 if we normalize throughput based on core counts. In contrast, the gzip benchmark, which compresses two copies of the Linux 4.13 kernel source tree, responds well to speedy clock rates, giving the eight-core Xeon 6250 the lead due to its 4.5 GHz boost clock. 

The open-source OpenSSL toolkit uses SSL and TLS protocols to measure RSA 4096-bit performance. As we can see, this test favors the EPYC processors due to its parallelized nature, but offloading this type of workload to dedicated accelerators is becoming more common for environments with heavy requirements. 

SPEC CPU 2017 Estimated Scores

Image 1 of 4

(Image credit: Tom’s Hardware)

Image 2 of 4

(Image credit: Tom’s Hardware)

Image 3 of 4

(Image credit: Tom’s Hardware)

Image 4 of 4

(Image credit: Tom’s Hardware)

We used the GCC compiler and the default Phoronix test settings for these SPEC CPU 2017 test results. SPEC results are highly contested and can be impacted heavily with various compilers and flags, so we’re sticking with a bog-standard configuration to provide as level of a playing field as possible. It’s noteworthy that these results haven’t been submitted to the SPEC committee for verification, so they aren’t official. Instead, view the above tests as estimates, based on our testing.

The multi-threaded portion of the SPEC CPU 2107 suite is of most interest for the purpose of our tests, which is to gauge the ability of the Supermicro platform to handle heavy extended loads. As expected, the EPYC processors post commanding leads in both the intrate and fprate subtests. And close monitoring of the platform didn’t find any thermal throttling during these extended duration tests. The Xeon 6250 and 8280 processors take the lead in the single-threaded intrate tests, while the AMD EPYC processors post impressively-strong single-core measurements in the fprate tests. 

Conclusion

AMD has enjoyed a slow but steadily-increasing portion of the data center market, and much of its continued growth hinges on increasing adoption beyond hyperscale cloud providers to more standard enterprise applications. That requires a dual-pronged approach of not only offering a tangible performance advantage, particularly in workloads that are sensitive to per-core performance, but also having an ecosystem of fully-validated OEM platforms readily available on the market. 

The Supermicro 1023US-TR4 server slots into AMD’s expanding constellation of OEM EPYC systems and also allows discerning customers to upgrade from the standard 7002 series processors to the high-frequency H- and F-series models as well. It also supports up to 8TB of ECC memory, which is an incredible amount of available capacity for memory-intensive workloads. Notably, the system comes with the PCIe 3.0 interface while the second-gen EPYC processors support PCIe 4.0, but this arrangement allows customers that don’t plan to use PCIe 4.0 devices to procure systems at a lower price point. As one would imagine, Supermicro has other offerings that support the faster interface. 

Overall we found the platform to be robust, and out-of-the-box installation was simple with a tool-less rail kit and an easily-accessible IPMI interface that offers a cornucopia of management and monitoring capabilities. Our only minor complaints are that the front panel could use a few USB ports for easier physical connectivity. The addition of a faster embedded networking interface would also free up an additional PCIe slot. Naturally, higher-end Supermicro platforms come with these features. 

As seen throughout our testing, the Supermicro 1023US-TR4 server performed admirably and didn’t suffer from any thermal throttling issues regardless of the EPYC processors we used, which is an important consideration. Overall, the Supermicro 1023US-TR4 server packs quite the punch in a small form factor that enables incredibly powerful and dense compute deployments in cloud, virtualization, and enterprise applications. 

MORE: Best CPUs

MORE: CPU Benchmarks and Hierarchy

MORE: All CPUs Content