The new Apple TV is said to boast support for 120Hz frame rates – a capability you won’t find on any current Apple TV models. Details are thin on the ground, but the development could herald smoother gaming and a more responsive user interface.
According to 9to5Mac, the beta version of Apple’s upcoming tvOS 14.5 software contains multiple references to ‘120HZ’ and ‘supports 120Hz’ – the current Apple TV 4K set-top box is limited to 4K@60Hz resolution.
There have been rumours of a new Apple TV launching as soon as this month for some time. Recent leaks have tipped Apple’s next set-top box for a redesigned Siri remote control, HDMI 2.1, spatial audio, improved Apple Arcade integration and a speedier A14 Bionic chipset.
This latest development adds weight to the rumour that the new Apple TV will double up as a top-tier games console. In fact, some analysts claim Apple’s next set-top box could give the likes of the PlayStation 5 and Xbox Series X (both of which offer 4K@120Hz support) a run for their money.
Given that most of the best gaming TVs already support 120Hz and HDMI 2.1, it could be that the forthcoming flagship Apple TV is upping its gaming game, so to speak.
The new Apple TV isn’t the only Cupertino-designed device tipped for a higher refresh rate either. The iPhone 13, which is expected to break cover in September, is said to use a range of LTPO displays with 120Hz support.
LG has announced it will stop making phones. Once one of the top players in the smartphone market, the firm will bow out of its mobile operations globally in order to focus on other “growth areas”.
It lists such areas as electric vehicle components, connected devices, smart homes, robotics, artificial intelligence and business-to-business solutions, and “platforms and services”.
In 2013, LG was rated as the third biggest maker of smartphones in the world, but, according to analysts IDC, it currently stands at 11th. Its smartphone business has been loss making for years, and struggling to keep up with the popularity of Apple and Samsung handsets.
Worried that your shiny LG phone will soon be rendered an expensive brick? LG has said it will maintain service support and software updates for its phones “for a period of time”. Exactly how long is anyone’s guess. This will also vary by region. According to a document spotted by XDA Developers, this support will include Android 12 updates for certain smartphones. Again, this will vary by region.
Many will mourn LG’s exit from the smartphone market. Its phones may have been niche and quirky, but they were usually a bit different from the competition. Recently, the firm had pursued a strategy of eye-catching folding phones like the LG Wing. Our favourite? The LG Chocolate from 2006.
MORE:
Check out the best Android phones currently available
These are the best smartphones for music and movies
Going Apple? Browse the best iPhones around right now
The Apple TV 4K hasn’t been updated for more than three and a half years, which is an eternity in technology. But 9to5Mac reports that the new tvOS 14.5 beta references support for 120Hz refresh rates — a capability that no currently available Apple TV models have — which could indicate Apple is working on a new version of its set-top box.
While it’s not clear what Apple may use a 120Hz refresh rate for in a new Apple TV, one of the more intriguing possibilities is for smoother gaming, like what’s offered with 120Hz support on the PlayStation 5 and Xbox Series X. Apple has been showing increased interest in gaming as of late, so it wouldn’t be too much of a surprise if the new Apple TV had more gaming-focused features.
For example, Apple just added a bunch of new games to its Apple Arcade gaming subscription service, which lets you play games on the iPhone, iPad, Mac, and Apple TV and carry your progress across platforms. And the iOS 14.5, macOS 11.3, and tvOS 14.5 betas also all include support for the PlayStation 5’s DualSense controller and the Xbox Series X controller — presumably, that support will carry over to the final software releases.
The rumored 120Hz support for Apple TV would also line up with a Bloomberg report from December, which said Apple was working on a new Apple TV for sometime this year with a “stronger gaming focus.” (That report also said the new set-top box would have a redesigned remote, which will hopefully be easier to use.) While we don’t know exactly when Apple will announce this rumored device, or if it will announce it at all, if you’re in the market for a new Apple TV, you might want to wait just a bit.
use? It’s an important question, and while the performance we show in our
GPU benchmarks
hierarchy is useful, one of the true measures of a GPU is how efficient it is. To determine GPU power efficiency, we need to know both performance and power use. Measuring performance is relatively easy, but measuring power can be complex. We’re here to press the reset button on GPU power measurements and do things the right way.
There are various ways to determine power use, with varying levels of difficulty and accuracy. The easiest approach is via software like
GPU-Z
, which will tell you what the hardware reports. Alternatively, you can measure power at the outlet using something like a
Kill-A-Watt
power meter, but that only captures total system power, including PSU inefficiencies. The best and most accurate means of measuring the power use of a graphics card is to measure power draw in between the power supply (PSU) and the card, but it requires a lot more work.
We’ve used GPU-Z in the past, but it had some clear inaccuracies. Depending on the GPU, it can be off by anywhere from a few watts to potentially 50W or more. Thankfully, the latest generation AMD Big Navi and Nvidia Ampere GPUs tend to report relatively accurate data, but we’re doing things the right way. And by “right way,” we mean measuring in-line power consumption using hardware devices. Specifically, we’re using Powenetics software in combination with various monitors from TinkerForge. You can read our Powenetics project overview for additional details.
Image 1 of 2
Image 2 of 2
Tom’s Hardware GPU Testbed
After assembling the necessary bits and pieces — some soldering required — the testing process is relatively straightforward. Plug in a graphics card and the power leads, boot the PC, and run some tests that put a load on the GPU while logging power use.
We’ve done that with all the legacy GPUs we have from the past six years or so, and we do the same for every new GPU launch. We’ve updated this article with the latest data from the GeForce RTX 3090, RTX 3080, RTX 3070, RTX 3060 Ti, and RTX 3060 12GB from Nvidia; and the Radeon RX 6900 XT, RX 6800 XT, RX 6800, and RX 6700 XT from AMD. We use the reference models whenever possible, which means only the EVGA RTX 3060 is a custom card.
If you want to see power use and other metrics for custom cards, all of our graphics card reviews include power testing. So for example, the RX 6800 XT roundup shows that many custom cards use about 40W more power than the reference designs, thanks to factory overclocks.
Test Setup
We’re using our standard graphics card testbed for these power measurements, and it’s what we’ll use on graphics card reviews. It consists of an MSI MEG Z390 Ace motherboard,
Intel Core i9-9900K CPU
, NZXT Z73 cooler, 32GB Corsair DDR4-3200 RAM, a fast M.2 SSD, and the other various bits and pieces you see to the right. This is an open test bed, because the Powenetics equipment essentially requires one.
There’s a PCIe x16 riser card (which is where the soldering came into play) that slots into the motherboard, and then the graphics cards slot into that. This is how we accurately capture actual PCIe slot power draw, from both the 12V and 3.3V rails. There are also 12V kits measuring power draw for each of the PCIe Graphics (PEG) power connectors — we cut the PEG power harnesses in half and run the cables through the power blocks. RIP, PSU cable.
Powenetics equipment in hand, we set about testing and retesting all of the current and previous generation GPUs we could get our hands on. You can see the full list of everything we’ve tested in the list to the right.
From AMD, all of the latest generation Big Navi / RDNA2 GPUs use reference designs, as do the previous gen RX 5700 XT, RX 5700 cards,
Radeon VII
,
Vega 64
and
Vega 56
. AMD doesn’t do ‘reference’ models on most other GPUs, so we’ve used third party designs to fill in the blanks.
For Nvidia, all of the Ampere GPUs are Founders Edition models, except for the EVGA RTX 3060 card. With Turing, everything from the
RTX 2060
and above is a Founders Edition card — which includes the 90 MHz overclock and slightly higher TDP on the non-Super models — while the other Turing cards are all AIB partner cards. Older GTX 10-series and GTX 900-series cards use reference designs as well, except where indicated.
Note that all of the cards are running ‘factory stock,’ meaning there’s no manual
overclocking
or
undervolting
is involved. Yes, the various cards might run better with some tuning and tweaking, but this is the way the cards will behave if you just pull them out of their box and install them in your PC. (RX Vega cards in particular benefit from tuning, in our experience.)
Our testing uses the Metro Exodus benchmark looped five times at 1440p ultra (except on cards with 4GB or less VRAM, where we loop 1080p ultra — that uses a bit more power). We also run Furmark for ten minutes. These are both demanding tests, and Furmark can push some GPUs beyond their normal limits, though the latest models from AMD and Nvidia both tend to cope with it just fine. We’re only focusing on power draw for this article, as the temperature, fan speed, and GPU clock results continue to use GPU-Z to gather that data.
GPU Power Use While Gaming: Metro Exodus
Due to the number of cards being tested, we have multiple charts. The average power use charts show average power consumption during the approximately 10 minute long test. These charts do not include the time in between test runs, where power use dips for about 9 seconds, so it’s a realistic view of the sort of power use you’ll see when playing a game for hours on end.
Besides the bar chart, we have separate line charts segregated into groups of up to 12 GPUs, and we’ve grouped cards from similar generations into each chart. These show real-time power draw over the course of the benchmark using data from Powenetics. The 12 GPUs per chart limit is to try and keep the charts mostly legible, and the division of what GPU goes on which chart is somewhat arbitrary.
Image 1 of 10
Image 2 of 10
Image 3 of 10
Image 4 of 10
Image 5 of 10
Image 6 of 10
Image 7 of 10
Image 8 of 10
Image 9 of 10
Image 10 of 10
Kicking things off with the latest generation GPUs, the overall power use is relatively similar. The 3090 and 3080 use the most power (for the reference models), followed by the three Navi 10 cards. The RTX 3070, RX 3060 Ti, and RX 6700 XT are all pretty close, with the RTX 3060 dropping power use by around 35W. AMD does lead Nvidia in pure power use when looking at the RX 6800 XT and RX 6900 XT compared to the RTX 3080 and RTX 3090, but then Nvidia’s GPUs are a bit faster so it mostly equals out.
Step back one generation to the Turing GPUs and Navi 1x, and Nvidia had far more GPU models available than AMD. There were 15 Turing variants — six GTX 16-series and nine RTX 20-series — while AMD only had five RX 5000-series GPUs. Comparing similar performance levels, Nvidia Turing generally comes in ahead of AMD, despite using a 12nm process compared to 7nm. That’s particularly true when looking at the GTX 1660 Super and below versus the RX 5500 XT cards, though the RTX models are closer to their AMD counterparts (while offering extra features).
It’s pretty obvious how far AMD fell behind Nvidia prior to the Navi generation GPUs. The various Vega and Polaris AMD cards use significantly more power than their Nvidia counterparts. RX Vega 64 was particularly egregious, with the reference card using nearly 300W. If you’re still running an older generation AMD card, this is one good reason to upgrade. The same is true of the legacy cards, though we’re missing many models from these generations of GPU. Perhaps the less said, the better, so let’s move on.
GPU Power with FurMark
FurMark, as we’ve frequently pointed out, is basically a worst-case scenario for power use. Some of the GPUs tend to be more aggressive about throttling with FurMark, while others go hog wild and dramatically exceed official TDPs. Few if any games can tax a GPU quite like FurMark, though things like cryptocurrency mining can come close with some algorithms (but not Ehterium’s Ethash, which tends to be limited by memory bandwidth). The chart setup is the same as above, with average power use charts followed by detailed line charts.
Image 1 of 10
Image 2 of 10
Image 3 of 10
Image 4 of 10
Image 5 of 10
Image 6 of 10
Image 7 of 10
Image 8 of 10
Image 9 of 10
Image 10 of 10
The latest Ampere and RDNA2 GPUs are relatively evenly matched, with all of the cards using a bit more power in FurMark than in Metro Exodus. One thing we’re not showing here is average GPU clocks, which tend to be far lower than in gaming scenarios — you can see that data, along with fan speeds and temperatures, in our graphics card reviews.
The Navi / RDNA1 and Turing GPUs start to separate a bit more, particularly in the budget and midrange segments. AMD didn’t really have anything to compete against Nvidia’s top GPUs, as the RX 5700 XT only matched the RTX 2070 Super at best. Note the gap in power use between the RTX 2060 and RX 5600 XT, though. In gaming, the two GPUs were pretty similar, but in FurMark the AMD chip uses nearly 30W more power. Actually, the 5600 XT used more power than the RX 5700, but that’s probably because the Sapphire Pulse we used for testing has a modest factory overclock. The RX 5500 XT cards also draw more power than any of the GTX 16-series cards.
With the Pascal, Polaris, and Vega GPUs, AMD’s GPUs fall toward the bottom. The Vega 64 and Radeon VII both use nearly 300W, and considering the Vega 64 competes with the GTX 1080 in performance, that’s pretty awful. The RX 570 4GB (an MSI Gaming X model) actually exceeds the official power spec for an 8-pin PEG connector with FurMark, pulling nearly 180W. That’s thankfully the only GPU to go above spec, for the PEG connector(s) or the PCIe slot, but it does illustrate just how bad things can get in a worst-case workload.
The legacy charts are even worse for AMD. The R9 Fury X and R9 390 go well over 300W with FurMark, though perhaps that’s more of an issue with the hardware not throttling to stay within spec. Anyway, it’s great to see that AMD no longer trails Nvidia as badly as it did five or six years ago!
Analyzing GPU Power Use and Efficiency
It’s worth noting that we’re not showing or discussing GPU clocks, fan speeds or GPU temperatures in this article. Power, performance, temperature and fan speed are all interrelated, so a higher fan speed can drop temperatures and allow for higher performance and power consumption. Alternatively, a card can drop GPU clocks in order to reduce power consumption and temperature. We dig into this in our individual GPU and graphics card reviews, but we just wanted to focus on the power charts here. If you see discrepancies between previous and future GPU reviews, this is why.
The good news is that, using these testing procedures, we can properly measure the real graphics card power use and not be left to the whims of the various companies when it comes to power information. It’s not that power is the most important metric when looking at graphics cards, but if other aspects like performance, features and price are the same, getting the card that uses less power is a good idea. Now bring on the new GPUs!
Here’s the final high-level overview of our GPU power testing, showing relative efficiency in terms of performance per watt. The power data listed is a weighted geometric mean of the Metro Exodus and FurMark power consumption, while the FPS comes from our GPU benchmarks hierarchy and uses the geometric mean of nine games tested at six different settings and resolution combinations (so 54 results, summarized into a single fps score).
This table combines the performance data for all of the tested GPUs with the power use data discussed above, sorts by performance per watt, and then scales all of the scores relative to the most efficient GPU (currently the RX 6800). It’s a telling look at how far behind AMD was, and how far it’s come with the latest Big Navi architecture.
Efficiency isn’t the only important metric for a GPU, and performance definitely matters. Also of note is that all of the performance data does not include newer technology like ray tracing and DLSS.
The most efficient GPUs are a mix of AMD’s Big Navi GPUs and Nvidia’s Ampere cards, along with some first generation Navi and Nvidia Turing chips. AMD claims the top spot with the Navi 21-based RX 6800, and Nvidia takes second place with the RTX 3070. Seven of the top ten spots are occupied by either RDNA2 or Ampere cards. However, Nvidia’s GDDR6X-equipped GPUs, the RTX 3080 and 3090, rank 17 and 20, respectively.
Given the current GPU shortages, finding a new graphics card in stock is difficult at best. By the time things settle down, we might even have RDNA3 and Hopper GPUs on the shelves. If you’re still hanging on to an older generation GPU, upgrading might be problematic, but at some point it will be the smart move, considering the added performance and efficiency available by more recent offerings.
You would think that something as critical as a town or county’s drinking water supply would be well-protected — you know, like how America’s nuclear armament was isolated from the internet and even relied on eight-inch floppy disks until just recently? And yet we’ve now had two instances where someone was able to remotely log into a municipal water supply in a way that could have seriously harmed people.
Remember the story of the Florida water treatment facility where someone was able to change the chemical levels? Something similar happened in March 2019 in Kansas’ Ellsworth County, too, where 22-year-old Wyatt Travnichek now stands accused of shutting down the region’s water cleaning system “with the intention of harming” it, according to a statement from the Department of Justice.
The wildest part is that in both cases, workers at these water resources left themselves wide open to tampering — they installed the remote access software themselves so employees could log in to monitor the systems! That’s what Travnichek was hired to do in Kansas, and authorities aren’t even accusing him of “hacking” the system in their indictment. He simply “logged in remotely” months after he left the job, began shutting things down, and is now facing up to 20 years in prison.
That sounds remarkably similar to what happened in Florida, where the water treatment plant never bothered to change the password or even remove an old piece of remote control software after they’d installed a newer one.
Maybe we should stop doing that. President Joe Biden is currently trying to push a $2 trillion infrastructure plan, including billions to deliver safe water and replace lead pipes, among other hazards. To keep the water safe, we also need to keep the water secure.
Cyberscoopspoke to a customer service rep at the Kansas water utility, who claimed the incident didn’t harm residents’ drinking water.
Outriders, the online shooting, looting, and superpower-slinging game from People Can Fly, finally has a way to pause, but to do it you’ll need to be using an Nvidia graphics card (via Kotaku). Despite working as a single-player game, Outriders requires an internet connection to play, which means pausing in the middle of a battle was impossible until this workaround. Even with your menu open, enemies could still attack you.
Using Ansel, which is a feature of Nvidia GeForce RTX graphics cards that enables a kind of photo mode even in games without one built in, you can “pause” Outriders by pressing “Alt F2” on the fly, and get up and take care of business. Because Ansel is specific to the Nvidia’s Geforce Experience software, pausing is limited to PC players, which means anyone playing on console or with a different brand of graphics card is out of luck.
The handling of pausing and single-player content in Outriders is similar to Destiny 2, towhich it shares some aesthetic and mechanical similarities. Destiny 2 sells a battle pass and yearly expansions with new story content, and it justifies — at least in part — it’s online-only requirements with the promise of new weekly and monthly changes in the form of live events and other features.
The difference is that Outriders is very explicitly sold as a more traditional single-player game, with the game’s publisher Square Enix addressing the issue on its site, “Outriders is a complete experience out of the box,” it writes. For some reason, an internet connection is still required, which, beyond hindering a basic feature like pausing the game, also seemed to contribute to Outriders’ launch on April 2nd being kind of a mess. Players had issues connecting with the game’s servers to play in single-player and multiplayer, for which developer People Can Fly acknowledged and apologized publicly.
The game seems to be working fine now, and this weird Nvidia loophole means the experience of playing single-player could be a little bit more comfortable, but Outriders definitely illustrates the ongoing problems of making a game online-only.
LG says it will continue to offer Android OS updates — including an upgrade to Android 12 — for some of its devices after it exits the smartphone business, but the company’s poor track record has us feeling skeptical it’ll follow through.
In its press release and on a US FAQ page, LG says only that it will continue to provide some security and OS updates, but a page on LG’s Korean website spotted by XDA Developers specifically mentions that an “Android 12 OS upgrade will also be provided for selected models.” This page also notes that this is subject to change based on product performance, and that update availability may vary by region.
LG US didn’t share anything more specific when contacted by The Verge, saying only that “LG may offer certain OS upgrades for select models,” and that “additional details will be provided in the near future on software updates.” An Android 11 update schedule hasn’t yet been confirmed for the US either, though details posted on LG’s German website could give a rough idea of what that might look like.
Unfortunately, LG wasn’t very reliable with updates even prior to its decision to get out of the phone business; OS upgrades came slowly, even for flagship devices. In 2018, the company attempted to remedy this by setting up a Software Upgrade division, but when it came time to offer Android 9 Pie updates in early 2019, not much had actually changed.
If the company does make good on its Android 12 promise, we expect that it would only come to a few high-profile devices, like the Wing and V60. And with Android 11 only scheduled to come to many of the company’s phones much later this year, even if that 12 update does come it’ll likely be a long ways off.
Amazon’s bringing Zoom compatibility to more devices. The company announced today that it’s making the Echo Show 10 devices in the US compatible with the popular video calling software. Users who have their calendars linked up to the Alexa will have their meetings started automatically while people who haven’t done that can say, “Alexa, join my meeting” or “Alexa, join my Zoom meeting” to join one. This is the second Echo Show to gain Zoom access; the Echo Show 8 started supporting the videoconferencing platform in the US in December.
The Echo Show 10’s camera tracks users as they move throughout a room, meaning callers can see the screen no matter where they sit or stand. Presumably, this functionality will work with Zoom, putting it on par with other competitor devices like the Facebook Portal and Google Nest. Google pulls Zoom meetings from users’ calendars while Facebook allows its Portal camera to track users around the room so that they’re always in frame, just like it does with Facebook Messenger and WhatsApp calls.
Although people might start commuting into the office more frequently, Zoom and other videoconferencing software will likely remain a standard for many workplace meetings, especially as workers shift to working from home more frequently.
If you’re reading this on a phone, chances are, LG didn’t make it. The Korean tech giant has been losing money and market share with its smartphone division for years, so it wasn’t a surprise when it finally announced plans to pull the plug today. You could be forgiven for shrugging.
But LG deserves to be remembered as more than just an also-ran. Its phones were rarely big hits, much less often the kind of polished product we’d ever recommend to most over its competitors. Despite this, LG did introduce several features and innovations that the phone world would be worse off without. The company was the first to put ultrawide cameras on its phones, for example, and it pioneered the kind of all-screen, no-button smartphone designs that dominate the market today.
And particularly in the US, where Android competition is extremely low, the loss of LG will only further entrench the Apple-Samsung duopoly at the high end. LG is the third-biggest phone vendor in the US, with roughly 10 percent of market share, although much of that was midrange prepaid devices sold through carrier stores. LG might not have been at the top of your smartphone shopping list, but if you live in the US, that list just got a lot more boring.
LG did have some claim to being a tastemaker in the pre-smartphone world. Its Chocolate and enV phones were stylish devices that helped LG expand its brand recognition around the world. But after the iPhone and Android changed everything, LG struggled to adapt. I’m duty-bound here to mention the original LG Prada, which had a capacitive touchscreen and was technically announced just before the iPhone, but its true legacy is mostly people pointing that out in online comments.
LG’s early Android phones weren’t impressive. The 2011 Nitro HD, for example, was its first splashy flagship device in a long time, but it was saddled with outdated, clunky software and poor battery life. Its successor, the Optimus G, represented a degree of refinement, and by the time the G2 came along in 2012, LG’s new G-series was a fairly credible alternative to the likes of Samsung or HTC. The G2 was one of the first flagship smartphones to attempt to cut down on bezel size, for example, and LG made on-screen buttons a core part of its design long before most others.
It was also around this time that LG found a new partner in Google, releasing two Nexus phones in a row. The 2012 Nexus 4 was built around the guts of the Optimus G, and it had its fans despite its crippling lack of LTE, weak battery life, and unimpressive camera. The next year’s Nexus 5 found an even stronger cult following despite it too having a poor camera and bad battery life. (The red version did look great, and the $349 price didn’t hurt.)
LG’s mobile division kept on ticking on, turning out respectable phones like the G3 and G4 without ever really challenging Samsung. The software was still a heavy-handed customization of Android, and LG continued to lag behind peers with its pace of updates, but the hardware was solid. It was the 2016 G5 where things really started to fall apart. Designed around a series of swappable modular accessories called “Friends,” the phone flopped, and LG quickly pretended it never happened. Suffice it to say that if you bought a camera grip or a DAC Hi-Fi audio accessory for your G5, it wouldn’t be able to make Friends with 2017’s G6.
It’s unfortunate that LG focused on gimmicks with the G5 because that phone did introduce one new feature that would become ubiquitous in the smartphone market years later: the ultrawide camera. Ultrawides on smartphones let people capture pictures that were previously restricted to camera gearheads, and it’s hard to imagine buying a new phone without one today. But it took a long time for other phone makers to figure out the utility; Apple introduced its first in 2019, for example.
The V20, released the same year as the G5, had another unique feature that would become a hallmark of the company’s phones for years: an honest-to-God headphone jack in the year that Apple decided to ditch it. And not just any headphone jack — one that worked with a built-in quad DAC designed to boost sound quality and appeal to audiophiles. Did this sell many phones? Well, no. But it became a hallmark of LG’s high-end devices ever since, providing an option for wired headphone enthusiasts who despaired as other phone makers followed Apple’s lead one by one.
The 2017 G6 got the G-series back on track. It was the first major smartphone released with a now-familiar taller aspect ratio, with an even stronger focus on eliminating bezels than ever before. Of course, not many people noticed as Samsung followed immediately with the similar but sleeker Galaxy S8 and its “Infinity Display.” Later that year, LG released the V30, which had a completely new (and very nice) design, but it’s always going to be a hard sell when your most differentiated feature is your (also very nice) haptics system.
From here on out, LG’s flagship phones mostly blurred into one. The G7 was a pretty good facsimile of an iPhone X, even winning an Editor’s Choice designation from Verge editor Dan Seifert. The V40 pioneered the now-common triple-camera setup. The G8X came with a dual-screen case that, in hindsight, Microsoft’s Surface Duo really didn’t improve much upon a year later. But all of these phones looked basically identical to each other, and none of their key features were viewed as much more than gimmicks at the time.
For every good idea LG had, there’d be something pointless like the G8’s vein-sensing “Hand ID” unlock. Despite the company making a big announcement about a new Software Upgrade Center to increase the pace of Android updates, nothing changed. And in the face of Samsung’s unstoppable marketing machine, LG’s best attempt at a brand identity was to add “ThinQ” to the name of each flagship phone.
In its final year, LG’s mobile division did move to address its problems. The Explorer Project was intended to produce more innovative designs, like the beautiful but underpowered Velvet and the oddball dual-screen Wing. At CES this year, the company announced a Rollable concept phone that it said it planned to take to market.
That’ll never happen now, and it’s hard to say it’s a huge loss with companies like Oppo and TCL likely to pick up the slack with their own versions. But in the context of the US phone market, there’s going to be fewer choices, and whoever ends up accounting for LG’s lost market share is unlikely to be as creative a replacement.
LG’s phones were rarely, if ever, the best available, but the company did make a significant impact on the smartphone world at large. With its mobile division’s demise, the US market becomes even more homogenous.
Nintendo’s official Pro Controller for the Switch is generally a pretty useful accessory, but it has its problems: the D-pad is unreliable, and it doesn’t really offer any “pro-level” functionality. 8BitDo’s latest controller improves on both of those issues while coming in at a lower price.
The 8BitDo Pro 2 is an upgraded version of the SN30Pro Plus, already a well-regarded Switch controller. It uses Bluetooth and also works with PCs and mobile devices; there’s a physical control for flipping between Switch, X-input, D-input, and Mac. You can use it as a wired controller with a USB-C cable, too. I did try using it with my PC, but I feel like it makes more sense on the Switch due to the Japanese-style button layout with B on the bottom and A on the right. Or maybe I’m just too used to using Xbox controllers on the PC.
Aesthetically, it looks kind of like a cross between a SNES pad and a PlayStation controller, with a lozenge-shaped body, two handles, and symmetrically aligned analog sticks. The unit I have is decked out in a PlayStation-inspired gray colorway, though there’s also an all-black option and a beige model that evokes the original Game Boy.
It’s not a huge controller, but it feels comfortable in my large hands, with easy access to all of the buttons and triggers. Just as importantly for me, the D-pad is good. It feels more or less like a SNES pad, and its placement above the left analog stick makes it more appropriate for games where it’s a primary input option. I’d much rather use the Pro 2 than Nintendo’s Pro Controller for just about any 2D game on the Switch.
The Pro 2’s key feature over its predecessor is the customizable back buttons that you can press with your middle finger. These are a common element of enthusiast-focused controllers today, from Microsoft’s Elite controllers to third-party offerings like the Astro C40 for the PS4. Sony also released an attachment that brings similar functionality to the DualShock 4.
These buttons are useful because they allow you to enter commands without taking your thumbs off the sticks. Most first-person shooters, for example, assign jumping to a face button, which means it can be awkward to activate while aiming at the same time. With controllers like the Pro 2, you can set a back button to work the same way as a given face button, freeing you up to design more flexible control schemes. The Pro 2 makes it much easier to manipulate the camera in the middle of a Monster Hunter Rise battle, which might be worth the asking price alone.
The back buttons on the Pro 2 are responsive and clicky, activating with a slight squeeze. You can assign them through 8BitDo’s Ultimate Software app, which is now available for the Pro 2 on iOS and Android as well as PCs. It’s not quite as simple as some pro controller setups that let you remap the buttons directly on the controller itself, but it does support multiple profiles and works well enough. Beside button assignments, the app can also be used to modify the controller’s vibration strength and stick sensitivity.
You do miss out on some of the Switch Pro Controller’s features with the 8BitDo Pro 2. While the rumble is solid, it doesn’t feel as precise as Nintendo’s HD Rumble in supported games. The Pro 2 also lacks an NFC reader, so it won’t work with Amiibo figurines. And it can’t be used to power the Switch on, which is common to most third-party controllers across various platforms.
For $49.99, though, those omissions are understandable. That’s $20 less than Nintendo’s equivalent option, let alone the pro controllers you’d find for the Xbox or PlayStation in the $180–$200 range. And all things considered, I’d take the 8BitDo Pro 2 over the official Nintendo controller most days of the week.
The 8BitDo Pro 2 will start shipping on April 12th.
Apple CEO Tim Cook rarely provides details on unannounced products, but he offered some hints about Apple’s thinking on augmented reality and cars in an interview with Kara Swisher for The New York Times this morning.
When it comes to augmented reality, he agreed with Swisher’s framing that the tech is “critically important” to Apple’s future and said it could be used to enhance conversations.
“You and I are having a great conversation right now. Arguably, it could even be better if we were able to augment our discussion with charts or other things to appear,” Cook said. He imagines AR being used in health, education, retail, and gaming. “I’m already seeing AR take off in some of these areas with use of the phone. And I think the promise is even greater in the future.”
Apple has been rumored for years to be working on an augmented reality headset, and the latest leaks suggested a mixed reality device could launch next year. Augmented reality features are already available on the iPhone and iPad, but outside of some fun Snapchat filters, augmented reality hasn’t become all that widely used yet.
Cook also talked broadly about Apple’s approach to products during a question about cars. Leaks from Apple have made it unclear if the company is developing self-driving tech that it could license to other companies or if Apple plans to develop an entire car by itself. Cook’s latest comments suggest the latter, assuming the project comes to fruition.
“We love to integrate hardware, software, and services, and find the intersection points of those because we think that’s where the magic occurs,” Cook said. “And so that’s what we love to do. And we love to own the primary technology that’s around that.”
Cook referred to “autonomy” as a “core technology” and said there are “lots of things you can do” with it in connection with robots. But he warned that not every Apple project eventually ships. “We investigate so many things internally. Many of them never see the light of day,” Cook said. “I’m not saying that one will not.”
Swisher also asked Cook about Elon Musk’s comments about a failed attempt to discuss selling Tesla to Apple around 2017. “You know, I’ve never spoken to Elon,” Cook said, “although I have great admiration and respect for the company he’s built.”
No matter how many keys your keyboard has, you can always use a dedicated keypad with buttons for executing macros, launching your favorite apps or, if you’re a streamer, initiating functions in OBS. Many users swear by the Elgato Stream Deck lineup of macro keypads, but these devices are expensive.
With Raspberry Pi Pico, some inexpensive hardware and the right script, you can create your own Stream Deck-like macro keypad, plug it in via USB and use it to make your life easier in OBS or for any tasks. Once completed, the macro keypad will be seen as a USB keyboard by your operating system, allowing it to work with any computer, no drivers or special software required.
What you need to build a Raspberry Pi Pico-Powered Stream Deck
Raspberry Pi Pico
Mechanical Key switches (i.e. Cherry MX brown)
Key Caps (Compatible for Cherry MX)
30 Gauge Wires
3D printed Case (using this design)
Setting Up Raspberry Pi Pico’s Firmware
To get our Raspberry Pi Pico-powered stream deck working, we will be using Circuit Python as the programming language, because it has a built-in USB HID library. To use Circuit Python on a Pico, you must first flash the appropriate firmware.
1. Download the Circuit Python UF2 file.
2. Push and hold the BOOTSEL button and plug your Pico into the USB port of your Raspberry Pi or other computer. Release the BOOTSEL button after your Pico is connected.
This will mount the Pico as a Mass Storage Device called “RPI-RP2”.
3. Copy the UF2 file to the RPI-RP2 volume.
Your Pico should automatically reboot and will be running Circuit Python.
Adding Code for Pico-Powered Stream Deck
I have written custom code to make the Pico act as a stream deck / macro keypad. Here’s how to install it.
1. Download the project zip file from Novaspirit Github.
2. Transfer the contents of the zip file to “CIRCUITPY” volume and overwrite the existing files.
3. Reboot the Pico and it should load the macro keys code.
3D Printing Pico-Powered Stream Deck Case
If you want to use our case, you need to 3D print it or have it printed by a service such as All3DP. Download our design files and use these CURA settings.
PLA
15% infill
3 line wall thickness
No Support needed
0.2 Layer height (use 0.1 layer height for higher quality)
Print separately with two different colors
Image 1 of 2
Image 2 of 2
Assembling Your Pico-Powered Stream Deck
Now it’s time to assemble the stream deck / maco keypad and solder everything into place.
1. Start by placing the Cherry MX-compatible key switches on the top plate of the 3D-printed case.
2. You will connect wires as follows. More details below.
Image 1 of 2
Image 2 of 2
3. Connect all the top left pins on the switches together with a single wire and connecting it to Pin 36, the 3V3 pin on the Pico.
4. Solder a short wire to each one of the right pins to prep the connections we are going to make to individual GPIO pins.
5. Solder the required wires to the appropriate pins on the raspberry pi pico
5. Snap the case together.
Image 1 of 2
Image 2 of 2
Setting up the macro keys
The keys are set up in a way to utilize Ctrl+function keys starting from Button 1 (Top left) Control + F7 to Button 6 ( bottom right) Control + F12. These keys can be altered from the code.py as needed. But i’m going to show you a few ways to utilize the default mapping with the examples below for both program shortcuts and OBS.
Setting Up Macro for Program Shortcuts
If you want to use a key on Raspberry Pi Pico-powered stream deck to launch an app in Windows , here’s how.
1. Right click a shortcut and select “properties.”
2. Select the “Shortcut key” field in the Shortcut tab.
3. Press any of the macro keys and you’ll see its keyboard combo (ex: CTRL + F7 for key 1) appear in the box.
4. Press “OK” and your new macro has been assigned to the key pressed.
Setting Up Macros for OBS
1. Open OBS and navigate to “Settings.”
2. Select the “Hotkeys” setting and scroll down to the scene you want to assign a macro for.
3. Select “Switch to scene” on the scene you want to macro and press the appropriate key on your stream deck to assign it.
4. Press “OK” and the macro keys will be assigned to those scenes.
AMD has published a whitepaper on a potential security vulnerability that affects the company’s latest Zen 3 processors. The side-channel exploit is similar to Spectre that affected a plethora of Intel processors three years ago.
With Zen 3, AMD introduced a new technology called Predictive Store Forwarding (PSF), which helps improve code execution performance by predicting the relationship between loads and stores. In the majority of the cases, PSF’s predictions are on the spot. However, there is still a slim chance that the prediction may not be accurate, which results in an incorrect CPU speculation.
AMD’s CPU architects have discovered that bad PSF speculation is equivalent to Spectre v4. Software that relies on isolation or “sandboxing” is highly at risk when it comes to incorrect speculation. AMD provided two scenarios where an incorrect PSF prediction can occur.
“First, it is possible that the store/load pair had a dependency for a while but later stops having a dependency. This can occur if the address of either the store or load changes during the execution of the program.”
“The second source of incorrect PSF predictions can occur if there is an alias in the PSF predictor structure. The PSF predictor is designed to track stores/load pairs based on portions of their RIP. It is possible that a store/load pair which does have a dependency may alias in the predictor with another store/load pair which does not. This may result in incorrect speculation when the second store/load pair is executed.”
AMD concludes that Predictive Store Forwarding helps improve application performance, but also comes with security complications. Nevertheless, the chipmaker hasn’t seen any code that’s considered vulnerable to PSF misprediction nor are there any reported cases of such exploit. The security risk of Predictive Store Forwarding is low for most applications.
The official recommendation from AMD is to leave the Predictive Store Forwarding enabled. Since it’s a performance enhancement feature, we suspect that disabling PSF could bring a performance hit.
Consumers who work with software that employs sandboxing and are alarmed about PSF have the choice to disable the PSF functionality. AMD recently proposed Linux patches that would disable Predictive Store Forwarding as well.
EVGA GeForce GTX 1650 XC Black (Image credit: Nvidia)
The latest rumor coming out of China is that Nvidia is supplying its partners with more Turing silicon to deliver more GeForce GTX 1650 graphics cards to the market. The report specificially mentions mainland China, so it’s uncertain if Nvidia is doing this on a global scale. However, we’ve reached out to the chipmaker for clarification.
Nvidia has already rekindled GeForce RTX 2060 and GTX 1050 Ti as a stopgap solution to the ongoing graphics card shortage, so it doesn’t surprise us that the GeForce GTX 1650 would get the same treatment. The Turing-powered graphics card does rank third on Steam’s Hardware & Software Survey for a reason.
According to the report, Nvidia was focusing more on its mobile graphics cards at the beginning of the year. As a result, there was a lack of supply of TU117 silicon for the desktop GeForce GTX 1650. However, the chipmaker will reportedly increase supply between the months of April and May.
The GeForce GTX 1650 isn’t a gaming monster, but wields sufficient firepower to offer consumers a comfortable 1080p gaming experience. More importantly, the GeForce GTX 1650 isn’t proficient for mining cryptcurrency, meaning gamers have less competition buying it up. According to Minerstat, the GeForce GTX 1650 puts up a puny hash rate of 13.2 MH/s in Ethereum, so there are way better options out there for miners. For comparison, even Nvidia’s entry-level CMP 30HX is good for 26 MH/s.
The GeForce GTX 1650 arrived on the market with a $149 price tag, but that was two years ago, long before the pandemic and graphics card shortages. Nowadays, custom GeForce GTX 1650 models are selling for between $400 and $900. Despite Nvidia injecting more stock into the market, we don’t expect the pricing to improve anytime soon.
AMD’d next-generation Ryzen Threadripper might be coming soon, according to the latest patch notes for the popular HWiNFO diagnostic suite.
Realix, the developer behind HWiNFO, said earlier today that the upcoming version of the software will improve its work with AMD’s Ryzen Threadripper Pro as well as “next-generation Ryzen Threadripper” platforms. This is essentially one of the first public signs of AMD’s 4th Generation Threadripper, which is allegedly based on the Epyc ‘Milan’ design.
“Improved detection of AMD ThreadRipper Pro and next-generation ThreadRipper,” a line in the HWiNFO changelog reads.
Unfortunately, we’re still not certain if HWiNFO got word from AMD or is simply adapting its Milan knowledge to fit the new Threadripper.
That’s because, at this point, we don’t know much about AMD’s next-generation Ryzen Threadripper and what ‘improved detection’ means in its case. We are almost certain that the upcoming Ryzen Threadripper will be based on the 3rd Generation Epyc 7003-series ‘Milan’ design and will therefore feature Zen 3-based chiplets with a unified core complex and L3 cache architecture. We can also assume that these CPUs feature slightly different sensors, a new memory controller, and other changes. So, if HWiNFO can properly detect Epyc 7003-series, it should be able to detect most of the next-generation Threadripper’s features correctly without help from AMD.
Still, diagnostic software is also vital for hardware developers and enthusiasts that play with the latest parts. Therefore, hardware developers are eager to add support for their new and upcoming products to diagnostic software in a bid to make the lives of their partners a bit easier. That’s why it’s not uncommon to learn news about future products from various third-party software makers.
So, in the case of HWiNFO’s next-generation Threadripper announcement, we can’t confirm whether Realix got preliminary information from AMD or just learned how to use Epyc 7003-series ‘Milan’ information in context of the next-generation Ryzen Threadripper.
In any case, now that Milan is out, AMD’s 4th Generation Ryzen Threadripper is a bit closer to release too.
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.