Apple’s first M1 MacBooks are here, and the world of laptops has changed overnight.
When Apple first announced that it would be transitioning its computers — specifically, the MacBook Air and entry-level 13-inch MacBook Pro, its most popular PCs — to a new and wildly different type of processor, there were plenty of reasons to be skeptical. Apple was making huge claims for battery life and performance, things that the first wave of Arm-based laptops from Qualcomm and Microsoft failed to deliver.
But deliver Apple did, with computers powered by a new M1 processor that aren’t just close to their previous Intel counterparts, but crush them in nearly every respect — and not just the base model Intel chips that the M1 purports to replace, either. In both early benchmarks and head-to-head comparisons for compiling code, Apple’s M1 chip appears to hold its own against even Intel’s most powerful Core i9 chip for laptops.
Keep in mind this comparison is deeply unfair: my 16-inch MacBook Pro was literally maxed out just a year ago – 8 cores, 64GB RAM, and much more, costing $6000.
In comparison the M1 costs just $2000 and manages to hammer the Intel machine with a quarter of the RAM.
— Paul Hudson (@twostraws) November 17, 2020
The conversation has flipped instantly: it’s no longer “why would you take a gamble on Apple’s new, unproven processor” but “how will competitors like Intel, AMD, and Qualcomm respond?”
For years, Intel and AMD have been playing a chess match, sniping back and forth with improvements in CPU performance, battery life, and onboard graphics. Apple appears to be playing an entirely different game on an entirely different level. The same interplay between hardware and software that has led to such huge successes on the iPhone and iPad has now come to the Mac.
It’s not just that Apple’s hardware is faster (although straight benchmarks would indicate that it is); it’s that Apple’s software is designed to make the most of that hardware, in a way that even the best optimization of macOS on an x86 system wasn’t. As John Gruber notes (citing Apple engineer David Smith) the new chips handle fundamental low-level macOS app tasks up to five times faster on the M1 than they do on Intel because Apple was able to design a chip from the ground up to specifically be good at those tasks. It’s why the new M1 Macs (and the existing iPhone and iPad lineups) are able to do more with comparatively less RAM than their Intel (and Android) counterparts.
Apple has also done incredible work with Rosetta 2, its translation layer for running legacy x86 applications on the M1. It’s another key part of how Apple’s software strategy pays off big dividends for the new hardware by making it seamless to run older software on the new Mac without any real hits to performance. Apple almost certainly has factored Rosetta 2 optimization into the M1’s design, benefiting from the same parallel development as the rest of the hardware. The result is that M1 laptops don’t make users choose between great performance on Arm-optimized apps at the expense of legacy x86 performance; instead, they run old apps well and new optimized apps even better.
Photo by Vjeran Pavic / The Verge
The most exciting — or frightening, if you’re a traditional PC chip company — part of Apple’s new chips is that the M1 is just the starting point. It’s Apple’s first-generation processor, designed to replace the chips in Apple’s weakest, cheapest laptops and desktops. Imagine what Apple’s laptops might do if the company can replicate that success on its high-end laptops and desktops or after a few more years of maturation for the M-series lineup.
Right now, the saving grace for traditional x86 laptops is that it’s only Apple, with its near-complete control over its hardware and software stack, that’s managed to accomplish this level of speed, software performance, and battery life on Arm.
It’s an open question whether companies like Qualcomm and Microsoft will be able to emulate Apple’s success with the next wave of Arm-based Windows machines. Certainly, it would take a much bigger restructuring of Windows, one that would impact a far greater number of customers than Apple’s changes. And while Microsoft does design its own Surface laptops — and even worked with Qualcomm on building Arm-based SQ1 and SQ2 chips for its Surface Pro X lineup — it’s still a far cry from the level of control that Apple maintains over its software / hardware ecosystem that allows so much of the M1’s success.
The new MacBook Air and MacBook Pro won’t be the perfect laptops for everyone, especially if you rely on huge, GPU-intensive tasks or specific developer tools. But when a $1,000 M1 laptop can outdo a maxed-out, $6,000 MacBook Pro with quadruple the RAM and Intel’s best chip, while also running cooler and quieter in a smaller and lighter form factor and with twice the battery life, where do competitors even go from here?
The NUC M15 is a premium productivity laptop meant to compete with the XPS and Spectre computers of the world
Intel is launching a new laptop. Yes, that’s right, Intel itself has a new laptop that it designed in-house and will be selling through various partners early in 2021. The NUC M15 is the latest computer in the company’s expanding Next Unit of Computing line, which is best known for making tiny desktop PCs.
You won’t actually see Intel’s name stamped on the lid, however. That’s because Intel is essentially supplying this laptop to boutique shops that will equip it with various storage configurations and brand it themselves (a process known in the industry as “white labeling”). This isn’t the first time Intel has done this: a little over a year ago, it produced the MAG-15, a gaming laptop that was sold by a number of smaller brands across the world, including Schenker in Europe and Eluktronics and Maingear in the US.
The NUC M15 is a different beast, however. Instead of targeting a gaming enthusiast crowd that is looking for impressive performance and cooling for an attractive price, the M15 is very much a premium productivity laptop. It’s got a 15.6-inch, 1080p IPS display (available with or without touch), a 73 watt-hour battery that Intel claims is good for up to 16 hours of use, and Intel’s 11th Gen Core i5-1135G7 or i7-1165G7 quad-core processor. Instead of a discrete graphics card from Nvidia, the M15 uses Intel’s Iris Xe integrated graphics. You’ll be able to get it with 8 or 16GB of RAM (soldered, so not upgradeable after purchase) and a variety of storage configurations, depending on which brand is selling it.
The M15 has an aluminum unibody and a 15.6-inch screen.
All of that is packed into an aluminum unibody that’s 14.9mm thick (0.59 in) and a stout 3.64 pounds (1.65 kg). The fit and finish are right up there with what you’d expect from a premium laptop, even if the visuals are a bit boring. (Intel says is it using Tongfeng as its manufacturing partner for the M15, the same one it used with the MAG-15.)
A standard, well-spaced chiclet keyboard is centered under the display with a large glass Windows Precision trackpad just below it. There are two Thunderbolt 4 / USB 4 Type-C ports, two USB-A 10Gbps ports, a 3.5mm headphone jack, and a full-size HDMI port along the sides. The two USB-C ports are on opposite sides, and you can charge from either one of them, which is convenient. The only thing that’s missing is an SD card slot.
If those specs sound familiar, it’s because they are effectively the same as the Asus ZenBook 14 and Dell XPS 13 we recently reviewed, plus countless other thin-and-light productivity laptops released this fall. The major difference with the M15 is that it has a 15-inch display; most productivity laptops have 13- or 14-inch screens on them, while 15-inch models tend to be costlier and more performance-oriented.
There are some other slight differences, such as an LED light bar in the front that works with the Alexa app for Windows. The light bar will glow blue when it hears you say the Alexa voice command, just like an Echo smart speaker. Four microphones installed along the top edge of the lid help the M15 pick up your voice from across the room.
The M15 also has a Windows Hello-compatible webcam for facial login, plus presence detection that will wake the computer up as you approach it and log you in automatically. It will also keep the computer unlocked so long as you’re sitting in front of it. It’s similar to what we saw on the Dell Latitude 7400 last year.
Intel says its goal with this computer is to provide a premium-level laptop to smaller companies so that they can compete with the Dells and HPs of the world without having to invest in the level of R&D that those companies have. The company described the M15 to me as “a premium product above the mainstream, but still targeted towards the average user” and that it is “optimized for a variety of use cases.” It says it saw “an opportunity for higher end premium laptop with a larger screen, thin and light with unbelievable battery life” in the market, and it designed the M15 to fit that.
The M15 technically isn’t labeled with Evo branding, which denotes a certain level of performance and features, including over nine hours of battery life, fast charging, Thunderbolt 4, Wi-Fi 6, and instant wake. But it is built to meet that specification, and Intel expects its partners to submit their finalized, branded machines for Evo certification. As for driver support, Intel says its goal is to provide support for anything it is involved in, which takes another burden off of small companies with limited support resources.
The company also tells me that it plans to bring more NUC laptops to market in the future and that it won’t become a once-per-year type of thing. But it also says that it doesn’t expect to have a full-range product stack like Dell or HP and that any models it does design and sell will be targeted to specific use cases.
Intel isn’t divulging the brands that will eventually sell the M15 early next year, but it’s likely that many of the companies that sold last year’s gaming laptop will participate, and Intel has hinted that it expects even more boutique brands to carry the M15. Intel also says its partners will ultimately determine the selling price, but it expects prices for the M15 to range between $999 and $1,499, depending on configuration.
That pricing is important because, unlike last year’s gaming-focused laptop, there really isn’t much that makes the M15 stand out from the extremely crowded productivity laptop field. The design is best described as a reference model, with a heavy-handed influence from the 2012–2015 MacBook Pro; the specs are not any different from what you can get from countless other brands; and it can’t lay claim to the thinnest or lightest package you can get, an important quality for many laptop buyers in this segment. In some respects, such as its 16:9 display, the M15 already feels behind the curve, as many companies have shifted to taller 16:10 or 3:2 screens that are easier to work with tall documents or webpages on. It also has two fans, unlike Apple’s new MacBook Air that can handle professional work in complete silence.
Last year’s MAG-15 was far from perfect, and it had an equally generic design. But it was interesting to gaming enthusiasts because it had an advanced cooling system, excellent performance, great build quality, a light chassis, and shockingly good battery life for a gaming laptop. It’s hard to find that exact mix of qualities from the name brands in the gaming space. As a result, many enthusiasts were able to get past the fact that it wasn’t made by a known brand, such as Alienware or Razer, because they could get a unique mix of features and top-tier performance at a discount. (I should know; I personally bought a MAG-15 last year for this very reason.)
XPG is one of the brands that’s likely to sell the M15 once it hits retail. Schenker, Eluktronics, and Maingear are other likely candidates.
The productivity laptop market is wildly different from the enthusiast gaming world, though, and without any standout performance qualities aside from its slightly larger screen, it’s hard to see why anyone would buy the M15 from a brand they never heard of instead of just getting a tried-and-true Dell XPS 13 or HP Spectre x360. The M15 is likely to be a perfectly competent laptop — there are no glaring faults that I can see from the list of specs and features, and the pre-production unit I was able to try out ahead of today’s announcement seems mostly fine — but that’s not likely to make the average person choose it over another model.
The pricing that Intel has set expectations for is premium level, but it is a little lower than similar configurations from the big names. It’s definitely lower than you typically pay for a premium 15-inch laptop, though those generally come with higher-end processors, discrete GPUs, and higher resolution screens than the M15 has.
We should have a better idea of how well the M15 fares in the near future, once we have the ability to put a unit through its paces. Until then, this will be something to watch.
Hopeful enthusiasts waited years for virtual reality (VR) to become accessible enough for the home. And with many of us suddenly stuck at home more, the idea of ‘leaving’ and entering a world of VR has become much more appealing.
But it’s not just boredom that’s made VR more enticing; it’s the tech too. Many things had to come together before at-home VR was plausible. Vendors needed to improve head-mounted displays (HMDs) so that VR gaming didn’t lead to nausea. We also needed headsets that were somewhat affordable. Of course, games and apps that make the next-gen tech worthwhile, like Half-Life: Alyx, are crucial. Today, it’s fair to say that VR gaming has all but arrived. We’re here to help you find the best VR headset for you so you can enjoy incredible, immersive games and experiences right at home.
VR has grown so much that there are various ways you to get into VR gaming. There are HMDs that connect to gaming desktops / gaming laptops, smartphones, as well as the PlayStation VR (PSVR), which connects to a gaming console. There are even standalone headsets, or HMDs that don’t need to connect to anything at all. Just strap it on, and you’re in VR. Plus, with distance learning growing, adding VR into the mix can help keep lessons immersive and engaging (Microsoft Flight Simulator counts, right?).
Below are the best VR headsets for PC and gaming that are actually worth escaping reality to enjoy. And if the VR headset you’re after doesn’t include a great set of headphones, be sure to check out our Best Gaming Headsets page so that sound quality and isolation isn’t the weakest link in your VR immersion.
When looking for the best VR headset for gaming, consider the following:
PC-connected VR has the best experience but requires an expensive system. The best VR gaming comes from headsets that you tether to a PC. But a VR-ready gaming PC starts at around $900 for a laptop, or a couple hundred less if you build your own PC. For more wallet-friendly VR, consider standalone HMDs that don’t connect to any system or alternatives that connect to your smartphone.
Is your PC / smartphone powerful enough for VR? Before buying a VR headset that relies on a PC or smartphone connection, you should ensure your device meets the headset’s minimum requirements. Steam has a free test for checking if your PC can handle VR, and we also test this in our gaming laptop reviews. If your PC or smartphone doesn’t meet the headset’s requirements, you might want to increase your budget or buy a standalone HMD instead.
When it comes to specs, bigger is better. In general, the greater the headset’s refresh rate, field of view (FOV), total resolution and pixel density (measured in pixels per inch or PPI), the smoother and sharper games will look.
Make sure your home has enough square footage. Depending on the headset, you may need a notable amount of physical space to properly game. For example, the Oculus Rift S recommends a 3 x 3-foot space minimum, and the PSVR recommends a 10 x 10-foot area.
Mind your glasses. You can usually wear glasses in VR, but some HMDs make this more comfortable than others. Check the headset’s IPD (interpupillary distance, the distance between the pupils in millimeters), which may be adjustable. Better yet, opt for an headset with a glasses spacer, like the Oculus Go or Rift S.
More options are coming. The HP Reverb G2 and XRSpace Mova are coming out this year. At CES 2020 in January, we saw upcoming HMDs hardcore enthusiasts may want to consider. The $450 Pimax Artisan is finally available, and we’re waiting on more news on the enterprise-focused, but incredibly interesting, Pico VR Glasses. And if you’re eagerly anticipating the PlayStation 5, note that the PSVR will work on the console, as Road to VR reported.
Best VR Headsets You Can Buy Today
The Oculus Quest 2 is the best VR headset for most gamers today. (Image credit: Tom’s Hardware)
Officially available for purchase today at $399, the Oculus Quest 2 is the best VR headset for most, offering a great upgrade over the original Oculus Quest. Qualcomm’s modern Snapdragon XR2 (Snapdragon 865) SoC proved to be a powerful chip bringing a fantastic VR experience even without any tethering to a powerful PC or even a smartphone. If you want, however, you can buy an Oculus Link cable for a PC connection
Oculus bumped the Quest 2’s resolution up to 1832 x 1920 per eye compared to the Quest’s 1440 x 1600 per eye. There’s also a unified panel here instead of one for each eye, as well as the ability to hit up to a 90 Hz refresh rate once the apps arrive.
But while the HMD is an upgrade over the last generation, the new Touch controllers accompanying the Quest 2 are not. Due to their bulky shape, these Touch controllers are hard to grip and lack balance. Additionally, the Quest 2 is sporting a brand new color, but unfortunately that white gets dirty easily.
Oculus is so sold on standalone VR that it’s discontinuing the Rift lineup of PC-only HMDs, including the Oculus RIft S. So if you want to get into VR, the Quest 2 is the easiest and best way to do it — and at a good price too.
Read:Oculus Quest 2 review
The best VR headset for PC gaming is the Valve Index. (Image credit: Valve)
2. Valve Index
Best VR Headset for PC
Connectivity: PC | Display: 2x LCD, canted | Per-eye Resolution: 1440×1600 | PPI: ? | Refresh Rate: 80, 90, 120 or 144 Hz (experimental) | FOV: Up to 130 degrees | Weight: 1.78 pounds (807.4g)
RGB subpixel array eliminates screen-door effect
Wider FOV than comparable headsets
Excellent audio quality
Very heavy
Less comfortable than the HTC Vive Pro
Cushions are glued on
If you’re looking for the best possible VR experience at home, you should get a HMD that tethers to a PC. Today, the best VR headset for PC is the Valve Index. It comes from Valve, the company behind Steam and the Lighthouse tracking system used by the HTC Vive Pro and HTC Vive. The Index also uses Lighthouse base stations (including those Vive owners would already have), but is a step up for consumers from the Vive Pro.
The Index experience is quite customizable with canted lenses that allow for FOV adjustments of up to 10 degrees. There’s also mechanical IPD control. But the Index is less comfortable than the Vive Pro due to a less balanced distribution of its slightly heavier weight (1.8 pounds versus 1.7 pounds) and a bulky data cable.
Gaming on the Index offers your choice of refresh rate, allowing for up to 144 Hz as an experimental feature. This means you can pick your refresh rate based on your system’s capabilities, but you’ll need a pretty powerful graphics card to surpass 90 Hz. The most exciting part of the kit is the long-anticipated Index controllers, which secure to your hand with various adjustments and allow open-hand movements for in-game actions like picking something up. Additionally, the Index controllers have capacitive touch sensors for finger movements and pressure sensors that can tell a game how firm or light your grip is.
Read:Valve Index review
The Oculus Go is the best VR headset for entry-level VR.
While this is still a great headset, Oculus recently announced that it’s discontinuing the Oculus Go. Since the Go won’t be getting any new features or apps after December 4, it’s remaining life is limited. However, Oculus will keep providing this budget-friendly headset with security updates until 2022. If you’re looking to futureproof, the Oculus Quest listed above is your next best option for more affordable VR. Sadly, any Oculus headset will require a Facebook login soon.
A quick, easy and affordable way to get into VR, the Oculus Go is the best VR headset for maintaining your budget. Like the Oculus Quest, the Go doesn’t need to connect to a PC or smartphone to work. Bonus: it’s great for glasses-wearers too.
On the other hand, the Go is the only headset here that has only 3-degrees of freedom (3-DoF) instead of 6-DoF. That means you’re not meant to walk around with it. In other words, don’t expect the same quality or level of immersion as you’d get from a PC-connected headset, like the Go’s more capable sibling the Rift S.
Read: Oculus Go review
Best Windows Mixed Reality Headset: HP Reverb G2 (Image credit: Tom’s Hardware)
The HP Reverb G2 is the best Windows Mixed Reality (MR) headset for most; however, it struggles to compete with the other headsets on this page.
HP’s Reverb G2 does boast some nice improvements over the original HP Reverb, such as the move to antimicrobial materials and a boost in audio quality, thanks to HP using the same speakers found in the Valve Index. However, Windows MR tracking is still lacking. HP upgraded the HMD with two more cameras, but it still can’t match the tracking on other PC-connected HMDs, such as the Oculus Rift S. During testing, the headset would lose sight of our controllers if they were close to our chest or moving rapidly.
The plus side is that the Reverb G2 has fantastic image quality with very high per-eye resolution that makes things from games to text easy to enjoy. If image quality is top of mind, the Reverb G2 tops the list.But for gaming and other apps where the ability to track controllers is imperative, you’ll want to look at other headsets on this list.
AMD with its Radeon RX 6000 series introduces a feature called Smart Access Memory. The promise is that in specific use-cases where the CPU needs to access a lot of the video memory, it can improve frame rates by up to 6%. Announced alongside the RX 6800 series, Smart Access Memory (SAM) is an AMD branding for the Resizable BAR (Base Address Register) feature the PCI-SIG standardized years ago. AMD realized that this feature can be useful in improving gaming performance.
How it Works
Your processor can typically only access up to 256 MB of your graphics card’s dedicated video memory at any given time. This arbitrary limit dates back to the 32-bit era when address-space was at a premium, and interestingly, carried on even into the 64-bit era. Around this time, newer APIs, such as DirectX 11, relied less on mirroring data between the system and video-memory. Obviously, we want to be able to transfer data to all GPU memory, so a windowing mechanism is used whereby your GPU holds 256 MB of its dedicated memory as a scratchpad any CPU-bound data to be juggled in and out of.
Another reason why nobody even saw this as a “problem” was because of the enormous amount of memory bandwidth at the disposal of GPUs (relative to system memory), which makes this jugglery “free.” When it came to the Radeon RX 6800 series, which is up against RTX 30-series “Ampere” GPUs with wider memory buses and faster memory devices, the company finally bit the bullet and implemented the Resizable BAR feature as Smart Access Memory. Since this is a PCI-SIG feature that can be added at the driver-level, NVIDIA announced that it intends to implement this feature as well, via a driver update.
Resizable BAR requires UEFI firmware support, and AMD has artificially segmented its support to just its Ryzen 5000 “Zen 3” processor + 500-series chipset combination, possibly as a means to promote the two. It’s likely that NVIDIA’s implementation is broader as it doesn’t have a CPU + chipset platform of its own, and AMD will follow.
Once enabled, the CPU sees the entire 16 GB of video memory on the RX 6800 series as one addressable block. AMD calculates that this helps with certain game engines which leverage the CPU in their 3D rendering stages (think certain kinds of post-processing, etc.). One possible explanation as to why AMD restricted SAM to its 500-series chipset platform is PCI-Express Gen 4. As such, PCI-Express 3.0 x16 bottlenecks next-gen GPUs by only a single-digit percentage, as shown in our RTX 3080 PCIe Scaling article; so AMD figured all that untapped PCIe Gen 4 bandwidth could be used by SAM without affecting the GPU’s performance during normal 3D rendering. But this doesn’t explain why you need a Ryzen 5000 processor, and why a Ryzen 3000 “Matisse” won’t do.
To enable SAM, you need a 500-series chipset motherboard with the latest UEFI firmware supplied by your motherboard vendor, a Ryzen 5000 processor, and a Radeon RX 6800 series graphics card. Simply enable the “Resizable BAR Support” toggle in the “Advanced” PCIe settings of your UEFI setup program. For these toggles to be available, CSM has to be disabled. This also means that if you’ve been booting from an MBR partition, using CSM, you’ll have to reinstall Windows on a GPT partition. There’s also a conversion mechanism between MBR and GPT, but I haven’t tested that.
In this review, we’re testing using a 500-series chipset motherboard and a Ryzen 9 5900X processor to tell you if Radeon Smart Access Memory is worth the hype and whether it helps the RX 6800 XT gain more against the RTX 3080.
Test Setup
Test System
Processor:
AMD Ryzen 9 5900X
Motherboard:
ASRock X570 Taichi AMD X570, BIOS v3.59
Memory:
2x 8 GB DDR4-3900 CL16 Infinity Fabric at 1900 MHz
While Microsoft’s Project xCloud game streaming platform is already available for Android devices, we’ve known that the company is also working on a Windows 10 version. What isn’t know is when we can expect it to arrive, and this has been a question on the minds of PC gamers whose systems don’t have enough power to play more demanding titles.
And while we still don’t know when xCloud for the PC will be released, we can now at least see Project xCloud in action, thanks to Windows Central. Earlier today, the outlet released a short YouTube video that looks at Project xCloud running on Windows 10.
Interestingly, the video details xCloud running on a Microsoft Surface X Pro, which happens to be powered by an Arm processor, not a conventional x86 processor. When it comes to PC gaming, the market is dominated by x86, and the only way to play just about any PC game on the market with an Arm processor is to emulate x86.
Seeing Project xCloud playing a game that wouldn’t be possible with the Surface X Pro is amazing in its own right. This means that if Microsoft continues to use ARM in upcoming hardware, it will be a viable option for gaming for Project xCloud.
While I don’t have access to Project xCloud for Windows, I did enjoy a lengthy hands-on with the mobile beta before it went public. I see that a lot of the features in the mobile version made it into the Windows version.
This makes me believe that this is some cobbled version of Project xCloud. Especially towards the end of the video, where we see the touch screen controls. Granted, if you played on a Surface X Pro like Zac Bowden is doing in this video, then sure, touch screen controls are possible. I’d imagine that most PC gamers, even those with touch screens, would want to use a keyboard or a gamepad/controller. Now I’m curious to see if I can give this a spin and manage to do the same.
For everyone else who’s waiting on Project xCloud for Windows, you can check it out now for mobile devices. Just keep in mind that the service requires a good internet and WiFi connection. If you’re using an older router or access point in your home, it could interfere with the proper operation of Project xCloud.
The CPU: M1 against Goliath Apple has achieved quite a bit with the M1-soc. Both the CPU and the GPU offer excellent performance, while the consumption is so low that it fits in a passively cooled MacBook Air. New reviews of the M1 show the power of the chip in a more independent way. It turns out to be faster than the Intel CPUs that we find in macOS devices.
Anandtech writes that the M1 in the Mac mini in Cinebench R 23 1. 522 points must pass in the single-threaded test. That’s almost as much as the on Windows 15 tested Core i7 1165 G7 based on Tiger Lake, which is officially 28 lost watts of thermal power should be possible. In the multi-threaded test, the chip knows 7. 904 points, which is almost as much as the Ryzen 7 4800 You with double the threads. That chip must be set to a consumption of 17 watts. It’s faster than the aforementioned Core i7, which is 4. 980 scores points.
Special is that the M1 when using the Rosetta2 emulation tool is also faster than the Core i7. Rosetta2 allows users to run the M1 apps that still use the x 86 – instruction set for Intel-based Macs . That means less optimal performance for the arm chip, but even then he knows with 5. 257 points still just above the score of the Tiger Lake CPU. Both chips have eight threads, but half of Apple’s cores are small, economical variants.
In Geekbench he knows with 1. 745 points even better than the Ryzen 9 5950 X, a newly introduced chip with a tdp of 105 watts. Of course the Core i9 18357 K and the Ryzen 9 top models for the desktop in the multi-threaded test pass the M1. These are chips with more than double the number of threads, so a performance difference is to be expected.
The Verge has tested the MacBook Air as well as the Pro and the Mac Mini. It can be concluded from this that the three chips often perform the same compared to each other, although there is a clear difference in heavier workloads such as the compute test in Geekbench 5.3 and the multi-core test from Cinebench. In this the Air performs such a 15% less good, which probably has to do with the thermal budget.
Apple M1 benchmarks in new Macs – The Verge Benchmark MacBook Air MacBook Pro Mac mini Cinebench R 23 Multi 6803 7729 7729 Cinebench R 23 Single 1494 1519 1520 Cinebench R 23 Multi loop of 30 minutes 5369 7729 7780 Geekbench 5.3 CPU Multi 7510 7554 7711 Geekbench 5.3 CPU Single 1730 1730 1754 Geekbench 5.3 OpenCL / Compute 18357 19211 19654
Just as the CPU is the brains of your computer, the SSD is the brains behind your storage drive. Though many companies produce SSDs, most don’t make their own controllers. Phison is a leader in the SSD controller space and one of only a few companies that produce the hardware that manages your precious data on the latest flash.
Phison has spearheaded the PCIe Gen4 NVMe SSD market with its PS5016-E16 NVMe SSD controller and has enjoyed staying on top for quite a while. Samsung’s 980 PRO recently dethroned Phison the top-ranking title, but Phison’s next-gen PS5018-E18 NVMe SSD controller may lead their way to victory once again, assuming the final firmware quirks get worked out.
The Prototype with a Speed Governor
Phison was gracious enough to send over an early engineering sample of the PS5018-E18 to play with. However, as exciting as early sampling is, ES units aren’t without drawbacks. The unfortunate part here is that the device is roughly 1-2 firmware revisions away from production and paired with slower than optimal flash. The company officially rates the PS5018-E18 to deliver throughput of up to 7.4 / 7.0 GBps read/write as well as sustain upwards of 1 million random read and write IOPS with next-gen flash.
(Image credit: Tom’s Hardware)
Our prototype comes with 2TB of Micron’s 512Gb B27B 96L TLC flash operating at 1,200 MTps rather than Micron’s recently announced 176L replacement gate TLC flash, capable of saturating the controller’s max interface speed. While this prototype won’t be nearly as fast as the final production units, it is interesting to see how it compares in testing at this point with the current generation flash. A recent news post shows that it is even capable of sustaining a hefty 1.2 million random write IOPS in the configuration we have in our hands today.
Architecture of PS5018-E18 SSD Controller
(Image credit: Tom’s Hardware)
Built from the ground up and produced one TSMC’s 12nm technology node, Phison’s PS5018-E18 is quite the capable PCIe 4.0 x4 SSD controller in terms of features and performance. Phison crammed in five Arm Cortex R5 CPU cores into this thing with three acting as primary cores for the heavy work while the other two are clocked lower for the company’s Dual CoXProcessor 2.0 code to efficiently help offload some of the strain from main core workloads.
Image 1 of 2
(Image credit: Tom’s Hardware)
Image 2 of 2
(Image credit: Tom’s Hardware)
The controller interfaces with the NAND over eight NAND flash channels at up to 1,600 MTps and supports capacities of up to 8TB with 32 chip enables. There are eight packages on our sample, four on each side thanks to the small size of the controller that measures just 12 x 12mm. The design leverages a DRAM-based architecture, too, with our sample containing two SK hynix DDR4 chips, one on each side of the PCB.
Features of Phison PS5018-E18 SSD Controller
Phison’s PS5018-E18 meets the NVMe 1.4 spec and comes with a bunch of features. As per usual, it comes with support for both Trim and S.M.A.R.T. data reporting. Like other controllers, it supports Active State Power Management (ASPM), Autonomous Power State Transition (APST), and the L1.2 ultra-low power state. Thermal throttling is implemented, but isn’t of much concern as the new controller doesn’t get too hot in most use cases, and mind you, that is without a nickel integrated heat sink.
It also leverages the company’s fourth-generation LDPC ECC engine, SmartECC (RAID ECC), and End-to-End Data Path Protection for robust error correction and enhanced data reliability. It even supports hardware-accelerated AES 128/256-bit encryption that is TCG, Opal 2.0, and Pyrite compliant and comes with crypto erase capability.
Phison’s E18 supports a fully-dynamic write caching like the E12S and E16 before. Therefore, the SLC cache size spans 1/3rd of the drive’s available capacity when using TLC flash. The company also implemented SmartFlush, which helps to quickly recover the cache for predictable and consistent performance.
Test Bench and Methodology
Asus X570 ROG Crosshair VIII Hero (Wi-Fi)
AMD 3600X @4.3 GHz (all cores)
2x8GB Crucial Ballistix RGB DDR4 3600 MHz CL16
Sapphire Pulse Radeon RX570 4GB
Corsair RM850x
The initial results you see in this article are with the SSDs tested at 50% full capacity and with the operating system drive using Windows 10 Pro 1909. Also, note that while some of the new PCIe Gen4 SSDs are capable of 1 million IOPS, our lowly 6C/12T Ryzen 5 3600X can only sustain 650-700K IOPS at most. We will soon upgrade our test system’s CPU to a 12C/24T Zen 3 5900X to push next-gen storage to the max.
2TB Performance of Phison PS5018-E18 SSD Controller
We threw in a few of the best SSDs into the mix to gauge the Phison PS5018-E18’s performance. We included two of the top dogs, a WD Black SN850 and Samsung’s 980 PRO as well as Adata’s XPG Gammix S50 Lite, an entry-level Gen4 performer based on SMI’s newest NVMe SM2267 controller and 1,200MTps flash.
We included the Sabrent Rocket NVMe 4.0, which has Phison’s E16 controller and Kioxia’s 96L TLC operating at up to 800MTps, and we added in the Sabrent Rocket Q4, which features Micron’s cheaper 96L QLC flash. Additionally, we threw in Crucial’s P5, Samsung’s 970 EVO Plus, WD’s Black SN750, and AN1500 as some PCIe Gen3 competition.
Game Scene Loading – Final Fantasy XIV
Final Fantasy XIV Stormbringer is a free real-world game benchmark that easily and accurately compares game load times without the inaccuracy of using a stopwatch.
(Image credit: Tom’s Hardware)
When it comes to game loading, the Phison PS5018-E18 proves more competitive than the E16 before it, but with the current flash, even Samsung’s 970 EVO Plus takes the lead over it in this test. The E18 is not quite as responsive as Samsung’s 980 PRO nor WD’s Black SN850, at least not yet.
Transfer Rates – DiskBench
We use the DiskBench storage benchmarking tool to test file transfer performance with our own custom blocks of data. Our 50GB dataset includes 31,227 files of various types, like pictures, PDFs, and videos. Our 100GB dataset consists of 22,579 files, with 50GB of them being large movies. We copy the data sets to new folders and then follow-up with a read test of a newly-written 6.5GB zip file and 15GB movie file.
Image 1 of 4
(Image credit: Tom’s Hardware)
Image 2 of 4
(Image credit: Tom’s Hardware)
Image 3 of 4
(Image credit: Tom’s Hardware)
Image 4 of 4
(Image credit: Tom’s Hardware)
When copying around datasets and reading large files, the Phison PS5018-E18 prototype delivered responsive performance, especially strong read performance, but it isn’t quite on par with the 1TB WD Black SN850 and Samsung 980 PRO. When copying our 50GB and 100GB datasets, the Phison PS5018-E18 ranked fourth place, outperforming most of the Gen3 competitors, but trailing WD’s mighty RAID 0 configured Black AN1500.
Trace Testing – PCMark 10 Storage Tests
PCMark 10 is a trace-based benchmark that uses a wide-ranging set of real-world traces from popular applications and everyday tasks to measure the performance of storage devices. The quick benchmark is more relatable to those who use their PCs for leisure or basic office work, while the full benchmark relates more to power users.
Image 1 of 6
(Image credit: Tom’s Hardware)
Image 2 of 6
(Image credit: Tom’s Hardware)
Image 3 of 6
(Image credit: Tom’s Hardware)
Image 4 of 6
(Image credit: Tom’s Hardware)
Image 5 of 6
(Image credit: Tom’s Hardware)
Image 6 of 6
(Image credit: Tom’s Hardware)
While previous tests show minor gains over the E16, PCMark 10 quick results look to have degraded compared to the E16 and are a little on the low side. That’s a little strange considering there is now an additional core in its architecture. PCMark 10’s Full System Drive benchmark shows improvement, but the Phison PS5018-E18 is still ranking behind both the new Samsung and WD.
Trace Testing – SPECworkstation 3
Like PCMark 10, SPECworkstation 3 is a trace-based benchmark, but it is designed to push the system harder by measuring workstation performance in professional applications.
Image 1 of 3
(Image credit: Tom’s Hardware)
Image 2 of 3
(Image credit: Tom’s Hardware)
Image 3 of 3
(Image credit: Tom’s Hardware)
When hit with some harder workloads in SPECWorkstation3, Phison’s E18 delivered fast performance but didn’t ellipse its competition in the way that Samsung 980 PRO’s performance did. The company will need to work a bit harder to improve to Samsung-like levels here.
Synthetic Testing – ATTO / iometer
iometer is an advanced and highly configurable storage benchmarking tool while ATTO is a simple and free application that SSD vendors commonly use to assign sequential performance specifications to their products. Both of these tools give us insight into how the device handles different file sizes.
Image 1 of 10
(Image credit: Tom’s Hardware)
Image 2 of 10
(Image credit: Tom’s Hardware)
Image 3 of 10
(Image credit: Tom’s Hardware)
Image 4 of 10
(Image credit: Tom’s Hardware)
Image 5 of 10
(Image credit: Tom’s Hardware)
Image 6 of 10
(Image credit: Tom’s Hardware)
Image 7 of 10
(Image credit: Tom’s Hardware)
Image 8 of 10
(Image credit: Tom’s Hardware)
Image 9 of 10
(Image credit: Tom’s Hardware)
Image 10 of 10
(Image credit: Tom’s Hardware)
In ATTO, we tested Phison’s PS5018-E18 at a QD of 1, representing most day-to-day file access at various block sizes. Based on ATTO’s results, the E18 shows the fastest peak sequential results, but once we bumped up the QD, both the Samsung and WD inched ahead in reads.
The E18 came back and demonstrated very responsive write performance, however, peaking at 6.6 GBps. When it comes to random performance, ranking fourth in read performance and first in write performance, the E18 is fairly competitive with the current flash, but not as tuned as its competitors.
First Thoughts on the PS5018-E18 Prototype
Phison’s PS5018-E18 NVMe SSD controller is impressive on paper and has some fast specs. With five CPU cores, it is just one shy of Crucial’s P5, but isn’t shackled down by a Gen3 PHY and runs much cooler thanks to TSMC’s 12nm technology node.
With our prototype using Micron’s 96L B27B TLC and operating at 1,200 MTps, the controller shows noticeable improvements over the company’s E16 in some workloads, but there are still some kinks to be worked out. Samsung’s 980 PRO and WD’s Black SN850 both have the upper hand for now.
(Image credit: Tom’s Hardware)
The Phison PS5018-E18’s performance will be a lot more interesting to analyze once we have finalized firmware and NAND configurations. With support for up to 1,600 MTps NAND flash, higher speeds are just around the corner and a lot of the performance gap will shrink.
In fact, while it wasn’t until just days ago that Micron announced supporting NAND, Phison already has new prototypes with Micron’s faster 176L (B47R) flash in hand and development is well underway. Retail products are just around the corner, roughly a month or two away.
In response to a new wave of attacks that have compromised standard approaches to Windows security, Microsoft announced its Pluton security processor that will reside inside of future consumer chips from both AMD and Intel – but it’s built using a technology that AMD pioneered with its custom processors for Xbox game consoles. It’s also based on a standard feature with AMD’s EPYC server processor chips. Now Intel will adopt the same approach to help secure PCs.
The new collaboration between Microsoft, AMD, Intel, and Qualcomm will enable more robust security that helps prevent physical attacks and encryption key theft while protecting against firmware attacks. Microsoft will also use the technology to streamline firmware updates via Windows Update.
The Pluton security processor comes as a result of recent new attack vectors that indirectly compromise the Trusted Platform Module (TPM), which has long been the preferred method of securing PCs from potential threats. The TPM, a small secondary chip inside the system that stores encryption keys for services like Bitlocker and Windows Hello, is still robust enough to protect encryption keys but malicious actors have learned how to penetrate the bus that connects the TPM to the CPU through physical attacks, thus compromising a system.
(Image credit: Microsoft )
Microsoft says that security must be built directly into the processor to prevent those attacks, hence the Pluton security processor. The new approach to securing the system isn’t really new at all, though – AMD pioneered the in-built security processor with the AMD Security Processor (ASP) in the Xbox game console back in 2013. This in-built 32-bit ARM Cortex-A5 processor is sandboxed from the rest of the processor, thus protecting it from attacks with exploits like Spectre, and provides secure encryption key generation and management to enable a hardware root of trust.
Image 1 of 5
Image 2 of 5
Image 3 of 5
Image 4 of 5
Image 5 of 5
AMD uses this same approach for its EPYC server chips and its commercial processors. For the Xbox, AMD’s secure processor runs Microsoft’s Pluton Security Processor firmware to enable tight integration between Microsoft’s software and AMD’s security hardware. AMD says it will be first to enable the same feature on all of its future client CPUs and APUs, though it hasn’t provided a specific timeline for the release.
Meanwhile, Intel says that it will continue to leverage its Hardware Shield feature in vPro, which only comes with specific commercial SKUs, but now also enable the Microsoft Pluton security processor to provide multiple root of trust options. Intel hasn’t provided a timeline for its adoption of the feature but says that it will support Pluton “as a choice.” It’s a bit unclear if Intel means that it hasn’t been forced to adopt the security processor approach or if the statement means the company will not enable the feature on all chips, thus providing customers with the ‘choice’ to purchase more expensive processors with the feature – much like with its vPro-enabled chips.
The Pluton processor, essentially Microsoft firmware/IP that runs on the sandboxed security processor, will emulate a TPM to maintain broad compatibility with APIs like BitLocker and System Guard. Pluton also uses a Secure Hardware Cryptography Key (SHACK) technology that prevents exposing cryptographic keys, even to the Pluton firmware itself, which will ultimately protect user information from physical attacks.
Finally, the Pluton processor secures the firmware updating process, streamlining the Windows Update process to provide a more unified and consistent method to update system firmware. This has become more important as a slew of security vulnerabilities have necessitated a rapid cadence of new firmwares to plug security holes like Meltdown and Spectre, but the current delivery system is fragmented. By building this functionality into the processor and using it to enable Windows Update to update firmware securely, Microsoft hopes that it, and the silicon vendors, can react to vulnerabilities quicker.
Microsoft is creating a new security chip that’s designed to protect future Windows PCs. Microsoft Pluton is a security processor that is built directly into future CPUs and will replace the existing Trusted Platform Module (TPM), a chip that’s currently used to secure hardware and cryptographic keys. Pluton is based on the same security technologies used to protect Xbox consoles, and Microsoft is working with Intel, AMD, and Qualcomm to combine it into future CPUs.
This new chip is designed to block new and emerging attack vectors that are being used to compromise PCs, including CPU security flaws like Spectre and Meltdown. Intel revealed back in 2018 that it was redesigning its processors to protect against future attacks, and Pluton is an even bigger step in securing CPUs and Windows PCs in general.
Existing TPMs are separated from CPUs, and attackers have also been developing methods to steal the data and information that flows between a TPM and CPU when they have physical access to a device. Just like you can’t easily hack into an Xbox One to run pirated games, the hope is that it will be a lot more difficult to physically hack into a Windows PC in the future by integrating Pluton into the CPU.
Pluton is the same security found in Xbox One consoles.Photo by James Bareham / The Verge
“We shipped the Xbox which has this physical attack protection, so people can’t just hack it for games etc,” explains David Weston, director of enterprise and OS security at Microsoft. “We learned principles of effective engineering strategies from that, and so we’re taking those learnings and partnering with Intel to build something for the PC that will stand up to that emerging attack vector.”
A number of firms sell kits, or 0-day vulnerabilities, that let attackers gain access to machines and literally crack open PCs to steal critical data that can unlock other ways to get into company systems or access personal information. “Our dream for the future is that’s just not possible on the PC platform,” says Weston.
Pluton is essentially the evolution of the TPM, baked directly into a CPU. “This is a better, stronger, faster, more consistent TPM,” explains Weston. “We provide the same APIs as TPM today, so the idea is that anything that can use a TPM could use this.” That means features like BitLocker encryption or Windows Hello authentication will transition over to using Pluton in the future.
Microsoft’s work with Intel, AMD, and Qualcomm also means that Pluton will be updated from the cloud. Updates will be issued monthly on the same Patch Tuesday that regular Windows fixes arrive. The hope is that this should improve system firmware updates for both consumers and businesses that run Windows PCs.
Windows Hello will transition to Pluton in the future.Photo by Tom Warren / The Verge
It’s not clear when PCs with Pluton chips will start shipping, but Intel, AMD, and Qualcomm are all committing to build this functionality into their future CPUs. You’ll still be able to build custom PCs with Pluton chips embedded inside, and there should even be support for Linux in the future, too.
“This is a future thing we’re going to build in,” says Mike Nordquist, director of strategic planning and architecture at Intel. “The idea is that you don’t have to look for a motherboard with a TPM chip… so you just get it.” Nordquist says Intel also supports choice for operating systems, and that it doesn’t “want to start doing different things for a bunch of different OS vendors.” There are no firm details on Linux support just yet, but Microsoft already uses Linux with Pluton in its Azure Sphere devices, so it’s likely to be available whenever these chips ship.
New chips and security do mean new fears about DRM, and the fact that processors will now call back to Microsoft’s cloud infrastructure for updates. “This is about security, it’s not about DRM,” explains Weston. “The reality is we’ll create an API where people can leverage it, it’s definitely possible for folks to use that for protection of content, but this is really about mainstream security and protecting identify and encryption keys.”
Microsoft, Intel, AMD, and Qualcomm all clearly believe that processors that are continually updated with security built into them is the future for Windows PCs. Spectre and Meltdown were a wake up call for the entire industry, and Pluton is a significant response to the complex security threats that modern PCs now face.
Adobe is releasing Arm versions of Photoshop for Windows and macOS today. The beta releases will allow owners of a Surface Pro X or Apple’s new M1-powered MacBook Pro, MacBook Air, and Mac mini to run Photoshop natively on their devices. Currently, Photoshop runs emulated on Windows on ARM, or through Apple’s Rosetta translation on macOS.
Native versions of Photoshop for both Windows and macOS should greatly improve performance, just in time for Apple to release its first Arm-powered Macs. While performance might be improved, as the app is in beta there are a lot of tools missing. Features like content-aware fill, patch tool, healing brush, and many more are not available in the beta versions currently.
Adobe lists a number of known issues for both macOS and Windows, but does note that new features will be added in the weeks ahead. The beta version isn’t officially supported for daily workloads just yet, and is only accessible from the beta apps tab in the Creative Cloud desktop app. Adobe hasn’t mentioned when other Creative Cloud apps will make the transition to ARM64, but Photoshop is a big boost for Arm-powered devices.
Alongside Photoshop support, Blizzard also announced this week that World of Warcraft will run natively on Arm-powered Macs. The latest World of Warcraft includes native M1 support from day one, avoiding the Rosetta translation layer just in time for the launch of World of Warcraft: Shadowlands later this month.
After an update in October on Windows computers, the malware security solution from Malwarebytes resulted in network printers being blocked and reported to the system as being offline. Malwarebytes has now released a beta update so that the network printers in the network are accessible again.
On 20. October 2020 a first user reported in the Malwarebytes forum and complained about a conflict with his network printer. The printer controlled by the TCP / IP protocol was blocked by Malwarebytes and Windows reported that the device was offline. As soon as the web protection function in the Malwarebytes protection software was switched off, printing was possible again.
Other affected persons confirmed the problem in the Malwarebytes forum -Update from October 2020 for network printers that are addressed via TCP / IP. At the beginning of November 2020 a Malwarebytes employee reported in a forum post and stated that the manufacturer was investigating the problem. Affected users could deactivate the “Web Protection” function for the duration of a printing process via the context menu.
During tests, the affected user found that the deactivation of the option SNMP status activated in the properties of the network printer that fixed the problem.
Beta-Fix Since the beginning of November 2020 Malwarebytes is working on eliminating the error, with partial solutions for certain Windows versions being released as beta versions. At the 11. November Malwarebytes released a new beta version 4.2.3. 20 of the protection software with the component package 1.0. 1112. This update is intended to fix the problem of offline network printers for all Windows systems. Bleeping Computer describes how users can obtain this beta under Malwarebytes Premium via the update function.
This is not the first time that Malwarebytes has caused a problem by updating its protection software. Under Windows 10 version 2004 there were blue screens and performance problems. 2018 there was the case that updates of the software led to significant problems.
Apple HomeKit becomes mesh-capable: With the HomePod mini, the manufacturer is launching the first device with thread support this week. The small version of the loudspeaker is not only a “home hub” via which HomeKit devices in your own house can be controlled while on the move, but also serves as a “border router” for thread – it connects the mesh network to the Outside world in order to automate and group the devices connected in this way and control them via Siri voice command or the Home app.
HomeKit via thread at Eve and Nanoleaf The HomePod mini only supports thread on devices that also speak the HomeKit protocol, as Apple previously announced. With Eve and Nanoleaf, the first third-party manufacturers have now announced compatible products.
Eve wants to update its window and door contact sensors by firmware update in November ( Eve Door & Windows) and the networked Eve Energy socket (EU version) with thread support. Thread updates are to follow for other devices, including the Eve Thermo radiator thermostats. The updates will only be available for devices from the year of manufacture 2020, as Eve announced, only these would have the chipset required for thread support.
The company Nanoleaf has announced new LED lamps (A 19 and light strips with thread support, a specific date for the start of sales is still pending. Just like the Eve sensors, Nanoleafs accessories also work via Bluetooth.
Mesh network for better smart home HomeKit via Thread promises to eliminate typical problems with networked home devices: The mesh network should make multiple hubs and / or Bluetooth extenders superfluous because the devices communicate directly with each other. Thread also promises faster response times and higher reliability.
The HomePod mini is currently the only way to connect Thread with HomeKit hardware use. Whether Apple thread support will also be retrofitted to the “large” HomePod or existing home hubs such as Apple TV via software or whether it will only be supported with new generations of devices remains open for the time being.
As is well known, Linus Torvalds does not hold back with his opinion and often participates in the forum of the Real World Technologies website. There he was asked for his opinion on the new MacBooks with “Apple Silicon”, ie with the new ARM processor M1. Torvalds “would love to” use such a notebook, but only if Linux runs on it. “I’ve been waiting for an ARM notebook with Linux for a long time,” he added.
Basically runs Linux problem-free on systems-on-chip (SoCs) with ARM processing cores such as Raspberry Pi, ARM Chromebooks and ARM servers. The iSH app allows the use of a Linux shell on iOS devices.
Linux hurdles with the M1 But some Special features of the new Apple M1 computers make starting Linux more difficult. So far there are no Linux drivers for Apple’s own graphics processors, which are also found in the M1. And the M1 Macs also lack Boot Camp (among other things), which means the parallel installation of alternative operating systems such as Windows or Linux on x 86 Macs
An answer to Torvalds’ forum post at realworldtech.com is exciting, because it comes from the hacking team that developed the iOS jailbreak “checkra1n”. Accordingly, the team is working on bringing Linux to Apple’s M1 systems and has identified other important hurdles.
Booting Linux should work in principle because there is a “Permissive” in Apple’s Secure Boot Mode in which you can integrate your own cryptographic signatures.
Apple Silicon: From ARM processors and the future of the Mac | Comment on the “One More Thing” keynote However, like Apple’s A processors, the M1 processors do not boot with EFI firmware, but with iBoot. According to the checkra1n team, however, it should be possible to reload a UEFI bootloader including ACPI tables in order to start Linux. For example, ARM servers also boot according to the SBSA specification.
Hypervisor solution One idea is a slim hypervisor that loads Linux into a VM and emulates certain functions. This is necessary because Apple has built in a special interrupt controller called the Advanced Interrupt Controller (AIC) and not the common ARM Generic Interrupt Controller (GIC).
Also for the power management of the SoC Forum participants can see open construction sites via the Power State Coordination Interface (PSCI) and the Collaborative Processor Performance Controls (CPPC). He estimates that it will take a long time before Linux will start on Apple M1 computers.
Sony PlayStation 5 DualSense Controller (Image credit: Sony PlayStation)
A recent Steam beta client update has added basic support for Sony’s Playstation 5 DualSense controller.
One of the first questions I asked when Sony revealed its newest gaming controller for the PlayStation 5 was would it include native Windows/PC support? I eventually found out that the answer to that was no. Fast forward to the official release of the DualSense controller. I had one shipped to me, and the first thing I did was connect it to my Windows 10 PC via USB Type-C cable and Bluetooth. While the controller was recognized and usable to an extent, it was clear that the controller wasn’t natively supported. With no official word from Sony, there was no telling if or when we’d see any support for the DualSense when it came to PC gaming.
Thankfully, as with most things with PC gaming, there’s always a workaround or someone out there working on a solution. It turns out that for the DualSense controller, solutions are being worked on.
The first one comes via a recent Steam beta client update that has enabled preliminary support. This is just the first initial push of getting DualSense support added. It lacks many advanced features, such as Trackpad, Rumble, and Gyro functionality, but it works.
Upon hearing about the Steam beta, I downloaded it, restarted Steam, and plugged in my DualSense controller. Sure enough, it was recognized… but as a PlayStation 4 controller. Taking a look over that the Steam Community Forums, that is the expected result. Everything looked to be in order, so I gave it a try in several games, including Streets of Rage 4, Tekken 7, Mortal Kombat 11, and Yakuza Like a Dragon. All of which worked, though without rumble support.
The fact that Valve has already added this to Steam is a godsend. And while Valve is no longer supporting its own gaming controller, it’s nice to see that it’s made it a mission to support just about any and every gaming controller that can be connected to a PC. I fully expect Valve to get every function of the DualSense working with the Steam client, just as they did with the DualShock 4 controller.
While of course only time will tell, it feels like it’s just a matter of time before Steam gets almost all the features enabled — outside of the haptic feedback, which would need in-game programming to work.
The updates fill two gaps. One of these can be used to remotely inject and execute code.
Citrix provides security updates for the virtualization products XenDesktop and XenApps. These remove two loopholes, one of which is (CVE – 2020 – 8270) can be used to smuggle in and execute code remotely (Remote Code Execution, RCE). To do this, however, the affected system must offer Windows file shares that the attacker can access.
The manufacturer provides further information and download links for hotfixes here: CTX 285059: Citrix Virtual Apps and Desktops Security Update.
(ju)
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.