Google Assistant is now available on newer Samsung smart TVs in the US, the companies announced. The AI helper will be available on Samsung 2020 smart TV models, including the 2020 8K and 4K QLED models, the 2020 Crystal UHD TVs, 2020 Frame and Serif sets, and 2020 Sero and Terrace models. The update will be available in the US first and roll out to more countries soon, according to Google.
Pressing down on the TV’s remote control mic will activate Assistant, and users will be able to switch channels, open apps, and adjust the TV’s volume with voice commands. The TVs already had voice command settings available that could be controlled with the mic on the remote, but you’ll have to tell it you want to use Google Assistant rather than Samsung’s default Bixby assistant. And you can control other smart home devices you have connected to Google Assistant, too. After installing an OTA update, users can enable Assistant on their Samsung TV by going to Settings > General > Voice > Voice assistant.
There were hints that Samsung was considering more uses for Google Assistant, with reports over the summer that the companies were negotiating to feature Assistant more prominently on Samsung devices. It’s not totally clear what this means for Bixby, which Samsung introduced in 2017, but the company reportedly has started phasing out some Bixby augmented real
(Pocket-lint) – Samsung’s main focus may be on 8K tellies these days, but given the current lack of content at that resolution, many are just more interested in a 4K alternative. The Q95T represents the top of the company’s QLED 4K line-up, while boasting many of the features found on the higher-end 8K models.
These features include a direct full-array backlight with local dimming, the latest AI-enhanced upscaling, an anti-reflection screen, Object Tracking Sound (OTS), gaming features, a One Connect box to keep cables out of sight, and the full Tizen OS with a comprehensive choice of streaming services.
Design, connections and control
One Connect Box
Wi-Fi; Bluetooth; AirPlay 2
Single fibre optic cable that includes power
4x HDMI inputs with eARC; 3x USB; Ethernet
The Samsung Q95T sports the company’s ‘boundless’ design, with a 2mm wide frame around the top and side edges of the chassis, and a slightly wider frame along the bottom. It’s impressively sleek, although not quite as cool as the 8K ‘infinity’ screen, where the image goes out to the edge.
Overall the QT95 is a stunning piece of industrial design, with a brushed metal outer edge, a textured rear panel, and an anti-reflection filter on the front. The build quality is excellent, and the whole set measures only 35mm deep, despite containing a direct full array LED backlight and six speakers.
The solid stand mirrors the panel’s sleek styling, and is angled forward to create the illusion the image is elegantly floating in space. The stand is finished in carbon silver, measures 300 x 280mm, and there’s 110mm of clearance if you’re planning to use a soundbar.
If you’d rather wall-mount the Q95T, there’s the optional ‘No-Gap’ bracket, while the included One Connect box allows the screen to be connected by a single thin cable, making installation mess-free. The TV comes with a nearly-invisible 10m cable, but an optional 15m version is also available.
The One Connect box houses four HDMI inputs, one of which (HDMI 3) supports eARC (which you can read more about here). One of HDMI inputs is capable of handling 4K/120Hz, VRR (variable refresh rate) and ALLM (auto low-latency mode), making this TV a great choice for gamers – unless you’re planning on buying both competing next-gen consoles.
There are also two USB 2.0 inputs, twin tuners for terrestrial and satellite broadcasts, a CI slot, an optical digital output, an ex link 3.5mm jack, and an Ethernet port. In terms of wireless connections, there’s Wi-Fi, Bluetooth and support for Apple AirPlay 2.
The are two remotes: a basic black version and a sleek metal zapper with an ergonomic shape that makes it comfortable to hold and easy to use with one hand. There’s a microphone for voice control, and direct access buttons for Netflix, Amazon and Rakuten TV.
Features: AI-enhanced imaging and next-gen support
High Dynamic Range (HDR) support: HDR10, HLG, HDR10+
Processing engine: Quantum 4K Processor with AI
100% of DCI-P3 and 1700nits of peak brightness
Anti-reflection screen and Ultra Viewing Angle
The Samsung Q95T boasts the 2020 iteration of the company’s Quantum Processor 4K. This uses AI-enhanced machine learning to deliver superior upscaling and image processing, as well as an Adaptive Picture mode to adjust the image on-the-fly – although the latter mode is best avoided in our opinion.
The Quantum Processor also analyses the audio signal and the TV’s location in the room to optimise the sonic performance. There’s Adaptive Sound+ (which Samsung describes as a real-time scene-by-scene analysis to identify sound type), along with Active Voice Amplifier for clearer dialogue, and Adaptive Volume that adjusts the volume based on content.
For 2020 Samsung has added Object Tracking Sound (OTS), which uses six speakers and 60W of amplification for a more immersive sonic experience. Along with the left and right speakers at the bottom, there are a pair of subs and additional speakers at the top for greater immersion.
There’s support for high dynamic range, specifically HDR10 and HDR10+, with dynamic metadata designed to tailor the performance on a scene-by-scene basis. The Q95T also supports HLG (hybrid log-gamma), which is the new broadcast standard used by the BBC and others.
Samsung claims a peak brightness of 2,000 nits and 100 per cent coverage of DCI-P3 colour, but in our actual measurements the peak brightness topped out at 1,700 nits and the colour gamut reached 94 per cent of DCI-P3 – which is still really good, just not as claimed.
The Q95T includes a couple of features first introduced in 2019: an anti-reflection screen and Ultra Viewing Angle technology. The former is designed to reduce reflections from ambient light in the room, and works well, making this an effective TV for daytime viewing.
The Ultra Viewing Angle tech addresses an inherent limitation in the the VA panels that Samsung uses in its QLED TVs. This innovation significantly reduces the colour and contrast drop-off experienced as you view the TV at more extreme viewing angles, and proves highly effective.
The Q95T uses a direct LED backlight with Quantum Dot filters that produce brighter and purer colours, thus expanding the colour gamut. There’s also local dimming, and while Samsung’s algorithm is highly effective, we counted 120 independent zones, which is fewer than the previous model.
Samsung has also added a Filmmaker Mode, which is designed to deliver an image that represents the content creator’s original intentions. It uses brightness and colour settings to match the industry standards, and turns off any unnecessary processing or frame interpolation.
Picture quality: An impressive all-rounder
The Samsung Q95T might not be quite as well-specified as the previous Q90R, but the performance remains impressive. The TV delivers a clean and detailed image regardless of whether you’re watching standard dynamic range or HDR content, the black level delivery is solid, highlights are suitably bright, and colours pleasingly saturated.
The AI-enhanced image processing and upscaling is highly effective, even making standard definition content watchable. There are no obvious artefacts, although the lack of sharpness and clarity is obvious once you switch to the same programme in high definition. Working with higher resolution content, the processor is able to squeeze out every last pixel of detail.
When you feed the Q95T a 4K signal it’s able to really shine, revealing a beautiful presentation that’s bursting with fine detail, nuanced colours, and inky blacks. The local dimming is applied with remarkable skill, despite the reduced number of zones. As a result a dark show like The Haunting of Hill House remains clearly defined, and never descends into a morass of smudged greys.
Given its peak brightness this TV doesn’t need to tone map 1,000 nits content at all, but does an excellent job of mapping 4,000 nits content, ensuring there is detail in both the shadows and the highlights. The local dimming algorithm also delivers bright highlights without any noticeable blooming, which enhances the overall experience.
The use of Quantum Dots allows for a superior colour volume, even at high brightness, which allows the Q95T to take full advantage of the wider colour gamut used in HDR. A great example is Guardians of the Galaxy Vol.2, which is a riot of saturated colours that the Samsung renders with wonderful levels of detail.
The Q95T also impresses when it comes to motion handling, and with Picture Clarity settings turned off any 24p content looks smooth, with no judder or unwanted artefacts. The motion settings apply frame interpolation, resulting in smoothing, which can be useful when viewing sports; while the LED Clear Motion setting uses black frame insertion, darkening the image, but improving the motion.
The built-in apps produce some superb 4K and HDR images, and in the case of Amazon Prime there’s the added enhancement of HDR10+. The levels of detail and contrast are equally as impressive when watching Netflix, Apple TV+ and Disney+, and given all three of these apps use Dolby Vision it’s disappointing that the Q95T doesn’t support the format.
The Q95T offers numerous features aimed at the PS5 and Xbox Series X next-gen consoles. These include VRR (variable refresh rate) for syncing the TV’s refresh rate with the console’s frame rate, thus reducing tearing, along with support for 4K at 120Hz. Samsung’s QLED TVs also support AMD Freesync, but the company is still in the process of certifying Nvidia G-Sync.
There’s ALLM (auto low-latency mode) for automatically detecting a console and selecting the Game mode, resulting in a blink-and-you-miss-it input lag of just 9.4ms. However this can result in a degree of flicker, so the Game Motion Plus is designed to smooth out motion. This feature is effective, but in doing so it does increase the lag to 22.5ms – although that’s still a very fast response time.
Finally there’s a multi-view mode that allows you to watch two different sources simultaneously. You can adjust the size of the two picture-in-picture screens, change their relative position and choose which has audio priority. This feature isn’t necessarily game-specific, but it’s useful for gaming while watching YouTube tutorials.
Smart features: Comprehensive platform
Tizen OS
Alexa/Google Assistant/Bixby built-in
Samsung’s smart platform is based around the Tizen operating system, and remains a slick, intuitive and easy-to-navigate interface. There’s a launcher bar along the bottom and a second layer that provides faster access to the video streaming services.
When it comes to streaming apps there’s a comprehensive choice, with Netflix, Amazon, Now TV, Rakuten, YouTube, and all the UK TV catch-up services. Samsung also supports Disney+ and AppleTV+, giving its platform a full house of video-related apps.
The Universal Guide helps you keep track of all this content by presenting it all via a user-friendly interface. It then uses AI machine learning to analyse your viewing habits and create a single ‘For You’ page with personalised content to suit your tastes.
New for 2020 is the Digital Butler, which allows for quick and easy connection by automatically scanning for nearby devices, detecting them and then representing all of them in an easy-to-understand graphical fashion.
The Q95T also offers the benefits of built-in smart assistants, with Samsung currently offering its own Bixby, along with Amazon Alexa. The company plans to add Google Assistant at some point, and you can even access Siri via Apple’s AirPlay 2.
The only annoying aspect of Samsung’s smart system is the requirement that you create a Samsung account when first setting-up the TV (assuming you don’t already have one). This blatant attempt to tie you into the company’s ecosystem is frustrating and time-consuming.
Sound quality: Object tracking sound
Object Tracking Sound (OTS)
Adaptive Sound+
Q Symphony
The Q95T’s integration of its six speakers is seamless, hidden behind a pattern of tiny holes in the outer edge of the TV cabinet. The sound quality is also impressive, with an open soundstage and plenty of power in the amplification.
When you first install the TV there’s a sound optimisation feature that sends out test tones and measures them using built-in microphones. This allows you to optimise the sonic performance depending on whether the TV is stand- or wall-mounted.
Object Tracking Sound doesn’t just involve more speakers, it also analyses the audio signal and uses sophisticated processing to align sounds with the location of specific images on the screen. It really works, creating an engaging experience with improved directionality and immersion.
There’s no on-board Dolby Atmos decoding, but the Q95T can send Atmos back via ARC from its internal apps to a supporting soundbar or AV receiver. Since it also supports eARC, the Q95T can even pass lossless audio back via HDMI to a supporting soundbar or AV receiver.
If you have a 2020 Samsung soundbar, you can also benefit from Q-Symphony which provides audio synergy with the TV. This feature enables the speakers in the soundbar to work in conjunction with the top speakers on the Q95T, resulting in an increased sense of immersion.
Anything missing?
As with all of Samsung’s TVs the Q95T doesn’t support Dolby Vision – an HDR format that uses dynamic metadata to deliver a more refined experience. There is support for HDR10+, which is similar, but there’s significantly more content available in Dolby Vision.
Unlike its predecessor the Q95T uses an 8-bit (FRC) panel, and it would appear Samsung is only using native 10-bit panels on its 8K line-up in 2020. While it’s a shame that a high-end 4K model isn’t 10-bit, the reality is you almost certainly won’t notice any difference with actual viewing material.
While we’re on the subject of Dolby, the Q95T also doesn’t include on-board Atmos object-based audio decoding. It’s a shame because with all those extra speakers for Object Tracking Sound, the results would be impressive. At least the Q95T can pass lossless Atmos back via eARC.
Samsung’s smart platform is undeniably comprehensive, but it doesn’t include Freeview Play. This isn’t really an issue because all the UK TV catch-up services are present and correct, but it does mean these aren’t integrated within the EPG (electronic programme guide).
Verdict
The Samsung Q95T proves to be an excellent all-round 4K performer, offering the benefits of QLED brightness and colour and many of the features found in the higher-end 8K models. This TV delivers a fantastic picture with both standard dynamic range and HDR, and thanks to a host of gaming features and an incredibly low input lag, it’s also ideal for hard-core gamers.
The inclusion of six speakers and Object Tracking Sound results in a TV that sounds really good, making for a nice change, while there are also useful features like eARC. In addition the smart platform is slick and responsive, plus it boasts every app you’ll ever likely need. In fact the only real complaint is Samsung’s continued refusal to embrace Dolby Vision.
The QT95 is a cost-effective choice for those not yet convinced by 8K. It is a downgrade on 2019’s 4K flagship, but it still offers many of the features found in the higher-end 8K range, while delivering cracking pictures and including support for the next-gen of consoles – making it a great choice for gamers.
Also consider
LG CX
This 4K OLED is similarly priced and offers an equally slick smart platform and comprehensive choice of apps. The picture quality is impressive, with effective AI upscaling, and while not as bright with HDR, the black levels are superior. The LG is also a gamer’s dream, and while there’s no HDR10+ support there is Dolby Vision, which is preferable.
(Pocket-lint) – The final version of the Galaxy S20 to launch in 2020 is designed to offer the fans what they want. That’s behind the name – Samsung Galaxy S20 FE, or Fan Edition – while strategically, it gives Samsung another roll of the S20 dice, to try and attract those who didn’t take a bite of the apple the first time around.
Make no mistake, this is a flagship phone that’s been pulled back slightly to land at a much more appealing price – and that makes this a winning Samsung Galaxy phone. Indeed, it makes it a winning flagship choice – whether you’re a Samsung fan or new to the brand.
Let’s talk about the ‘Glasstic’ design
Six colours: Cloud Navy / Lavender / Mint / Red / White / Orange
Dimensions: 74.5 x 159.8 x 8.4mm / Weight: 190g
Plastic back (AKA ‘Glasstic’)
IP68 waterproof
A lot is said about the switch from glass to plastic – what Samsung calls Glasstic. There’s a seam of criticism that’s outspoken on social media about what constitutes premium and what doesn’t. For the Galaxy Note 20, a lot of people – presumably who had never seen the phone, let alone held it – dismissed it as wrong because it was plastic, not glass.
That’s a hard position to take, in reality, because the durability of plastic is so good. Cutting the number of glass surfaces in half means there’s less to smash for starters. And having now lived with the Samsung Galaxy S20 FE for a couple of weeks, it’s easy to see the materials debate as a storm in a teacup.
The Glasstic back is warmer to the touch, it feels softer, it has a matte finish, and it’s a lot easier to keep clean. There’s barely a smudge or smear on the thing, unlike the glass back of so many other phones. It’s cheaper, sure, but the phone is cheaper too (unlike the Samsung Note 20, which is where this controversy started).
But importantly the fit, finish, look and feel of the Galaxy S20 FE isn’t cheap. In fact, we prefer it, along with that assurance that if we do drop it, we’re unlikely to see a catastrophic shattering of the back of the phone. Yes, we’ve broken plenty of glass phones in our time.
The other big thing you’ll notice from a design point of view is that the display is flat. That makes the bezels more visible, but they’re actually very neat. It’s slightly strange that the punch hole camera seems to have a reflective ring around it; it doesn’t look deliberate, but it certainly catches the eye and that means it doesn’t blend in quite as well as some do. In reality, that doesn’t matter – we’ve not noticed it much when using the phone day-to-day, so we’re happy to set it aside as a peculiar quirk.
Otherwise the Galaxy S20 FE is very much an execution of Samsung design, wrapped in a package with IP68 protection – meaning it’s waterproof to 1.5m deep for up to half and hour. It’s great to have protection at this level, as it’s usually the preserve of the most expensive phones.
Display
6.5-inch AMOLED panel, 120Hz refresh rate, 2400 x 1080 resolution
Under display fingerprint sensor
Samsung is known for its display panels and the S20 FE doesn’t disappoint. With the curved edges gone, there’s perhaps slightly less wow factor to this phone. But at the same time it avoids the downsides of curved screens – namely that they can introduce phantom touches around the edges or that the extreme edges are less responsive.
That’s something we’ve noticed in games in the past – and having put the S20 FE through many hours of Call of Duty Mobile, we’re actually very happy with it. The size, too, is great, because it feels like this is the sweet spot for us, remaining manageable while giving content enough space to shine.
It’s a Full HD+ display with a 120Hz refresh rate, designed to reduce blur in fast motion. That can be particularly important in games and there’s a growing list of titles that support faster refresh rates, from the likes of Real Racing 3, through to the forthcoming Call of Duty Mobile update.
Whether that makes a difference to you will depend on a lot of things – how much time you spend gaming, how keen your eyes are, how much you spend fast scrolling through information-dense apps – but in many cases, the celebration of faster refresh rates is a little misguided, with many confusing software optimisation and hardware performance. Bad software or a slow user interface (UI) is still just as bad at 120Hz.
Look for that 120Hz smoothness and you’ll find it, but Samsung still gives you the option to switch back to 60Hz, which potentially reduces battery drain as the display isn’t working as hard. For this review we’ve left the phone at 120Hz because the drain on the battery hasn’t been a concern.
As with all Samsung devices, the brightness and pop to the display really makes a difference. Moving to this phone from a mid-range phone – albeit a good mid-range one – the Nokia 8.3 – and there’s still a quality difference here between Samsung panels and some alternatives.
The vibrancy of the colours is a delight, bringing a richness to your content so things just look better than on other devices. You can alter the saturation if you find it too rich, but we also really like the fact that the Samsung One UI will let you turn off adaptive brightness from the slider, meaning it’s really easy to switch off auto-brightness when you don’t want it – such as when gaming – and switch it back on again when you do, for when it gets dark.
The richness of the display can crush some darker tones though. While the deep blacks are impressive – and a signature look of AMOLED displays – than can mean that shadow detail on movies can vanish, so you might have to pump the brightness up a little.
There’s also one minor thing that might impact on those with polarised sunglasses: the screen dims radically in portrait orientation. But it’s not a complete blackout, so it’s shouldn’t be a huge issue come summertime.
Hardware and specs
Qualcomm Snapdragon 865 processor
6GB RAM, 128GB storage
4500mAh battery
5G connectivity
Before we talk about the hardware, let’s talk about the fingerprint scanner. This is an optical scanner rather than the ultrasonic scanner that Samsung uses in its higher-spec S20 devices. And we can tell the difference. We’ve found it less reliable and slower to unlock that some other implementations and it feels like a step-down from the other S20 models.
Having used some old school rear-mounted fingerprint scanners recently – which are blisteringly fast – this under-display scanner feels a little slower. Having had some fails we recorded the same thumbprint again and that seemed to make it a little better. It’s not a huge issue, but it’s just not the slickest out there.
What does stand-out is that this is a Qualcomm Snapdragon 865 Samsung phone – something that’s a rarity in the UK and Europe for Samsung Galaxy devices, which are usually Exynos powered. That was part of the reasoning behind this FE device, as globally, the 5G version is Snapdragon 865. It’s paired with 6GB RAM, slightly less than the Galaxy S20 models, but so far we’ve not noticed that to make any real difference.
That’s a key thing to look out for on this phone, however, as the 4G version (if the 4G version is available in your market, which it is in Europe), will be the Exynos 990 version. Just make sure you know what you’re buying and what you want from it. The Qualcomm vs Exynos debate is one that’s been raging amongst Samsung fans for many years. In reality, the difference – from prospective power, to a purported faster draining of battery on Exynos – is likely only slight and in most situations, you’d have no idea what was powering your phone unless someone told you.
That said, having closely followed Qualcomm’s developments over the past few years and used a good number of Snapdragon 865 devices in 2020, we’ve been universally impressed by the S20 FE’s performance. Importantly, it gives you a more direct comparison to other devices you might be looking at. Companies like OnePlus and Xiaomi offer aggressively-priced Snapdragon 865 devices – and now Samsung feels a little more competitive in this space too.
The performance really speaks for itself. There’s speed and snap to everything and we’ve found this a great phone for gaming, because it can cope with the demands that some modern games are stating to put on phones. We’d seen a little slowdown in the last-generation of flagship devices when playing games like Call of Duty Mobile (especially Shipment 1944, which can be really busy) and Snapdragon 865 copes with this better – so the Samsung Galaxy S20 FE copes better.
The 128GB storage is pretty standard and the support for up to 1TB extra via microSD means that future expansion is open to you – storage basically won’t be a worry. There’s no sense that Samsung is trying to leverage storage to pump up the price – as you’ll find on the iPhone.
The S20 FE’s battery capacity, at 4,500mAh, is reasonable – the same as the Galaxy S20+ that this phone directly rivals – and we’ve found it to get through days without a problem. Throw in a couple of hours of hardcore gaming and you’ll be needing that charger towards the end of the day, but overall, this phone feels like it benefits from the hardware changes made for the FE to give better endurance.
We also need to give a word over to sound quality. There’s no 3.5mm headphone socket, there’s no adapter in the box for USB-C to 3.5mm either, but it is equipped with excellent speakers. These support Dolby Atmos, in that you can toggle on a virtualised wider sound stage to make things sound better, and there’s enough volume to really make the phone sing. We’ve even had people across the room ask with surprise if it’s the phone producing that sound. It’s great for YouTube, it’s great for gaming – it just sounds great thanks to the AKG-tuned stereo speakers.
What we like about the camera arrangement on this phone is that there are few gimmicks. There’s no unnecessary macro or depth sensor, just three usable, good quality lenses.
The main camera is the same 12-megapixel that we saw on the Galaxy S20 and we were impressed with the performance – which seems to be much the same on the S20 FE. It has the same great high dynamic range (HDR) treatment, meaning photos really have some pop and balance to them.
The main camera is generally a good performer, with plenty of options thrown in, including some smart suggestions to help you get better pictures. If it’s dark it will suggest night mode (just as the Google Pixel does), although it will attempt to take pictures in lower light anyway. These are passable, but the night mode is a lot more accomplished, thanks to longer exposure times, so you’ll get a cleaner shot. Night mode is also available across cameras – it can be used on the ultra-wide (although the results are poor), when zooming (although it’s limited to 10x digital zoom), as well as on the front camera.
The main camera is joined by a ultra-wide camera and telephoto, both very usable, but spec spotters will notice that the telephoto here is just 8-megapixels, rather than 64-megapixels as on the S20 models. That’s where the cost savings come in. On those Galaxy models launched earlier, the 64-megapixel sensor was also used to capture 8K video, which the S20 FE can’t offer at this resolution. As it sticks to 4K, that video is captured through the main camera instead, with 4K/60fps the top option. There’s also an experimental HDR10+ video capture mode, although that’s only available up to 4K/30fps.
The ultra-wide camera is always popular, able to capture shots that squeeze lots into the frame easily, which is ideal for the great outdoors. This can be a little blurry towards the edges of the frame, but generally it’s a fun lens, especially if you have good bright conditions.
Best smartphones 2020: The top mobile phones available to buy today
The telephoto lens offers Samsung’s 30x Space Zoom, wanting to keep up with the other S20 models. Here it has a 3x optical lens that then gives 10x digital zoom to hit that headline figure. It’s stabilised and you can get usable shots from it, but the quality drops off rapidly. The 3x shots are pretty good, but at 10x zoom it’s losing grip – and by the time you get to 30x zoom it’s more like a watercolour painting.
Samsung plays a little trick in lower light, using the main camera for zoom out to 10x to try and deliver better quality from the larger sensor and wider aperture, before switching to the telephoto for the end of the range (which isn’t great). If it’s really dark, i.e. night time, it doesn’t use the telephoto lens at all – it just stops at 10x digital zoom.
The front camera is reasonable too, but not hugely impressive. In daylight conditions you’ll get a good selfie from it, and there’s fun options for bokeh with effects, but as soon as the light dips you’ll find image noise is more present. Indoor shots will see shadow noise and here it can’t compete with some rivals – the Google Pixel beats it in our opinion (although admittedly there’s a harsher styling to Google’s selfies) – although supporting night mode is also useful and can make a noticable difference.
We question the use of a 32-megapixel sensor, which is used for pixel combining for better lower-resolution quality, and we suspect this is behind the fairly poor dynamic range of the selfie camera. But it does have one trick: switching views when it detects a larger group. This means it can take a wider selfie – and the files are actually different, moving from a 6.5-megapixel (average) to 10-megapixel for the wide – suggesting the pixel combining it’s using is different. You can also take 32-megapixel selfies if you want – although why you’d want to, we can’t imagine.
Verdict
Back when the Galaxy S10 Lite launched it raised the question about what Samsung was trying to achieve with its sub-tier flagship devices. That phone was anything but “lite” and very respectable. The S20 Fan Edition seems to be the new change of direction that’s easier to understand: a device that retains the flagship essentials, while making a few calculated moves to reduce the price.
The S20 FE’s overall spec – ignoring the storm-in-a-teacup Glasstic rear – stands the phone in good stead against many rivals. Sure, there are cheaper phones using the lower-spec Snapdragon 765 platform that are very good, but the Galaxy feels like a phone designed to draw you towards something a little higher-end. We’re drawn in, so it succeeds on that front.
So will the Fan Edition cannibalise the full-fat S20 models? There’s a good chance that it will. It’s free from gimmicks, it appears to hit all the right notes, and for us it’s the pick of the bunch.
Also consider
Poco F2 Pro
Dirt 5 on next-gen, Nest Audio review and more – Pocket-lint Podcast 74
Poco has always been about offering what fans want – and that’s power in a package that’s affordable. There are some drawbacks – the cameras aren’t great and the software isn’t that refined – but you get a low of power for your money.
Read our review
OnePlus 8
OnePlus has always been popular thanks to affordability and software optimisation. It’s a little cheaper than the Samsung Galaxy S20 FE – but Samsung offers a better camera experience.
The ASUS TUF Gaming GeForce RTX 3080 OC is the company’s biggest bet on the TUF Gaming brand since its transformation from a hyper-durable motherboard label to a gaming-centric name targeted at gamers looking for durable, value-conscious products a few years ago. ASUS typically positions TUF Gaming below the coveted ROG brand. Over the past few years, we’ve seen the company design TUF Gaming graphics cards based on entry-thru-mainstream GPUs, but this is the first outing with an enthusiast-segment GPU, NVIDIA’s flagship GeForce RTX 3080 “Ampere,” which debuted earlier this month. ASUS directed considerable engineering efforts into making this a premium product with several practical features.
The GeForce RTX 3080 by NVIDIA, based on the Ampere graphics architecture, is the green team’s first new consumer graphics technology in two years, and an exercise at making real-time raytracing meet next-generation performance. NVIDIA took the bold step of introducing real-time raytracing to the gaming segment in 2018 with RTX—a technology that combines real-time raytraced elements, such as lighting, shadow, reflections, ambient occlusion, and global illumination, with conventional raster 3D scenes to significantly increase realism. With Ampere, NVIDIA is introducing its 2nd generation with even more RTX effects at better performance. The RTX 3080 is designed to make AAA gaming with raytracing at 4K UHD possible at 60 Hz, or with high refresh-rates at lower resolution.
The GeForce Ampere architecture introduces a new generation double-throughput CUDA core that can perform concurrent FP32+INT32 math. The new 2nd generation RT core doubles the BVH traversal and intersection performance over the previous generation and introduces new fixed-function hardware that enables newer RTX effects, such as raytraced motion blur. The 3rd generation Tensor core shares many design elements with its HPC cousin powering the A100 Tensor Core processor NVIDIA launched this spring, which leverages the sparsity phenomenon in deep-learning neural networks to double AI inference performance by an order of magnitude. NVIDIA heavily uses AI in its raytracing pipeline, and image-quality features such as DLSS. The new DLSS 8K feature takes a stab at 8K gaming by taking advantage of AI.
The new GeForce RTX 3080 features more than double the CUDA core counts than the previous-generation, with over 8,704 CUDA cores, 68 RT cores, 272 tensor cores, 272 TMUs, and 96 ROPs. All this compute muscle is fed data by an updated memory setup consisting of 10 GB of 19 Gbps GDDR6X memory across a 320-bit wide memory interface, which works out to 760 GB/s of bandwidth; that’s 70% higher than the previous generation. There are several other “next generation” bits, such as support for PCI-Express 4.0 x16 bus, the latest HDMI 2.1 and DisplayPort 1.4a connectivity, and AV1 video acceleration. NVIDIA built the GA102 over the 8 nm FFN process Samsung designed specially for NVIDIA.
The ASUS TUF Gaming GeForce RTX 3080 OC retains the military-grade metal aesthetic characteristic of the brand. It uses a meaty cooling solution that features multiple fin stacks which are ventilated by a trio of ASUS’s new-generation Axial-Tech fans. There’s lavish use of metal on both the cooler shroud and the backplate. The cooler is longer than the PCB underneath. Much like NVIDIA’s Dual Axial Flow-Through cooling solution on the Founders Edition card, ASUS used the extra length of the cooler to vent air through the card. ASUS gave the card factory-overclocked speeds of 1785 MHz GPU Boost (vs. 1710 MHz reference). The card draws power from a pair of conventional 8-pin PCIe power inputs. ASUS is asking $730 for the RTX 3080 TUF Gaming OC, a tiny $30 premium over the reference design. In this review, we pit the card against a large selection of graphics cards, across an equally large selection of games.
Zotac today releases its GeForce RTX 3080 Trinity graphics card, a premium custom-design offering based on the GeForce RTX 3080 “Ampere” by NVIDIA. Zotac’s custom design focuses on meeting all the expectations of the DIY premium gaming segment, including a low-noise design, decent overclocking headroom, and a fair bit of ARGB bling to brighten up high-end gaming PC builds. This card sees the introduction of Zotac’s new and smarter IceStorm 2 cooling solution with idle fan-stop, and aesthetic enhancements from the Spectra 2 ARGB illumination package. A conventional triple-fan setup tames the next-generation beast underneath.
The GeForce RTX 3080 by NVIDIA is the company’s new flagship graphics card, designed to make 4K UHD gaming with RTX-on possible at a price you got a 1440p-segment graphics card for in the previous generation. The new Ampere graphics architecture introduces NVIDIA’s 2nd generation RTX real-time raytracing technology. NVIDIA innovated a means of combining raster 3D graphics with real-time raytraced elements, such as lighting, shadow, reflections, ambient occlusion, and global illumination, to make the hybrid raster+raytraced 3D scenes that are as true to life as possible and a generational leap above DirectX 12.
The GeForce Ampere architecture introduces a new generation double-throughput CUDA core design that can perform concurrent INT32+FP32 math. The new 2nd generation RT core doubles the BVH traversal and triangle intersection performance over the previous generation and introduces new fixed-function hardware that enables RTX effects such as raytraced motion blur. The 3rd generation Tensor core shares many similarities with its HPC cousin powering the A100 Tensor Core processor NVIDIA launched this Spring, which leverages the sparsity phenomenon in deep-learning neural networks to double AI inference performance by an order of magnitude. NVIDIA heavily uses AI in its raytracing pipeline, and image-quality features, such as DLSS. The new DLSS 8K feature takes a stab at 8K gaming by utilizing AI.
The new GeForce RTX 3080 features more than double the CUDA core counts than the previous-generation, with over 8,704 CUDA cores, 68 RT cores, 272 tensor cores, 272 TMUs, and 96 ROPs. All this compute muscle is fed data by an updated memory setup consisting of 10 GB of 19 Gbps GDDR6X memory across a 320-bit wide memory interface, which works out to 760 GB/s of bandwidth—that’s 70% higher than the previous generation. There are several other “next generation” bits, such as support for PCI-Express 4.0 x16 bus, the latest HDMI 2.1 and DisplayPort 1.4a connectivity, and AV1 video acceleration. NVIDIA built the GA102 over the 8 nm FFN process Samsung designed specially for NVIDIA.
As mentioned earlier, the Zotac RTX 3080 Trinity is one of the company’s premium custom-design RTX 3080 offerings, though like most other board partners, it is designing more bleeding edge products. The IceStorm 2 cooling solutions uses a trio of fans with alternative spin directions (although the same airflow direction) to minimize lateral vortex formation, which leads to noise. Since this is the Trinity SKU (and not Trinity OC), the card ticks at NVIDIA-reference clock speeds of 1710 MHz GPU Boost and 19 Gbps memory. Zotac sells their card at the NVIDIA MSRP of $700—no price increase.
ASUS ROG STRIX GeForce RTX 3090 OC is the flagship implementation of the RTX 3090 Ampere by ASUS. The GeForce RTX 3090 by NVIDIA represents the pinnacle of the GeForce “Ampere” architecture for the client segment. ASUS builds on this technology with a product targeted at premium gaming PC builds and overclockers, as the card offers many conveniences for both kinds of users. The RTX 3090 transcends the barrier between the gaming and professional visualization markets, much like the TITAN brand of graphics cards, especially when combined with NVIDIA’s highly capable Studio drivers. This begins to explain why NVIDIA referred to the RTX 3080 from last week as its “flagship” gaming product.
NVIDIA reshaped the top-end of its GeForce product stack with the RTX 30-series. The RTX 3080 is designed to offer premium 4K UHD gaming with raytracing on, which should cover almost every gaming use-cases, hence the “flagship” badge. It’s been extensively compared to the RTX 2080 Ti. The RTX 3090, on the other hand, is being compared by NVIDIA to the $2,500 TITAN RTX, a “halo” product based on Turing. What’s also new is that both the RTX 3080 and RTX 3090 share a common silicon under the hood, which is the largest based on the GeForce “Ampere” architecture.
The GeForce Ampere architecture uses the 2nd generation of NVIDIA’s path-breaking RTX technology, which introduced real-time raytracing to the gaming segment. NVIDIA perfected a way to combine conventional raster 3D rendering with certain real-time ray-traced elements, such as lighting, shadows, reflections, ambient-occlusion, and global-illumination. Processing even these few elements required NVIDIA develop fixed-function hardware. Ampere combines a new generation CUDA core that doubles FP32 throughput over Turing and performs concurrent RP32+INT32 operations, the company’s 2nd generation RT core, which has double the performance over the previous generation, and hardware to pull off raytraced motion-blur. The 3rd generation tensor core accelerates AI and leverages the sparsity phenomenon in deep-learning neural nets to increase AI inference performance by an order of magnitude. NVIDIA leverages AI for de-noising its raytraced elements and features such as DLSS. Find more details about the architecture in our NVIDIA Ampere Architecture article.
The RTX 3090 isn’t strictly a creator’s card, NVIDIA is also making a bold stab at the 8K frontier. 8K is four times the pixels of 4K and sixteen times that of 1080p, which is an enormous performance and memory demand on any GPU. NVIDIA believes that its DLSS 8K feature can unlock this resolution for gamers by leveraging AI deep-learning to render the game at 1440p and upscaling it to 8K with an AI-based 9X super-sampling algorithm that leverages ground-truth data from NVIDIA that’s been rendered at 16K on its render farms. 8K gaming monitors are still a fair bit away, but TVs with this resolution are hitting shelves. Helping things is NVIDIA’s implementation of HDMI 2.1, which enables 8K HDR 60 Hz using a single cable. Besides 8K gaming, the RTX 3090 also has the muscle to drive e-Sports games at insane refresh rates of 360 Hz at lower resolution, with the NVIDIA Reflex technology working to reduce whole-system latency to give professional e-Sports gamers the edge they need.
As we mentioned earlier, the RTX 3090 is based on the same silicon as the RTX 3080—the 8 nm “GA102.” The RTX 3090 nearly maxes this chip out by enabling all but one TPC (two SMs) on the chip, resulting in a CUDA core count of 10,496, along with 328 Tensor cores, 82 RT cores, 328 TMUs, and 112 ROPs. The RTX 3090 maxes out the 384-bit wide GDDR6X memory interface of the “GA102” and comes with 24 GB of memory. This memory is ticking at the highest memory frequency in NVIDIA’s product stack, 19.5 Gbps, which works out to an astounding 940 GB/s of memory bandwidth,and took expensive HBM2E solutions in the past.
The GeForce RTX 3090 starts at a price of $1,500, which is exactly 50% more than the RTX 2080 Ti, but $1,000 less than the launch price of the TITAN RTX and probably explains the hyphenation. Unlike the TITAN series, the RTX 3090 can be made into custom-design cards by NVIDIA’s board partners. ASUS flexed its vast engineering muscle to come up with its latest-generation ROG STRIX design scheme. Matte black surfaces make way for brushed aluminium and a vast fin-stack cooler, and a swanky new VRM solution keeps the feisty 350-watt GPU fed through three 8-pin power inputs. But wait there’s more—ASUS is including a manual power limit setting of up to 480 W! This is much higher than any other RTX 3090 card out there. We did a special round of testing at the maximum power limit in our RTX 3090 STRIX OC review to see what additional performance can be gained. ASUS has also given the RTX 3090 a factory overclock of 1860 MHz, while the memory is left untouched at 19.5 Gbps. The ROG Strix RTX 3090 OC is priced at $1800. In this review, we take a look at the gaming and overclocking credentials of the ASUS flagship, with impressive results.
The MSI GeForce RTX 3090 Gaming X Trio is the king of the hill for the company, based on the top-dog NVIDIA RTX 3090 “Ampere” GPU with an astounding 24 GB of GDDR6X memory. The Gaming X brand represents MSI’s finest combination of performance, cooling, and noise-optimization, along with the right aesthetics for DIY high-end gaming PC builds. In this MSI GeForce RTX 3090 Gaming X Trio review we’ll check how well it can unleash the RTX 3090 out of its “stock” comfort zone by letting you tap into higher power limits. The Gaming X board design also sees the company’s latest triple-slot, triple-fan cooling solution. With this generation of the MSI design, you’ll see much of the RGB embellishments relocated to the backplate and towards the top edge of the card. MSI figured out that you’re more likely to revel upon your graphics card from this angle.
NVIDIA changed its approach to the higher-end of its product stack with the GeForce RTX 30-series “Ampere.” The RTX 3080 launched last week has already been proclaimed the new “flagship” product by NVIDIA, in marketing material that also contains the RTX 3090. This means the company intends for the RTX 3090 to address a different market—halo premium. While the RTX 3080 was shown beating the previous-generation flagship, the RTX 2080 Ti, the RTX 3090 was extensively compared with the TITAN RTX—a $2,500 halo product based on Turing. Much like the TITAN, the RTX 3090 is designed to transcend market segment barriers between gaming and professional visualization. Helping matters here are NVIDIA’s highly capable Studio drivers. The 24 GB memory amount, NVIDIA believes, helps in various creator use cases.
This doesn’t necessarily mean the RTX 3090 doesn’t bring anything to the table with gaming. On the contrary, this is the SKU with which NVIDIA wants to take a stab at the 8K gaming frontier. 8K is no joke; it’s four times the pixels of 4K, and sixteen times those of Full HD. To accomplish this, NVIDIA developed the new DLSS 8K feature, which leverages AI and deep learning to perform a bold 9X AI upscaling of 1440p rendering on supported games. 8K gaming monitors haven’t yet arrived, but target buyers for RTX 3090 could be early adopters of 8K TVs. Helping matters here are NVIDIA’s implementation of HDMI 2.1 and DisplayPort 1.4a, which enable 8K HDR 60 Hz with a single cable.
NVIDIA leveraged a common silicon for the RTX 3090 and RTX 3080, so only a single kind of ASIC has to be built for this relatively low volume market segment. The 8 nm “GA102” is a mammoth piece of silicon with over 28 billion transistors, which is almost maxed out on the RTX 3090 by enabling all but one TPC (two SM) on the silicon, resulting in 10,496 CUDA cores, 328 tensor cores, 82 RT cores, and a 384-bit wide GDDR6X memory interface holding 24 GB of memory that ticks at a blistering 19.5 Gbps—940 GB/s of bandwidth.
The 2nd generation RTX technology with Ampere sees NVIDIA introduce a new double-throughput CUDA core that can process concurrent FP32+INT32 operations; the 2nd generation RT core has fixed-function hardware to process temporal elements of raytracing, enabling new RTX effects, such as raytraced motion blur, an effect that was until now post-processed and inaccurate; and the new 3rd generation Tensor core that shares much of its design with the heavy-duty Tensor cores of the A100 Tensor Core AI HPC processor NVIDIA launched this Spring, which leverages the sparsity phenomenon in deep-learning neural nets to increase AI inference performance by an order of magnitude.
The MSI GeForce RTX 3090 Gaming X Trio is designed to unleash the RTX 3090 GPU, which even in its reference avatar comes with 350 W typical board power. As you’ll see in the next page, the card introduces the Tri Frozr 2 cooling solution with many segment-first features. The card also ships with factory-overclocked speeds of 1785 MHz GPU Boost (vs. 1695 MHz reference). MSI is pricing the RTX 3090 Gaming X Trio at $1590, a $90 premium over the $1,500 baseline pricing for the RTX 3090. In this MSI RTX 3090 Gaming X review, we put the card through its paces against our vast selection of graphics cards and games, and test the card’s overclocking capabilities.
ZOTAC today launched its GeForce RTX 3090 Trinity Ampere graphics card. The card combines NVIDIA’s top-dog RTX 3090 GPU with 24 GB of GDDR6X memory, the latest IceStorm 2.0 cooling solution, and Spectra 2 ARGB illumination. ZOTAC also offers an industry-leading 5-year warranty subject to product registration. Unlike NVIDIA’s RTX 3090 Founders Edition with its Dual-Axial Flow-Through cooler, ZOTAC takes a more conventional approach with its IceStorm 2.0 thermal solution. All three fans are where you’d expect them. The cooler is longer than the PCB, so some of the airflow from the third fan flows through, out the backplate.
NVIDIA has taken an unconventional approach to the enthusiast segment with its GeForce RTX 30-series Ampere family. The $700 RTX 3080 launched last week has been labeled “flagship” by NVIDIA, and has been extensively shown beating not just its predecessor, the RTX 2080, by a high double-digit percent, but also the previous-gen flagship RTX 2080 Ti by a fair margin, while at least $500 cheaper. The new RTX 3090, on the other hand, is being launched as a “halo segment” product and extensively compared to the Turing-based TITAN RTX, which launched at $3,000.
What’s also unconventional about the GeForce Ampere series is NVIDIA’s use of a common silicon between the RTX 3080 and RTX 3090—the 8 nm GA102 graphics processor. With the previous generation, the RTX 2080 and its refresh, the RTX 2080 Super, were based on the smaller TU104 silicon, while the RTX 2080 Ti and TITAN RTX were built using the larger TU102 die. Between the RTX 3080 and RTX 3090, NVIDIA left itself plenty of headroom for future product segmentation.
With the RTX 3080 already capable of 4K UHD gaming with raytracing, the RTX 3090 has an interesting market position at its $1,499 starting price, which is about 50% higher than the launch price of the RTX 2080 Ti, but exactly 50% lower than the TITAN RTX. Besides enabling all but two streaming multiprocessors on the GA102 silicon, the RTX 3090 enjoys the full 384-bit wide memory interface of the die—no 352-bit business this time around. NVIDIA took things a notch further by arming the RTX 3090 with a staggering 24 GB of GDDR6X memory clocked at 19.5 Gbps—an astounding 940 GB/s memory bandwidth.
The comparisons to the TITAN RTX begin to explain the main application of the RTX 3090 to consumers as it offers the highest possible performance from the Ampere generation, with 4K UHD gameplay at higher refresh rates than the RTX 3080 can handle, 8K gameplay leveraging DLSS 8K, and “TITAN-class creator performance,” which probably underscores NVIDIA’s decision to give it 24 GB of memory.
NVIDIA carved the RTX 3090 out of the same GA102 silicon the RTX 3080 is based on by disabling just 1 of the 42 TPCs present on the silicon. With 41 TPCs (82 SM), the RTX 3090 enjoys a jaw-dropping 10,496 CUDA cores, 328 Tensor cores, 82 RT cores, 328 TMUs, and 112 ROPs. The GPU Boost frequency goes up to 1695 MHz. NVIDIA leveraged the new 8 nanometer 8FFN silicon fabrication node by Samsung to build the GA102. Ampere represents the 2nd generation of NVIDIA’s path-breaking RTX architecture that introduces real-time raytracing to the consumer segment by combining conventional raster 3D graphics with real-time raytraced elements, such as lighting, shadows, reflections, ambient-occlusion, and global illumination. The 2nd generation also introduces raytraced motion-blur and even has fixed-function hardware just to pull this otherwise difficult effect off. Find more details about the architecture in our NVIDIA Ampere Architecture article.
Unlike the TITAN RTX, which only comes in the reference-design Founders Edition version, the RTX 3090 can be built by partners, who have the freedom to implement their latest premium board designs with the chip. As we mentioned earlier, the Zotac RTX 3090 Trinity in this review comes with the company’s IceStorm 2 cooler that features a long series of aluminium fin-stack heatsinks held together by copper heat pipes, ventilated by three fans that each spin at a speed independent of the others. The card sticks to the reference 1695 MHz GPU Boost frequency, and its memory ticks at 19.5 Gbps (GDDR6X-effective). In this review, we take the card for a spin across our exhaustive list of game tests and compare it to our vast selection of high-end graphics cards to tell you if you should start saving for one.
GeForce RTX 3090 Market Segment Analysis
Price
Shader Units
ROPs
Core Clock
Boost Clock
Memory Clock
GPU
Transistors
Memory
GTX 1080 Ti
$650
3584
88
1481 MHz
1582 MHz
1376 MHz
GP102
12000M
11 GB, GDDR5X, 352-bit
RX 5700 XT
$370
2560
64
1605 MHz
1755 MHz
1750 MHz
Navi 10
10300M
8 GB, GDDR6, 256-bit
RTX 2070
$340
2304
64
1410 MHz
1620 MHz
1750 MHz
TU106
10800M
8 GB, GDDR6, 256-bit
RTX 2070 Super
$450
2560
64
1605 MHz
1770 MHz
1750 MHz
TU104
13600M
8 GB, GDDR6, 256-bit
Radeon VII
$680
3840
64
1802 MHz
N/A
1000 MHz
Vega 20
13230M
16 GB, HBM2, 4096-bit
RTX 2080
$600
2944
64
1515 MHz
1710 MHz
1750 MHz
TU104
13600M
8 GB, GDDR6, 256-bit
RTX 2080 Super
$690
3072
64
1650 MHz
1815 MHz
1940 MHz
TU104
13600M
8 GB, GDDR6, 256-bit
RTX 2080 Ti
$1000
4352
88
1350 MHz
1545 MHz
1750 MHz
TU102
18600M
11 GB, GDDR6, 352-bit
RTX 3070
$500
5888
64
1500 MHz
1725 MHz
1750 MHz
GA104
17400M
8 GB, GDDR6, 256-bit
RTX 3080
$700
8704
96
1440 MHz
1710 MHz
1188 MHz
GA102
28000M
10 GB, GDDR6X, 320-bit
RTX 3090
$1500
10496
112
1395 MHz
1695 MHz
1219 MHz
GA102
28000M
24 GB, GDDR6X, 384-bit
Zotac RTX 3090 Trinity
$1500
10496
112
1395 MHz
1695 MHz
1219 MHz
GA102
28000M
24 GB, GDDR6X, 384-bit
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.