As on 23. August 2019 Call of Duty: Modern Warfare was released First person shooter in addition to the scandalous story also caused a stir due to the required hard drive space. With full 150 GB presented itself to the 16. Part of the Call-of-Duty series as extremely memory-hungry. Also in the upcoming Call of Duty: Black Ops Cold War, which will start from 13. November 2019, the Activision development studio is not exactly reluctant regarding the resources required on the hard drive. As can be seen from the published system requirements, the latest part of the series in the Ultra RTX specifications requires at least 250 GB of free space.
Data protection notice for YouTube
At this point we would like to show you a YouTube video. Protecting your data is important to us: YouTube sets cookies on your computer by embedding and playing them, with which you can possibly be tracked. If you want to allow this, just click the play button. The video will then be loaded and then played.
Your Hardwareluxx team
From now on, display YouTube videos directly
Whoever screws down the requirements comes with 175 GB out. If you only want to play the multiplayer of the new Call of Duty with the minimum requirements, you only have to 50 GB of free space available. An Intel Core i3 – 3700 or an AMD FX – 6300 out. When it comes to graphics cards, all shooter fans should have at least one NVIDIA GeForce GTX 670 or GTX 1650 or a Radeon HD 7950. The required 8 GB of RAM should hardly be a problem. The operating system can also be used with both Windows 7 and Windows 10 – each in the 64 – bit version – can be accessed.
As soon as the demands are higher and gamers want to use ray tracing, an NVIDIA GeForce RTX 3070 recommended. Since this is very rare at the moment and is difficult or impossible to obtain in retail, at least a GeForce GTX 970 or better be built into your own computer. Otherwise, high FPS can hardly be implemented on displays that have a high refresh rate. For the supreme discipline “Ultra RTX”, in addition to an RTX 3080, an Intel i9 9900 K or an AMD Ryzen 3700 X required.
System requirements
Minimum:
Windows 7 64 – Bit (SP1) or Windows 10 64 – Bit (Version 1803 or higher)
Intel Core i3 – 4340 or AMD FX – 6300
8 GB RAM
50 GB (multiplayer only), 150 GB (all game modes)
NVIDIA GeForce GTX 670 / GeForce GTX 1650 or Radeon HD 7950
Recommended:
Windows 10 64 Bit (latest update)
Intel Core i5 – 2500 K or AMD Ryzen R5 1600 X processor
Matthew Wilson 5 hours ago Featured Tech News, Software & Gaming
Godfall is coming next week for both the PlayStation 5 and PC. Given that this is a PS5 console launch exclusive, we’d expect some beefy PC hardware requirements. Today, we get to see exactly how demanding the game will be on PC.
This week, Gearbox and Counterplay announced the PC minimum and recommended hardware requirements for Godfall. It is a flashy looking game and is set to be a graphical showcase for Unreal Engine. So what does it take to run such a game on PC?
Here are the system requirements for Godfall on PC:
Memory requirements are particularly big here. There are also indications that running this game at 4K with Ultra HD textures will eat up as much as 12GB of VRAM.
KitGuru Says: I’ll be interested in seeing user benchmarks when this game comes out. Are any of you planning on playing Godfall next week?
Become a Patron!
Check Also
Call of Duty: Black Ops Cold War update coming to Warzone in December
We’ve known for a while now that Call of Duty: Warzone will persist between annual …
Swift 5 i5 Swift 5 i7 In summary The Swift 5 is Acer’s first laptop with Intel Tiger Lake processor and it shows excellent performance especially during moderate use. With long-term load, a CPU with more cores is faster. The housing is made of sturdy metal and yet very light, weighing about one kilogram. The battery life is about fourteen hours while browsing, but with a heavy load it only remains about four hours. The screen has good brightness and contrast, but it could be better calibrated and we also miss a card reader. The Swift 5 is one of the first laptops with a Tiger Lake processor, but you seem to be paying for that too, as the price has gone up compared to its predecessor.
In summary The Swift 5 is Acer’s first laptop with Intel Tiger Lake processor and it shows excellent performance especially during moderate use. With long-term load, a CPU with more cores is faster. The housing is made of sturdy metal and yet very light, weighing about one kilogram. The battery life is about fourteen hours while browsing, but with a heavy load it only remains about four hours. The screen has good brightness and contrast, but could be better calibrated and we also miss a card reader.
When Intel launched its new 15 W-chips for laptops, we actually already knew a lot about it. Some specifications and benchmarks had already been leaked and already in June Acer announced its new Swift 5 series with an ‘Intel processor of the next generation’. That could really only be one, and now Acer is one of the first manufacturers to have a laptop with Intel’s Core processor of the eleventh generation, also known as Tiger Lake, on the shelves. How big is the step from Ice Lake to Tiger Lake and should AMD start to worry? In this review, we take a look at the Swift 5 with Core i5 – 1135 G7- and i7 – 1165 G7 processor.
So the laptop in question is an Acer Swift 5 and the Swift series includes the lighter and thinner alternatives to the brand’s well-known Aspire model range. The Swift 5 is characterized, among other things, because the laptop weighs no more than a kilogram. The latest Swift 5 can be recognized by the ’55’ code in the type name and is therefore the successor to the ’54 ‘. The key features of the 55 are its low weight, Intel Evo certification and anti-microbial display, according to Acer. Let’s start with that screen and the antimicrobial coating. That sounds very interesting during a pandemic and it is to some extent. The coating contains silver ions, which can kill microbes, including bacteria. That is not a new technique, by the way; Samsung used the same coating on its NC 10 – netbook in 2008. With the Swift 5, that coating is on the touchscreen, and certainly if different users use the laptop, you spread fewer microbes than on a normal touchscreen. However, SARS-CoV-2 is a virus and not a microbe. Although there are indications that the silver ions are also effective against virus particles, Corning and Acer do not give any guarantees that they will also work against the corona virus and that does not make the touchscreen a ‘corona killer’.
With the versions that are delivered at the time of writing, it stops there, but there is also a Swift 5 with type number 55 TA, where the entire case, keyboard and touchpad have the antimicrobial coating. It is not yet clear whether this version will also be sold in the Netherlands and Belgium.
The antimicrobial glass plate on the screen is made by Corning , known for its sturdy Gorilla Glass, and that is also applied to this laptop. That is not unique either, because manufacturers such as HP, Dell and Lenovo also use the firmer glass on their laptops and tablets. Whether it is due to the Gorilla Glass or not, the screen construction of the Swift 5 feels quite sturdy and we sometimes see that differently with laptops with narrow screen edges. At the top of the screen is the webcam, with a resolution of 1280 x 720 pixels, without facial recognition. Biometric login is possible, but with a fingerprint scanner, which is located under the keyboard.
The back of the screen and the rest of the housing are made of an alloy of aluminum, lithium and magnesium. Acer already used these metals in earlier Swift 5 models. The advantage of this is that it is very light and the Swift 5 weighs just over a kilogram on our scales. In addition, the housing feels sturdy, which is not the case with every laptop. If you really hammer hard on the keys, you will see the housing bounce, but in normal practice that will not happen. The housing of the Swift 5 is incidentally ‘mist green’, which is a subtle color green. In low light, the case appears gray, but in the right light it turns green and while taste is indisputable, we think green looks good on the Swift. The combination with the yellow-orange print of the keys is less successful, but it is nice that Acer tries a different color on the housing.
There is also an Intel Evo sticker on the case, which means that the Swift 5 meets a number of Intel requirements. Currently Evo certification is only for laptops with an Intel 11th generation processor, with at least 8GB of memory and one 256 GB -ssd. Intel also sets requirements for battery life, the speed with which the laptop comes out of standby, the quality of the microphone and even the connections. An Evo laptop must have Thunderbolt 4, and that connector is on the left side of the Swift 5 enclosure. In addition, there is a regular USB-A connection at 5Gbit / s and an HDMI 2.0 connection. You can charge the laptop via USB-C, but Acer supplies a standard charger with the connection next to the HDMI port. On the right side is a second USB-A connection that also works at 5Gbit / s, with a jack connection for headphones next to it. Acer already omitted the card reader on the previous generation Swift 5 and unfortunately it has not been returned to the latest version.
Keyboard and touchpad The Swift 5’s keys are made of flat plastic and have a backlight with two modes: on and off . Yet that lighting is not easily too bright in a dark environment, so you do not miss a less bright setting. In terms of typing comfort, the keyboard is not the kind that provides for a lot of travel, although we sometimes see it worse on thin laptops.
Compared to other 14 “- laptops, the actuation point of the stop is approximately in the same place and we arrive at a total key travel of 1.3mm To get an idea of the key travel: the most travel we have measured so far was on the ThinkPad E 490, while the least travel was measured on the MacBook Pro with Butterfly keyboard, so the Swift 5 is somewhere in between and is therefore quite average in that respect.
The touchpad does its job well on its own, but feels somewhat cheap. The surface appears to be made of plastic rather than glass, as on more expensive laptops, and the mouse button feels like Acer had the same switch used as in Aspire laptops from 300 euro.
Swift 5 i5 Swift 5 i7 In summary The Swift 5 is Acer’s first laptop with Intel Tiger Lake processor and it shows excellent performance especially during moderate use. With long-term load, a CPU with more cores is faster. The housing is made of sturdy metal and yet very light, weighing about one kilogram. The battery life is about fourteen hours while browsing, but with a heavy load it only remains about four hours. The screen has good brightness and contrast, but it could be better calibrated and we also miss a card reader. The Swift 5 is one of the first laptops with a Tiger Lake processor, but you seem to be paying for that too, as the price has gone up compared to its predecessor.
In summary The Swift 5 is Acer’s first laptop with Intel Tiger Lake processor and it shows excellent performance especially during moderate use. With long-term load, a CPU with more cores is faster. The housing is made of sturdy metal and yet very light, weighing about one kilogram. The battery life is about fourteen hours while browsing, but with a heavy load it only remains about four hours. The screen has good brightness and contrast, but could be better calibrated and we also miss a card reader.
When Intel launched its new 15 W-chips for laptops, we actually already knew a lot about it. Some specifications and benchmarks had already been leaked and already in June Acer announced its new Swift 5 series with an ‘Intel processor of the next generation’. That could really only be one, and now Acer is one of the first manufacturers to have a laptop with Intel’s Core processor of the eleventh generation, also known as Tiger Lake, on the shelves. How big is the step from Ice Lake to Tiger Lake and should AMD start to worry? In this review, we take a look at the Swift 5 with Core i5 – 1135 G7- and i7 – 1165 G7 processor.
So the laptop in question is an Acer Swift 5 and the Swift series includes the lighter and thinner alternatives to the brand’s well-known Aspire model range. The Swift 5 is characterized, among other things, because the laptop weighs no more than a kilogram. The latest Swift 5 can be recognized by the ’55’ code in the type name and is therefore the successor to the ’54 ‘. The key features of the 55 are its low weight, Intel Evo certification and anti-microbial display, according to Acer. Let’s start with that screen and the antimicrobial coating. That sounds very interesting during a pandemic and it is to some extent. The coating contains silver ions, which can kill microbes, including bacteria. That is not a new technique, by the way; Samsung used the same coating on its NC 10 – netbook in 2008. With the Swift 5, that coating is on the touchscreen, and certainly if different users use the laptop, you spread fewer microbes than on a normal touchscreen. However, SARS-CoV-2 is a virus and not a microbe. Although there are indications that the silver ions are also effective against virus particles, Corning and Acer do not give any guarantees that they will also work against the corona virus and that does not make the touchscreen a ‘corona killer’.
With the versions that are delivered at the time of writing, it stops there, but there is also a Swift 5 with type number 55 TA, where the entire case, keyboard and touchpad have the antimicrobial coating. It is not yet clear whether this version will also be sold in the Netherlands and Belgium.
The antimicrobial glass plate on the screen is made by Corning , known for its sturdy Gorilla Glass, and that is also applied to this laptop. That is not unique either, because manufacturers such as HP, Dell and Lenovo also use the firmer glass on their laptops and tablets. Whether it is due to the Gorilla Glass or not, the screen construction of the Swift 5 feels quite sturdy and we sometimes see that differently with laptops with narrow screen edges. At the top of the screen is the webcam, with a resolution of 1280 x 720 pixels, without facial recognition. Biometric login is possible, but with a fingerprint scanner, which is located under the keyboard.
The back of the screen and the rest of the housing are made of an alloy of aluminum, lithium and magnesium. Acer already used these metals in earlier Swift 5 models. The advantage of this is that it is very light and the Swift 5 weighs just over a kilogram on our scales. In addition, the housing feels sturdy, which is not the case with every laptop. If you really hammer hard on the keys, you will see the housing bounce, but in normal practice that will not happen. The housing of the Swift 5 is incidentally ‘mist green’, which is a subtle color green. In low light, the case appears gray, but in the right light it turns green and while taste is indisputable, we think green looks good on the Swift. The combination with the yellow-orange print of the keys is less successful, but it is nice that Acer tries a different color on the housing.
There is also an Intel Evo sticker on the case, which means that the Swift 5 meets a number of Intel requirements. Currently Evo certification is only for laptops with an Intel 11th generation processor, with at least 8GB of memory and one 256 GB -ssd. Intel also sets requirements for battery life, the speed with which the laptop comes out of standby, the quality of the microphone and even the connections. An Evo laptop must have Thunderbolt 4, and that connector is on the left side of the Swift 5 enclosure. In addition, there is a regular USB-A connection at 5Gbit / s and an HDMI 2.0 connection. You can charge the laptop via USB-C, but Acer supplies a standard charger with the connection next to the HDMI port. On the right side is a second USB-A connection that also works at 5Gbit / s, with a jack connection for headphones next to it. Acer already omitted the card reader on the previous generation Swift 5 and unfortunately it has not been returned to the latest version.
Keyboard and touchpad The Swift 5’s keys are made of flat plastic and have a backlight with two modes: on and off . Yet that lighting is not easily too bright in a dark environment, so you do not miss a less bright setting. In terms of typing comfort, the keyboard is not the kind that provides for a lot of travel, although we sometimes see it worse on thin laptops.
Compared to other 14 “- laptops, the actuation point of the stop is approximately in the same place and we arrive at a total key travel of 1.3mm To get an idea of the key travel: the most travel we have measured so far was on the ThinkPad E 490, while the least travel was measured on the MacBook Pro with Butterfly keyboard, so the Swift 5 is somewhere in between and is therefore quite average in that respect.
The touchpad does its job well on its own, but feels somewhat cheap. The surface appears to be made of plastic rather than glass, as on more expensive laptops, and the mouse button feels like Acer had the same switch used as in Aspire laptops from 300 euro.
Announced at The Game Awards 2019 Godfall is an action RPG by Counterplay Games studio, which is created under the aegis of the well-known fans of the Borderlands and Brothers in Arms series by Gearbox Software. The manufacturer included the title in the so-called looter shooter, i.e. a mishmash of a classic TPP slasher like God of War, Devil May Cry or Darksiders and elements characteristic of the mentioned Borderlands series, only instead of shooting, the action will focus on hand-to-hand combat. Don’t worry – Godfall offers more than just cooperation for up to four players. We will also complete the story campaign alone. The premiere of the game will take place next week, so Counterplay Games studio finally revealed the hardware requirements for the PC version.
We know the hardware requirements for Godfall slasher, which will be released on PS5 and PC (Epic Games Store) now 12 November 2020 year.
Game releases November 2020 – DiRT 5, Assassin’s Creed and Call of Duty
From the very beginning, Godfall was promoted as one of the first games developed exclusively for next-gen hardware. This is also the case – the creators will skip the current generation consoles, focusing only on PlayStation 5 (and PC). Therefore, should we expect hardware requirements worthy of a “new generation” game? Not necessarily, because although the graphics in Godfall are not too bad, it is difficult to talk about any noticeable qualitative leap. Here are the hardware requirements for the PC version (it is worth noting that the game requires a permanent connection to the Internet):
NVIDIA GeForce GTX 1080 Ti or AMD Radeon RX 5700 XT
Disk
no data
no data
Watch Dogs Legion – Performance test of graphics cards with RTX and DLSS
In minimum w there is only a quad-core and quad-threaded Intel Core i5 processor – 6600, which has its heyday long ago behind. In the recommended, however, the manufacturer gave clearly more powerful CPUs: Ryzen 5 3600 or Intel Core i7 – 8700. As for graphics cards, the minimum is the GeForce GTX level 1060 6 GB or Radeon RX 580. For higher settings, GeForce GTX 1080 Ti and Radeon RX 5700 XT. The developers implemented ray tracing (DXR) together with AMD. Ray tracing will also be available on PS5, and the effect will be limited to shadows only. The Sony console will provide 4K gameplay in Godfall and 60 FPS. The premiere of the game will take place 12 November 2020 on PC (Epic Games Store) and PS5. Both platforms are temporary exclusivity (probably for one year).
Leo Waldock 6 hours ago Channel, CPU, Featured Announcement, Featured Tech News, Graphics
In the last week of October we heard about the Acer Swift 3x laptop with Intel Iris Xe Max graphics and were at a bit of a loss as this was a new name that Intel had not yet revealed. A few days later we had a call with Intel that made things clearer and now, on 1st November, you will find a handful of laptops that use Intel Max. Leo took the opportunity to deliver eight minutes of video addressing the question ‘What the Heck is Intel Iris Xe Max?’
Watch the video via our VIMEO Channel (Below) or over on YouTube at 2160p HERE
Matthew covers the news about Intel Iris Xe Max HERE . The essence is that Intel is supplying the graphics portion of an 11th Gen Tiger Lake laptop chip as an add-in graphics chip for laptop manufacturers such as Acer, Asus and Dell. We are used to the idea of a laptop with Core i5 plus RTX 2060 or MX250 but have never come across a laptop with two graphics cores from the same company and with very similar specification. The short answer to our big question is that Intel Iris Xe Max is Intel Iris XE-LP running 300MHz faster and with 4GB of dedicated memory.
The unanswered question is, what happens when the two graphics cores work together in Additive Gaming? We asked Intel this question and they refused to answer – yet! we have to assume that Additive Gaming will produce some benefit but have no idea whether it will be ten percent or somewhere close to 100 percent. Time will tell on that front and you can be certain we are keen to check it out.
KitGuru Says: We expect to hear new graphics news from AMD and Nvidia but this time it is Intel that is breaking new ground with Iris Xe Max.
Become a Patron!
Check Also
Crysis 2 and 3 remasters reportedly in development
While the original Crysis held a unique identity in the gamingsphere due to its specs …
A few weeks ago, Intel announced 11 generation Tiger Lake processors for laptops based on the Willow Cove architecture and lithography 10 nm SuperFin. With the new cores, integrated Intel Iris Xe Graphics also entered the game, making it the first product to take advantage of the rebuilt Xe architecture. For the manufacturer, it is certainly an apple of the eye, especially since the plans for this architecture are very extensive and apply not only to laptops, but also to desktops, servers and even supercomputers. In this way, Intel wants to enter the game as the third major manufacturer of dedicated graphics cards after years of absence in this market. The first combat test, however, will be the more modest Intel Iris Xe MAX Graphics card, which is to fight NVIDIA GeForce MX systems in notebooks. As the first editorial office in Poland and one of the few in the world, we invite you to test a laptop with a dedicated Iris Xe MAX Graphics card.
Author: Damian Marusiak
Acer Swift 3X will be based on processors 11 Intel Tiger Lake-U generation. We will get a choice of 4-core and 8-thread Intel Core i5 units – 1135 G7 and Intel Core i7 – 1165 G7. An even more interesting element of the design will be the presence of a dedicated graphics card. Not NVIDIA cards and not AMD. I am talking about the first graphics card from Intel in the form of the Iris Xe MAX Graphics chip in twenty years. These are the first steps of the manufacturer to create an attractive GPU line that will be able to compete with both AMD and NVIDIA. However, the full potential of the Xe architecture will not be seen until next year, when much more advanced designs based on the Xe-HP architecture with hardware Ray Tracing acceleration will come out. The Acer Swift 3X laptop has the first card, which until recently was known as Intel DG1 – Discrete Graphics 1.
As the only editorial office in Poland, we invite you to Testing the Acer Swift 3X laptop with a dedicated Intel Iris Xe MAX Graphics card. Will it be able to break the hegemony of NVIDIA?
Perhaps to the surprise of many people, the dedicated Intel Iris Xe MAX Graphics card in terms of specifications will be very similar to the integrated Iris Xe Graphics chip. Like iGPU, it has 96 Execution units (EU), which translates into 768 The two differences are the amount of RAM allocated and the core clock. Intel Iris Xe MAX Graphics are available 4 GB of LPDDR4x RAM clocked at 4267 MHz, which translates into bandwidth order 68 GB / sec. The core clock is also noticeably higher, reaching a maximum of 1650 MHz. Due to the fact that it is a dedicated card, it is not associated with the processor with a common energy limit. This means that, for example, in games, the processor receives the entire energy budget and can work with a higher clock, and the Iris Xe MAX Graphics card has its own budget and thus, regardless of the processor, it can achieve a higher clock speed in the Boost mode.
Another feature of the Acer Swift 3X laptop is undoubtedly the use of Intel Tiger Lake processors (11 generation of Intel Core chips). There was a lot of talk about processors long before their final debut (September 2). First of all, they use the redesigned Willow Cove architecture, based on last year’s Sunny Cove. Mainly the cache memory subsystem has been changed, increasing the number of L1 cache, L2 cache and L3 cache. Also used was a noticeably improved 10 nm technological process, named 10 nm SuperFin. The name is related primarily to the use of new SuperFin transistors, which are to enable work with a higher clock at the same time lower voltages. The tested version of Acer has an Intel Core i7 unit – 1135 G7, equipped with 4 cores and 8 threads. The base clock of the processor is 2.8 GHz with the possibility of increasing in Turbo Boost 2.0 mode to a maximum of 4.7 GHz. Intel Core i7 – 1165 The G7 has a default TDP of 28 W. The processor not only supports the Thunderbolt 4 platform, but is also the first to introduce PCIe 4.0 to notebooks. Also supports DDR4 3200 MHz, LPDDR4X 4267 MHz and LPDDR5 5400 MHz – the latter will not appear in laptops, however, until next year.
Nvidia offers Call of Duty: Black Ops Cold War bundled with GeForce RTX video cards 3080 and 3090 until the 10 December. The promotion, however, risks satisfying few fans, since the cards are practically impossible to find.
by Manolo De Agostini published 31 October 2020 , at 09: 31 in the Video Cards channel GeForce Ampere NVIDIA Call of Duty Activision
Nvidia announced that by purchasing a video card GeForce RTX 3080 or 3090 , or a desktop system equipped with such GPUs, it will be possible to receive Call of Duty: Black Ops Cold War for free for a limited time . Information and partners of the initiative can be found on this page. The promotion lasts until 10 December and game redemption is possible until 11 January 2021. In addition to the standard edition title, buyers of the two Nvidia cards will have access to the Woods Operator Pack and Confrontation Weapons Pack.
The point is that this bundle, although interesting and sensible by virtue of the fact that the new CoD supports ray tracing technology and Nvidia’s DLSS, risks ending up in the hands of a very few lucky ones, as the cards are practically unavailable and deliveries for those who have booked them are expected in several weeks, in several cases not before 2021.
In recent days the launch of the RTX 3070 went the same way as the previous proposals , with the cards gone in minutes and the sites of different stores crash. Nvidia, contacted about it, has not told us anything new compared to its previous statements and therefore we remain with the statements of the company’s CEO, who has predicted a demand higher than production for all this year end. The company therefore remains firm in supporting the thesis of “unprecedented demand”. ) Click to enlarge
Meanwhile, Activision has released the requirements for Call of Duty: Black Ops Cold War , which you can see by clicking on the image above. You need at least one PC with an Intel Core i3 CPU – 4340 or AMD FX – 6300, 8GB RAM and an Nvidia GeForce GTX video card 670 / GTX 1650 or AMD Radeon HD 7950 to run the game. Recommended requirements are for a Core i5 CPU – 2500 K or AMD Ryzen 5 1600 X, 12 GB of RAM and a GeForce GTX video card 970 / GTX 1660 SUPER or AMD Radeon R9 390 / RX 580.
If you want to enable ray tracing , Activision suggests an Intel Core i7 processor – 8700 K or AMD Ryzen 7 1800 X, 16 GB of RAM and one Nvidia GeForce RTX video card 3070. For competitive players, the software house states that even a GTX 1080 / RX Vega 64 can be fine.
Finally, if you want to play with the ray tracing in 4K at high detail settings , a Core i9 CPU is required – 9900 K or a Ryzen 7 3700 X, 16 GB of RAM and a video card GeForce RTX 3080 . Note that it is only possible to install 50 GB dei 175 GB provided in case you only want to play multiplayer. Also, the disk space required to play in “Ultra RTX” is 250 GB, as higher quality textures need to be installed.
Matthew Wilson 1 hour ago Featured Tech News, Software & Gaming
We’re closing in on the release date for Call of Duty: Black Ops Cold War, which means it is time to take a look at the game’s PC system requirements. One thing immediately jumps out here – a requirement for up to 250GB of storage space.
Call of Duty: Modern Warfare has caused players a lot of issues over the last year due to its ever-growing file size. Unfortunately, it looks like Black Ops Cold War won’t be fixing that problem, with the full game calling for 175GB of storage space, with the top-end PC spec calling for as much as 250GB.
The minimum system requirements call for an Intel Core i3-4340 or AMD FX-6300 CPU, 8GB of RAM and an Nvidia GTX 670/GTX 1650 or AMD Radeon HD 7950 GPU. Meanwhile the recommended requirements call for an Intel Core i5-2500K or Ryzen 5 1600X CPU, 12GB of RAM and an Nvidia GTX 970 or Radeon R9 390.
If you want to play with ray-tracing, then you are going to need an Intel Core i7-8700K or AMD Ryzen 1800X, 16GB of RAM and a GeForce RTX 3070. Finally, if you want to run with ‘Ultra RTX’ settings, you’ll want an Intel Core i9-9900K or AMD Ryzen 3700X, 16GB of RAM and an Nvidia GeForce RTX 3080.
In all cases, you are going to need 175GB of storage space for all game modes at launch. Alternatively, you can just install the multiplayer mode, which is 50GB. The spec sheet does say that the game can eat up to 250GB of space though and as we’ve seen from Modern Warfare this year, that seems like a very likely outcome in the months to come.
KitGuru Says: The fact that Call of Duty is continuing to require so much storage space is a little crazy. Are any of you planning on getting Black Ops Cold War?
Become a Patron!
Check Also
World of Warcraft: Shadowlands gets a new release date
Back at the start of the month, Blizzard officially delayed World of Warcraft: Shadowlands. We …
Testing Tests overview Smartphone Pixel 5 in the test: Google just does it better Elephone U5 in the test: It’s that good Cheap phone from China The most popular China smartphones 2020 Xiaomi Mi 10 T Pro in the test: 144 – Hz display and great camera Xiaomi Poco X3 NFC in the test: 120 Hz and the best camera Motorola Moto G9 Play im Test: A lot of power for little money The best monthly cancellable tariffs in September 2020 Smartwatch Huawei Watch GT 2 Pro in the test: Smartwatch with cross-country battery Apple Watch: Smartwatch with contract from 15 € per month Buy Apple Watch 6: All generations in the price check Skagen Falster 3 in the test: Smartwatch with Wear OS Test Huawei Watch GT 2: Noble fitness tracker in watch form Huawei Watch GT in the test: record-breaking batteries ufzeit Skagen Falster 2 in the test: good design and a weak point Multiroom Bose Portable Home Speaker in the test: battery, WLAN , Airplay 2 Sonos Move in the test: The robust all-rounder Musiccast: Multiroom from Yamaha in the test Denon Heos in the test: versatile multiroom system Flat soundbar Teufel Sounddeck Streaming in the test Teufel Raumfeld in the test: rich multiroom sound Technisat Digitradio 580 in the test: The gray one All-rounder Keyfinder Tile Slim (2019): Key finder in credit card format Bluetooth key finder Tile Pro in the test: 122 m range! Key finder Tile Pro in the test: the range champion Orbit Bluetooth tracker under test: looking for wallet and key Nonda iHere 3.0: smart key finder under test Chipolo Classic and Plus: Bluetooth key finder in the test Musegear finder 2: Keyfinder without mandatory registration Action-Cam Actioncam Insta 360 One R: 1-inch image sensor in the test Gopro Hero 8 Black in the Tes t: Back to the top Insta 360 One R in the test: The modular action cam Motorola Moto G8 Plus test: Great smartphone, but … Insta 360 Go: Micro-GoPro in the test Motorola One Action Test: Good hardware, bad camera Actioncam DJI Osmo Action in the test: The better Gopro microSD In the test: Kingston UHS-I U3 microSDXC Kit MicroSD card for smartphone : Samsung Evo Plus 2017 Test report: Lexar Professional 1800 x microSDXC kit Test report: Intenso Premium microSDXC card with 64 GByte Android Sonos Move in the test: The robust all-rounder Honor 20: Inexpensive high-end smartphone in the test Xiaomi Mi 9: Top technology at a bargain price Doogee S 90 in the test: modular outdoor smartphone ZTE Axon 10 Pro in the test: High-end phone at a competitive price Motorola Moto G7 Power in the test: large battery, small price Sony Xperia 10: Smartphone with 20: 9 display under test Adviser Advisor overview Purchase advice Purchase advice: What good is a leaf blower m it battery for 45 Euro? True wireless headphones: How much do you have to invest? Buying advice water cooling: High-end PCs cool better Guide: Air conditioning and fan against the heat wave Sony shows the Xperia 1: Is the predecessor XZ3 worth it now? Purchase advice: Current headphones with ANC to 400 Euro Purchase advice: Smartphones with dual SIM and micro SD Practice Caution, money away: Kickstarter & Co. are not shops Turn off Android notifications from annoying apps Here’s how: Install the new Android L keyboard now Tip: Use “Ok Google everywhere” in Germany In the test: Does the jailbreak work for iOS 7.1? Goderma and mobile medicine: The doctors apps are coming! Instructions: Jailbreak for iOS 7 on iPhone 5S, 5, 4S and 4 Technology Import technology from China, part 2: Customs, taxes and tricks Drones & copters: From toys to FPV racers What does the end of an ecosystem mean? Smartphones with a flexible display: What’s in it for me? Overview of smartphone processors: everything Snapdragon? Evolutionary dead ends: the very worst cell phones mpass: Pay with the NFC mobile phone – or the NFC toilet roll Display calculator Calculate pixel density, number and display proportion Best List Test winner Price comparison Price comparison overview Smartphones from Android 7.0 Phablets with stylus Fitness tracker with GPS Bluetooth headphones with ANR Drones with GPS Video TechStage Advisor Graphics cards in comparison: RTX 3090 is too fast Start RTX 3080 Foreword Test procedure CPU Chipset against … fps cheap Middle class Upper class Conclusion Comments We have an RTX 3090 for our comparison of graphics cards. In the update of the article we show why the card is too fast for our system and what buyers need to know beforehand.
Graphics cards are for players who most important component in the PC. They largely determine how realistically the game world is presented, even more so than CPU or RAM. In this comparison, we show how much performance the different graphics card families from AMD and Nvidia bring. Since the beginning of our test series 2018 we have more than 18 different graphics cards from 14 GPU families chased through our test setup. We compare these results and give tips for every budget.
The article appears in our graphics cards theme, where you can also find all the individual tests for special devices. We have also written other guides, for example on eGPUs that are plugged into Thunderbolt ports. We also took a look at how well Minecraft RTX runs on the cheapest RTX cards and show in the theme world gaming how to use a VR-compatible gaming PC under 650 Euro.
Some of the graphics cards in the test. Geforce RTX 3080, RTX 3090, RTX 3070, RTX 3060 Brand new in this one Update are the values for the Nvidia Geforce RX 3090. We’ll keep getting tickets. Until then, we will collect all the details on price, publication and benchmark results from our colleagues in the article “Market overview Nvidia Ampere: All information about the RTX 3080, 3090 & Co. ”
Preface: RTX 3090 shows the limits of our test system Before the comparison, an important classification: The RTX 3090 is probably the fastest graphics card that we have tested so far. It is so fast that the other components of our test system become a bottleneck, the RTX 3090 therefore cannot really show its performance.
We use a different approach to our tests than colleagues in other magazines. We have 2018 actively decided to use an affordable gaming platform instead of a sophisticated test system. For example, we use a Core i5 8400. The CPU came 2017 on the market, has six cores and clocks up to 3.2 GHz. It currently costs around 200 Euro, significantly less than the high-end CPUs from Intel. There is also a mainboard with Z 360 – chipset as well as 16 GByte RAM. Back then it was a reasonable platform for mainstream gamers. Two years later it still is, but in combination with the high-end graphics cards it reaches its limits. We notice that both with the RTX 3080 and extremely with the RTX 3090.
What does this mean for our tests? They are still valid, at least for anyone who has a similar setup. They nicely show when the graphics card pushes the rest of the system to its limits. If the CPU isn’t fast enough, the GPU will get bored. In other words, there is little point in having a 1500 Euro expensive graphics card in one 500 Euro PC to plug.
That is why we have not (yet) updated the system: 2020 brought several innovations to the PC market. This included both the new, fast AMD processors and PCI Express 4.0. This standard doubles the bandwidth of the PCI Express slots. In addition to the RTX 3090 also support the upcoming AMD Radeon RX 6000 cards this connector. However, matching motherboards have so far been in short supply. That is currently changing. For 2021 we have a complete Planned upgrade of our test platform, then with a better CPU and PCIe 4.0. The disadvantage is that the values no longer correspond 1: 1 with the current tests. We will try the main kart test again, but this may take a while.
Test procedure For the reasons given above, the dates are the Geforce RTX available to us 3080 from Zotac very close to the values of the last tested RTX 3070. We therefore want to transparently point out in advance that many of our current benchmarks simply no longer load the card. This becomes particularly clear in the benchmark with Crysis: Remastered . Here the RTX should 3070 much further ahead of the RTX 3080 lie.
All of our graphics card tests take place on the same test system, a Windows – 10 – calculator with 16 GByte RAM, one Intel Core-i5 – 8400 CPU and a mainboard based on the Z 370 chipset. This is no longer the most up-to-date hardware, but it’s still up-to-date enough to be relevant.
All graphics cards then have to go through several benchmarks. These include the 3D Mark, Scenarios Timespy, Timespy Extreme and, if supported, Port Royale and the VR Mark. The latter measures the suitability of the GPUs for VR headsets (theme world). In order to get an impression of real games in addition to the synthetic benchmarks, we also use several test procedures integrated in games. We are currently relying on the tests in Far Cry 5 , Metro: Exodus and Borderlands 3 . The latter is our latest benchmark, it should Far Cry 5 , because this is with Full -HD is apparently slowly reaching its limits. We will therefore sort the benchmark lists from this post for Far Cry 5 according to the performance in 2560 × 1440 pixels. In addition, we measure all newer cards with the HD graphics package for Far Cry 5 . Not only does it look better, it also provides more realistic values for the game.
No mercy: The RTX 3080 outclasses all other graphics cards in the 3DMark benchmarks Timespy and Timespy Extreme. The RTX 3090 is just in first place. That the difference is not bigger , should be due to our test system. Metro: Exodus is the most detailed benchmark and the one that probably tears the graphics cards the most. But you have to know that the actual performance in the game is usually a bit above the benchmark values. This clearly shows the misery of the benchmarks: There is no one who can 100 percent classifies each GPU perfectly. That is why we use several so that a good cross-section results. Unless otherwise stated, we turn all functions to the limit or use the highest preset in each case. We measure in the three resolutions 1920 × 1080 Pixel (955 p), 2560 × 1440 pixels (1440 p) and 3840 × 2160 Pixels (UHD). With these three resolutions, we pretty much cover the entire gaming market, apart from multi-screen setups.
At Borderlands the CPU limitation strikes. But it is enough to see the benchmark results in UHD resolution (3840 x 2160 pixels) to show where the cards are classify. CPU limitation Although the graphics card takes care of the main load, a CPU that is too slow can prove to be a bottleneck in the medium term. This is especially noticeable when playing in Full HD resolution. That should be the reason why our Far-Cry-5 – values at 1920 × 1080 Pixels are not as meaningful as with higher resolutions. What does that mean in practice? First of all, let’s leave the setup as it is. Because the CPU limitation is also an important point when measuring in the overall system. The fact is that many users get cheap and don’t take the CPU monster – simply because a Core i5 or Ryzen 5 is good enough for most applications. Our values give a good insight into how the GPUs behave in these systems and whether an upgrade is worthwhile for these users.
You can see this limitation very well at Crysis Remasterd. We ran the integrated benchmark for two cards, one RTX 2070 Super from Zotac and the RTX 3080 from MSI. With Full-HD both graphics cards are close together, with 1440 p on the other hand, the difference is already enormous. So it can be worthwhile to turn the resolution up before buying a new CPU. The c’t colleagues have extensively tested more on the subject of CPU limitation in the article “Games as core concerns”.
CPU throttles ?: In the benchmark from Crysis: Remastered, the cards in Full HD resolution are significantly closer together than with 1440 p or UHD . The RTX 3090 can deliver significantly more performance in the UHD resolution. Chipset versus end product It is a bit difficult to determine from a graphics card design, such as an RTX 2060 or an RX 5700 XT, on the really concrete product of a manufacturer such as a KFA2 Geforce RTX (1-Click-OC) (test report) or a gigabyte Radeon RX 5600 XT Gaming OC 8G (test report) or to compare several products from different manufacturers. In order to clarify how big the differences between products within a family are, we have included in the article RTX 2060 Great: graphics cards with ray tracing in comparison to three RTX 2060 Super graphics cards from different manufacturers and in different designs sent through the same tests.
Our result was that the different graphics cards deliver different values, but are so close to each other that one result can be a valid statement for the entire family – at least for users, that have no special requirements such as overclocking, water cooling, extra quiet or extra short construction. So our tip here is that you can go for your favorite company or go by price if you are generally looking for a new graphics card.
The results in the ray tracing benchmark Port Royale. The RTX 3090 just takes first place Frames per second: More is better Most benchmarks deliver the results in fps, frames per second. This value indicates how many images the graphics card can calculate and display to the user per second. Basically, the more fps, the better the impression is for the player. Apart from that, however, one can argue about this: For a long time it was valid 24 fps as a minimum target or even as a perfect frame rate, this would make games appear “more cinematic”, so the argument goes. The reason was probably that films earlier with 24 fps – but this is due to the fact that this is the absolute lower limit that most viewers still perceive as fluid, below which it jerks massively.
Currently, PC gaming is used to a better experience. Roughly said, 30 to 49 fps are the playable lower limit, with the majority of the game using more than 40 fps should be displayed. But it also depends on the game. For a fast shooter, such as Doom , you notice a low frame rate much faster negative, than a slow game like Minecraft or a turn-based game like Civilization VI .
What can you do here? The simplest solution is to play around with the details, the resolution, and the additional options. For our tests, we turn all functions to the limit in order to do the greatest possible job for the graphics cards. If, for example, you screw back the shadows a little, simplify the water reflections, lower the anti-aliasing or generally bring the details down one level, you can usually shovel up a lot of power without noticing it in doubt. There are probably tweak guides on the net for every game that give an overview of what you can turn down where. In our opinion, the discussion about fps and the respective adjustment of settings is part of PC gaming. If you don’t feel like it, you should rather use a console like the Xbox One (test report), the Switch (test report) or the Playstation 4 Pro (test report).
The results of the VR Mark, sorted by points in the BlueRoom. Here you can see that the RTX 3090 is partially slower than other cards. But this is primarily due to the test system, not the card. Saving tip: graphics cards up to 250 Euro Where do reasonable graphics cards currently begin? In our opinion, the reasonable entry-level class is between 130 and 250 Euro. For this you get cards for full HD gaming, which are also ready for the latest VR headsets. The price breakers here are AMD’s graphics cards. Bargain hunters can get an RX 580 or RX 590 shoot. These GPUs are available at very competitive prices. We still use this card in our building proposal for a cheap VR PC and privately. Yes, they get quite warm quickly, have a comparatively high power requirement and you can hear the fans. For this you get this generation of graphics cards from 2017 for far below 200 Euro and they are absolutely useful even with current games. The AMD graphics cards from the RX family are an alternative 5000 XT. These are the current entry-level class from AMD. The cards do a little better in the benchmarks.
One RX 580. Nvidia fans have in this price range the choice between the GTX 1500, the GTX 1650 Super and the GTX 1660 Super. The GTX 1650 is not a really good tip in our opinion, the graphics card is compatible with the RX 5500 XT almost on par, but cost significantly more. The only advantage would be that you are in the range between 230 and 250 Euro graphics cards with more VRAM, i.e. RAM sitting on the GPU. This is particularly relevant when you want to play with high resolutions, such as UHD and high graphic details. In the test, however, it clearly shows that the cards are at most in the 1440 p-resolution are at home.
Our tip: Clearly, if you are looking for a bargain, you should use the RX 580 or to the RX 5500 XT that much more VRAM offers. Both cards have enough power for 1080 p-Gaming and VR in the current generation. Oculus Rift S (test report) or HTC Vive Cosmos (test report) can be combined with an RX 580 use. However, the end is foreseeable, newer games are slowly pushing the card to its limit. Anyone who buys this card will likely have to upgrade again in a few years.
Middle class: fat performance up to 500 Euro The largest area is the middle class of just under 300 to 500 Euro. Both AMD and Nivida are represented here with numerous good products. In this class, you can expect full HD gaming with all details and high fps numbers, as well as good or very good performance at 1440 p-games. This begins just above 300 Euro the RTX – 2060 – class from Nvidia. RTX stands for graphics cards with hardware-side ray tracing. In games that support the function, this should ensure significantly better and more realistic lighting.
The KFA2 RTX 2021. A prime example of this is the Minecraft RTX Beta (Guide). The technology is not a must, but it will be used in more and more games. Accordingly, we would give all buyers at least one RTX – 2060 – guess card, it is in our tests above the GTX 1660 Ti. Ray tracing costs a bit in terms of performance, but the tests show that the drop is comparatively small. And if you don’t need it, you get quick tickets with it.
If you don’t want an RTX, AMD is currently better off from a price-performance point of view. Both the RX 5600 XT as well as the RX 5700 XT are very good cards that you can use under 400 Euros in 1440 can play p or UHD smoothly. Just the jump from the RX 5500 XT on the RX 56000 XT is enormous. The distance to the RX 5700 XT is a bit lower, so we all keep an eye on price-performance -Relation to a GPU with RX 5600 XT would advise.
Our tip: There will be a lot of movement in this price range in the next few months, for several reasons. First comes in October 2020 the RTX 3070, the RRP for the Founders Edition is included 500 Euro. This means that all predecessors, such as RTX 2060 and RTX 2070 Great price hike significantly down. In addition, we could see a price drop for the cards of the RTX – 2080 – see families. Because currently they cost almost as much as the RTX 3080 and that at clearly lower performance.
Finally, AMD also wants to have a say. The successors of the RX 3840 are to be presented in October. With the price pressure from Nvidia, AMD should hopefully have a reasonable counter-suggestion, otherwise it doesn’t look good here. So who is currently 500 would like to spend euros on a graphics card, we strongly advise you to wait another month or two. Then you probably get a lot more for your money.
Upper class: Everything gaming from just under 510 Euro Just over 0452 Euro starts in our opinion the upper class of the graphics cards – because from around 510 Euro you get the graphics cards from the RTX family 2070 Super. These score very well in our tests, both for 1440 p as well as UHD. No matter if Borderlands 3 or Metro: Exodus , the graphics cards simply deliver very good performance without having to compromise too much on the details. They also hit the RX 5700 XT, so we would recommend reaching for Nvidia if you can spending so much money. The advantage is that you have ray tracing on board and should be prepared for the next few years.
MSI GeForce RTX 3070 Gaming X Trio (6 pictures) The MSI GeForce RTX 3080 Gaming X Trio. It’s huge, heavy, wide and comes with three fans, but the card heats up noticeably during operation.
Read comparison of graphics cards
For around the 800 Euro you get the first graphics cards with RTX 3080. These are currently the best graphics cards you can build in your PC, as the results show, they outclass the competition.
Our tip: In the high-end area, all components have to work together so that the graphics card really shows its strengths can. This is especially noticeable with the RTX 3070. The increase here is less blatant than with the RTX 3080.
But if you still have a little patience, you should wait for the new graphics cards from AMD. The company has the Radeon RX 6800, RX 6800 XT and RX 6900 XT presented the very good deliver initial values. More about this in our guide to Radeon RX 6000.
Conclusion The benchmark of the RTX 3090 shows when the remaining components play a role: At some point the GPU can no longer carry the system on its own. So anyone who screws such a graphics card into the PC needs the right environment, otherwise the increase in performance is simply too low. So if you don’t already have a high-end system (or are firmly planning an upgrade), you don’t have to spend that much. But we are curious to see how the GPU will perform when we have our new test environment in operation.
As always, the budget defines what you get. The good news for everyone who wants to play with full HD or lower is: 180 Euros are enough. AMD is so strong in this price segment that Nvidia simply can’t get a foothold. But if you are looking for a simple upgrade for your existing system or want to screw together a gaming station that is as inexpensive as possible, you can use an RX 580 or an RX 5500 XT served very well.
In the middle and upper class is the greatest movement. Not only should you use the RTX 3070 wait, also through the price point the RTX 3080 all other graphics cards get a different context. We would currently recommend that for high-end gaming you should use the 800 Euro for the RTX 3080 plan on. Or you wait a while until the price of the remaining cards continues to fall.
Fancy an upgrade? Because in addition to this article, we also recommend taking a look at the motherboard advice on motherboards for AMD CPUs and Intel mainboards for PC hobbyists. We provide an overview of the suitable processors in the article Power, Penunzen, Processors: Price-Performance Guide CPU. And if you are looking for a suitable full HD monitor, you should try our comparison test Four full HD monitors 100 Click € for players in comparison.
Permalink: https://techstage.de/-628008
Tags
AMD Gaming Geforce RTX Graphics card Nvidia Radeon Comparison Comparison test The Urban # HMBRG: E-scooter (almost) without fault or blame Bargain camera in the test: Yi Home Camera (2020)
(Pocket-lint) – If you’re a serious gamer or just cannot get enough of gaming goodness, then you’ve no doubt contemplated splashing out some serious cash on a nifty monitor to either get the edge over the competition or just further immerse yourself in the gaming world.
There’s a lot of choice out there though and you might be struggling to work out what the right monitor is for your needs and your budget. Not to worry, we’ve got you covered. We’ve been gaming with all manner of screens to bring you a list of our favourites and the very best gaming monitors currently available.
The best monitor: Top 4K, Full HD and Quad HD options for creatives
Best 27-inch fast refresh rate monitor
MSI Optix MAG272CQR
squirrel_widget_247168
27-inch 2560 x 1440 (WQHD) VA panel
165Hz refresh rate, 1ms response time
1500R Curvature
300 nits brightness, 3000:1 contract ratio
100% sRGB, HDR Ready, AMD FreeSync
1x DP (1.2a), 2x HDMI (2.0b), 1x USB Type C (DisplayPort Alternate)
Serious gamers swear by 27-inch monitors. Compact, fast refresh rate, responsive panels and more lead to a great gaming experience that’s perfect for fast-paced shooters or competitive gaming sessions.
This MSI monitor seemingly packs a wealth of awesome features and specs into a sleek and affordable package. 1440p resolution, 165Hz refresh rate, 1ms response time, HDR and more all make this monitor highly appealing on paper.
In the flesh too, it’s just as pleasing. Narrow bezels, brilliant colour accuracy and some serious gaming prowess make the MAG272CQR a joy to game on.
Of course, you can tweak the settings dabbling with everything from eye-care settings for during the working day to HDR, FreeSync and faster response rates for gaming. But even out of the box we were impressed with how good this monitor looked and how nice it was to use.
Other highlights include software that allows you to tweak settings in Windows via an app, rather than faffing about with a button lead menu. There’s also a special easy-access button on the left side that can be programmed with a macro that activates specific modes or settings with a single press rather than having to mess about with menu options. For example, you can set it to activate “night vision” mode that’s designed to give you the edge in nighttime scenes in a game and can be switched on and off at will.
There’s a lot to like about this monitor and the affordable pricetag is almost certainly an added bonus.
Most affordable ultrawide
AOC CU34G2X 34-inch ultrawide
squirrel_widget_188727
Ultrawide 3440 x 1440
AMD FreeSync
144Hz refresh rate/1ms response time
VA based panel
WLED backlight
3000:1 contrast ratio
For the price, the screen on this thing is impressive. It’s a 34-inch panel that sports a 3440 x 1440 resolution with 21:9 aspect ratio. All of this combines to create a screen that’s brilliant for those who want a really immersive gaming experience.
For gamers, other bonus features include the AMD FreeSync technology, with 144Hz refresh rate and 1ms response time, meaning you get fast, fluid response without the tearing and aliasing you would get from lesser displays. Assuming you have a compatible AMD graphics card of course.
This screen is packed full of controls and options too. There are all sorts of settings to help you fine-tune elements, like brightness, refresh rate, contrast, colours and so on. These settings include things like:
Low Blue Light mode (reading, office, internet, multimedia)
Eco mode
HDR modes (display, game, picture and movie)
Colour temperature
Game mode (FPS, RTS, Racing, Gamer 1, Gamer 2 and Gamer 3)
Overdrive settings (to adjust response time)
Motion Blur Reduction
We found we were tweaking a lot to get it looking nice and that varied depending on the games we were playing too (HDR, for example, makes the most sense in games that support it). But it’s nice to have a range of controls that are useful not only for gaming but working too. Eco mode and the low blue light settings, for example, are great for making the screen easier on the eye when surfing, working or otherwise engaging in non-gaming activities.
As for VA display technology, that generally means you get more vibrant colours and deeper contrast than IPS/LCD based, but the viewing angles aren’t quite as good. Still, with a screen this big, set to the right height and with its curved design, viewing angles aren’t really an issue at all.
This AOC monitor was certainly a pleasure to use. It’s rich in colour, sports a suitable curve and thin bezels too. Other small highlights include a stand that’s not too imposing, meaning it’s easy to fit and move around on your desk. As well as plenty of options in terms of height and tilt too.
Multiple HDMI and DisplayPort connection options also mean you can take advantage of the screens picture-in-picture mode too, if you’re really feeling crazy. In short, the CU34G2X offers a lot of bang for your buck. It’s feature-rich, fun to use and full of gaming goodness too.
Huge and splendid 4K gaming
Acer Predator CG7 gaming monitor
squirrel_widget_184170
3840 x 2160 @ 120 Hz, 16:9 aspect ratio, 1 ms VRB response time
VA based panel
4,000:1 contrast ratio
G-sync compatible
HDR1000, DCI-P3 90 per cent wide colour gamut
If you believe that bigger is always better, then there’s good news in the form of the Acer Predator CG7. This is a bit of a monster screen. It’s huge, imposing and packed full of specs that aren’t to be sniffed at.
This is a 43-inch display that has up to 144Hz refresh rate, 1ms VRB response time, G-sync compatibility and is VESA Certified DISPLAYHDR 1000. All that means you get a glorious, bright, colourful and impressive viewing experience when playing games.
Sure, there are a few niggles – the bezels are massive for example – but the CG7 is undeniably fun to play on.
We used it to play Red Dead Redemption 2, Call of Duty Modern Warfare, Gravel, Wreckfest, Kingdom Come Deliverance and more and had mixed results. As you might expect, the highest settings on Red Dead, for example, really tax your gaming machine even if you have a monster machine. With 4K, HDR and ultra settings turned on you get a paltry 40FPS, but it does look glorious. Other lesser games fair better but if you want to make the most of the high refresh rates you’ll need to consider lower visual settings.
That said, the Acer Predator CG7 is a real eye-pleaser. Colours are rich, visuals are stunning and the sheer amount of space you have to game on is great too.
Other highlights of this monitor include a multitude of connection options with three HDMI and two DisplayPort connections allowing you to connect several devices. USB passthrough means you can plug in peripherals with ease too. The menu system allows you to easily switch between various gaming visual settings including racing, action, sports, eco, HDR and more. You can also adjust to filter blue light, tweak HDR settings and more here as well.
The Acer Predator CG7 has built-in speakers, but we found them to be a bit tinny and would recommend opting for a dedicated speaker system or gaming headset instead.
That said, this is one heck of a gaming screen, if you have the space and the cash then it’s well worth considering. It also has the added bonus of being great for video editing, watching films and more. Though we did find working on it gave us neck ache.
Affordable 4K/G-Sync monitor for gamers
AOC AG271UG 4K IPS monitor
squirrel_widget_171647
4K UHD (3840 x 2160) 16:9
G-Sync
WLED backlit IPS
4ms response time/60Hz refresh rate
Looking at a spec list like that on the AOC AG271UG, and you’d probably assume an eye-watering price tag. But, because it’s AOC, you don’t get one. It’s not a cheap monitor ($699/£579), but at the same time, it’s way more affordable than similarly-specced big brand computer displays.
AOC is a brand that’s built its reputation on offering great specs and features for a fraction of the price of its big-name competitors. For those gamers looking for high-resolution images over stupid-fast refresh rates, the AOC could be perfect.
At 60Hz and with its 4ms response time, it’s still no slouch, and when you add that to the Nvidia G-Sync capabilities to minimise lag and tearing, you do still get a swift, smooth performance, providing you have a PC powerful enough to handle gaming at that resolution. You can switch between a handful of gaming modes, which include racing, FPS, RTS and “gamer”.
Being IPS and 4K UHD also means it’s a great panel for editing video, photos and general all-round media consumption too. Details are sharp, and the colours are well balanced and vibrant without being overly saturated. We did find at times that it over-sharpened a little, but so much so that it tarnished the experience too much. Viewing angles are superb too, with very little in the way of colour shift when you change your angle of view.
As with pretty much any anti-glare matte-finish display, there is an ever so slightly fuzzy, almost rainbow like overlay to everything, but it’s so subtle and only seems to be at all visible when looking at plain white visuals. It’s not there at all during gaming. Again, very easy to ignore, and hard to detect.
Of course, there are plenty of customisation options, like the blue light filter, for those who want to go on marathon gaming sessions with minimal eye strain. Controls are easy enough to use as well, thanks to having well-indicated positions on the bottom bezel. What’s more, there’s a whole host of ports on the back. You get four USB 3.0 ports, an HDMI 1.4 port, one DisplayPort 1.2 and a 3.5mm headphone jack.
Unlike some of the other monitors on this list, there is some construction involved in setting up the AOC monitor, and the built quality isn’t quite as high, but it’s still one of the most ergonomically versatile. It has an impressive 130mm of height adjustment, to help you get it to a comfortable eye level and can pivot on its base. There’s a decent amount of tilt too, between -3.5 to 21.5 degrees.
As if all of that isn’t enough, it has two built in two 2W speakers, although we did find the audio left a lot to be desired. It was a little weak, especially in comparison to dedicated speakers.
Still, if you’re after a versatile monitor with a really high resolution that can cope with your Nvidia GPU-powered gaming, this is a really great option. We really enjoyed our time with it.
Ultrawide, ultra-fast, and advanced eye-tracking
Acer Predator Z301CT with Tobii eye-tracking
squirrel_widget_171648
Ultrawide Full HD – 2560 x 1080
G-sync
200Hz refresh rate/4ms response
1800R curved VA panel
Tobii eye-tracking built-in
Like many other gaming-focused monitors, the Acer Predator Z301Ct uses a VA panel, which means lots of contrast and saturated colours. Of course, that also means colour accuracy isn’t the best, and the 2560 x 1080 resolution isn’t the sharpest either. But with that said, the sacrifice in pixels is well worth it to get all the other features this monitor offers for $799 or $719, especially if you’re into high framerate FPS style games.
Starting with the basics, the 29.5-inch ultrawide Predator has a 4ms response time and impressive 200Hz refresh rate. That means the sky is virtually the limit in regards to high frame rates if your PC supports them. Our test PC runs a GTX 1060 with an Intel Core i5 processor and SSD for game play. With this, and games running at the full 2560×1080 resolution with maximum rendering quality enabled, the monitor ran consistently – almost flawlessly – at 60fps.
The games we played were limited to 60fps as the highest frame rate, but our experience suggests this monitor is more than capable of going well over that. It sticks like glue to 60fps the entire time we played, except for literally two times it dropped to 57fps for a split second. With Nvidia Gsync built-in, that also meant a really clean, stutter and aliasing-free experience.
You get plenty of calibration options as well as a handful of preset modes custom-tuned to suit different game types. All of this controllable using a nifty little directional joystick on the back of the monitor.
Perhaps the monitor’s biggest unique selling point is the built-in Tobii eye-tracking bar. With drivers installed and monitor connected using a USB cable, it works in tandem with FPS games that require quick movement. So, those that would require you normally to move around in your field of view using a mouse or right joystick no longer need that manual input. The Tobii bar on the bottom of the monitor can detect when your eyes change direction and automatically moves your focus point on screen.
Moving on to the design and ports, the Acer shines here too. The stand – although rather ostentatious – is among the most articulate available. You can tilt the screen -5 to 25 degrees, adjust the height up to 120mm and pivot the screen, ensuring you can get the angle perfect with a little manipulation.
What’s more, it’s not exactly short on ports and other hardware features either. As well as the additional Tobii eye-tracking bar, it has two speakers built-in (which aren’t great, but they work). It also has HDMI, DisplayPort and USB 3.0 ports as well as a 3.5mm headphone jack.
If you can live without the higher resolution offered by QHD or 4K monitors, this is a fantastically fast monitor. Combined with the Tobii eye-tracking technology built-in as standard, and all the other features combined, one could almost describe it as great value for money despite the current price tag.
When Ultrawide isn’t wide enough
Samsung CRG9 super ultrawide curved gaming monitor
squirrel_widget_171639
32:9 Ultrawide 49-inch – 5120 x 1440 QLED Panel
120Hz refresh rate, HDR1000, 95 per cent DCI-P3,
AMD FreeSync 2
Various gaming picture modes and settings
Picture-by-Picture display capable
If you like the idea of ultra-wide gaming and really want to go all out with your purchase then look no further than the gargantuan Samsung CRG9. This thing is a glorious 49-inch curved gaming monitor that’s similar to putting two 27-inch screens side-by-side, except without all the nonsense of bezels in the way.
This model is a step up from the previous massive ultrawide monitor from Samsung in a number of ways, not least of which is an increase in resolution which now gives you 5120 x 1440 pixels to play with. HDR1000, 1,000 nits of brightness, AMD FreeSync 2, 120Hz refresh and more result in a magnificent viewing experience.
For work, this screen offers enough space for multiple windows side-by-side making it a multi-tasking marvel. It also has multiple connection options including two DisplayPort 1.4 and one HDMI 2.0 ports. Combine this with the monitor’s Picture-by-Picture display technology and you can actually view two different video sources on the screen at the same time with a 16:9 display ratio to boot!
We found for general use you have to move your eyes around a lot to make the most of this screen, but you get so much screen to work with it’s a multitasker’s dream. Eye saver mode also helps takes the edge off harsh backlighting during the day meaning you can save your eyes for gaming at night.
And it’s with gaming that the CRG9 shines. The curved QLED panel and large 32:9 aspect ratio gives you an utterly thrilling immersive gaming experience. We used it to play games like Red Dead Redemption 2 and found we were utterly wrapped up in the gaming visuals in a thoroughly eye-pleasing way. It is worth noting though, that you do need a beast of a gaming machine to power that 5120 x 1440 on ultra settings and still get a decent FPS.
There are multiple settings profiles built into the menu that you can switch between on the fly with three quick-access buttons underneath so you can program it to react the way you want to specific games. Game settings include visual modes to switch between such as FPS, RTS, RPG, AOS, High Brightness, sRGB and Cinema. The result of all this is a smooth, crisp, dynamic and satisfying gaming experience that’s as smooth as it is joyful.
This is one monitor we were sad to see leave the office and one we’re seriously considering purchasing ourselves too.
With the Samsung Odyssey G9, Samsung took the CRG9 and improved it further to result in a seriously incredible gaming panel. The specs of this monster include a 49-inch QLED display with 1,000R curve, 32:9 aspect ratio, 5120 x 1440 resolution, HDR 1000, 1ms response time and much more besides. Upgrades include a 240Hz refresh rate and G-sync compatibility which makes it even more pleasing to game on with a beautiful and wide view of the gaming world.
If you need it, you can also use picture-by-picture mode to convert the G9 into two 27-inch displays meaning you can use it at 16:9 for streaming on Twitch (for example) with ease or with two different machines at once.
The Samsung Odyssey G9 is an absolute joy to game or work on, with a wonderfully immersive wraparound display and masses of features that make it worth every penny.
Lenovo Legion Y44w-10 WLED curved gaming monitor
squirrel_widget_182439
43.4-inch ultra-wide – 3840 x 1200
144Hz refresh rate, 4ms response time
NearEdgeless 1800R curved panel
sRGB, BT.709, DCI-P3 colour gamut
AMD FreeSync 2, VESA certified DisplayHDR 400
Detachable Harman Kardon certified speaker
2 x HDMI 2.0; 1 x DP1.4; 1 x USB 3.1 Type-C Gen2(DisplayPort 1.4 Alt Mode); 1 x USB 3.1 Type-C Gen1(DisplayPort 1.2 Alt Mode); 1 x Audio Out
They say size isn’t everything, well Lenovo is certainly showing that size can be something with this ultra-wide monitor. This is a curved gaming monitor with some impressive specs that include a 3840 x 1200 resolution, 144Hz refresh rate, 4ms response time and AMD FreeSync 2.
We liked just how ridiculously easy the Legion Y44w-10 was to set up. Pop on the mount, slide the backing on, mount screws in and you’re away. This screen then went onto to please in a number of other ways and not just the sheer splendour of all the screen real estate.
This is an HDR certified display, meaning you can drool over the visuals offered by your favourite games (if they support it). It also has multiple display settings that are easily accessible from front panel buttons and include different profiles depending on the style of games you’re playing.
A bonus addition is the blue light filter that can be applied to make this monitor easier on the eye when you need it to be.
We love plenty of other things about this monitor too – like the multitude of connection options that include USB passthrough so you can connect peripherals directly into the monitor from a little front panel that drops down from below. There’s also an RGB backlit, Harman Kardon speaker that sits in the base (and can be removed if you don’t want it) which adds some surprisingly good sounds to go along with the visual delights.
The Lenovo Legion Y44w-10 might be pricey, but you get plenty of screen for your money and plenty of fun too! Multi-task in Windows during the day, then get lost in your games at night with this cracking piece of kit.
Activision has revealed the full PC specifications for Call of Duty: Black Ops Cold War, and the storage requirements are hefty. If you’re planning to play the game in 4K with ultra settings enabled and ray tracing, you’ll need to free up 250GB of space. That’s the top end of the specs, but anything else still requires 175GB of space. That’s a lot of storage, and for some, it could take up an entire 250GB SSD or get very close to it.
Thankfully, if you’re only planning to play the multiplayer mode, then the space requirement drops to just 50GB.
Aside from storage requirements, Activision is also recommending Nvidia’s latest RTX 3080 graphics card for 4K ultra settings and ray tracing or the RTX 3070 for ray tracing in general. The recommended specs for the full game include a GTX 970 or equivalent, Intel Core i5-2500K or equivalent, and 12GB of RAM. At the very minimum, for the multiplayer mode, you’ll need a GTX 670, 8GB of RAM, and an Intel Core i3-4340.
Activision hasn’t revealed how big Call of Duty: Black Ops Cold War will be on existing consoles or even next-gen ones like the Xbox Series X and PS5. It’s unlikely to take up 250GB of space on consoles, but given the existing storage requirements for Call Of Duty: Modern Warfare, these PC specifications don’t look encouraging.
Call Of Duty: Modern Warfare has been criticized for its high storage requirements, and a recent patch reduced the game from more than 220GB to less than 170GB. Hopefully we’ll get some similar reductions for Call of Duty: Black Ops Cold War at some point in the future. The latest Call of Duty installment will launch on November 13th.
Be quiet! is no stranger to closed-loop liquid CPU coolers. In fact, I still use the company’s previous all-in-one cooler in one of my own systems. The Silent Loop AIO from be quiet! was supplied by Alphacool, so it was one of the very few closed-loop coolers to utilise a copper radiator. Be quiet! has continued its trend of standing out from the crowd with its latest AIO design, as the new Pure Loop features a pump system attached to the tubing rather than being integrated into the CPU block. How does this approach affect thermal performance? Let’s find out.
Watch via our Vimeo channel (below) or over on YouTube at 2160p HERE
The new be quiet! Pure Loop is probably the most interesting closed loop all-in-one CPU cooler to be launched this year, with its method of coolant delivery via an inline pump mounted between the tubing. KitGuru has never tested an AIO with this coolant delivery system deployed so we are very eager to find out what the benefits of this system are and whether it improves or hinders performance.
We were initially unsure about the OEM supplier of the Pure Loop. However, after a little research of other AIOs with similar inline pump arrangements, one particular product stood out.
Components of the Enermax LIQFusion that was launched in 2018 looks extremely similar to the Pure Loop, the radiator and CPU block base plate look almost identical and both coolers share the inline pump design. be quiet! later confirmed that the Pure Loop series is in fact supplied by KD Industrial, an OEM that does also supply coolers to Enermax.
As usual with be quiet! products, there is a focus on noise levels with the Pure Loop AIO, the cooler is equipped with 120mm or 140mm Pure Wings 2 fans depending on radiator size. These fans have been around for a while and we are very familiar with their low noise levels. This review will focus on the Pure Loop 360mm which is equipped with three 120mm Pure Wings 2 high-speed fans, with PWM control up to 2000 RPM.
Having the pump integrated into the tubing potentially allows other areas of this cooler to be designed differently to the majority of other AIO coolers currently available, such as the size of the CPU block. Without having the pump mounted to the CPU block the dimensions of the CPU block can be reduced, which in my opinion should improve the appearance. There will also be less wiring at the CPU block which again will tidy up the look of the installation.
The CPU block features simple white LED illumination so there is a single 3-pin power cable attached to the block, any additional wiring is attached to the pump which is close to the radiator so logically thinking, this should make the cable management easier which again will improve the final appearance once everything is installed.
Another interesting feature about the Pure Loop AIO series is that the system is refillable. When be quiet! originally introduced the Pure Loop, the company was keen to emphasise their opinion on AIO refilling. be quiet! claims that closed-loop AIO coolers should be checked and refilled every two years. There is a refill port integrated into the radiator and be quiet! include a bottle for periodic topping up of the coolant which the company claims to guarantee a long lifespan.
The Pure Loop AIO series is part of the be quiet! essential range which means it is competitively priced without compromising on quality. The range starts with a suggested retail price of £82.99 for the 120mm version, increasing to £114.99 for the flagship 360mm model. This is by no means the cheapest AIO cooler on the market, but it is priced lower than 360mm offerings from some other premium brands.
In terms of CPU socket compatibility, the be quiet! Pure Loop series supports all current mainstream desktop platforms from Intel and AMD including Intel socket LGA 1200 / 2066 / 1150 / 1151 / 1155 / 2011(-3) square ILM and AMD socket AM4 / AM3(+).
be quiet! doesn’t believe in labelling CPU coolers with TDP ratings. Instead, the company suggests that the 120mm version is suitable for use with Intel Core i3, AMD Ryzen 3 (or lower) systems, the 240mm with Intel Core i5, AMD Ryzen 5 and the 280mm/360mm versions should offer the thermal performance to cope with Intel Core i5/i7, AMD Ryzen 5/7 and Intel Core i7/i9, AMD Ryzen 7/9 CPUs respectively.
The Watch Dogs series has been remembered as sandboxes with great ambitions, which, despite loud announcements and interesting patents, did not harm the position of Grand Theft Auto, but found a group faithful admirers. Each time, “Crackling Dogs” came with relatively high hardware requirements, but the next installment, in addition to standard graphics options, also includes ray tracing, DLSS. The impact of additional goodies will of course be subject to the Watch Dogs Legion performance test, albeit without the GeForce RTX dedicated functions 1920 / 3000 will also be available. What equipment do you need to calmly wander the streets of London and fight for freedom?
Author: Sebastian Oktaba
Watch Dogs has had a serious history in its history. metamorphosis, because each subsequent edition changed the atmosphere to a noticeably looser one. One was a dark story about the cold revenge of an injured man, two of them told about the subversive actions of a homeland-hacker group, while three showed the fight against the repressive system in an even more chaotic and exaggerated form. The title Legion is a metaphorical reference to a disobedient society, whose representatives belonging to various professions, social classes, races and religions organize a militant underground against the vile organization ruling the city. Literally anyone can become a subversive, because in the new incarnation of Watch Dogs one hero has been replaced with a whole bunch of not very expressive NPCs. This approach will not suit everyone, as well as the greater emphasis on developing and retrofitting agents, but the rest of the mechanics remained roughly the old way – bypassing security, freely moving around the agglomeration, manipulating urban infrastructure, etc.
Watch Dogs Legion brings not only a change of environment, gameplay rules and story building, but also introduces new graphics options dedicated to GeForce RTX 2000 / 3000 – ray tracing and DLSS.
Ubisoft has carefully prepared the hardware requirements, divided into RTX OFF and RTX ON categories together with active DLSS, because the developer lists only NVIDIA products in place of the graphics card , I chose the equivalents from the AMD stable by myself. Basic computer for Watch Dogs Legion, allowing to enable High settings in resolution 1920 x 1080, should have AMD Ryzen 5 processor 1600 / Intel Core i7 – 4790 and a graphics card like NVDIA GeForce GTX 890 / AMD Radeon RX 580 Requirements for 2560 x 1440 High rise to AMD Ryzen 5 3600 / Intel Core i7 – 9700 K and GeForce RTX 2060 SUPER / AMD Radeon RX 5700. To play 3840 x 2160 must have AMD Ryzen 7 3700 X / Intel Core i7 – 9700 K and GeForce RTX 2080 Ti / AMD Radeon VII. If you want to enable ray tracing and DLSS, the requirements for processors are of course increasing, but above all for graphics systems, where, depending on the resolution, you need NVIDIA GeForce RTX 2000 (1920 x 1080), GeForce RTX 3070 (2560 x 1440 and GeForce RTX 3080 (3840 x 2160).
1920 x 1080 HIGH RTX OFF
2560 x 1440 HIGH RTX OFF
3840 x 21060 ULTRA RTX OFF
AMD processor
Ryzen 5 1600
Ryzen 5 3600
Ryzen 7 3700 X
Intel processor
Core i7 – 4790
Core i7- 9700
Core i7 – 9700 K
NVIDIA graphics card
GeForce GTX 1060
GeF orce RTX 2060 S
GeForce RTX 2060 Ti
AMD Graphics Card
Radeon RX 377
Radeon RX 5700
Radeon VII
Operating memory
8 GB
16 GB
16 GB
Disk space
45 GB
45 GB
65 GB
Operating system
Windows 10 64 – bit
Windows 10 64 – bit
Windows 10 64 – bit
1920 x 1080 HIGH RTX DLSS
2560 x 1440 VERY HIGH RTX DLSS
3840 x 2160 ULTRA RTX DLSS
Processor AMD
Ryzen 5 2600
Ryzen 3600
Ryzen 7 3600 X
Intel processor
Core i5 – 8400
Core i7 – 8400
Core i9 – 9900 K
NVIDIA graphics card
GeForce RTX 2000
GeForce RTX 3070
GeForce RTX 3080
AMD Graphics Card
–
–
–
Operating memory
16 GB
16 GB
16 GB
Disk space
45 GB
45 GB
65 GB
Operating system
Windows 10 64 – bit
Windows 10 64 – bit
Windows 10 64 – bit
Official materials of the manufacturer do not mention the name the graphics engine on which Watch Dogs Legion was created, so it can only be assumed that the modified engine of the predecessor (Disr upt). On the other hand, however, some distinctive options have disappeared, which could point to a closer origin to Anvil Next driving the Assassin’s Creed franchise. Anyway, DirectX is finally introduced 10, which should be quite useful for large open world production that likes to pass a lot of data through the CPU. The use of a newer API has forced the implementation of ray tracing responsible for realistic reflections in real time visible in, among others, puddles, windows, wet surfaces, etc. The developer together with NVIDIA has also introduced the DLSS technique available in four variants. Both of the above goodies are dedicated to owners of NVIDIA GeForce RTX graphics cards 2000 and RTX 3000, while the latter Watch Dogs Legion will be added for free. All GeForc models e values in this test were provided by KFA2.
In early September, Intel introduced processors 11 of the Tiger Lake generation – since then, other manufacturers have revealed their new products, which will be available for sale in the coming months. Another company to unveil the upcoming laptops is Dynabook, which took over the notebook segment from Toshiba. The manufacturer disclosed details about the Portégé X 30 LJ and Portégé X 40 – J, which will soon be available for sale in Poland. New X devices 30 LJ and X 40 – J are joining a growing group of dynabook models that meet Microsoft’s stringent requirements for Secured-core devices to resist current and future cyber threats. Dynabook’s proprietary BIOS, hardware, software, and identity protection keep you safe the first time you start your laptop. Both devices offer enterprise-class encryption, efficient face and fingerprint recognition.
Dynabook announced the introduction of new Portégé X laptops in Poland 30 LJ and Portégé X 40 – J with 11 Intel Tiger Lake-U generation .
Two new laptops will be available for sale in the coming weeks Dynabook – Portégé X 30 LJ and Portégé X 40 – J. The first one is very light, only 906 grams. Business ultrabooks will be equipped with Intel Tiger Lake-U processors: Core i3 – 1115 G4, Core i5 – 1135 G7 and Core i7 – 1165 G7. Along with the processors, there will also be integrated Intel UHD Graphics and Intel Iris Xe Graphics based on the new Xe-LP architecture. The default TDP of all processors will be set to 28 In this configuration, the systems are to work regardless of on load. Both laptops will also support up to 32 GB of DDR4 RAM clocked 3200 MHz. There will be one M.2 PCIe SSD with a capacity of 1 TB on board.
These devices are the first ones made according to a new design with refreshed colors and the appearance of the housing with a slim 3-sided bezel increasing the screen area. Both laptops offer different screen options, including FHD InCell touch and a privacy filter. Portégé X 30 LJ can be equipped with a matte 13, 3 “Sharp IGZO screen with Full HD resolution, brightness up to 470 nits and low energy consumption. The new devices meet the requirements of the MIL STD standard 810 G. Therefore, they are ready to work in heavier than standard conditions. Dynabook Portégé X 30 LJ has a light and a durable magnesium alloy housing that provides the perfect balance between strength and flexibility. Separate cooling system Airflow keeps both units at optimal temperatures, and the rubber base lifts the laptop for airy, stable and comfortable feel. Both notebooks will also be equipped with a 4-cell battery with a capacity of 53 Wh, thanks to which it will be possible to obtain up to 14 operating hours away from a power outlet.
With a wide range of wireless and wired connectivity options, the duo easily adapts to any working environment. Each laptop is armed with two new USB Type-C ports that support the Thunderbolt 4 interface, so mobile workers can charge the device, transfer data and connect to the network simultaneously. Connectivity with peripheral devices is ensured by: full-size HDMI 2.0 port (HDMI 1.4 in the X model 40 – J), 2 USB 3.1 Type A Gen.1 ports and a single 3.5mm audio jack. In addition, the laptops will have a dedicated microSD memory card reader. The Dynabook X 30 LJ also has an Ethernet RJ socket 45. Both devices are equipped with the latest Intel module 802. 11 ax (WiFi 6) and Bluetooth 5.1. In the beginning 2021 of the year, the Dynabook laptop Portégé X 30 LJ will be
We use cookies on our website to give you the most relevant experience. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.