Psychonauts 2 proves extra video games have to be made for youths and grownups

OPINION: This week I had the pleasure of reviewing Psychonauts 2 for Trusted Reviews. For me this was a big deal, as I’m actually old enough to have fallen in love with the first game when it launched on the original Xbox all the way back in 2005.

To catch our younger readers up, Psychonauts was one of the best platformers of its generation. It’s a charming game that sees you step into the shoes of wanna-be psychic superhero Razputin “Raz” Aquato as he attempts to join the Psychonauts – an elite group of the world’s most powerful psychics.

The sequel is a follow up that takes place mere days after the events of the first game, despite fans having had to wait 16 years for an answer to the original’s cliffhanger ending. To be clear, waiting just under half your lifetime for a sequel is not fun. Which is why I was super happy to find Psychonauts 2 does its much loved source material justice.

You can get a proper breakdown of all the reasons why in my in-depth Psychonauts 2 review. But here I want to focus on one of the biggest, and most surprising reasons: how fun it is for both kids and adults

As I noted in the main review, Psychonauts 2’s biggest selling point is its story.

“Not enough games manage to get this tone right. As we saw with Biomutant, many oversimplify things, creating a narrative and experience that is too simplistic for anyone older than 8 to engage with. By comparison, Psychonauts offers players a good balance, delivering a narrative that’s nuanced and full of enough hidden jokes to keep adults entertained, while remaining child friendly.”

You may be rolling your eyes, but this is a key element I think a lot of game designers and writers miss. Unless it’s Nintendo, they think a game’s either for kids or adults. There’s no inbetween and that’s a crying shame.

My first games console was a SEGA Master System. I’m not sure my dad’s decision to let me loose on Secret Commando, Ghouls and Ghosts and other similarly violent games was the best decision, but I remember those games genuinely helping us bond early in my early life.

This was in part because he also loved playing the games, so on more than one occasion, when we faced a truly challenging game we’d both take turns, tagging in and out, trying to figure out how to beat a particularly nasty boss, or make an impossibly hard jump.

I think I may have nearly gotten my parents divorced one evening when my dad let me stay up past 10 pm to help him finish Mickey Mouse Castle of Illusion…

The thing is, outside of Nintendo, there aren’t many games that are enjoyable for adults and kids the same way these classics were. This is why it felt outright wonderful to find this experience being replicated while I was reviewing Psychonauts 2, but this time with me playing the older, sadly not wiser, gamer in the team.

This happened when my niece walked past while I was playing the game’s opening cutscene, which catches you up on what happened in the first game. Within seconds the words “can I have a go” came out of her mouth, which anyone knows is code for “give me the controller or I will punish you”.

From there we spent the first few hours playing through the opening level together. This actually meant her having plenty of “that was practice” extra lives and me offering backseat suggestions.

This is the first time we’ve played any game other than Worms or Mario Kart (which I am banned from winning) together and I’ve already had multiple requests to play it again.

The experience made me realise how games with good stories and experiences for both adults and kids are so rare at the moment. I loved reviewing Psychonauts 2 so much and wish there were more games like it for me and parents to enjoy with their kids without putting them in competition.

You might like…

Correctly Imaginative and prescient: Will Samsung' s QD Show expertise take OLED to raised peaks?

OPINION: Recently, Samsung lifted the lid on its QD-OLED TVs – or QD Display – revealing some insight into what we can expect from these next-gen TVs.

Samsung’s relationship with OLED for TVs has not been the most… rosy. Back when OLED was still relatively unheralded, the South Korean manufacturer pumped out its sole effort – the curved KN55S9C. Despite other manufacturers jumping and advancing the OLED train further, Samsung took a step back and developed its QLED range, embroiling in a sometimes-heated rivalry with LG who have been the main backer of OLED displays. You can get a break down how that fight went in our OLED vs QLED guide.

Samsung has long bemoaned OLED’s lack of high brightness as one reason for not supporting the technology, another is OLED’s potential for image retention and permanent burn-in. That’s an issue that doesn’t affect Samsung’s QLEDs, as well as QLED able to hit much brighter peaks to give HDR content a punchier, colourful look.

And yet, you do sense that despite these advantages, Samsung has been rather annoyed that OLED continues to stick around. OLED TVs are now available in wider range of sizes and they’re getting cheaper each year with peak brightness, though nowhere near troubling QLED’s brightest, is improving.

And so, it would seem Samsung has decided to get back into the fray. But as usual, they’re going about it their own way.

What QD-OLED – sorry – QD Display intends to do is to take the best attributes of OLED and combine that with the Quantum Dot technology Samsung has developed. This return of sorts to OLED wasn’t sudden and hasn’t been the smoothest either. There have been many reports of resistance internally from Samsung Visual Display, despite Samsung Electronics investing a huge sum of money in a new production facility that would be home to these new large screen QD Displays.

Judging from what Samsung Display detailed on their site, there’s mention of the new panel technology reaching 1000 nits – something of a holy grail for OLED as Panasonic’s JZ2000 just hovers below it. Samsung has always pursued a high brightness remit for its TVs and while 1000 nits sounds rather paltry for them – a recent 8K TV we tested from them managed to hit a scorching 4000 nits – even 1000 nits is a level of performance OLED infrequently reaches.

So, while it seems Samsung believes it can achieve this level of performance, will they be able to? Again, reports from South Korea in the first half of 2021 suggested prototypes weren’t all that bright to begin with – perhaps a breakthrough has been made – but considering this a first effort at a next-gen panel, it’s likely we won’t see the best of these QD Displays just yet.

You might like…

Nevertheless, it’ll be intriguing to see what Samsung brings to the market with its take. It’s unlikely to hit the same affordability OLED is currently at, and unless there’s a sudden about-turn, it’s unlikely to support Dolby Vision like every other OLED. When it comes to features – especially for gaming – LG has arguably been more inclusive.

That means there’s ground for Samsung to lose if QD Displays don’t match the initial hype. They won’t have the high ground of significantly higher brightness, nor are they likely to have completely solved the issue of image retention and burn-in, of which many marketing pot-shots have been taken at the likes of LG about this issue.

Perhaps Samsung’s QD Displays on their own won’t take OLED to higher peaks at the first attempt, but perhaps the presence of another challenger will spur those already in the mix to greater heights. Considering how great this years’ OLEDs are turning out, it’s rather exciting to see how much further the technology can be taken.

Ctrl+Alt+Delete: Intel' s discovered from AMD' s errors with XeSS

This week Intel lifted the lid on a load of details around its new Alder Lake CPU and Arc graphics cards’ Xe graphics at a digital architecture day event attended by Trusted Reviews.

And boy was it a busy event. During it, the firm made some pretty bold claims, like suggesting the incoming Alder Lake CPUs will be “19% faster” than previous generations and that it is set to conquer the graphics market currently dominated by Nvidia and AMD.

For those that missed it, Intel’s been making gradual moves to release its own series of dedicated and discrete GPUs to take on AMD Radeon and Nvidia GeForce for years now. And this is hardly the first time it’s been a little bolshy with its claims. Raja Koduri, Intel chief architect and senior vice president, went so far as to suggest the firm planned to make most games run on integrated graphics rather than dedicated GPUs at a previous event I attended a couple of years ago.

This week it also took a key step forward confirming the first name of its GPU series and codenames for the first set of cards. Specifically, we learned the series will be called Intel Arc and the cards are currently codenamed Battlemage, Celestial and Druid.

While I’m still on the fence about whether its mysterious Intel Arc graphics cards will actually be any good, as there’s little to no concrete details on their specs or design, the event did show signs the firm is at least moving in the right direction.

This is because, not only did it confirm ray tracing support, it also lifted the lid on its new XeSS technology (Xe Super Sampling). Ray tracing is an advanced technology that lets graphics cards render significantly more realistic lighting effects. Highlights include things like real time reflections and dynamic, more realistic shadows. It’s a vogue item that runs on Nvidia’s 20 and 30-series GPUs and AMD’s latest RX 6000-series cards as well as the PS5 and Xbox Series X/S games consoles.

Trust me when I say, the added realism is awesome and makes for a significantly better gaming experience. But, the flipside is, as demonstrated by the AMD Radeon RX 6600 XT I reviewed earlier this month, the tech can outright kill frame rates without a solution like Nvidia DLSS and AMD FSR.

These are two technologies that aim to reduce the impact ray tracing has, though they take different techniques doing so. DLSS (an acronym for Deep Learning Super Sampling) is an Nvidia RTX feature that boosts a game’s framerate by generating the image at a lower resolution and then adding additional pixels to upscale it to the desired resolution, using artificial intelligence.

AMD FSR (FidelityFX Super Resolution) has a similar goal, but unlike DLSS it doesn’t require machine learning to work. Instead, it uses simple spatial upscaling. This is a different technique that raises the resolution by referencing adjacent pixels while upscaling.

From what we know Intel’s XeSS works like a combination of the two, with a spokesperson explaining to Trusted Reviews:

“XeSS or Xe Super Sampling is a novel upscaling technology that enables high performance and high fidelity visuals. It uses deep learning to synthesize images that are very close to the quality of native high-res rendering. It works by reconstructing subpixel details from neighbouring pixels, as well as motion-compensated previous frames. This reconstruction is performed by a neural network trained to deliver high performance and great quality, with up to a 2x performance boost.”

But for me, that’s not what’s interesting. What’s interesting is that Intel has made two clever decisions with XeSS. First, like AMD FSR, it’s made it open source, so anyone, regardless of if they’re an approved games developer can use it without having to ask. Second, it’s already secured partnerships, which Intel claimed means there “should” be a wealth of games featuring XeSS support when the Arc cards launch at the end of the year.

This sounds small – and I’ll remain slightly sceptical of the “widespread” support claim until I have the cards in our lab being benchmarked with the feature active – but it’s a positive sign that shows Intel’s at least trying to avoid the pitfall that’s hindered AMD’s GPU aspirations over the last 12 months.

Benchmarking AMD’s opening wave of RX 6000 cards, there was one constant theme: While they were excellent for general gaming and could match the Nvidia cards in most instances with ray tracing off, the lack of FSR support at launch, meant they struggled to compete with the setting on. This was a key reason our best graphics card buying guide was dominated by Nvidia 30-series cards last year.

Though there are still way too many unknowns to accurately say whether Intel will successfully carve a share of the GPU market any time soon, by making XeSS open source and investing in getting support at launch Intel could at least come out of the gate with a fighting chance. This would also be a good thing for the market in general, with added competition always giving companies an extra incentive to up their game and make better products.

You might like…

Champions and losers: Intel Xe-cites while you’re Pixel 5a followers get snubbed

It has been a busy week in the world of tech with everything from emails detailing a failed to launch iPhone Nano to an Intel data dump hitting the headlines.

But, as ever, for the team of experts at Trusted Reviews there has been a very clear winner and loser in the world of shiny things over the last seven days. Here’s who they are.

Winner: Intel with its new Arc and Alder Lake silicon

It’s no secret, we’ve not been Intel’s biggest fans over the last couple of years.

Since AMD came out swinging with Ryzen CPUs, generation-on-generation we’ve found it harder and harder to recommend the firm’s top end desktop chips, with the performance gains growing smaller by the day – even in gaming. This is why the Intel Core i9-11900K we reviewed earlier this year scored a slightly disappointing 3/5 stars.

It’s also why we were delighted to see the firm come out swinging at its latest Architecture Day earlier this week. At the event, Intel’s architect’s treated us to an outright barrage of announcements including fresh details about the firm’s Alder Lake CPU architecture, Arc graphics cards and data centre supercomputing plans.

For us, the graphics cards, in particular, stood out, with Intel dropping some seriously impressive performance claims and promising that, as well as ray tracing support, the cards will launch with XeSS later this year. This is a key step for Intel that means, if the stars align and the cards have competitive specs and performance, the Arc GPUs be competitive at launch.

XeSS is Intel’s version of Nvidia DLSS and AMD FSR. The Cliff Notes is that it aims to let the GPUs improve frame rates when running demanding processes, such as ray tracing light effects. In 2020, despite offering otherwise competitive performance, the lack of FSR (AMD’s version) at launch for the RX 6000-series cards, made them hard to recommend compared to their DLSS-ready Nvidia rivals.

Loser: Pixel 5a fans outside of the US

We’ve never made any secret of our love of Google’s Pixel a-series brand. Key phones, such as the Pixel 4a and Pixel 4a 5G have been consistent entrants in our best affordable phone and best mid-range phone guides.

This is because they manage to offer a number of key features traditionally seen on flagship phones at an affordable price. Highlights include Google’s top end camera AI processing and a completely untouched version of Android that’s guaranteed to receive software updates longer than the competition.

This legacy looks set to continue with the new Pixel 5a, based on its spec sheet which includes a solid Snapdragon 765 CPU, 5G connectivity plus a larger battery and the same stellar camera tech Google’s now famous for. This is why this week we well and truly had a toys out of pram moment when the firm revealed the phone will only be launching in the US and Japan.

Based on the Twitter reactions to Google’s Tweet confirming the news, we’re not alone in our disappointment, hence our placement as this week’s loser.

You might like…

Put on OS lastly meets its potential within the Galaxy Watch 4

OPINION: In my tenure as a tech journalist, I’ve tested so many Wear OS smartwatches that I’ve long since lost count of the total.

As much as I’ve enjoyed the flare that some of the fashion-led smartwatches have brought with them, Wear OS itself has always felt a bit clunky compared to the competition – that is, until now.

I’ve spent the last few days testing out the new Samsung Galaxy Watch 4, which is the first device to pack Google’s revamped Wear OS 3 (made in collaboration with Samsung, no less). With this update, it’s clear that Wear OS has finally evolved into the fluid, watchOS alternative that it always wanted to be.

From more colourful app icons to third party tiles that aren’t afraid to envelop the screen with eye-catching prompts, Wear OS powered by Samsung feels like a world away from the Wear OS watches of old. The digital rotating bezel on the Galaxy Watch 4 also makes skipping through menus feel like a breeze, much to the same extent as the Apple Watch’s digital crown.

You might like…

It also helps that the Galaxy Watch 4 hasn’t skipped over one of the most crucial parts of a smart watch – the watch faces.

As much as I applaud the company for making use of the Snapdragon Wear 4100 chipset, Mobvoi’s watch faces have always felt rather dull, but the Galaxy Watch has a surprising amount of variety where no two watch faces feel the same. If you want something casual to suit a lazy Sunday, you have plenty of options, just as there are several gorgeous two-hand designs that would go perfectly with a finely pressed suit.

The only problem here is that we don’t yet know whether these facets are part of the new Wear OS, or a part of the One UI overlay that is exclusive to the Galaxy Watch 4. Because we have yet to see another smartwatch using this version of Wear OS, Samsung’s wearable is our only frame of reference.

The latest rumours suggest that Fossil’s next mainline wearable could be released in September, but even then there will be a significant delay until it’s updated with the new OS. There’s still a chance that Google could be the next company to market with its long-gestating Pixel Watch, but analysts have been wrong before about an impending release.

Until we know more, the Galaxy Watch 4 is shaping up to be the ultimate Wear OS watch to beat, but that’s no bad thing. With careful consideration, Samsung may have finally given Android users what they’ve been waiting for – a smartwatch that can compete with the Apple Watch.

Telephones are too massive and Apple might have mounted this

OPINION: With the recent reveal that Apple was actually planning an iPhone nano back in the day, I feel that the trend of phones growing bigger each year is an issue for small-handed people everywhere.

I don’t think it’s that controversial to say that phones are getting bigger, not smaller.

Apple isn’t actually the worst for this, with many of the best Android phones around passing the 6.7-inch mark.

But it seems that no matter the phone, they’re just too big. The iPhone 12 Pro is 6.06-inches diagonally, which means I can never comfortably fit it in one hand, and I would be surprised if anyone who wasn’t an adult man could.

Of course, it’s easy to see why phones are getting bigger. More screen means more to see, more to engage with, and the possibility for better quality visuals and videos that aren’t squished down to a tiny screen. It also just generally looks better on the spec sheet.

But is it too much to ask that phone manufacturers take into account how big people’s hands actually are? In general, the average length of a man’s hand is 7.6-inches, and the average for a woman is 6.8-inches, and some phones easily clock out over that.

So knowing that, and then looking to see that a lot of smartphones sit around the 6-inch mark, it makes sense that I can’t use Twitter without feeling like I’m developing carpal tunnel, it’s because the phone really isn’t made with that in mind.

Apple is making its way back into my good books since the release of the iPhone 12 Mini, which sits at a reasonable 5.4-inches, just a tad smaller than the iPhone X. But I also think it’s come a little too late, as this is the first Apple phone available that is as good as the newest phone and smaller, so you don’t have to sacrifice size for quality.

I know I’m talking about Apple a lot here, and I’ll be upfront and say that’s because I’ve always had Apple phones. But it doesn’t seem to matter the brand, whenever I get the chance to play with an Android-toting phone I’m overwhelmed with the notion that I would never be able to send a speedy one-handed text, or quickly check Twitter while my other hand was occupied, as it really feels like a two-man operation just to get to the top of the screen to the bottom.

Phones also don’t all need to be tiny, I understand some people love a big phone and I don’t begrudge anyone that wants to carry a brick around with them, though I still don’t understand why you’d want to.

Instead, the simple solution here is just what Apple has done, mobile manufacturers should start making versions of new phones that are in ‘Mini’ form, so more people can actually use their own smartphones without straining their wrists.

But really, that’s why I blame Apple for this. If the company would have made the iPhone nano back in 2011, it’s more than likely that the trend would have caught on, and more phones would be available to buy in smaller sizes.

But instead, the trend went the opposite way, and phone sizes just keep going up. Comparing the iPhone 6 – the last phone I felt I could comfortably hold with one hand – and the iPhone 12 Mini. The 6 is 4.7-inches and the 12 Mini is 5.4-inches, so even the smallest recent phone from Apple is still a fair few inches bigger than what I personally find comfortable.

I will mention though for all the pedants, that the height and width of the iPhone 6 are actually bigger than the 12 Mini, however, I am discounting that as the screen is the part of the phone you actually interact with, and an extra 0.7-inches does make the difference.

Foldables were also promising, there was nothing stopping companies from making foldable phones that become smaller and slimmer, easier to hold and easier to slip into your pocket. But we didn’t get that, instead, most mobile manufacturers just took this as the chance to make phones even bigger, to double the number of screens and make it impossible for someone of my size to use them one-handed.

So, my pitch ends with asking Apple, and other phone companies, to consider revisiting this nano concept and to make versions of new phones that aren’t ridiculously big, but is perfect to hold in just one hand.

You might like…

Quick Cost: Apple’s the one one with a shot at making foldables mainstream

OPINION: This week, Trusted Reviews’ test unit of the Galaxy Z Flip 3 finally arrived at our labs. And while deputy and mobile Max Parker has found plenty to like about the new folding phone, I’m still not convinced it’ll be the breakthrough device that makes the category mainstream.

To catch you up, the Galaxy Z Flip 3 is the more affordable option in Samsung’s 2021 foldables line, sitting below the Galaxy Z Fold 3. It features a Motorola Razr style folding hinge that lets you fold the inner screen in on itself, like an old school clamshell phone from the days of yore. Once closed there’s then a secondary front screen for quick alerts and the like.

The second screen is actually the part Max liked best, with him excitedly telling me “you can have a cat on the front, need I say more,” moments after unboxing it.

And yes, I like cats as much as the next person. But, as I’ve said many times before, I simply can’t get excited by any foldable phone at the moment for a variety of reasons, the biggest of which is their current software.

The fact is, Android simply isn’t set up for foldables in its current state. This is because the OS, and pretty much every application on it isn’t optimised to run on foldable phone screens’ aspect ratio. They’re firmly optimised for traditional ones like the Galaxy S21’s 20:9 or older 16:9 form factors, which are a far cry from the Z Flip 3 main screen’s taller 22:9 aspect ratio.

This means, as we’ve seen on all previous foldables, next to every app looks a little odd on the screen and doesn’t display correctly: Netflix shows have visible black boundaries, as do games. And loading web pages all manner of strange things happen to news sites’ and their ilks’ formatting.

To be fair to Samsung, this isn’t its fault and it has adapted as much as it can. It’s a consequence of Android’s open source nature, which forces developers to hedge their bets prioritising the most common aspect ratios when creating their wares. I mean, why would they do loads of expensive UX work redesigning their product for a single phone?

This is simply the nature of Android being open source. It’s also the primary reason, as much as I hate to say it, I can only see one company having the drive and ability to make folding phones mainstream: Apple.

This isn’t because I think Apple’s better at hardware than Samsung – I’m actually very impressed how much more durable the firm’s made its folding screen tech since it launched its first generation foldables. It’s because Apple has always been a world leader in optimising software to new form factors.

Think about the iPad. Despite what Apple claims, there were numerous Windows tablets, like the slates from Samsung, before Apple even considered the form factor. What Apple did well was create a holistic offering with software bespoke designed to work on a tablet with a touch screen. This made the entire experience far smoother for general consumers than anything that came before. Trust me trying to use Windows 7 and Vista on a touch screen is not a pleasant experience, I’ve tried.

It’s since built on this creating what in my mind is the most holistic, and intuitive to some degree, ecosystem available with iPad OS, which is, being blunt, so much better than anything Android has to offer that Google threw in the towel years ago, confirming it had given up on tablets.

This is because Apple controls every aspect of its devices and only releases a limited number of them with locked down specs each year. Android by comparison lets firms to do whatever they want with the OS and use it on a huge variety of different form factors and designs, like foldables.

For foldables to work, they’re going to need an OS that is completely optimised to run on their specific form factor, where every app displays correctly and there are no “compromised” experiences. On a device that costs over $1000/£1000 consumers won’t accept anything less.

In the current climate, unless you count the Surface Duo, which has a twin screen rather than foldable design, Apple is in my mind the only company sensibly placed to do this. This is why, as much as it pains me to say it, I can’t see the form factor becoming mainstream until we see it on an iPhone.

You might like…

The Galaxy Z Flip 3 makes a great first impression

OPINION: Ever since I used the first iteration of the Galaxy Fold I’ve been of the opinion foldables exist simply to offer something a bit different.

I have used a number of other folding phones since that initial launch and my view has rarely changed. There have been some impressive devices no doubt, but nothing that really made me think they were ready to go mainstream.

And then I swapped my SIM card over to the Samsung Galaxy Z Flip 3.

Even though I have only been using this £949/$999 phone for a short time I already feel far more at home with it than any other foldable phone. This is because, it just works.

Now, of course, this isn’t the first version of the Flip. But there are a number of big changes here that help it feel more complete and less like a tech demo.

You might like…

The outer screen, for example, is much larger (1.9-inches compared to 1.1-inches) and is actually now useful. Instead of being able to see the time and a few lines of a notification, you can now scroll through your latest messages, play or pause music or even quickly set a timer. I’ve already found myself keeping the Flip closed when I’m flipping through songs or just triaging Slack messages.

Samsung’s focus on durability also makes itself known immediately. The stronger aluminium sides have a great feel while the IPX8 rating gives me added reassurance that I’m not going to ruin the phone with an accidental drop in the bath. These might sound like little things, but they combine for a meaty upgrade.

Whereas the Galaxy Fold uses its foldable panel to turn a phone into a small tablet, the Z Flip 3 uses the same tech to give you a regular-sized Android phone that folds down into something far more pocketable.

When folded, Z Flip 3 is a joy. It slips into a back pocket or a small handbag with ease and by not having a screen on show I find it easier to ignore when I am trying to stay away from my phone. A quick tap on the outer display to check any notifications and I’m not distracted by anything else.

The original Moto Razr was my first phone and I have huge amounts of nostalgia for the whole flip phone style. Samsung has recreated that here, but in a modern format that stands apart from the rest of the Android market.

Of course, I will need to spend much longer with the Z Flip 3 than I have done so far to render a full verdict. I’m wary of how good the battery life will be, or how comparable the two outward-facing 12-megapixel cameras are to other phones at this price.

But, for now, I can say with ease the Z Flip 3 is the most consumer-friendly foldable phone to date – by some distance.

Quick Cost: The Galaxy Z Fold 3 simply made me need the Be aware 21 extra

This week Samsung went all-in with foldable phones, unveiling its new Galaxy Z Fold 3 and Galaxy Z Flip 3 alongside the Galaxy Watch 4 and Galaxy Buds 2.

But, through all the clambering attempts to show just how great foldable devices are I only had one thought running through my head: how is this any better than a new Galaxy Note?

Don’t get me wrong, I’m a well known foldable sceptic, but this thought was properly drilled home at the end of the Fold’s launch when Samsung started talking about the new phone’s S Pen stylus support.

To catch up anyone that missed the initial news, the Fold 3 is a phone that does what it says on the tin. It has a custom folding screen that lets you use it as a regular phone or tablet. The new version is the first to feature S Pen support, with Samsung selling two optional versions of the stylus as extras.

Specifically, there is a larger Pro version that supports Bluetooth and works across all supported Galaxy phones and then a smaller Z Fold 3-exclusive one that doesn’t support Bluetooth.

On paper, this sounds like a match made in heaven. I can personally attest to the productivity benefits of having a stylus on a tablet, having used multiple Surface devices and Apple iPads with Pencils over the years.

Even on the Galaxy Note phones, which I’ve conceded were too small to fully utilise the stylus effectively, had some great uses for it. All too often I’d use the S Pen to manage Excel sheets, or jot down shopping lists when out and about reviewing one of the phones. But, for me, the Fold doesn’t have the chops to really make the most of the S Pen for two reasons.

First, because whichever one you get is superfluous as the Fold 3 doesn’t have any way to dock the pens without investing in a separate case, which will make the already chunky phone even thicker and more cumbersome to use. I can also guarantee, based on my experience with the Apple iPad and Pencil, that the S Pen will go walkies fairly fast.

Numerous times I’ve grabbed my iPad thinking I’d spend some time sketching in Procreate, only to realise I have no clue where I’ve left the stylus. The same game of check under the sofa cushions to locate the missing pen will inevitably happen with the Fold 3.

Then there’s the screen. Sure it’s bigger, which will likely make it better in some instances. And yes it has impressive specs, featuring a variable refresh rate 120Hz OLED panel. But there’s one big problem: the aspect ratio makes it distinctly square.

This is problematic for a variety of reasons, chief of which is that most Android apps aren’t optimised correctly for the Fold’s screen. On top of that most digital painting and photo editing work is done on rectangular canvases, so a square isn’t exactly ideal.

I also can’t see Krita or Adobe tweaking their services to run properly on the aspect ratio just for the Galaxy Fold.

This is why, despite feeling Samsung never 100% nailed the Note, I couldn’t help but feel a twinge of sorrow at the Galaxy Note 21’s absence during the event.

Here’s hoping the line returns in 2022 with a new bigger and better Note that finally does the S Pen justice.

You might like…

Ctrl+Alt+Delete: Don't count on higher battery life with OLED laptops

It looks like the next big trend in laptops could be OLED screens, with Dell recently embracing the technology for the XPS 13 ultrabook. 

OLED screens offer many benefits, achieving better contrast and colour accuracy than most standard LCD displays. This is because every individual pixel can produce its own light rather than relying on a backlight – a pixel can even be turned off to create authentic on-screen blacks. 

It’s also widely known that OLED screens can technically help devices save on battery life, conversing energy when multiple pixels are turned off for long stretches of time. It makes sense in theory, but our Mobile Phone expert, Max Parker, says he rarely notices a difference in battery life for OLED smartphones. But what about laptops?

I was hoping the Dell XPS 13 OLED would see a boost to the battery life compared to the previous model. But interestingly, the battery life was actually worse than that of the Dell XPS 13 LCD version. 

Dell XPS 13 OLEDDell XPS 13 LCDProcessorIntel Core i7-1185G7Intel Core i7-1165G7Resolution3456 x 21603840 x 2400Brightness150 nits150 nitsBattery life (PCMark 10 test)7 hours and 22 minutes9 hours and 52minutes

Both laptops have a similar processor and were tested with the same 150-nit brightness level, yet the OLED still saw a worse battery-life figure. The Dell XPS 13 LCD even has a higher resolution, which makes things even more peculiar since the pixel count can have a big impact on battery. 

After doing a quick look around the web, I found that other publications such as Wired have seen similar results. But why is this the case? 

Well the first thing to consider is that the battery benchmark test simulated a number of video chats, word documents and spreadsheets, all of which feature lots of white space. OLED screens only save on battery when showing the colour black, but there are a lot of bright colours when browsing the web, whether you’re on news websites, browsing through photos on Instagram or watching a YouTube video. 

You might like…

Of course, you could set all of your web browsers and apps to dark mode to counteract this, but as LaptopMag points out, that seemingly only improves the battery life by 10 minutes, which is hardly worth the effort.

There could be many other factors in play here. Perhaps the OLED panel simply isn’t as efficient as an LCD screen when displaying bright colours. After all, OLED technology is still in its infancy in the laptop industry, while the likes of Dell have been using LCD panels for ages. Manufacturers may just need more time to get to grips with OLED in order to make them just as efficient. 

On left: Razer Blade Stealth. On right: Dell XPS 13 OLED

Of course, this is the result of one laptop, so I don’t want to start suggesting OLED screens are worse than LCD counterparts when it comes to laptop battery life. But I think it’s safe to say that OLED screens don’t provide the battery-boosting perks that we all hoped for, at least not to the extent to make a worthwhile difference.

In my experience, the resolution of a laptop is far more influential on the battery life, and an OLED panel is not capable of counteracting the battery loss when upgrading from Full HD to 4K. 

Let’s be clear though, OLED screens are still fantastic. The improvements they see to the likes of contrast and colour accuracy is remarkable, making an even greater impact to picture quality than a boost in resolution. So if you’re looking to buy an OLED laptop for the sole purpose of improving the picture quality, then go for it – just don’t expect it to boost the battery life. 

Ctrl+Alt+Delete is our weekly computing-focussed opinion column where we delve deeper into the world of computers, laptops, components, peripherals and more. Find it on Trusted Reviews every Saturday afternoon.