Gadget

Samsung Galaxy AI gives
Star Trek power to S24

Sometimes, a new product faces a moment of truth, when it lives or dies by the extent to which it meets the hype that precedes it. That is especially the case when grandiose claims are made, dramatic new features are introduced, or there is a major shift in a company’s entire branding message.

All three applied ahead of the launch of the new Samsung Galaxy S24 range in San Jose, in the heart of Silicon Valley, on Wednesday morning. Along with the unveiling of three new handsets – the S24, S24 Plus and S24 Ultra – Samsung announced a new overarching brand: “Galaxy AI”. It is intended to send several messages, not least Samsung’s intention to own thought leadership in the use of AI in smartphones, and the company’s shift in focus from hardware specs to software capabilities.

It’s a dangerous strategy, but also a powerful one, if it can be pulled off. Did Samsung pull it off?

Ahead of the launch, I had the chance to test the claims for the S24’s AI capabilities at a demonstration hosted by Samsung in San Jose, California, in the heart of Silicon Valley. 

The first set of tests revolved around the AI capabilities of the cameras across the range. First, it’s worth noting that only the top of the range, the S24 Ultra, contains a processor optimised for AI, namely the new Qualcomm Snapdragon 8 Gen 3 chip. The other devices run on less advanced Exynos chips. In principle, this means the Ultra can run AI processes on the phone itself, rather than depending on generative AI in the cloud.

The Galaxy S24 Ultra on display before the launch in San Jose in California this week. Pic: ARTHUR GOLDSTUCK

It turns out, however, that even the more basic phones are AI-capable, resorting to the cloud only for specific exceptions. We tried the S24 Plus, first for AI-generated slow-mo. This meant selecting a brief action moment in a video for a slow-motion effect. The phone used AI to generate additional in-between frames that were never captured by the camera, but generated from the existing frames to create a slow-mo effect.

The next feature was Photo Assist, which analyses an image and identifies, for example, reflections resulting from taking a photo through a window. One clicks on the AI button, and the software offers suggestions on how to enhance the photo. In one test, it offered to erase the reflection, resulting in a near-pristine image. In another, it erased a shadow on a face. The difference between the before-and-after images was dramatic. For the average user who is not used to advanced photo-editing software, it will be jaw-dropping.

The final test was editing an action photo to move an element of an image from close to a road to mid-air, using generative AI. One starts by clicking on the Galaxy AI button, and then highlighting the element of the image one wants to move. Hold it down, drag it and drop it, and you have the most extensive editing of a smartphone image yet from the tools natively available on the device.

Yet, that was not even the moment of truth: it is all possible already using online AI tools, even if there is still a usability barrier for typical consumers..

My own jaw-drop moment came when I tested the live translation ability of the device. The first involved making a call to a restaurant to make an appointment – almost a cliché now for testing AI chat. I phoned a Spanish restaurant and spoke in English, then listened as the phone translated my words into Spanish for the listener. He responded in Spanish, and the phone translated his words back into English. That feels like a more advanced version of a variety of online translation tools now available, so merely gave a basic sense of satisfaction that it worked as advertised.

The Galaxy S24 on display before the launch in San Jose in California this week. Pic: ARTHUR GOLDSTUCK

However, it was when I tested a function called Interpreter, built into the settings of the devices, that the moment of truth arrived. I held a live conversation, face to face, with a Korean speaker (see the YouTube video), and watched as my words were translated, on the display screen, from English into Korean, while the translation was read aloud. My counterpart then responded in Korean, and I watched the words appear in English on the screen, simultaneously with the words being spoken by the phone.

This was a live translation, in both sound and text, of two languages using different alphabets, and it was near instantaneous. It was also astonishingly accurate, give or take a couple of translation glitches.

The most startling thing about this conversation was not that it took place on an S24 Plus – so was not using the AI chip – but that it took place in Airplane Mode. In other words, it was not using generative AI, nor processing in the cloud. The language pack shipped with the phone – there are 13 included languages at launch – along with AI software, turned the 1960s science fiction vision of the Star Trek series into a reality in 2024.

* Arthur Goldstuck is founder of World Wide Worx, editor-in-chief of Gadget.co.za, and author of “The Hitchhiker’s Guide to AI”. Follow him on Twitter and Instagram on @art2gee.

Exit mobile version