Samsung has announced its AI service – Bixby. The cloud-based AI interface is imbedded in all the phone’s applications and it will soon be seen on all of its devices. CRAIGE FLEISCHER of Samsung South Africa, tells us what we can expect.
Technology is supposed to make life easier, but as the capabilities of machines such as smartphones, PCs, home appliances and IoT devices become more diverse, the interfaces on these devices are becoming too complicated for users to take advantage of many of these functions conveniently.
User interface designers have to make trade off decisions to cram many functions into a small screen or bury them deeper in layers of menu trees. Ultimately users are at the mercy of the designers with an increasingly steep curve that makes learning a new device difficult. This is the fundamental limitation of the current human-to-machine interface. Since Samsung makes millions of devices, this problem impacts the core of our business.
Samsung has a conceptually new philosophy to the problem: instead of humans learning how the machine interacts with the world (a reflection of the abilities of designers), it is the machine that needs to learn and adapt to us. The interface must be natural and intuitive enough to flatten the learning curve regardless of the number of functions being added. With this new approach, Samsung has employed artificial intelligence, reinforcing deep learning concepts to the core of our user interface designs. Bixby is the ongoing result of this effort.
Bixby will be a new intelligent interface on our devices. Fundamentally different from other voice agents or assistants in the market, Bixby offers a deeper experience thanks to proficiency in these three properties:
When an application becomes Bixby-enabled, Bixby will be able to support almost every task that the application is capable of performing using the conventional interface (i.e. touch commands). Most existing agents currently support only a few selected tasks for an application and therefore confuse users about what works or what doesn’t work by voice command. The completeness property of Bixby will simplify user education on the capability of the agent, making the behaviours of the agent much more predictable.
- Context Awareness
When using a Bixby-enabled application, users will be able to call upon Bixby at any time and it will understand the current context and state of the application and will allow users to carry out the current work-in-progress continuously. Bixby will allow users to weave various modes of interactions including touch or voice at any context of the application, whichever they feel is most comfortable and intuitive. Most existing agents completely dictate the interaction modality and, when switching among the modes, may either start the entire task over again, losing all the work in progress, or simply not understand the user’s intention.
- Cognitive Tolerance
When the number of supported voice commands gets larger, most users are cognitively challenged to remember the exact form of the voice commands. Most agents require users to state the exact commands in a set of fixed forms. Bixby will be smart enough to understand commands with incomplete information and execute the commanded task to the best of its knowledge, and then will prompt users to provide more information and take the execution of the task in piecemeal. This makes the interface much more natural and easier to use.
We know that adopting new ways to interact with your devices will require a change in user behaviour. The inconvenience of learning a new interface can cause friction and force users to revert back to old habits (e.g. the touch interface). At the same time we believe the key to success for a new voice interface is to design a scheme that reduces friction and makes the experience significantly more rewarding than the existing interface. So at its core, Bixby will help remove friction. It will simplify user education with new voice interfaces and will make using your phone even more seamless and intuitive.
Another example of removing friction will be the dedicated Bixby button that will be located on the side of our next device. Confusion around activating a voice interface is a barrier we have removed to make it feel easier and more comfortable to give commands. For example, instead of taking multiple steps to make a call – turning on and unlocking the phone, looking for the phone application, clicking on the contact bar to search for the person that you’re trying to call and pressing the phone icon to start dialling – you will be able to do all these steps with one push of the Bixby button and a simple command.
There has been a lot of excitement and speculation about what we will deliver with the launch of the Galaxy S8, especially due to the advancements in artificial intelligence. We do have a bold vision of revolutionising the human-to-machine interface, but that vision won’t be realised overnight. Ambition takes time.
Bixby will be our first step on a journey to completely open up new ways of interacting with your phone. At the launch of the Galaxy S8, a subset of preinstalled applications will be Bixby-enabled. This set will continue to expand over time. Our plan is to eventually release a tool (in SDK) to enable third-party developers to make their applications and services Bixby-enabled easily.
Starting with our smartphones, Bixby will be gradually applied to all our appliances. In the future you would be able to control your air conditioner or TV through Bixby. Since Bixby will be implemented in the cloud, as long as a device has an internet connection and simple circuitry to receive voice inputs, it will be able to connect with Bixby. As the Bixby ecosystem grows, we believe Bixby will evolve from a smartphone interface to an interface for your life.
Bixby is at the heart of our software and services evolution as a company. We are fundamentally and conceptually changing our attitude toward software and services and working hard on innovation throughout all aspects of our mobile ecosystem. Our investment in engineering resources speaks for itself – we have thousands of software developers supporting this effort. This is something that I’m very excited about. Innovating in software and services enables opportunities for creativity and the ability to build new experiences from the ground up. With the continued investment from Samsung on artificial intelligence, the possibility of what Bixby can become is endless.
- Craige Fleischer, Director Integrated Mobility at Samsung South Africa
Millennials turning 40: NOW will you stop targeting them?
It’s one of the most overused terms in youth marketing, and probably the most inaccurate, writes ARTHUR GOLDSTUCK
One of the most irritating buzzwords embraced by marketers in recent years is the term “millennial”. Most are clueless about its true meaning, and use it as a supposedly cool synonym for “young adults”. The flaw in this targeting – and the word “flaw” here is like calling the Grand Canyon a trench – is that it utterly ignores the meaning of the term. “Millennials” are formally defined as anyone born from 1980 to 2000, meaning they have typically come of age after the dawn of the millennium, or during the 21st century.
Think about that for a moment. Next year, the millennial will be formally defined as anyone aged from 20 to 40. So here you have an entire advertising, marketing and public relations industry hanging onto a cool definition, while in effect arguing that 40-year-olds are youths who want the same thing as newly-minted university graduates or job entrants.
When the communications industry discovers just how embarrassing its glib use of the term really is, it will no doubt pivot – millennial-speak for “changing your business model when it proves to be a disaster, but you still appear to be cool” – to the next big thing in generational theory.
That next big thing is currently Generation Z, or people born after the turn of the century. It’s very convenient to lump them all together and claim they have a different set of values and expectations to those who went before. Allegedly, they are engaged in a quest for experience, compared to millennials – the 19-year-olds and 39-olds alike – supposedly all on a quest for relevance.
In reality, all are part of Generation #, latching onto the latest hashtag trend that sweeps social media, desperate to go viral if they are producers of social content, desperate to have caught onto the trend before their peers.
The irony is that marketers’ quest for cutting edge target markets is, in reality, a hangover from the days when there was no such thing as generational theory, and marketing was all about clearly defined target markets. In the era of big data and mass personalization, that idea seems rather quaint.
Indeed, according to Grant Lapping, managing director of DataCore Media, it no longer matters who brands think their target market is.
“The reason for this is simple: with the technology and data digital marketers have access to today, we no longer need to limit our potential target audience to a set of personas or segments derived through customer research. While this type of customer segmentation was – and remains – important for engagements across traditional above-the-line engagements in mass media, digital marketing gives us the tools we need to target customers on a far more granular and personalised level.
“Where customer research gives us an indication of who the audience is, data can tell us exactly what they want and how they may behave.”
Netflix, he points out, is an example of a company that is changing its industry by avoiding audience segmentation, once the holy grail of entertainment.
In other words, it understands that 20-year-olds and 40-year-olds are very different – but so is everyone in between.
* Arthur Goldstuck is founder of World Wide Worx and editor-in-chief of Gadget.co.za. Follow him on Twitter and Instagram on @art2gee
Robots coming to IFA
Robotics is no longer about mechanical humanoids, but rather becoming an interface between man and machine. That is a key message being delivered at next month’s IFA consumer electronics expo in Berlin. An entire hall will be devoted to IFA Next, which will not only offer a look into the future, but also show what form it will take.
The concepts are as varied as the exhibitors themselves. However, there are similarities in the various products, some more human than others, in the fascinating ways in which they establish a link between fun, learning and programming. In many cases, they are aimed at children and young people.
The following will be among the exhibitors making Hall 26 a must-visit:
Leju Robotics (Stand 115) from China is featuring what we all imagine a robot to be. The bipedal Aelos 1s can walk, dance and play football. And in carrying out all these actions it responds to spoken commands. But it also challenges young researchers to apply their creativity in programming it and teaching it new actions. And conversely, it also imparts scholastic knowledge.
Cubroid (Stand 231, KIRIA) from Korea starts off by promoting an independent approach to the way it deals with tasks. Multi-functional cubes, glowing as they play music, or equipped with a tiny rotating motor, join together like Lego pieces. Configuration and programming are thus combined, providing a basic idea of what constitutes artificial intelligence.
Spain is represented by Ebotics (Stand 218). This company is presenting an entire portfolio of building components, including the “Mint” educational program. The modular system explains about modern construction, programming and the entire field of robotics.
Elematec Corporation (Stand 208) from Japan is presenting the two-armed SCARA, which is not intended to deal with any tasks, but in particular to assist people with their work.
Everybot (Stand 231, KIRIA) from Japan approaches the concept of robotics by introducing an autonomous floor-cleaning machine, similar to a robot vacuum cleaner.
And Segway (Stand 222) is using a number of products to explain the modern approach to battery-powered locomotion.
IFA will take place at the Berlin Exhibition Grounds (ExpoCenter City) from 6 to 11 September 2019. For more information, visit www.ifa-berlin.com