Samsung has announced its AI service – Bixby. The cloud-based AI interface is imbedded in all the phone’s applications and it will soon be seen on all of its devices. CRAIGE FLEISCHER of Samsung South Africa, tells us what we can expect.
Technology is supposed to make life easier, but as the capabilities of machines such as smartphones, PCs, home appliances and IoT devices become more diverse, the interfaces on these devices are becoming too complicated for users to take advantage of many of these functions conveniently.
User interface designers have to make trade off decisions to cram many functions into a small screen or bury them deeper in layers of menu trees. Ultimately users are at the mercy of the designers with an increasingly steep curve that makes learning a new device difficult. This is the fundamental limitation of the current human-to-machine interface. Since Samsung makes millions of devices, this problem impacts the core of our business.
Samsung has a conceptually new philosophy to the problem: instead of humans learning how the machine interacts with the world (a reflection of the abilities of designers), it is the machine that needs to learn and adapt to us. The interface must be natural and intuitive enough to flatten the learning curve regardless of the number of functions being added. With this new approach, Samsung has employed artificial intelligence, reinforcing deep learning concepts to the core of our user interface designs. Bixby is the ongoing result of this effort.
Bixby will be a new intelligent interface on our devices. Fundamentally different from other voice agents or assistants in the market, Bixby offers a deeper experience thanks to proficiency in these three properties:
When an application becomes Bixby-enabled, Bixby will be able to support almost every task that the application is capable of performing using the conventional interface (i.e. touch commands). Most existing agents currently support only a few selected tasks for an application and therefore confuse users about what works or what doesn’t work by voice command. The completeness property of Bixby will simplify user education on the capability of the agent, making the behaviours of the agent much more predictable.
- Context Awareness
When using a Bixby-enabled application, users will be able to call upon Bixby at any time and it will understand the current context and state of the application and will allow users to carry out the current work-in-progress continuously. Bixby will allow users to weave various modes of interactions including touch or voice at any context of the application, whichever they feel is most comfortable and intuitive. Most existing agents completely dictate the interaction modality and, when switching among the modes, may either start the entire task over again, losing all the work in progress, or simply not understand the user’s intention.
- Cognitive Tolerance
When the number of supported voice commands gets larger, most users are cognitively challenged to remember the exact form of the voice commands. Most agents require users to state the exact commands in a set of fixed forms. Bixby will be smart enough to understand commands with incomplete information and execute the commanded task to the best of its knowledge, and then will prompt users to provide more information and take the execution of the task in piecemeal. This makes the interface much more natural and easier to use.
We know that adopting new ways to interact with your devices will require a change in user behaviour. The inconvenience of learning a new interface can cause friction and force users to revert back to old habits (e.g. the touch interface). At the same time we believe the key to success for a new voice interface is to design a scheme that reduces friction and makes the experience significantly more rewarding than the existing interface. So at its core, Bixby will help remove friction. It will simplify user education with new voice interfaces and will make using your phone even more seamless and intuitive.
Another example of removing friction will be the dedicated Bixby button that will be located on the side of our next device. Confusion around activating a voice interface is a barrier we have removed to make it feel easier and more comfortable to give commands. For example, instead of taking multiple steps to make a call – turning on and unlocking the phone, looking for the phone application, clicking on the contact bar to search for the person that you’re trying to call and pressing the phone icon to start dialling – you will be able to do all these steps with one push of the Bixby button and a simple command.
There has been a lot of excitement and speculation about what we will deliver with the launch of the Galaxy S8, especially due to the advancements in artificial intelligence. We do have a bold vision of revolutionising the human-to-machine interface, but that vision won’t be realised overnight. Ambition takes time.
Bixby will be our first step on a journey to completely open up new ways of interacting with your phone. At the launch of the Galaxy S8, a subset of preinstalled applications will be Bixby-enabled. This set will continue to expand over time. Our plan is to eventually release a tool (in SDK) to enable third-party developers to make their applications and services Bixby-enabled easily.
Starting with our smartphones, Bixby will be gradually applied to all our appliances. In the future you would be able to control your air conditioner or TV through Bixby. Since Bixby will be implemented in the cloud, as long as a device has an internet connection and simple circuitry to receive voice inputs, it will be able to connect with Bixby. As the Bixby ecosystem grows, we believe Bixby will evolve from a smartphone interface to an interface for your life.
Bixby is at the heart of our software and services evolution as a company. We are fundamentally and conceptually changing our attitude toward software and services and working hard on innovation throughout all aspects of our mobile ecosystem. Our investment in engineering resources speaks for itself – we have thousands of software developers supporting this effort. This is something that I’m very excited about. Innovating in software and services enables opportunities for creativity and the ability to build new experiences from the ground up. With the continued investment from Samsung on artificial intelligence, the possibility of what Bixby can become is endless.
- Craige Fleischer, Director Integrated Mobility at Samsung South Africa
Welcome to world of 2099
The world of 2099 will be unrecognisable from the world of today, but it can be predicted, says one visionary. ARTHUR GOLDSTUCK met him in Singapore.
Futuristic structures tower over the landscape. Giant, alien-looking trees light up with dazzling colours amid the hundreds of plant species that grow up their trunks. Cosmetic stores sell their wares via public touch-screens, with products delivered instantly in drawers below the screens.
This is not a vision of the future. It is a sample of Singapore today. But it is also an inkling of the world we may all experience in the future.
Singapore was the venue, last week, of the World Cities Summit, where engineers, politicians, investors and visionaries rubbed shoulders as they talked about the strategies and policies that would enhance urban living in the future.
As part of the Summit, global payment technologies leader Mastercard hosted a small media briefing by one of Singapore’s leading thinkers about the future, Dr Damian Tan, managing director of Vickers Venture Partners. The company’s slogan “We invest in the extraordinary,” offers a small clue to Tan’s perspective.
“We look as far forward as 2099 because, as a venture capital firm, we invest in the long term,” he tells a group of journalists from Africa and the Middle East. “Companies explode in growth because there is value in the future. If there is no growth, they won’t explode.”
The big question that the Smart Cities Summit and Mastercard are trying to help answer is, what will cities look like in the year 2099? Tan can’t give an exact answer, but he offers a framework that helps one approach the question.
“If you want to look at 81 years into the future, and understand the change that will come, you need to double that amount and look into the past. That takes us to 1856. The difference between then and now is the difference you can expect between now and 2099.”
Click here or on the page link below to read on: Page 2: Soldiers and Health in 2099.
- Arthur Goldstuck is founder of World Wide Worx and editor-in-chief of Gadget.co.za. Follow him on Twitter on @art2gee and on YouTube
Street art goes electric
Kaspersky Lab and British street artist D*Face have unveiled the first-ever “art helmet” design at the Formula E finale for electric cars in New York.
The ‘Save The World’ helmets will be raced by DS Virgin Racing’s drivers, Sam Bird and Alex Lynn, as they traverse the New York street circuit during the final races of the Formula E season.
The announcement signals the first art helmet by a Formula E team, continuing the heritage of art in motorsport and the cybersecurity brand’s commitment to contemporary art, creativity and innovation. D*Face took inspiration from Kaspersky Lab’s tagline, “A Company To Save The World”, and hopes that his colourful work will inspire people to take positive action.
D*Face will announce his first-ever art car design with a custom-made livery for the DS Virgin Racing Team. Its design will be released at the “Art Goes Green” event after Saturday’s race. The helmets and art car are the latest installations in the “Save the World” collection, following a major permanent public mural that was installed in Brooklyn, New York, in May.
D*Face, whose real name is Dean Stockton, said: “It is exciting to work with Kaspersky Lab on this project and create art with a real message of hope for a better future. After all, this is our world and we need to look after it. It will take every one of us to make a real lasting, impactful change. I love the mentality of the DS Virgin Racing Team and that of Formula E by showcasing sport in a way that doesn’t harm the environment, but is still just as exhilarating and fun.
“It is time for us all to stand together and make a change… be that stopping data steals, climate change, plastic waste or using damaging fuels. I want everyone to make a pledge to do one thing that will help make a change.”
As a sponsor of DS Virgin Racing Team, Kaspersky Lab is responsible for protecting the team’s devices against cyber threats. The company sees the technical environment in the global sport of Formula E as the next frontier in furthering its research and development of new technologies to keep vehicles secure in the digital world.
Sylvain Filippi, Managing Director at DS Virgin Racing, said: “The whole team fully supports this great initiative and our thanks got to Kaspersky and D*Face for their collaboration. It’s an honour to have such an innovative artist bring his talents to bear in our team ahead of the season-finale; the car, drivers’ crash helmets and mural all look amazing.”
Aldo Fucelli Pessot del Bo, Head of Global Partnerships and Sponsorships at Kaspersky Lab added: “There is a need for innovation on a global scale, both in contemporary art and in the fast-growing sport of Formula E. Now, for the first time ever, Kaspersky Lab is proudly bringing together the two sectors in an effort to Save the World and unleash creativity, encourage freedom of expression and further innovation.”