Connect with us

Featured

Where next for humans?

Published

on

With many worried about their jobs being taken over by artificial intelligence and machines, SIMON CARPENTER, Chief Technology Adviser at SAP Africa, asks what the future has in stall for humankind.

Perhaps it’s because I have less runway ahead of me than behind me or perhaps it’s because of being in the IT industry for three and half decades but I find myself marvelling at and sometimes bewildered by, the exponentially accelerating pace and scope of advances in science and technology. Recently, in response to alarming headlines about job destruction and Artificial Intelligence (AI) getting away from us, I’ve been wondering where it’s all heading for the apex primate – humankind.

A brief history of mankind

When you look at the mammal that is Homo sapiens in the context of geological time, the 200,000 years we have been around is a tiny, tiny amount of time – a mere 0.00004% of Earth’s existence. And yet here we are, living in the Anthropocene epoch (recently named for us by climatologists and geologists). This epoch is so named because for the first time in the 4.1-billion-year history of life on Earth we humans, as a species, are changing what happens to and on the planet, rather than simply being the observers and subjects of natural forces.

In that 200,000 years since Homo sapiens first emerged in Africa and spread across the planet we have evolved to become a “reasonably smart” apex primate at the top of the food chain in a closed system called planet Earth (although we have already made our presence felt in other parts of the solar system).

You could argue that we’re only “reasonably smart” because whilst we are sentient, have consciousness, self-awareness, intellectual capacity, language, moral reasoning, and the ability to create, we haven’t yet figured out how to live without degrading our own environment through pollution, over-population, over-exploitation of natural resources and species extinction. Only “reasonably smart” because whilst we create great art, music, literature, food, science and technologies and new industries we haven’t yet figured out how to stop warring with each other, to transcend tribalism and racism, or to curb the greed and corruption whereby the few predate upon the many. Only “reasonably smart” because although we’ve made great strides in medicine and healthcare we have not yet figured out how to cure dread diseases, prevent obesity, build equitable, inclusive economies or provide universal healthcare.

The greatest show in the universe

These “reasonable smarts” come to us courtesy of arguably the most amazingly complex “thing” in the universe – the human brain. With its estimated 86,000,000,000 to 100,000,000,000 neurons and 3,440,000,000,000,000 to 4,000,000,000,000,000 synapses it accounts for roughly 2% of our body mass yet consumes around 20% of the oxygen and energy we take in. It is this human brain and its astonishing capacities that keeps us alive on daily basis, that enables us to dominate other animals and that accounts for all human progress and the massive impact we have had on the planet despite having been here for only 0.00004% of the earth’s history. And, it is this amazing brain that has helped us to develop and master the various technologies that have brought us this far; from fire to fission and everything in between.

Yet, despite this awesomeness, the human brain may not be sufficient to ensure our survival as a species. Whilst there is still much to discover about how the brain works we do know that it suffers from the fact that it is trapped in a physically constrained space – the skull – and is subject to metabolic limitations. The prefrontal cortex, where we do most of our reasoning, appears to be able process no more than five to seven discrete pieces of information at any one time, and the myth of multitasking is just that – a complete myth. Our individual brains, in other words, are ill-equipped to deal with either the size or dynamism of some of the challenges we now face. That’s one of the reasons we find it so hard to execute the dictum that “No problem can be solved from the same level of consciousness that created it” (commonly attributed to Einstein) – it’s hard to change consciousness when its seat doesn’t change. The world in which our current brains evolved no longer exists and, per evolutionary science, it will take somewhere in the region of one million years for any significant changes in our human capabilities. So, we will need to look elsewhere for solutions to the many pressing problems such as how to feed an additional two billion people on shrinking amounts of arable land and how to manage traffic congestion and safety in rapidly urbanising societies, how to provide sufficient energy for economic development or manage epidemics or maximise corporate profits without harming society and so on.

The digital brain

Help is at hand.  We now stand at the dawn of a Digital Revolution, one which promises socio-economic upheaval as profound as that which followed previous agricultural and industrial revolutions. Whereas the plough, the steam-engine and the production technologies of yesteryear augmented our physical capabilities this new Digital Revolution, with its data generation, information processing and communication technologies is about augmenting our mental capabilities. Pre-eminent among the multi-faceted technologies that underpin the Digital Revolution is Artificial Intelligence.

As we embed sensors in more and more things in the world (including ourselves) and this cyber-physical world creates unprecedented volumes and velocities of data we must use AI to make sense of it as our brains are simply not up to the task of dealing with the velocity and volumes of data.

Unlike our brains, and courtesy of Moore’s law, we can scale up silicon-based capabilities in a largely unrestricted fashion – it’s not bound by the physical limitations of the human skull or by the metabolic need for sleep – and this is enabling us to deliver AI capabilities that were the stuff of science fiction only a few years ago. There is much debate as to when (and whether) AI will exceed human intelligence with futurists such a Ray Kurzweil (who claims an 86% accuracy rate for the 147 predictions he has made since the 1990s) positing 2045 as the year when the Singularity will occur. The Singularity being the point in time at which AI leads to machines that are smarter than human beings.

But, you may say, AI is not new and for every success there have been many failures and it’s nowhere close to matching human general intelligence. So, what is different this time? Well, your assertions would be right on all counts but the answer to your question is “plenty”.

Advances in technology

The last few years have seen spectacular improvements in capability and affordability on several fronts that feed into Artificial Intelligence. The pace of this Digital Revolution has no historical precedent we can refer to; advances in science and technology are combinatorial and exponential in nature, intersecting in ways we battle to anticipate and because of which jobs, industries and societies are being disrupted.

There have been improvements in computing technologies, especially Graphics Processing Units (GPUs) to the point where super-computing and massive memory is affordable and therefore widely available, significant improvements in algorithms (not least of which is Deep Learning) and massive sets of data on which to train new AI models using techniques like Machine and Deep Learning. And all of this is set to accelerate as the Internet of Things (IoT) takes hold allowing us to create and “feed” real-time data into Digital Twins that will represent all sorts of artefacts from the real world (including humans). The possibilities are endless and limited only by our imaginations and ethical considerations, the ramifications can be scary, and the process is unstoppable.

It is now up to us as individuals, workers, parents, managers, leaders, companies, governments and societies to understand and evaluate these trends and technologies and to ask how can we ensure this new technology serves us? How will we apply it to help make the world run better and improve people’s lives?

The (narrow) usefulness of AI

The answer lies in understanding that today’s AIs are very narrow – they can do certain specific tasks, but only those tasks, astonishingly well. So, it’s about picking the most valuable use cases, understanding that for certain tasks, where efficiency, repeatability, neutrality, speed and big sets of data are the norm the narrow intelligence of today’s AI is often superior to humans when it comes to getting a job done. It’s also about having a mindset of embedding these new tools into both existing or new business processes and models in such a way that we can take the “work out of work” and free our people up to do the things that only humans can do; imagining, empathising, relating, creating and solving complex problems. This last point is worth emphasising; AI cannot envision, it cannot innovate, it doesn’t empathise, it cannot synthesise new solutions to complex problems. What it can do is tackle the routine, mundane, dangerous activities that make work a less than stellar experience for millions of people – so that those people can bring their talents to bear on the world’s challenges in a more engaging fashion.

It is about taking tools such as machine learning and applying them to the data you already have (or will generate through new capabilities such as social listening, IoT or visual processing) to generate new insights, to make life and work safer, easier and more productive, and to design innovative competitive capabilities.

As we stand at the beginning of a new age for humanity, one where we can use Artificial Intelligence for good, it is up to us to explore ways to make sure technology serves us well. We don’t yet know where it will take us but we do know that we must get started.

Have you asked yourself how your organisation is using Artificial Intelligence today?

Featured

IoT at tipping point

We have long been in the hype phase of IoT, but it is finally taking on a more concrete form illustrating its benefits to business and the public at large, says PAUL RUINAARD, Country Manager at Nutanix Sub-Saharan Africa.

Published

on

People have become comfortable with talking to their smartphones and tasking these mini-computers to find the closest restaurants, schedule appointments, and even switch on their connected washing machines while they are stuck in traffic.

This is considerable progress from those expensive (and dated) robotic vacuum cleaners that drew some interest a few years ago. Yes, being able to automate cleaning the carpets held promise, but the reality failed to deliver on those expectations.

However, people’s growing comfort when it comes to talking to machines and letting them complete menial tasks is not what the long-anticipated Internet of Things (IoT) is about. It really entails taking connectedness a step further by getting machines to talk to one another in an increasingly digital world filled with smart cities, devices, and ways of doing things.

We have long been in the hype phase of IoT, but it is finally taking on a more concrete form illustrating its benefits to business and the public at large. The GSM Association predicts that Africa will account for nearly 60 percent of the anticipated 30 billion connected IoT devices by 2020.

Use cases across the continent hold much promise. In agriculture, for example, placing sensors in soil enable farmers to track acidity levels, temperature, and other variables to assist in improving crop yields. In some hotels, infrared sensors are being used to detect body heat so cleaning staff now when they can enter a room. In South Africa, connected cars (think telematics) are nothing new. Many local insurers use the data generated to reward good driver behaviour and penalise bad ones with higher premiums.

Data management

The proliferation of IoT also means huge opportunity for businesses. According to the IDC, the market opportunity for IoT in South Africa will grow to $1.7 billion by 2021. And with research from Statista showing that retail IoT spending in the country is expected to grow to $60 million by the end of this year (compared to the $41 million of 2016), there is significant potential for connected devices once organisations start to unlock the value of the data being generated.

But before we get a real sense of what our newly-connected world will look like and the full picture of the business opportunities IoT will create, we need to put the right resources in place to manage it. With IoT comes data, more than we can realistically imagine, and we are already creating more data than ever before.

Processing data is something usually left to ‘the IT person’. However, if business leaders want to join the IoT game, then it is something they must start thinking about. Sure, there are several ways to process data but they all link back to a data centre, that room or piece of equipment in the office, or the public data centre down the road. Most know it is there but little else, other than it has something to do with data and computers.

Data centres are the less interesting but very essential tools in all things technology. They run the show, and without them we would not be able to do something as simple as send an email, let alone create an intricate system of connected devices that constantly communicate with each other.

Traditionally, data centres have been large, expensive and clunky machines. But like everything in technology, they have been modernised over the years and have become smaller, more powerful, and more practical for the digital demands of today.

Computing on the edge

Imagine real-time face scanning being used at the Currie Cup final or the Chiefs and Pirates derby. Just imagine more than a thousand cameras in action, working in real time scanning tens of thousands of faces from different angles, creating data all along the way and integrating with other technology such as police radios and in-stadium services.

As South Africans, we know all too well that the bandwidth to process such a large amount of data through traditional networks is simply not good enough to work efficiently. And while it can be run through a large core or public data centre, the likelihood of one of those being close to the stadium is minimal. Delays, or ‘latency and lag time’, are not an option in this scenario; it must work in real time or not at all.

So, what can be done? The answer lies in edge computing. This is where computing is brought closer to the devices being used. The edge refers to devices that communicate with each other. Think of all those connected things the IoT has become known for: things like mobile devices, sensors, fitness trackers, laptops, and so on. Essentially anything that is ‘remote’ that links to the Web or other devices falls under this umbrella. For the most part, edge computing refers to smaller data centres (those in the edge) that can process the data required for things like large-scale facial recognition.

At some point in the future, there could be an edge data centre at Newlands or The Calabash that processes the data in real time. It would, of course, also be connected to other resources such as a public or private cloud environment, but the ‘heavy lifting’ is done where the action is taking place.

Unfortunately, there are not enough of these edge resources in place to match our grand IoT ambitions. Clearly, this must change if we are to continue much further down the IoT path.

Admittedly, edge computing is not the most exciting part of the IoT revolution, but it is perhaps the most necessary component of it if there is to be a revolution at all.

Continue Reading

Featured

Don’t panic! Future of work is still human

Published

on

The digital age, and the new technologies it’s brought with it – blockchain, artificial intelligence (AI), robotics, augmented reality and virtual reality – is seen by many as a threat to our way of life as we know it. What if my job gets automated? How will I stay relevant? How do we adapt to the need for new skills to manage customer expectations and the flood of data that’s washing over us?

The bad news is that the nature of work has already changed irrevocably. Everything that can be automated, will be. We already live in an age of “robot restaurants”, where you order on a touch screen, and machines cook and serve your food. Did you notice the difference? AmazonGo is providing shopping without checkout lines. In the US alone, there are an estimated 3.4 million drivers that could be replaced by self-driving vehicles in 10 years, including truck drivers, taxi drivers and bus drivers.

We’re not immune from this phenomenon in Africa. In fact, the World Economic Forum (WEF) predicts that 41% of all work activities in South Africa are susceptible to automation, compared to 44% in Ethiopia, 46% in Nigeria and 52% in Kenya. This doesn’t mean millions of jobs on the continent will be automated overnight, but it’s a clear indicator of the future direction we’re taking.

The good news is that we don’t need to panic. What’s important for us in South Africa, and the continent, is to realise that there is plenty of work that only humans can do. This is particularly relevant to the African context, as the working-age population rises to 600 million in 2030 from 370 million in 2010. We have a groundswell of young people who need jobs – and the digital age has the ability to provide them, if we start working now.

Make no mistake, there’s no doubt that this so-called “Fourth Industrial Revolution” is going to disrupt many occupations. This is perfectly natural: every Industrial Revolution has made some jobs redundant. At the same time, these Revolutions have created vast new opportunities that have taken us forward exponentially.

Between 2012 and 2017, for example, it’s estimated that the demand for data analysts globally grew by 372%, and the demand for data visualisation skills by more than 2000%. As businesses, this means we have to not only create new jobs in areas like data science and analytics, but reskill our existing workforces to deal with the digital revolution and its new demands.

So, while bus drivers and data clerks are looking over their shoulders nervously right now, we’re seeing a vast range of new jobs being created in fields such as STEM (Science, Technology, Engineering and Mathematics), data analysis, computer science and engineering.

This is a challenge for Sub-Saharan Africa, where our levels of STEM education are still not where they should be. That doesn’t mean there are no opportunities to be had. In the region, for example, we have a real opportunity to create a new generation of home-grown African digital creators, designers and makers, not just “digital deliverers”. People who understand African nuances and stories, and who not only speak local languages, but are fluent in digital.

This ability to bridge the digital and physical worlds, as it were, will be the new gold for Africa. We need more business operations data analysts, who combine deep knowledge of their industry with the latest analytical tools to adapt business strategies. There will also be more demand for user interface experts, who can facilitate seamless human-machine interaction.

Of course, in the longer term, we in Africa are going to have to make some fundamental decisions about how we educate people if we’re going to be a part of this brave new world. Governments, big business and civil society will all have roles to play in creating more future-ready education systems, including expanded access to early-childhood education, more skilled teachers, investments in digital fluency and ICT literacy skills, and providing robust technical and vocational education and training (TVET). This will take significant intent not only from a policy point of view, but also the financial means to fund this.

None of this will happen overnight. So what can we, as individuals and businesspeople, do in the meantime? A good start would be to realise that the old models of learning and work are broken. Jenny Dearborn, SAP’s Global Head of Learning, talks about how the old approach to learning and work was generally a three-stage life that consisted largely of learn-work-retire.

Today, we live in what Ms Dearborn calls the multi-stage life, which includes numerous phases of learn-work-change-learn-work. And where before, the learning was often by rote, because information was finite, learning now is all about critical thinking, complex problem-solving, creativity and innovation and even the ability to un-learn what you have learned before.

Helping instill this culture of lifelong learning, including the provision of adult training and upskilling infrastructure, is something that all companies can do, starting now. The research is clear: even if jobs are stable or growing, they are going through major changes to their skills profile. WEF’s Future of Jobs analysis found that, in South Africa alone, 39% of core skills required across all occupations will be different by 2020 compared to what was needed to perform those roles in 2015.

This is a huge wake-up call to companies to invest meaningfully in on-the-job training to keep their people – and themselves – relevant in this new digital age. There’s no doubt that more learning will need to take place in the workplace, and greater private sector involvement is needed. As employers, we have to start working closely with should therefore offer schools, universities and even non-formal education to provide learning opportunities to our workers.

We can also drive a far stronger focus on the so-called “soft skills”, which is often used as a slightly dismissive term in the workplace. The core skills needed in today’s workplace are active listening, speaking, and critical thinking. A quick look at the WEF’s “21st Century Skills Required For The Future Of Work” chart bears this out: as much as we need literacy, numeracy and IT skills to make sense of the modern world of work, we also need innately human skills like communication and collaboration. The good news is that not only can these be taught – but they can be taught within the work environment.

It sounds almost counter-intuitive, but to be successful in the Digital Age, businesses are going to have to go back to what has always made them strong: their people. Everyone can buy AI, build data warehouses, and automate every process in sight. The companies that will stand out will be those that that focus on the things that can’t be duplicated by AI or machine learning – uniquely human skills.

I have no doubt that the future will not be humans OR robots: it will be humans AND robots, working side by side. For us, as businesspeople and children of the African continent, we’re on the brink of a major opportunity. We just have to grasp it.

Continue Reading

Trending

Copyright © 2018 World Wide Worx