Artifical Intelligence
AI gives, even as it
takes away jobs
The hype around AI destroying jobs is giving way to a new narrative, Red Hat regional GM Dion Harvey tells ARTHUR GOLDSTUCK.
A year ago, as the hype around generative artificial intelligence (AI) was reaching a crescendo, investment bank Goldman Sachs predicted it would replace 300-million existing jobs. That was well up from the World Economic Forum’s “The Future of Jobs Report 2020,” which had said AI would replace 85-million jobs within five years. However, it tempered this forecast with the expectation of 97-million new jobs.
Clearly, such predictions are a thumb-suck, and have rightfully been derided. So, by the time the WEF produced the 2024 version, it had come up with a different and more defensible measure: how many jobs will be affected by AI? It found that almost 40% of employment globally is exposed to AI, rising to 60% in advanced economies.
The report also introduced the concept of “complementarity”, which reflects an occupation’s likely degree of shielding from AI-driven job displacement when paired with high AI exposure.
“For example, because of advances in textual analysis, judges are highly exposed to AI, but they are also highly shielded from displacement because society is currently unlikely to delegate judicial rulings to unsupervised AI,” the report finds. “Consequently, AI will likely complement judges, increasing their productivity rather than replacing them.”
This is the flip side of another employment coin: the extent to which AI will fill the gaps left by skills shortages in information technology. A range of studies, from surveys of skills shortages among chief information officers in the United Kingdom to analysis of job postings in the United States, reveal a desperate need for people skilled in cybersecurity, cloud computing architecture, technical support, networking infrastructure and, of course, AI.
In South Africa, a new study by OfferZen, titled “State of the Software Developer Nation”, finds that developers using specific coding programming languages are in short supply, despite layoffs across the sector.
Last month, at a media briefing by open source software developer Red Hat, held at the Cradle of Humankind north of Johannesburg, the penny dropped.
Dion Harvey, regional general manager of Red Hat for sub-Saharan Africa, outlined how the company was modernising cloud architectures, both to simplify them and make it easier for companies to use, in the context of a global skills shortage. A key to both simplification and modernisation would be the use of AI to help plug the skills gap.
In other words, we are now seeing the role of AI in filling the skills gap, as opposed to the prevailing narrative of AI replacing existing skills.
“Both narratives are going to end up being the reality,” Harvey told Business Times at the event. “In terms of the skills shortage, the priority is having technology that allows us to reprioritise what little skills we have, to be able to do higher-value, more important tasks.
“If I can do even 50% of all the menial tasks, which in IT terms could be spending hours patching servers, if you can transition those limited hours that you have, by automating and intelligently managing infrastructure for example, then you can direct those hours to more meaningful work.”
Harvey gave the example of developer productivity at banks, where “developers are probably spending 80% of their time doing code maintenance”. “Imagine if you can free up half of that to get developers to do new things? Can we use AI as a tool to handle those tasks? We will see that. The promise is higher value tasks being done by humans – and AI dealing with the things we all know we’d rather not have to spend our time on.
“We know we have a skill shortage. But do we really want to train people how to patch services, or do we want to build skills in AI?”
* Arthur Goldstuck is founder of World Wide Worx, editor-in-chief of Gadget.co.za. Follow him on Twitter and Instagram on @art2gee.