Connect with us

Featured

Inventor of Web has hope for Artificial Intelligence

Published

on

The man who invented the World Wide Web, Tim Berners-Lee, argues that the evils of artificial intelligence have been exaggerated. ARTHUR GOLDSTUCK reports from Las Vegas.

The man who invented the World Wide Web could be mistaken for a schoolteacher, or perhaps a university professor. A slight build, spectacles and thinning brown hair combine with an almost humble demeanour that is difficult to associate with a legacy as great as any of the giants of the technology world.

Tim Berners-Lee was looking for an easier way to connect information when he first came up with the concept of the World Wide Web in the 1980s, while working as a physicist at the CERN laboratory in Switzerland. Today he is a director of the World Wide Web Consortium (W3C), which in effect sets the technical rules for how the Web operates. But he remains an academic, and is a senior researcher and founder chair at the MIT Computer Science and Artificial Intelligence Laboratory.

It is little wonder, then, that he is as preoccupied with artificial intelligence (AI) as he is with the Web. The latter remains his baby, however, and he has far more to say about it than any other topic. In March 2017, he issued an open letter warning that we have lost control of our personal data, that it’s too easy for misinformation to spread on the web, and that political advertising online needs transparency.

Clearly, he is not one to gloss over the perils of progress. So, when it was announced that he was to offer his insights into the dangers of AI at the Dell EMC 2017 conference in Las Vegas, it became the must-attend talk of an already-intensive convention.

The conference represented the first joint convention by two giants of the computer world, following computer manufacturer Dell’s purchase of storage leaders EMC for $67-billion – the biggest IT acquisition ever. EMC and its subsidiary VMware are responsible, respectively, for the storage and the management platforms of a large proportion of the world’s cloud computing infrastructure. The cloud, in turn, is going to be integral to AI, hosting and processing the massive amounts of data that will allow AI to help human beings make and act on decisions.

Photo by John S. and James L. Knight Foundation [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons

Photo by John S. and James L. Knight Foundation [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons

It was no surprise that a record 13 500 delegates attended Dell EMC World. And it was no surprise that the lines to get into Berners-Lee’s “AI in Perspective” talk were almost as long as those for the conference’s opening keynote by Dell founder Michael Dell.

The Web founder did not disappoint.

While he speaks with a rapid-fire energy that sometimes appears to run ahead of his thoughts, he delivers his perspective of the future with both authority and empathy.

“The promise of AI is really exciting, but you still have to look at it as the thing which makes a lot of people concerned,” he says by way of introduction. “We have to look at not just the hopes, but also the fears.”

The promise, he says, is that of almost all computer projects:

“We’re trying to get machines to do things we don’t want to do, like filling out a form. A lot of the progress in computing starts off with simple things, like doing accounts and taxes. Translating languages has always been just about to happen, but is now starting to become functional.

“You can train a machine to beat a game. Instead of training it by looking at lots of people playing a game, you just teach it to play against it self and it becomes better than a human.

“You can grab yourself some cloud storage and some cloud computation and find some open data produced by government or scientific or enterprise, find lots of data, find the latest algorithm, and create something that has added value, and put out signal where there wasn’t signal before. Like enabling you to decide where to invest.

“More and more, computers are starting to tick off all those things we were told computers just couldn’t do. So these are very exciting times.”

There are two key problems in this Utopian vision, however. The first seems easily solved:

“There is a huge dearth of people who know how to do this stuff, but there will be more and more. The stuff is out there and you can teach yourself. The promise is huge.”

The second issue is that most difficult of challenges: public perception. Berners-Lee talks about an AI Spring, when the world was full of hope, turning into an AI winter: “The world turned on them and said: ‘You were supposed to give us robots by now, what happened? This sucks’.”

At the same time, the fear is building that AI will take away jobs.

“Suddenly its no longer AI. Now it’s Natural Language Processing, now it’s self-driving cars. But they wont call it AI.”

The three fears of AI

Berners-Lee addresses the fears of AI in “three pieces”:

“Let’s talk about robots replacing jobs. The first take you get on AI: are robots going to take my job or all the jobs of my people? Is an autonomous vehicle going to take my job? Autonomous vehicles are coming. A lot of people, when they arrive somewhere as immigrants, or people between jobs, start out with Uber or a cab company, where driving is one of the things they can do. If that goes away, there is going to be a big shift and we have to be responsible about how we do these things.”

The second big category of AI fears lies in its ability to generate fake news. However, Berners-Lee sees AI as the solution rather than the problem: “AI can be a frontline defence against things which can be proven to be false. There’s no way of really objecting to the decisions of AI.”

The third and most famous category, he says, is the Singularity – when AI surpasses human intelligence.

“Is all this getting out of control? As a kid I read Asimov books, Arthur C Clarke book. Asimov imagined robots would become just as powerful as us and therefore they’d have to be controlled. Ask people who make robots about the problem of robots becoming smarter than you, they say, ‘Do you know how difficult it is to build a robot? You know how long it’s going to take, getting smarter than us?’ Don’t worry about it.”

He mocks the current trend in movies of showing future robots not only as smart, but humanoid, female, beautiful, blonde and blue-eyed, with female voices.

“By the time they are smarter than us, they won’t look like us. A lot of the intelligence already out there is sitting in the cloud, it doesn’t have blue eyes, but it does play a part in our society.

“The funny thing is, we’re worrying about robots, but we’re not worrying about the companies that program them. People are not good at stopping bad things.”

In short, he says real people, rather than artificial intelligence, are the bigger threat.

  • Arthur Goldstuck is founder of World Wide Worx and editor-in-chief of Gadget.co.za. Follow him on Twitter and Instagram on @art2gee

Continue Reading

Featured

Money talks and electronic gaming evolves

Computer gaming has evolved dramatically in the last two years, as it follows the money, writes ARTHUR GOLDSTUCK in the second of a two-part series.

Published

on

The clue that gaming has become big business in South Africa was delivered by a non-gaming brand. When Comic Con, an American popular culture convention that has become a mecca for comics enthusiasts, was hosted in South Arica for the first time last month, it used gaming as the major drawcard. More than 45 000 people attended.

The event and its attendance was expected to be a major dampener for the annual rAge gaming expo, which took place just weeks later. Instead, rAge saw only a marginal fall in visitor numbers. No less than 34 000 people descended on the Ticketpro Dome for the chaos of cosplay, LAN gaming, virtual reality, board gaming and new video games. 

It proved not only that there was room for more than one major gaming event, but also that a massive market exists for the sector in South Africa. And with a large market, one also found numerous gaming niches that either emerged afresh or will keep going over the years. One of these, LAN (for Local Area Network) gaming, which sees hordes of players camping out at the venue for three days to play each other on elaborate computer rigs, was back as strong as ever at rAge.

MWeb provided an 8Gbps line to the expo, to connect all these gamers, and recorded 120TB in downloads and 15Tb in uploads – a total that would have used up the entire country’s bandwidth a few years ago.

“LANs are supposed to be a thing of the past, yet we buck the trend each year,” says Michael James, senior project manager and owner of rAge. “It is more of a spectacle than a simple LAN, so I can understand.”

New phenomena, often associated with the flavour of the moment, also emerge every year.

“Fortnite is a good example this year of how we evolve,” says James. “It’s a crazy huge phenomenon and nobody was servicing the demand from a tournament point of view. So rAge and Xbox created a casual LAN tournament that anyone could enter and win a prize. I think the top 10 people got something each round.”

Read on to see how esports is starting to make an impact in gaming.

Previous Page1 of 3

Continue Reading

Featured

Blockchain unpacked

Blockchain is generally associated with Bitcoin and other cryptocurrencies, but these are just the tip of the iceberg, says ESET Southern Africa.

Published

on

This technology was originally conceived in 1991, when Stuart Haber and W. Scott Stornetta described their first work on a chain of cryptographically secured blocks, but only gained notoriety in 2008, when it became popular with the arrival of Bitcoin. It is currently gaining demand in other commercial applications and its annual growth is expected to reach 51% by 2022 in numerous markets, such as those of financial institutions and the Internet of Things (IoT), according to MarketWatch.

What is blockchain?

A blockchain is a unique, consensual record that is distributed over multiple network nodes. In the case of cryptocurrencies, think of it as the accounting ledger where each transaction is recorded.

A blockchain transaction is complex and can be difficult to understand if you delve into the inner details of how it works, but the basic idea is simple to follow.

Each block stores:

–           A number of valid records or transactions.
–           Information referring to that block.
–           A link to the previous block and next block through the hash of each block—a unique code that can be thought of as the block’s fingerprint.

Accordingly, each block has a specific and immovable place within the chain, since each block contains information from the hash of the previous block. The entire chain is stored in each network node that makes up the blockchain, so an exact copy of the chain is stored in all network participants.

As new records are created, they are first verified and validated by the network nodes and then added to a new block that is linked to the chain.

How is blockchain so secure?

Being a distributed technology in which each network node stores an exact copy of the chain, the availability of the information is guaranteed at all times. So if an attacker wanted to cause a denial-of-service attack, they would have to annul all network nodes since it only takes one node to be operative for the information to be available.

Besides that, since each record is consensual, and all nodes contain the same information, it is almost impossible to alter it, ensuring its integrity. If an attacker wanted to modify the information in a blockchain, they would have to modify the entire chain in at least 51% of the nodes.

In blockchain, data is distributed across all network nodes. With no central node, all participate equally, storing, and validating all information. It is a very powerful tool for transmitting and storing information in a reliable way; a decentralised model in which the information belongs to us, since we do not need a company to provide the service.

What else can blockchain be used for?

Essentially, blockchain can be used to store any type of information that must be kept intact and remain available in a secure, decentralised and cheaper way than through intermediaries. Moreover, since the information stored is encrypted, its confidentiality can be guaranteed, as only those who have the encryption key can access it.

Use of blockchain in healthcare

Health records could be consolidated and stored in blockchain, for instance. This would mean that the medical history of each patient would be safe and, at the same time, available to each doctor authorised, regardless of the health centre where the patient was treated. Even the pharmaceutical industry could use this technology to verify medicines and prevent counterfeiting.

Use of blockchain for documents

Blockchain would also be very useful for managing digital assets and documentation. Up to now, the problem with digital is that everything is easy to copy, but Blockchain allows you to record purchases, deeds, documents, or any other type of online asset without them being falsified.

Other blockchain uses

This technology could also revolutionise the Internet of Things  (IoT) market where the challenge lies in the millions of devices connected to the internet that must be managed by the supplier companies. In a few years’ time, the centralised model won’t be able to support so many devices, not to mention the fact that many of these are not secure enough. With blockchain, devices can communicate through the network directly, safely, and reliably with no need for intermediaries.

Blockchain allows you to verify, validate, track, and store all types of information, from digital certificates, democratic voting systems, logistics and messaging services, to intelligent contracts and, of course, money and financial transactions.

Without doubt, blockchain has turned the immutable and decentralized layer the internet has always dreamed about into a reality. This technology takes reliance out of the equation and replaces it with mathematical fact.

Continue Reading

Trending

Copyright © 2018 World Wide Worx