Connect with us

Featured

IBM opens Quantum Computing

Published

on

IBM Research has announced that for the first time ever it is making quantum computing available to members of the public, who can access and run experiments on IBM’s quantum processor.

IBM scientists have built a quantum processor that users can access through a first-of-a-kind quantum computing platform delivered via the IBM Cloud onto any desktop or mobile device. IBM believes quantum computing is the future of computing and has the potential to solve certain problems that are impossible to solve on today’s supercomputers.

The cloud-enabled quantum computing platform, called IBM Quantum Experience, will allow users to run algorithms and experiments on IBM’s quantum processor, work with the individual quantum bits (qubits), and explore tutorials and simulations around what might be possible with quantum computing.

The quantum processor is composed of five superconducting qubits and is housed at the IBM T.J. Watson Research Center in New York. The five-qubit processor represents the latest advancement in IBM’s quantum architecture that can scale to larger quantum systems. It is the leading approach towards building a universal quantum computer.

A universal quantum computer can be programmed to perform any computing task and will be exponentially faster than classical computers for a number of important applications for science and business.

A universal quantum computer does not exist today, but IBM envisions medium-sized quantum processors of 50-100 qubits to be possible in the next decade. With a quantum computer built of just 50 qubits, none of today’s TOP500 supercomputers could successfully emulate it, reflecting the tremendous potential of this technology. The community of quantum computer scientists and theorists is working to harness this power, and applications in optimization and chemistry will likely be the first to demonstrate quantum speed-up.

“Quantum computers are very different from today’s computers, not only in what they look like and are made of, but more importantly in what they can do. Quantum computing is becoming a reality and it will extend computation far beyond what is imaginable with today’s computers,” said Arvind Krishna, senior vice president and director, IBM Research. “This moment represents the birth of quantum cloud computing. By giving hands-on access to IBM’s experimental quantum systems, the IBM Quantum Experience will make it easier for researchers and the scientific community to accelerate innovations in the quantum field, and help discover new applications for this technology.”

With Moore’s Law running out of steam, quantum computing will be among the technologies that could usher in a new era of innovation across industries. This leap forward in computing could lead to the discovery of new pharmaceutical drugs and completely safeguard cloud computing systems. It could also unlock new facets of artificial intelligence (which could lead to future, more powerful Watson technologies), develop new materials science to transform industries, and search large volumes of big data.

IBM Quantum Experience

Quantum information is very fragile and needs to be protected from any errors that can result from heat and electromagnetic radiation. Signals are sent in and out of a cryogenic dilution refrigerator to measure operations on the quantum processor.

The IBM team has made a number of robust engineering advances both at the device level and in the electronic controls to give IBM Quantum Experience users unprecedented and reliably high-quality performance in this five-qubit processor.

Coupled with software expertise from the IBM Research ecosystem, the team has built a dynamic user interface on the IBM Cloud platform that allows users to easily connect to the quantum hardware via the cloud. The team sees the introduction to the public of this complete quantum computing framework as just the start of a new user community, which embraces the quantum world and how it works.

In the future, users will have the opportunity to contribute and review their results in the community hosted on the IBM Quantum Experience and IBM scientists will be directly engaged to offer more research and insights on new advances. IBM plans to add more qubits and different processor arrangements to the IBM Quantum Experience over time, so users can expand their experiments and help uncover new applications for the technology.

Quantum computing – a different way of thinking

We live in a world where classical physics defines our experiences and our intuition, and ultimately how we process information. However, nature at the atomic level is governed by a different set of rules known as quantum mechanics. It is beyond the reach of classical computers to solve problems that exist in nature in which quantum mechanics plays a role, for example, understanding how molecules behave.

To overcome this, in 1981, Richard Feynman proposed to build computers based on the laws of quantum mechanics. Over three decades later, IBM is helping to make this a reality.

Quantum computing works fundamentally differently from today’s computers. A classical computer makes use of bits to process information, where each bit represents either a one or a zero. In contrast, a qubit can represent a one, a zero, or both at once, which is known as superposition. This property along with other quantum effects enable quantum computers to perform certain calculations vastly faster than is possible with classical computers.

Most of today’s quantum computing research in academia and industry is focused on building a universal quantum computer. The major challenges include creating qubits of high quality and packaging them together in a scalable way, so they can perform complex calculations in a controllable way.

IBM employs superconducting qubits that are made with superconducting metals on a silicon chip and can be designed and manufactured using standard silicon fabrication techniques. Last year, IBM scientists demonstrated critical breakthroughs to detect quantum errors by combining superconducting qubits in latticed arrangements, and whose quantum circuit design is the only physical architecture that can scale to larger dimensions.

Now, IBM scientists have achieved a further advance by combining five qubits in the lattice architecture, which demonstrates a key operation known as a parity measurement – the basis of many quantum error correction protocols. The road towards universal quantum computing hinges upon the achievement of quantum error correction, and the IBM team has taken another important step down this challenging path.

New frontiers for quantum computing

There has been tremendous progress and interest in the field of quantum of computing in recent years. By giving users access to the IBM Quantum Experience, it will help businesses and organizations begin to understand the technology’s potential, for universities to grow their teaching programs in quantum computing and related subjects, and for students to become aware of promising new career paths.

“It is a beautiful challenge to pursue the path to build the first universal quantum computer, but it requires us to change how we think about the world. Access to early quantum computing prototypes will be key in imagining and developing future applications,” said Dario Gil, vice president of science and solutions, IBM Research. “If you want to understand what a true quantum computer will do for you and how it works, this is the place to do it. You won’t experience it anywhere else.”

IBM’s quantum computing platform is a core initiative within the newly formed IBM Research Frontiers Institute. The Frontiers Institute is a consortium that develops and shares ground-breaking computing technologies to spur world-changing innovations. Companies from diverse industries can leverage IBM’s research talent and cutting-edge infrastructure to explore what the future of quantum computing may mean for their organization and business. Founding members of the Frontiers Institute include Samsung, JSR, and Honda.

To access the IBM Quantum Experience and for more information on IBM’s quantum computing research, please visit www.ibm.com/quantumcomputing. To learn more about the IBM Research Frontiers Institute, please visit www.ibm.com/frontiers.

Note to journalists and bloggers: You can view and download b-roll on IBM’s quantum computing efforts at http://www.thenewsmarket.com/ibm. The video is available in HD, standard definition broadcast and streaming quality.

Arts and Entertainment

VoD cuts the cord in SA

Some 20% of South Africans who sign up for a subscription video on demand (SVOD) service such as Netflix or Showmax do so with the intention of cancelling their pay television subscription.

Published

on

That’s according to GfK’s international ViewScape survey*, which this year covers Africa (South Africa, Kenya and Nigeria) for the first time.

The study—which surveyed 1,250 people representative of urban South African adults with Internet access—shows that 90% of the country’s online adults today use at least one online video service and that just over half are paying to view digital online content. The average user spends around 7 hours and two minutes a day consuming video content, with broadcast television accounting for just 42% of the time South Africans spend in front of a screen.

Consumers in South Africa spend nearly as much of their daily viewing time – 39% of the total – watching free digital video sources such as YouTube and Facebook as they do on linear television. People aged 18 to 24 years spend more than eight hours a day watching video content as they tend to spend more time with free digital video than people above their age.

Says Benjamin Ballensiefen, managing director for Sub Sahara Africa at GfK: “The media industry is experiencing a revolution as digital platforms transform viewers’ video consumption behaviour. The GfK ViewScape study is one of the first to not only examine broadcast television consumption in Kenya, Nigeria and South Africa, but also to quantify how linear and online forms of content distribution fit together in the dynamic world of video consumption.”

The study finds that just over a third of South African adults are using streaming video on demand (SVOD) services, with only 16% of SVOD users subscribing to multiple services. Around 23% use per-pay-view platforms such as DSTV Box Office, while about 10% download pirated content from the Internet. Around 82% still sometimes watch content on disc-based media.

“Linear and non-linear television both play significant roles in South Africa’s video landscape, though disruption from digital players poses a growing threat to the incumbents,” says Molemo Moahloli, general manager for media research & regional business development at GfK Sub Sahara Africa. “Among most demographics, usage of paid online content is incremental to consumption of linear television, but there are signs that younger consumers are beginning to substitute SVOD for pay-television subscriptions.”

Continue Reading

Featured

New data rules raise business trust challenges

When the General Data Protection Regulation comes into effect on May 25th, financial services firms will face a new potential threat to their on-going challenges with building strong customer relationships, writes DARREL ORSMOND, Financial Services Industry Head at SAP Africa.

Published

on

The regulation – dubbed GDPR for short – is aimed at giving European citizens control back over their personal data. Any firm that creates, stores, manages or transfers personal information of an EU citizen can be held liable under the new regulation. Non-compliance is not an option: the fines are steep, with a maximum penalty of €20-million – or nearly R300-million – for transgressors.

GDPR marks a step toward improved individual rights over large corporates and states that prevents the latter from using and abusing personal information at their discretion. Considering the prevailing trust deficit – one global EY survey found that 60% of global consumers worry about hacking of bank accounts or bank cards, and 58% worry about the amount of personal and private data organisations have about them – the new regulation comes at an opportune time. But it is almost certain to cause disruption to normal business practices when implemented, and therein lies both a threat and an opportunity.

The fundamentals of trust

GDPR is set to tamper with two fundamental factors that can have a detrimental effect on the implicit trust between financial services providers and their customers: firstly, customers will suddenly be challenged to validate that what they thought companies were already doing – storing and managing their personal data in a manner that is respectful of their privacy – is actually happening. Secondly, the outbreak of stories relating to companies mistreating customer data or exposing customers due to security breaches will increase the chances that customers now seek tangible reassurance from their providers that their data is stored correctly.

The recent news of Facebook’s indiscriminate sharing of 50 million of its members’ personal data to an outside firm has not only led to public outcry but could cost the company $2-trillion in fines should the Federal Trade Commission choose to pursue the matter to its fullest extent. The matter of trust also extends beyond personal data: in EY’s 2016 Global Consumer Banking Survey, less than a third of respondents had complete trust that their banks were being transparent about fees and charges.

This is forcing companies to reconsider their role in building and maintaining trust with its customers. In any customer relationship, much is done based on implicit trust. A personal banking customer will enjoy a measure of familiarity that often provides them with some latitude – for example when applying for access to a new service or an overdraft facility – that can save them a lot of time and energy. Under GDPR and South Africa’s POPI act, this process is drastically complicated: banks may now be obliged to obtain permission to share customer data between different business units (for example because they are part of different legal entities and have not expressly received permission). A customer may now allow banks to use their personal data in risk scoring models, but prevent them from determining whether they qualify for private banking services.

What used to happen naturally within standard banking processes may be suddenly constrained by regulation, directly affecting the bank’s relationship with its customers, as well as its ability to upsell to existing customers.

The risk of compliance

Are we moving to an overly bureaucratic world where even the simplest action is subject to a string of onerous processes? Compliance officers are already embedded within every function in a typical financial services institution, as well as at management level. Often the reporting of risk processes sits outside formal line functions and end up going straight to the board. This can have a stifling effect on innovation, with potentially negative consequences for customer service.

A typical banking environment is already creaking under the weight of close to 100 acts, which makes it difficult to take the calculated risks needed to develop and launch innovative new banking products. Entire new industries could now emerge, focusing purely on the matter of compliance and associated litigation. GDPR already requires the services of Data Protection Officers, but the growing complexity of regulatory compliance could add a swathe of new job functions and disciplines. None of this points to the type of innovation that the modern titans of business are renowned for.

A three-step plan of action

So how must banks and other financial services firms respond? I would argue there are three main elements to successfully navigating the immediate impact of the new regulations:

Firstly, ensuring that the technologies you use to secure, manage and store personal data is sufficiently robust. Modern financial services providers have a wealth of customer data at their disposal, including unstructured data from non-traditional sources such as social media. The tools they use to process and safeguard this data needs to be able to withstand the threats posed by potential data breaches and malicious attacks.

Secondly, rethinking the core organisational processes governing their interactions with customers. This includes the internal measures for setting terms and conditions, how customers are informed of their intention to use their data, and how risk is assessed. A customer applying for medical insurance will disclose deeply personal information about themselves to the insurance provider: it is imperative the insurer provides reassurance that the customer’s data will be treated respectfully and with discretion and with their express permission.

Thirdly, financial services firms need to define a core set of principles for how they treat customers and what constitutes fair treatment. This should be an extension of a broader organisational focus on treating customers fairly, and can go some way to repairing the trust deficit between the financial services industry and the customers they serve.

Continue Reading

Trending

Copyright © 2018 World Wide Worx