Connect with us

Featured

How AI will fix noisy data

The potential impact of Artificial Intelligence (AI) has never been greater — but we’ll only be successful if AI can deliver smarter and more intuitive answers, writes DR. MICHAEL MAYBERRY.

A key barrier to AI today is that natural data fed to a computer is largely unstructured and “noisy.”

It’s easy for humans to sort through natural data. For example: If you are driving a car on a residential street and see a ball roll in front of you, you would stop, assuming there is a small child not far behind that ball. Computers today don’t do this. They are built to assist humans with precise productivity tasks. Making computers efficient at dealing with probabilities at scale is central to our ability to transform current systems and applications from advanced computational aids into intelligent partners for understanding and decision-making.

This is why probabilistic computing is one key component to AI and central to addressing these challenges. Probabilistic computing will allow future systems to comprehend and compute with uncertainties inherent in natural data, which will enable us to build computers capable of understanding, predicting and decision-making.

Today at Intel, we are observing an unprecedented growth of applications that rely on analysis of noisy natural data – different and even conflicting information. Such applications aim to assist humans with a higher level of intelligence and awareness about the environments in which they operate. Cutting through this noisy minefield is central to our ability to transform computers into intelligent partners that can understand and act on information with human-like fidelity.

Research into probabilistic computing is not a new area of study, but the improvements in high-performance computing and deep learning algorithms may lead probabilistic computing into a new era. In the next few years, we expect that research in probabilistic computing will lead to significant improvements in the reliability, security, serviceability and performance of AI systems, including hardware designed specifically for probabilistic computing. These advancements are critical to deploying applications into the real world – from smart homes to smart cities.

To accelerate our work in probabilistic computing, Intel is increasing its research investment in probabilistic computing and we are working with partners to pursue this goal.

Establishing the Intel Strategic Research Alliance for Probabilistic Computing

Realizing the full potential of probabilistic computing involves holistic integration of multiple levels in computing technology. Today, Intel underscored its commitment to integrated and collaborative implementation of emerging computing architectures and a sound ecosystem enablement strategy by issuing a call to the academic and start-up communities to partner with us to advance probabilistic computing from the lab to reality across these vectors: benchmark applications, adversarial attack mitigations, probabilistic frameworks and software and hardware optimization.

An Eye on What’s Next

We are incredibly eager to see the proposals to advance probabilistic computing and to continue this research with the potential to raise the bar for what AI can help us achieve.

We began this journey with research into neuromorphic computing – focusing on our understanding of the human brain and its associated computational processes. The start of the neuromorphic research community announced on March 1 is also on track and we are planning to continue to scale up our Loihi on the cloud to allow researchers access to cutting-edge hardware. We see a path to reach 100 billion synapses on a single system in 2019.

Subscribe to our free newsletter
To Top