Connect with us

Featured

Data literacy to the fore in 2017

Published

on

Last year we saw an explosion of data, but unfortunately, not aren’t enough people with the expertise to handle the increasing levels of data and computing. DAN SOMMER, Senior Director of Qlik, predicts that in 2017 a culture-wide change is needed.

Over the past twelve months we’ve seen an explosion of data, an increase in processing it and a move towards information activism. This means the number of employees actively able to work with – and master – the huge amounts of information available, such as data scientists, application developers, and business analysts, have become a valuable entity.

Unfortunately, however, there still aren’t enough people with the expertise to handle the ever-increasing, vast levels of data and computing. You would assume, with all the information currently being produced and held by businesses, that 2017 would see us in a new digital era of facts. But, without the right number of specialists to consume and analyse it, there’s a gap in resources.  Data is, unfortunately, growing faster than our ability to make use of it.

For many business leaders then, this means a reliance on gut instinct to make even the most important decisions. Unable to hone in on the most important insights, they’re presented with multiple – and sometimes conflicting – data points, so the most important ones seem unreliable.

The situation needs to change. Yes, that will mean upskilling more data scientists in 2017, but there will be a greater focus on empowering more people more broadly. That will go beyond information activists and towards providing more people with the tools and training to increase data literacy. Just as reading and writing skills needed to move beyond scholars 100 years ago, data literacy will become one of the most important business skills for any member of staff.

So, what will change to see culture-wide data literacy become a reality? Here are my predictions:

1.    Combinations of data – Big data will become less about size and more about combinations. With more fragmentation of data and most of it created externally in the cloud, there will be a cost impact to hoarding data without a clear purpose. That means we’ll move towards a model where businesses have to quickly combine their big data with small data so they can gain insights and context to get value from it as quickly as possible. Combining data will also shine a light on false information more easily, improving data accuracy as well as understanding.

2.    Hybrid thinking – In 2017, hybrid cloud and multi-platform will emerge as the primary model for data analytics. Because of where data is generated, ease of getting started, and its ability to scale, we’re now seeing an accelerated move to cloud. But one cloud is not enough, because the data and workloads won’t be in one platform. In addition, data gravity also means that on premise has long staying power. Hybrid and multi-environment will emerge as the dominant model, meaning workloads and publishing will happen across cloud and on-premise.

3.    Self-service for all – Freemium is the new normal, so 2017 will be the year users have easier access to their analytics. More and more data visualisation tools are available at low cost, or even for free, so some form of analytics will become accessible across the workforce. With more people beginning their analytics journey, data literacy rates will naturally increase — more people will know what they’re looking at and what it means for their organisation. That means information activism will rise too.

4.   Scale-up – Much a result of its own success, user-driven data discovery from two years ago has become today’s enterprise-wide BI. In 2017, this will evolve to replace archaic reporting-first platforms. As modern BI becomes the new reference architecture, it will open more self-service data analysis to more people. It also puts different requirements on the back end for scale, performance, governance, and security.

5.    Advancing analytics – In 2017, the focus will shift from “advanced analytics” to “advancing analytics.” Advanced analytics is critical, but the creation of the models, as well as the governance and curation of them, is dependent on highly-skilled experts. However, many more should be able to benefit from those models once they are created, meaning that they can be brought into self-service tools. In addition, analytics can be advanced by increased intelligence being embedded into software, removing complexity and chaperoning insights. But the analytical journey shouldn’t be a black box or too prescriptive. There is a lot of hype around “artificial intelligence,” but it will often serve best as an augmentation rather than replacement of human analysis because it’s equally important to keep asking the right questions as it is to provide the answers.

6.    Visualisation as a concept will move from analysis-only to the whole information supply chain – Visualisation will become a strong component in unified hubs that take a visual approach to information asset management, as well as visual self-service data preparation, underpinning the actual visual analysis. Furthermore, progress will be made in having visualisation as a means to communicate our findings. The net effect of this is increased numbers of users doing more in the data supply chain.

7.    Focus will shift to custom analytic apps and analytics in the app – Everyone won’t — and cannot be —both a producer and a consumer of apps. But they should be able to explore their own data. Data literacy will therefore benefit from analytics meeting people where they are, with applications developed to support them in their own context and situation, as well as the analytics tools we use when setting out to do some data analysis.  As such, open, extensible tools that can be easily customised and contextualised by application and web developers will make further headway.

These trends lay the foundation for increased levels of not just information activism, but also data literacy. After all, new platforms and technologies that can catch “the other half” (i.e., less skilled information workers and operational workers on the go) will help usher us into an era where the right data becomes connected with people and their ideas — that’s going to close the chasm between the levels of data we have available and our ability to garner insights from it. Which, let’s face it, is what we need to put us on the path toward a more enlightened, information-driven, and fact-based era.

Featured

Liquid, IS, partner for 5G roll-out to corporate SA

Liquid Telecom has teamed up with Internet Solutions to develop an ultra-fast wholesale connectivity service for enterprises – including telcos

Published

on

Liquid Telecom South Africa has partnered with Internet Solutions (IS) to provide wholesale 5G connectivity targeted at delivering enterprise services to their existing and potential new customer bases.  

The 5G service will provide operators and internet service providers with faster speeds, lower latency and greater capacity, ultimately enabling businesses to deliver richer experiences to their customers.

“Providing IS with 5G wholesale services as an alternative to fibre connectivity, Liquid Telecom South Africa is highlighting how we are delivering on our commitment to the market to continue being the best business network in South Africa,” says Reshaad Sha, CEO of Liquid Telecom South Africa. “Local businesses are adopting technologies like SD-WAN, IoT, and cloud computing, However, these technologies need network connectivity that provides high quality, increased capacity, and greater reliability to ensure optimum performance.” 

IS managing executive Dr Setumo Mohapisays the company has evolved its networking model to provide a high-performance hybrid network that aggregates multiple WAN transport services. 

“This enables clients to fully utilise all available bandwidth for high availability and total application performance,” he says. “The innovation, flexibility and range of 5G use cases that this offers for different industries such as agriculture, retail, manufacturing, and logistics is boundless. 5G is a core component of our hybrid network and we are extremely excited about the extended capability this partnership with Liquid enables us to offer our clients.

Liquid Telecom is the first to launch a 5G wholesale network service, which it says will “accelerate the building of Africa’s digital future and the  digital revolution in South Africa”.

Liquid Telecom is a leading communications solutions provider across 13 countries, primarily in Eastern, Southern and South Africa. It serves mobile operators, carriers, enterprise, media and content companies and retail customers with high-speed, reliable connectivity, hosting and co-location and digital services. This means that it can provide the basis for its clients to offer 5G services to end-users.

Liquid has built Africa’s largest independent fibre network, approaching 70,000km, and operates state-of-the-art data centres in Johannesburg, Cape Town and Nairobi.

IS, which pioneered Internet connectivity in South Africa, is a subsidiary of the Dimension Data Group and part of Japanese telecoms giant NTT. It now leverages its infrastructure and global footprint to support organisations with the rapid deployment of emerging technologies. Still headquartered in South Africa, it has operating offices in Mozambique, Uganda, Ghana, Kenya and Nigeria. It has 82 Points of Presence (PoPs) in 19 African countries and four international PoPs in London, Germany, Hong Kong and Singapore. The company has over 10 000 square metres of data centre space across Africa.

Continue Reading

Featured

So you think you need a Blockchain?

By CAYLE SHARROCK, Head of Engineering at Tari Labs

Published

on

It’s 2020, and we’re still in hype overdrive about blockchain. If conventional wisdom is to be believed, blockchain is going revolutionise and disrupt every industry known to humankind.

But does every industry actually need a blockchain? Let’s take an objective look at two of the most aggressively touted use cases for Blockchain to see if it’s all it’s cracked up to be.

Before we do this, let’s remind ourselves about the four pillars of Blockchain technology and what they give you: tamper-evident logs (the blockchain); cryptographic proof of ownership (digital signatures); public accountability (the distributed public ledger); and corruption resistance (proof of work).

If we use these four features as a checklist, we can evaluate any proposed use case of blockchain technology and decide whether the potential is genuine, or whether it’s just buzzword bingo.

Banking

There have been hundreds of headlines over the past four years proclaiming how Bank Y will use Blockchain to disrupt the industry. Usually, what they claim is that they can perform interbank settlements at a fraction of the cost of what the incumbent monopoly, SWIFT, provides.

So does Blockchain work for the banking sector? Clearly, tamper detection of the transaction history is a must-have here. What about digital signatures and proof of ownership? Without a doubt. Multiple signatures? The more the merrier.

Bitcoin was conceived as trustless money – and with banks, we have a fairly small community that is heavily regulated, and that do actually trust each other to some degree. Essentially, banks use governments’ big stick instead of proof-of-work to keep everyone honest. This works most of the time. Except when it doesn’t. The 2008 crisis and the 2012 Cypriot haircuts are just two examples.

How about Public Accountability from distributed public records? No, public accountability has never been the banking sector’s strong suit. That means the banks’ ideal “blockchain” is just tamper detection, plus digital signatures. This sounds like a bunch of databases that have tightly controlled access along with strong cryptographic signatures.

The banks actually gave this non-Blockchain blockchain a name: Distributed Ledger Technology. And it’s pretty much what SWIFT already does.

Verdict: Do banks need Blockchain? Nah. They want a cheaper alternative to SWIFT.

Supply-chain management

Blockchain technology is going to revolutionise the supply-chain management (SCM) industry, we’re told. BHP Billiton was one of the first large companies to announce in 2016 that they were implementing Blockchain for their core sample supply chain. We’ve heard similar stories about the diamond industry.

Whether you think a proof-of-work Blockchain makes sense for SCM is really secondary to the challenge of The Oracle problem: blockchains are brilliant at letting you know when data in the system has been compromised. But they have zero sense whether that data is true or not.

The Oracle problem arises whenever you need to bring the concept of truth, or providence from the real world into a trustless system like Blockchain. How does the core sample data get onto the blockchain ledger? Does a guy type it in? Does he never make mistakes? Can he be bribed to type in something else? If it’s a totally automated system, can it fail? Be hacked?

Maybe we solve this by having two systems running and we compare the results. Or three. Or four. Now we have the problem of having to ship our samples to different labs around the world and be sure they weren’t tampered with in transit. If only we had a blockchain-based SCM system to secure our blockchain-based SCM system …

Verdict: The Oracle problem is really hard, and torpedos a lot of tangible good-based blockchain proposals.

So, back to our original question: do you need a blockchain? Ultimately, the future of blockchain applications (beyond money) lies in whether the benefits of having a decentralised, public record secured by proof-of-work outweighs its costs. There are plenty of really encouraging use cases emerging – think ticketing, for example, or trading in any digital assets. But for most industries, the jury’s still out.

Continue Reading

Trending

Copyright © 2020 World Wide Worx