Cloud technology is gaining traction as more Internet enabled devices become available. But the key to a successful cloud deployment is how it is managed and which processes are hosted in the cloud and which are kept locally, writes JOHAN SCHEEPERS of CommVault.
While cloud technology gains traction through the big data boom, IT leaders question how they can maximise value from cloud computing while still maintaining data security and control. The data growth we are experiencing continues to escalate in volume and complexity, particularly when data is streaming in from millions of new internet enabled devices, virtualised machines and cloud enabled business-critical applications.
Although the cloud has been around for a few years now, organisations are only recently starting to understand what level of cloud adoption makes sense for their business needs. The cloud has also gone through significant developments, with Hybrid models becoming the favoured approach to enable organisations to benefit from the agility offered from public clouds, while maintaining control of sensitive data on-premise.
A hybrid approach to cloud
Planning a journey to the cloud, whether private, public or both is daunting for all organisations. There is the promise of greater business agility and low upfront investment, however if not handled systematically and driven by insights gleaned from your data, it can actually increase cost and complexity. Some organisations are experiencing issues ranging from egress costs to wasteful utilisation, to complex and siloed management. By starting with insights from your data you can better understand which workloads and applications are most appropriate for a public or private cloud or on-premise hosting, and deploy a successful Hybrid model.
Having an on-premises, private infrastructure directly accessible means not having to go via the public internet for everything, which can greatly reduce access time and latency in comparison to public cloud services. The hybrid cloud model offers organisations on-premises computational and storage infrastructure for processing data that requires extra speed or high availability for your business. This is combined with the benefits of the public cloud where a workload may exceed the computational power of the private cloud component.
Expanding the private component of a hybrid cloud also allows for flexibility in virtual server design. Organisations can automate the entire virtual machine lifecycle to archiving older VM’s to the cloud.
Another benefit of the hybrid model is the increased connectivity and collaboration offered to employees – which can often be a challenge in today’s digital world. The ability for teams to easily and securely share files should be coupled with the integration of remote workers into core business processes, such as internal messaging, scheduling, edge protection (laptops, tablets, etc), business intelligence and analytics.
Although the benefits are clear for adopting a Hybrid approach it can still be difficult to know where to start. CIOs need to look at how they can introduce a Hybrid model that delivers deeply integrated cloud automation and orchestration tools, ensuring compatibility across cloud solutions and on-premise infrastructure. It is recommended that organisations look towards a low risk, high value first step to the cloud through disaster recovery. And particularly in India, our service provider partners are seeing strong demand for Disaster-Recovery-as-a-Service and Backup-as-a-Service, as a clear entries into the cloud for businesses.
The hybrid environment is fast emerging as the norm for many CIOs. However the key to successfully deploying a hybrid cloud model is by understanding which workloads and applications are most appropriate for which hosting, and leveraging a single integrated console with an enterprise-wide view of data across these infrastructures. This will mean that IT leaders can better control where to process data and maximise cost savings by identifying reasonable spend in relation to the value that data offers to the business.
The spending shift – from Capex to Opex
While cloud computing offered promises of cost savings, increasingly we are seeing headlines like this from the Wall Street Journal: “The Hidden Waste and Expense of Cloud Computing“ or from CFO Mag: “Cloud Computing’s Wasteland“. So what’s actually happening?
Due to a lack of controls to help track and manage utilisation, businesses are being faced with unexpected costs, typically from an unusually large bill from their cloud provider after cloud instances are left running. In the traditional CapEx model, which we’re all used to, we invest heavily upfront in hardware and software. However with the cloud subscription model, we can build a datacenter with a credit card in a predictable Operational Expense (OpEx) model – which is wonderful in theory, until the bill shows up. As organisations mainstream public cloud, they are exposing holes in the maturity of their management processes and controls. This means that developers have been deploying VMs at will and not taking down workloads when they are finished.
To address this growing concern, IT leaders need to ensure they have a data and information management strategy which enables them to capture the workload at the point of creation and attach data management service at that point. To support Hybrid models, we need to be able to stay with the workload as it moves between on-premises to hosted private cloud to hybrid and public clouds.
Lastly, data is only useful when we are able to gain value from it, whether it be in the cloud or on-prem. Starting with backup and recovery, organisations can then fast track into more advanced use cases like dev/test solutions and more. Here emerges the hybrid data analytics strategy. ‘Analytics with purpose’ will be a guiding principle for businesses moving forward. And regardless of whether it’s to introduce a business intelligence project or take an advanced analytics strategy to the next level, organisations leveraging a hybrid cloud model will have the opportunity to make more intelligent choices about structured and unstructured data in their environment. They will be able to quickly mitigate the risk of compliance related issues, and regain valuable storage space, freeing up budgets to pursue opportunities that can power business growth.
* Johan Scheepers, Principal Systems Engineer at CommVault
When will we stop calling them phones?
If you don’t remember when phones were only used to talk to people, you may wonder why we still use this term for handsets, writes ARTHUR GOLDSTUCK, on the eve of the 10th birthday of the app.
Do you remember when handsets were called phones because, well, we used them to phone people?
It took 120 years from the invention of the telephone to the use of phones to send text.
Between Alexander Graham Bell coining the term “telephone” in 1876 and Finland’s two main mobile operators allowing SMS messages between consumers in 1995, only science fiction writers and movie-makers imagined instant communication evolving much beyond voice. Even when BlackBerry shook the business world with email on a phone at the end of the last century, most consumers were adamant they would stick to voice.
It’s hard to imagine today that the smartphone as we know it has been with us for less than 10 years. Apple introduced the iPhone, the world’s first mass-market touchscreen phone, in June 2007, but it is arguable that it was the advent of the app store in July the following year that changed our relationship with phones forever.
That was the moment when the revolution in our hands truly began, when it became possible for a “phone” to carry any service that had previously existed on the World Wide Web.
Today, most activity carried out by most people on their mobile devices would probably follow the order of social media in first place – Facebook, Twitter, Instagram and LinkedIn all jostling for attention – and instant messaging in close second, thanks to WhatsApp, Messenger, SnapChat and the like. Phone calls – using voice that is – probably don’t even take third place, but play fourth or fifth fiddle to mapping and navigation, driven by Google Maps and Waze, and transport, thanks to Uber, Taxify, and other support services in South Africa like MyCiti, Admyt and Kaching.
Despite the high cost of data, free public Wi-Fi is also seeing an explosion in use of streaming video – whether Youtube, Netflix, Showmax, or GETblack – and streaming music, particularly with the arrival of Spotify to compete with Simfy Africa.
Who has time for phone calls?
The changing of the phone guard in South Africa was officially signaled last week with the announcement of Vodacom’s annual results. Voice revenue for the 2018 financial year ending 31 March had fallen by 4.6%, to make up 40.6% of Vodacom’s revenue. Total revenue had grown by 8.1%, which meant voice seriously underperformed the group, and had fallen by 4% as a share of revenue, from 2017’s 44.6%.
The reason? Data had not only outperformed the group, increasing revenue by 12.8%, but it had also risen from 39.7% to 42.8% of group revenue,
This means that data has not only outperformed voice for the first time – as had been predicted by World Wide Worx a year ago – but it has also become Vodacom’s biggest contributor to revenue.
That scenario is being played out across all mobile network operators. In the same way, instant messaging began destroying SMS revenues as far back as five years ago – to the extent that SMS barely gets a mention in annual reports.
Data overtaking voice revenues signals the demise of voice as the main service and key selling point of mobile network operators. It also points to mobile phones – let’s call them handsets – shifting their primary focus. Voice quality will remain important, but now more a subset of audio quality rather than of connectivity. Sound quality will become a major differentiator as these devices become primary platforms for movies and music.
Contact management, privacy and security will become critical features as the handset becomes the storage device for one’s entire personal life.
Integration with accessories like smartwatches and activity monitors, earphones and earbuds, virtual home assistants and virtual car assistants, will become central to the functionality of these devices. Why? Because the handsets will control everything else? Hardly.
More likely, these gadgets will become an extension of who we are, what we do and where we are. As a result, they must be context aware, and also context compatible. This means they must hand over appropriate functions to appropriate devices at the appropriate time.
I need to communicate only using my earpiece? The handset must make it so. I have to use gesture control, and therefore some kind of sensor placed on my glasses, collar or wrist? The handset must instantly surrender its centrality.
There are numerous other scenarios and technology examples, many out of the pages of science fiction, that point to the changing role of the “phone”. The one thing that’s obvious is that it will be silly to call it a phone for much longer.
MTN 5G test gets 520Mbps
MTN and Huawei have launched Africa’s first 5G field trial with an end-to-end Huawei 5G solution.
The field trial demonstrated a 5G Fixed-Wireless Access (FWA) use case with Huawei’s 5G 28GHz mmWave Customer Premises Equipment (CPE) in a real-world environment in Hatfield Pretoria, South Africa. Speeds of 520Mbps downlink and 77Mbps uplink were attained throughout respectively.
“These 5G trials provide us with an opportunity to future proof our network and prepare it for the evolution of these new generation networks. We have gleaned invaluable insights about the modifications that we need to do on our core, radio and transmission network from these pilots. It is important to note that the transition to 5G is not just a flick of a switch, but it’s a roadmap that requires technical modifications and network architecture changes to ensure that we meet the standards that this technology requires. We are pleased that we are laying the groundwork that will lead to the full realisation of the boundless opportunities that are inherent in the digital world.” says Babak Fouladi, Group Chief Technology & Information Systems Officer, at MTN Group.
Giovanni Chiarelli, Chief Technology and Information Officer for MTN SA said: “Next generation services such as virtual and augmented reality, ultra-high definition video streaming, and cloud gaming require massive capacity and higher user data rates. The use of millimeter-wave spectrum bands is one of the key 5G enabling technologies to deliver the required capacity and massive data rates required for 5G’s Enhanced Mobile Broadband use cases. MTN and Huawei’s joint field trial of the first 5G mmWave Fixed-Wireless Access solution in Africa will also pave the way for a fixed-wireless access solution that is capable of replacing conventional fixed access technologies, such as fibre.”
“Huawei is continuing to invest heavily in innovative 5G technologies”, said Edward Deng, President of Wireless Network Product Line of Huawei. “5G mmWave technology can achieve unprecedented fiber-like speed for mobile broadband access. This trial has shown the capabilities of 5G technology to deliver exceptional user experience for Enhanced Mobile Broadband applications. With customer-centric innovation in mind, Huawei will continue to partner with MTN to deliver best-in-class advanced wireless solutions.”
“We are excited about the potential the technology will bring as well as the potential advancements we will see in the fields of medicine, entertainment and education. MTN has been investing heavily to further improve our network, with the recent “Best in Test” and MyBroadband best network recognition affirming this. With our focus on providing the South Africans with the best customer experience, speedy allocation of spectrum can help bring more of these technologies to our customers,” says Giovanni.