Cloud technology is gaining traction as more Internet enabled devices become available. But the key to a successful cloud deployment is how it is managed and which processes are hosted in the cloud and which are kept locally, writes JOHAN SCHEEPERS of CommVault.
While cloud technology gains traction through the big data boom, IT leaders question how they can maximise value from cloud computing while still maintaining data security and control. The data growth we are experiencing continues to escalate in volume and complexity, particularly when data is streaming in from millions of new internet enabled devices, virtualised machines and cloud enabled business-critical applications.
Although the cloud has been around for a few years now, organisations are only recently starting to understand what level of cloud adoption makes sense for their business needs. The cloud has also gone through significant developments, with Hybrid models becoming the favoured approach to enable organisations to benefit from the agility offered from public clouds, while maintaining control of sensitive data on-premise.
A hybrid approach to cloud
Planning a journey to the cloud, whether private, public or both is daunting for all organisations. There is the promise of greater business agility and low upfront investment, however if not handled systematically and driven by insights gleaned from your data, it can actually increase cost and complexity. Some organisations are experiencing issues ranging from egress costs to wasteful utilisation, to complex and siloed management. By starting with insights from your data you can better understand which workloads and applications are most appropriate for a public or private cloud or on-premise hosting, and deploy a successful Hybrid model.
Having an on-premises, private infrastructure directly accessible means not having to go via the public internet for everything, which can greatly reduce access time and latency in comparison to public cloud services. The hybrid cloud model offers organisations on-premises computational and storage infrastructure for processing data that requires extra speed or high availability for your business. This is combined with the benefits of the public cloud where a workload may exceed the computational power of the private cloud component.
Expanding the private component of a hybrid cloud also allows for flexibility in virtual server design. Organisations can automate the entire virtual machine lifecycle to archiving older VM’s to the cloud.
Another benefit of the hybrid model is the increased connectivity and collaboration offered to employees – which can often be a challenge in today’s digital world. The ability for teams to easily and securely share files should be coupled with the integration of remote workers into core business processes, such as internal messaging, scheduling, edge protection (laptops, tablets, etc), business intelligence and analytics.
Although the benefits are clear for adopting a Hybrid approach it can still be difficult to know where to start. CIOs need to look at how they can introduce a Hybrid model that delivers deeply integrated cloud automation and orchestration tools, ensuring compatibility across cloud solutions and on-premise infrastructure. It is recommended that organisations look towards a low risk, high value first step to the cloud through disaster recovery. And particularly in India, our service provider partners are seeing strong demand for Disaster-Recovery-as-a-Service and Backup-as-a-Service, as a clear entries into the cloud for businesses.
The hybrid environment is fast emerging as the norm for many CIOs. However the key to successfully deploying a hybrid cloud model is by understanding which workloads and applications are most appropriate for which hosting, and leveraging a single integrated console with an enterprise-wide view of data across these infrastructures. This will mean that IT leaders can better control where to process data and maximise cost savings by identifying reasonable spend in relation to the value that data offers to the business.
The spending shift – from Capex to Opex
While cloud computing offered promises of cost savings, increasingly we are seeing headlines like this from the Wall Street Journal: “The Hidden Waste and Expense of Cloud Computing“ or from CFO Mag: “Cloud Computing’s Wasteland“. So what’s actually happening?
Due to a lack of controls to help track and manage utilisation, businesses are being faced with unexpected costs, typically from an unusually large bill from their cloud provider after cloud instances are left running. In the traditional CapEx model, which we’re all used to, we invest heavily upfront in hardware and software. However with the cloud subscription model, we can build a datacenter with a credit card in a predictable Operational Expense (OpEx) model – which is wonderful in theory, until the bill shows up. As organisations mainstream public cloud, they are exposing holes in the maturity of their management processes and controls. This means that developers have been deploying VMs at will and not taking down workloads when they are finished.
To address this growing concern, IT leaders need to ensure they have a data and information management strategy which enables them to capture the workload at the point of creation and attach data management service at that point. To support Hybrid models, we need to be able to stay with the workload as it moves between on-premises to hosted private cloud to hybrid and public clouds.
Lastly, data is only useful when we are able to gain value from it, whether it be in the cloud or on-prem. Starting with backup and recovery, organisations can then fast track into more advanced use cases like dev/test solutions and more. Here emerges the hybrid data analytics strategy. ‘Analytics with purpose’ will be a guiding principle for businesses moving forward. And regardless of whether it’s to introduce a business intelligence project or take an advanced analytics strategy to the next level, organisations leveraging a hybrid cloud model will have the opportunity to make more intelligent choices about structured and unstructured data in their environment. They will be able to quickly mitigate the risk of compliance related issues, and regain valuable storage space, freeing up budgets to pursue opportunities that can power business growth.
* Johan Scheepers, Principal Systems Engineer at CommVault
Legion gets a pro makeover
Lenovo’s latest Legion gaming laptop, the Y530, pulls out all the stops to deliver a sleek looking computer at a lower price point, writes BRYAN TURNER
Gaming laptops have become synonymous with thick bodies, loud fans, and rainbow lights. Lenovo’s latest gaming laptop is here to change that.
The unit we reviewed housed an Intel Core i7-8750H, with an Nvidia GeForce GTX 1060 GPU. It featured dual storage, one bay fitted with a Samsung 256GB NVMe SSD and the other with a 1TB HDD.
The latest addition to the Legion lineup has become far more professional-looking, compared to the previous generation Y520. This trend is becoming more prevalent in the gaming laptop market and appeals to those who want to use a single device for work and play. Instead of sporting flashy colours, Lenovo has opted for an all-black computer body and a monochromatic, white light scheme.
The laptop features an all-metal body with sharp edges and comes in at just under 24mm thick. Lenovo opted to make the Y530’s screen lid a little shorter than the bottom half of the laptop, which allowed for more goodies to be packed in the unit while still keeping it thin. The lid of the laptop features Legion branding that’s subtly engraved in the metal and aligned to the side. It also features a white light in the O of Legion that glows when the computer is in use.
The extra bit of the laptop body facilitates better cooling. Lenovo has upgraded its Legion fan system from the previous generation. For passive cooling, a type of cooling that relies on the body’s build instead of the fans, it handles regular office use without starting up the fans. A gaming laptop with good passive cooling is rare to find and Lenovo has shown that it can be achieved with a good build.
The internal fans start when gaming, as one would expect. They are about as loud as other gaming laptops, but this won’t be a problem for gamers who use headsets.
Click here to read about the screen quality, and how it performs in-game.
Serious about security? Time to talk ISO 20000
By EDWARD CARBUTT, executive director at Marval Africa
The looming Protection of Personal Information (PoPI) Act in South Africa and the introduction of the General Data Protection Regulation (GDPR) in the European Union (EU) have brought information security to the fore for many organisations. This in addition to the ISO 27001 standard that needs to be adhered to in order to assist the protection of information has caused organisations to scramble and ensure their information security measures are in line with regulatory requirements.
However, few businesses know or realise that if they are already ISO 20000 certified and follow Information Technology Infrastructure Library’s (ITIL) best practices they are effectively positioning themselves with other regulatory standards such as ISO 27001. In doing so, organisations are able to decrease the effort and time taken to adhere to the policies of this security standard.
ISO 20000, ITSM and ITIL – Where does ISO 27001 fit in?
ISO 20000 is the international standard for IT service management (ITSM) and reflects a business’s ability to adhere to best practice guidelines contained within the ITIL frameworks.
ISO 20000 is process-based, it tackles many of the same topics as ISO 27001, such as incident management, problem management, change control and risk management. It’s therefore clear that if security forms part of ITSM’s outcomes, it should already be taken care of… So, why aren’t more businesses looking towards ISO 20000 to assist them in becoming ISO 27001 compliant?
The link to information security compliance
Information security management is a process that runs across the ITIL service life cycle interacting with all other processes in the framework. It is one of the key aspects of the ‘warranty of the service’, managed within the Service Level Agreement (SLA). The focus is ensuring that the quality of services produces the desired business value.
So, how are these standards different?
Even though ISO 20000 and ISO 27001 have many similarities and elements in common, there are still many differences. Organisations should take cognisance that ISO 20000 considers risk as one of the building elements of ITSM, but the standard is still service-based. Conversely, ISO 27001 is completely risk management-based and has risk management at its foundation whereas ISO 20000 encompasses much more
Why ISO 20000?
Organisations should ask themselves how they will derive value from ISO 20000. In Short, the ISO 20000 certification gives ITIL ‘teeth’. ITIL is not prescriptive, it is difficult to maintain momentum without adequate governance controls, however – ISO 20000 is. ITIL does not insist on continual service improvement – ISO 20000 does. In addition, ITIL does not insist on evidence to prove quality and progress – ISO 20000 does. ITIL is not being demanded by business – governance controls, auditability & agility are. This certification verifies an organisation’s ability to deliver ITSM within ITIL standards.
Ensuring ISO 20000 compliance provides peace of mind and shortens the journey to achieving other certifications, such as ISO 27001 compliance.