A few years ago organizations around the world were discussing their migration to the cloud. Now that many of them have moved over what is next? ERIK ZANDBOER, Advisory Specialist EMEA at Dell EMC, shares his thoughts on where the cloud market is headed.
Go back a few years and cloud was very different. It mainly existed in name, hanging on the lips of vendors and the IT channel. Today that has flipped nearly 180 degrees…
“When we started talking about cloud, we spoke about the journey to the cloud. Everyone wanted to get there and nobody knew how. There were all kinds of definitions for cloud. Many people built things they called cloud, but it wasn’t cloud at all. Nowadays we see a shift. It’s the complete opposite. Customers are actually pulling for features. Now we hear ‘Why can’t you support X, because that’s simple.’ Initially we dragged customers, now they are dragging us!”
So says Erik Zandboer, Advisory Specialist EMEA at Dell EMC. While visiting South Africa, he shared his views on where the cloud market stands and is headed.
Cloud is quickly becoming the baseline for current and future technology investment and the reason is simple: cloud represents the rapid commoditization of IT infrastructure. The combination of distributed computing and high-speed connectivity is drastically reducing the cost of raw bit-crunching power, diminishing barriers of entry to such a degree that participating on a cloud platform is the equivalent of a personal (and affordable) supercomputer. Cloud is to industrial-scale computing what the smartphone is to the desktop computer. As a result anyone who wishes to remain relevant are building their applications and solutions in the cloud.
Fortunately the business is not ignorant and many large companies are already exploring the next stages of cloud adoption:
“They are looking at this cloud native stuff. It looks very promising and interesting. It’s way easier to deploy anywhere. You can deploy services to multiple clouds and just connect them together. As long as the microservices can find each other over the network, the application will work. That’s a whole other mode of operation and a lot of companies are willing to go in that direction. There are a lot of questions around Openstack, cloud native, devops and such things.”
Companies are starting to take ownership of this new methodology, jumping between their own exclusive private clouds and robust public clouds as project requirements change:
“We see companies that do development in their private data centres, and when they need to scale it out, they go to a public provider. We also see other companies do the exact opposite: developing in Amazon or similar, because it is so easy and flexible, then running their production on private cloud because most of the time it’s cheaper.”
Eventually workloads – the live versions of apps and data – will dynamically shuttle between various clouds, finding the best and most cost-effective platform for the job. Zandboer says this is already happening with VMware solutions:
“We see that with vCloud Air. You move your workload with very limited downtime from on-premise to off-premise and the other way around. There are complications: you need a low latency, high bandwidth network. The moment you move your workload, it needs to work. So there are a lot of implications. But VMWare is making great strides there.”
Zandboer is confident that in a few years this type of automation will be widespread. Companies will finally get rid of the headache of IT infrastructure they don’t need: “That would be very cool: to have a cloud marketplace and your workloads bound to SLAs, and a system looking at the SLA and the app, assigning the cloud that matches and is cheapest. That’s the ultimate dream for many.”
We aren’t there yet, which is why companies such as VMware and Dell EMC focus on creating seamless hardware and software environments. But that is the future of cloud: a world where infrastructure is irrelevant and the performance of business applications are all that matter.
Legion gets a pro makeover
Lenovo’s latest Legion gaming laptop, the Y530, pulls out all the stops to deliver a sleek looking computer at a lower price point, writes BRYAN TURNER
Gaming laptops have become synonymous with thick bodies, loud fans, and rainbow lights. Lenovo’s latest gaming laptop is here to change that.
The unit we reviewed housed an Intel Core i7-8750H, with an Nvidia GeForce GTX 1060 GPU. It featured dual storage, one bay fitted with a Samsung 256GB NVMe SSD and the other with a 1TB HDD.
The latest addition to the Legion lineup has become far more professional-looking, compared to the previous generation Y520. This trend is becoming more prevalent in the gaming laptop market and appeals to those who want to use a single device for work and play. Instead of sporting flashy colours, Lenovo has opted for an all-black computer body and a monochromatic, white light scheme.
The laptop features an all-metal body with sharp edges and comes in at just under 24mm thick. Lenovo opted to make the Y530’s screen lid a little shorter than the bottom half of the laptop, which allowed for more goodies to be packed in the unit while still keeping it thin. The lid of the laptop features Legion branding that’s subtly engraved in the metal and aligned to the side. It also features a white light in the O of Legion that glows when the computer is in use.
The extra bit of the laptop body facilitates better cooling. Lenovo has upgraded its Legion fan system from the previous generation. For passive cooling, a type of cooling that relies on the body’s build instead of the fans, it handles regular office use without starting up the fans. A gaming laptop with good passive cooling is rare to find and Lenovo has shown that it can be achieved with a good build.
The internal fans start when gaming, as one would expect. They are about as loud as other gaming laptops, but this won’t be a problem for gamers who use headsets.
Click here to read about the screen quality, and how it performs in-game.
Serious about security? Time to talk ISO 20000
By EDWARD CARBUTT, executive director at Marval Africa
The looming Protection of Personal Information (PoPI) Act in South Africa and the introduction of the General Data Protection Regulation (GDPR) in the European Union (EU) have brought information security to the fore for many organisations. This in addition to the ISO 27001 standard that needs to be adhered to in order to assist the protection of information has caused organisations to scramble and ensure their information security measures are in line with regulatory requirements.
However, few businesses know or realise that if they are already ISO 20000 certified and follow Information Technology Infrastructure Library’s (ITIL) best practices they are effectively positioning themselves with other regulatory standards such as ISO 27001. In doing so, organisations are able to decrease the effort and time taken to adhere to the policies of this security standard.
ISO 20000, ITSM and ITIL – Where does ISO 27001 fit in?
ISO 20000 is the international standard for IT service management (ITSM) and reflects a business’s ability to adhere to best practice guidelines contained within the ITIL frameworks.
ISO 20000 is process-based, it tackles many of the same topics as ISO 27001, such as incident management, problem management, change control and risk management. It’s therefore clear that if security forms part of ITSM’s outcomes, it should already be taken care of… So, why aren’t more businesses looking towards ISO 20000 to assist them in becoming ISO 27001 compliant?
The link to information security compliance
Information security management is a process that runs across the ITIL service life cycle interacting with all other processes in the framework. It is one of the key aspects of the ‘warranty of the service’, managed within the Service Level Agreement (SLA). The focus is ensuring that the quality of services produces the desired business value.
So, how are these standards different?
Even though ISO 20000 and ISO 27001 have many similarities and elements in common, there are still many differences. Organisations should take cognisance that ISO 20000 considers risk as one of the building elements of ITSM, but the standard is still service-based. Conversely, ISO 27001 is completely risk management-based and has risk management at its foundation whereas ISO 20000 encompasses much more
Why ISO 20000?
Organisations should ask themselves how they will derive value from ISO 20000. In Short, the ISO 20000 certification gives ITIL ‘teeth’. ITIL is not prescriptive, it is difficult to maintain momentum without adequate governance controls, however – ISO 20000 is. ITIL does not insist on continual service improvement – ISO 20000 does. In addition, ITIL does not insist on evidence to prove quality and progress – ISO 20000 does. ITIL is not being demanded by business – governance controls, auditability & agility are. This certification verifies an organisation’s ability to deliver ITSM within ITIL standards.
Ensuring ISO 20000 compliance provides peace of mind and shortens the journey to achieving other certifications, such as ISO 27001 compliance.