Featured
AI gives birth to next generation of data centres
By Ian Jansen van Rensburg, VMware EMEA Senior Systems Engineer
The Rise of AI
Artificial intelligence has rapidly become a leading driver of innovation, creating competitive advantages and new business opportunities. The proliferation of data is enabling breakthroughs across disparate industries, from transportation and healthcare to energy and communications. However, one of the most profound AI-mediated transformations will occur within the world of enterprise technology.
Based on our extensive experience within the enterprise tech stack, we see three core factors that have created the perfect storm fueling today’s AI innovation:
1. Compute (The Need for Speed): From CPUs and GPUs through FPGAs and ASICs, computing resources have made incredible progress in the past few years, allowing us to process data more quickly, more broadly, and more deeply than ever before. In addition, new deployment channels (such as public cloud GPUs/ASICs) allow customers to balance Capex versus Opex in their AI initiatives.
2. Algorithms (The Modern Day Equation): Algorithms are the theoretical foundation underlying machine learning and AI, from simple neural networks to more complicated recurrent and convolutional architectures. Many of these algorithms trace back decades — yet have only recently led to applied breakthroughs. This is partially due to the advances in Compute, but even more so, because of…
3. Data (Data Is the New Oil): Machine learning techniques are famously data-inefficient, compared to humans. Many of the headline advances in AI performance, such as AlphaGo, are trained on enormous data sets that no human could see in an entire lifetime. Without sufficient training volume, machine learning techniques fail to reach acceptable performance levels. And as recently as a decade ago, the quantity of enterprise data available for machine learning was a tiny fraction of what is available today, from logs and metrics to traces and configuration events.
The AI Opportunity: Self-Optimizing Data Centers
This explosion of operational data is both a blessing and curse. In the current world of data center and cloud operations, companies are desperately trying to keep up with the flood of raw information, and falling further behind each year. The volume of data has outpaced currently available tools and platforms, placing an increasing burden on human operators – even feature developers – to keep up.
In fact, a recent report by EMA cites that an average of 30-40% of developer time is spent on production deployment, configuration, testing, debugging, and support challenges rather than feature development (Source: EMA blog – 3 Key Lessons from DockerCon 2018: Strategic Analysis of the Container Market Place). This operational ‘tax’ is unacceptable for firms in competitive industries where feature velocity is a key driver.
AI will allow companies to transform this operational burden into a strategic advantage. It will enable firms to move to a global operations model where they can leverage the deep value of their data, resulting in real-time insights that drive business value. AI will fill the void between operational complexity and operational capability. Some common uses where companies can leverage AI to improve their data centers include improved operational efficiencies, real-time cost-performance balancing, security, and even business metric optimization.
Don’t fight the data deluge – embrace it.
The increase in operational complexity won’t be slowing down anytime soon. The gap between human scale and machine scale continues to grow. Organizations that are not able to augment their data center, cloud, and edge computing strategies by adopting AI technologies will risk falling further behind. Conversely, organizations that are able to effectively leverage machine learning will build a significant competitive edge.
We envision a hybrid data center, cloud, and edge that is self-healing and self-optimizing, greatly reducing administrative overload and allowing firms to focus on strategic innovation and customer experience. We also believe that an AI-enabled infrastructure will deliver degrees of self-regulation well beyond today’s policy-based capabilities.
What Next?
Preparing your data center for the modern world is not a trivial task. Start by future-proofing your data center as groundwork for the AI-driven approaches that are coming around the corner. For example, on the control surface, that means adopting infrastructure-as-code (IAC) patterns, as well as intermediating legacy manual processes with APIs where possible. As for the data, consider implementing edge computing, which enables data gathering and analytics to occur near the source of the data. Companies should also invest in software-defined infrastructure as a key enabler of this process. Last, but certainly not least, explore a multi-cloud strategy to offer the most agility and flexibility to your IT infrastructure as you prepare for the high-velocity, machine-learning-driven future.
Share
- Click to share on Twitter (Opens in new window)
- Click to share on Facebook (Opens in new window)
- Click to share on LinkedIn (Opens in new window)
- Click to email a link to a friend (Opens in new window)
- Click to share on Reddit (Opens in new window)
- Click to share on WhatsApp (Opens in new window)
- Click to share on Pinterest (Opens in new window)
Thank you for Signing Up |