Connect with us

Featured

Time to prioritise apps

App complexity has become even more confusing as businesses move to a multi-cloud world. But there are practical ways to tackle it, writes IAN JANSEN VAN RENSBURG, Senior Manager: Systems Engineering at VMware Southern Africa

When discussing the impact of technology on the organisation, we’ve typically done so in terms of platforms and infrastructure:  on-premise, off-premise, cloud, data centers, networks, Edge. And you might measure value and effectiveness in terms of the value of cost optimization, agility, speed to market, security, compliance, control and choice. What this focus overlooks is what’s actually driving business decisions today – something that, until a few years ago, most people outside of the IT department didn’t really think about – applications. 

Everything changed when we, ‘the consumer’, got our hands on the iPhone and its App Store. Now, in only a handful of years, and with Application Marketplaces for every operating system, enterprises are thinking ‘app first’. But not all applications were created equal, and each app’s value must be measured in terms of how core it is to the business. 

So, what’s mission critical, what’s business critical and what’s customer facing? It’s this prioritisation of applications that is ultimately informing IT decisions, whether it’s a mission critical app that must deliver complete security without compromising its performance, or a consumer-facing service that needs to have the scalability to manage major spikes in use without constantly consuming vast amounts of resource, such as a retailer’s mobile commerce offering. The type of application is also a major factor – if you have a bespoke app that has sat at the core of your business for many years, like an automated pricing tool for a logistics company, simply lifting and shifting to the cloud will not work. With access to its data so critical, the decision may be made to keep it in its existing environment for the time being. 

These are all factors that influence the criteria for choosing the right platform. The challenge is that with each application requiring different operating systems and platforms, and no one platform yet being able to offer all benefits without being prohibitively expensive, many organizations find themselves with a multitude of infrastructures and platforms with a complex application estate hosted in all sorts of places. Unfortunately, many of these applications are unable to move easily across platforms and different clouds to where they would be best located and used. Respondents to a recent VMware survey highlighted significant challenges to this situation; with integrating legacy systems (57%) and understanding new technologies (54%) two of the biggest obstacles organisations needed to overcome in order to get the best performance out of this mix of infrastructures. But is there a way of managing this ‘complex’ landscape with more ease? 

Delivering a better experience across multiple platforms

Having a clear strategy and defined approach is key. Take a retail bank for example. With physical branches as well as mobile applications and online banking services, its infrastructure will mostly be a mix of on-premise or private cloud.  With security, regulatory compliance and governance so critical, the unwieldly nature of these systems means that going with tried and trusted approaches is usually more straight forward. However, with new entrants and digital-native disruptors using public cloud providers, unencumbered by legacy systems, established players need to find a way of being able to respond quickly. Banks such as Capital One and the World Bank are deploying public cloud computing for development and testing. In this way, they enjoy the benefits of flexibility, scalability and agility without significant investment, whilst experimenting or using applications that do not draw on legacy data. 

For instance, trailing the use of blockchain to streamline letters of credit could require significant resource. As it is a pilot, however, the bank may be less keen to commit to the investment of a fully private cloud environment. Deploying a public cloud becomes attractive; it provides the necessary infrastructure, the pilot can be run, and if it is deemed a success the decision can be made to move the application over to a private cloud environment. In doing so, the bank has been able to develop, deploy and test quickly, turning around results that allow a decision to be made and, potentially, a new product to be released to the market. If it has not been a success, investment in permanent resource has not been lost. 

Another opportunity for a clearly defined approach and strategy is the opening up of banking. Driven by the likes of the Open Banking initiative in the UK and the EU’s Directive on Payment Services (PSD2), more financial institutions are giving API access to third party developers to build applications and services that consumers or businesses can use to manage their finances across multiple providers. The aim is to provide greater transparency and flexibility to customers, ultimately delivering a better experience. What it means for banks and other financial service providers is having the infrastructure in place to easily share relevant data securely – again, a mix of private and public cloud environments can support the development of third party apps without exposing core data or mission critical services to security risks or non-compliance. 

Managing talent and avoiding silos

But what does this mean for the bank’s technology team? For starters, it raises the possibility of requiring teams with multiple skillsets or, more likely, separate teams focused on separate platforms. That public cloud might be from AWS, for example, which requires a different type of skillset to the one needed to operate the private cloud, which again might not be relevant for the team managing the legacy infrastructure. IT has long been plagued by silos of teams working on individual, proprietary technology, and left unchecked, this issue will be exacerbated further by the demands of multi-platform infrastructure. The whole point of having a multi-cloud environment, of being able to securely move applications from one environment to another depending on requirements at that time, becomes much more complicated if siloed teams struggle to work together.

And these demands are only going to increase. As more and more enterprises accelerate their digital transformation agendas, they are faced with the challenge of repurposing their sprawling application estates to meet their digital requirements without compromising security. Many are already harnessing multi-cloud environments to enable transformation. The same VMware survey mentioned earlier found that 80% of respondents believed that one of the benefits of multi-cloud was improving innovation – and it makes sense; being able to get the best across multiple types of environment sounds exactly what most enterprises need to do to unlock the opportunities of digitization. 

Understanding what you need to achieve 

For a multi-cloud deployment to work, enterprises need to understand what they fundamentally require and have the hybrid cloud infrastructure to run and manage those requirements across all environments and devices. The environments used are ultimately the support, the enabler, not the objective itself; that lies with the applications. 

Yet this should also be in a constant state of evolution. As enterprises continue to digitally transform, they need to be continually reviewing and reforming their application estate. It is the ongoing process of choosing which applications are redundant, which need to be retrofitted, which can be completely transformed into cloud-native apps, and which need to be kept in legacy environments for a bit longer, all whilst being able to manage and move workloads as required. By following this approach, and by working with partners with the experience and skills required to deliver infrastructure that can efficiently run different platforms, enterprises can deliver an effective app-first approach, across any number of environments, to drive their digital business goals forward. 

Featured

Now IBM’s Watson joins IoT revolution in agriculture

Global expansion of the Watson Decision Platform taps into AI, weather and IoT data to boost production

IBM has announced the global expansion of Watson Decision Platform for Agriculture, with AI technology tailored for new crops and specific regions to help feed a growing population. For the first time, IBM is providing a global agriculture solution that combines predictive technology with data from The Weather Company, an IBM Business, and IoT data to help give farmers around the world greater insights about planning, ploughing, planting, spraying and harvesting.

By 2050, the world will need to feed two billion more people without an increase in arable land [1]. IBM is combining power weather data – including historical, current and forecast data and weather prediction models from The Weather Company – with crop models to help improve yield forecast accuracy, generate value, and increase both farm production and profitability.

Roric Paulman, owner/operator of Paulman Farms in Southwest Nebraska, said: “As a farmer, the wild card is always weather. IBM overlays weather details with my own data and historical information to help me apply, verify, and make decisions. For example, our farm is in a highly restricted water basin, so the ability to better anticipate rain not only saves me money but also helps me save precious natural resources.”

New crop models include corn, wheat, soy, cotton, sorghum, barley, sugar cane and potato, with more coming soon. These models will now be available in the Africa, U.S. Canada, Mexico, and Brazil, as well as new markets across Europe and Australia.

Kristen Lauria, general manager of Watson Media and Weather Solutions at IBM, said: “These days farmers don’t just farm food, they also cultivate data – from drones flying over fields to smart irrigation systems, and IoT sensors affixed to combines, seeders, sprayers and other equipment. Most of the time, this data is left on the vine — never analysed or used to derive insights. Watson Decision Platform for Agriculture aims to change that by offering tools and solutions to help growers make more informed decisions about their crops.” 

The average farm generates an estimated 500,000 data points per day, which will grow to 4 million data points by 2036 [2]. Applying AI and analysis to aggregated field, machine and environmental data can help improve shared insights between growers and enterprises across the agriculture ecosystem. With a better view of the fields, growers can see what’s working on certain farms and share best practices with other farmers. The platform assesses data in an electronic field record to identify and communicate crop management patterns and insights. Enterprise businesses such as food companies, grain processors, or produce distributors can then work with farmers to leverage those insights. It helps track crop yield as well as the environmental, weather and plant biologic conditions that go into a good or bad yield, such as irrigation management, pest and disease risk analysis and cohort analysis for comparing similar subsets of fields.

The result isn’t just more productive farmers. Watson Decision Platform for Agriculture could help a livestock company eliminate a certain mold or fungus from feed supply grains or help identify the best crop irrigation practices for farmers to use in drought-stricken areas like California. It could help deliver the perfect French fry for a fast food chain that needs longer – not fatter – potatoes from its network of growers. Or it could help a beer distributor produce a more affordable premium beer by growing higher quality barley that meets the standard required to become malting barley.

Watson Decision Platform for Agriculture is built on IBM PAIRS Geoscope from IBM Research, which quickly processes massive, complex geospatial and time-based datasets collected by satellites, drones, aerial flights, millions of IoT sensors and weather models. It crunches large, complex data and creates insights quickly and easily so farmers and food companies can focus on growing crops for global communities.

IBM and The Weather Company help the agriculture industry find value in weather insights. IBM Research collaborates with start up Hello Tractor to integrate The Weather Company data, remote sensing data (e.g., satellite), and IoT data from tractors. IBM also works with crop nutrition leader Yara to include hyperlocal weather forecasts in its digital platform for real-time recommendations, tailored to specific fields or crops. IBM acquired The Weather Company in 2016 and has since been helping clients better understand and mitigate the cost of weather on their businesses. The global expansion of Watson Decision Platform for Agriculture is the latest innovation in IBM’s efforts to make weather a more predictable business consideration. Also just announced, Weather Signals is a new AI-based tool that merges The Weather Company data with a company’s own operations data to reveal how minor fluctuations in weather affects business.

The combination of rich weather forecast data from The Weather Company and IBM’s AI and Cloud technologies is designed to provide a unique capability, which is being leveraged by agriculture, energy and utility companies, airlines, retailers and many others to make informed business decisions.

[1] The UN Department of Economic and Social Affairs, “World Population Prospects: The 2017 Revision”

[2] Business Insider Intelligence, 2016 report: https://www.businessinsider.com/internet-of-things-smart-agriculture-2016-10


Continue Reading

Featured

What if Amazon used AI to take on factories?

By ANTONY BOURNE, IFS Global Industry Director for Manufacturing

Amazon recently announced record profits of $3.03bn, breaking its own record for the third consecutive time. However, Amazon appears to be at a crossroads as to where it heads next. Beyond pouring additional energy into Amazon Prime, many have wondered whether the company may decide to enter an entirely new sector such as manufacturing to drive future growth, after all, it seems a logical step for the company with its finger in so many pies.

At this point, it is unclear whether Amazon would truly ‘get its hands dirty’ by manufacturing its own products on a grand scale. But what if it did? It’s worth exploring this reality. What if Amazon did decide to move into manufacturing, a sector dominated by traditional firms and one that is yet to see an explosive tech rival enter? After all, many similarly positioned tech giants have stuck to providing data analytics services or consulting to these firms rather than genuinely engaging with and analysing manufacturing techniques directly.

If Amazon did factories

If Amazon decided to take a step into manufacturing, it is likely that they could use the Echo range as a template of what AI can achieve. In recent years,Amazon gained expertise on the way to designing its Echo home speaker range that features Alexa, an artificial intelligence and IoT-based digital assistant.Amazon could replicate a similar form with the deployment of AI and Industrial IoT (IIoT) to create an autonomously-run smart manufacturing plant. Such a plant could feature IIoT sensors to enable the machinery to be run remotely and self-aware; managing external inputs and outputs such as supply deliveries and the shipping of finished goods. Just-in-time logistics would remove the need for warehousing while other machines could be placed in charge of maintenance using AI and remote access. Through this, Amazon could radically reduce the need for human labour and interaction in manufacturing as the use of AI, IIoT and data analytics will leave only the human role for monitoring and strategic evaluation. Amazon has been using autonomous robots in their logistics and distribution centres since 2017. As demonstrated with the Echo range, this technology is available now, with the full capabilities of Blockchain and 5G soon to be realised and allowing an exponentially-increased amount of data to be received, processed and communicated.

Manufacturing with knowledge

Theorising what Amazon’s manufacturing debut would look like provides a stark learning opportunity for traditional manufacturers. After all, wheneverAmazon has entered the fray in other traditional industries such as retail and logistics, the sector has never remained the same again. The key takeaway for manufacturers is that now is the time to start leveraging the sort of technologies and approaches to data management that Amazon is already doing in its current operations. When thinking about how to implement AI and new technologies in existing environments, specific end-business goals and targets must be considered, or else the end result will fail to live up to the most optimistic of expectations. As with any target and goal, the more targeted your objectives, the more competitive and transformative your results. Once specific targets and deliverables have been considered, the resources and methods of implementation must also be considered. As Amazon did with early automation of their distribution and logistics centres, manufacturers need to implement change gradually and be focused on achieving small and incremental results that will generate wider momentum and the appetite to lead more expansive changes.

In implementing newer technologies, manufacturers need to bear in mind two fundamental aspects of implementation: software and hardware solutions. Enterprise Resource Planning (ERP) software, which is increasingly bolstered by AI, will enable manufacturers to leverage the data from connected IoT devices, sensors, and automated systems from the factory floor and the wider business. ERP software will be the key to making strategic decisions and executing routine operational tasks more efficiently. This will allow manufacturers to keep on top of trends and deliver real-time forecasting and spot any potential problems before they impact the wider business.

As for the hardware, stock management drones and sensor-embedded hardware will be the eyes through which manufacturers view the impact emerging technologies bring to their operations. Unlike manual stock audits and counting, drones with AI capabilities can monitor stock intelligently around production so that operations are not disrupted or halted. Manufacturers will be able to see what is working, what is going wrong, and where there is potential for further improvement and change.

Knowledge for manufacturing

For many traditional manufacturers, they may see Amazon as a looming threat, and smart-factory technologies such as AI and Robotic Process Automation (RPA) as a far off utopia. However, 2019 presents a perfect opportunity for manufacturers themselves to really determine how the tech giants and emerging technologies will affect the industry. Technologies such as AI and IoT are available today; and the full benefits of these technologies will only deepen as they are implemented alongside the maturing of other emerging technologies such as 5G and Blockchain in the next 3-5 years. Manufacturers need to analyse the needs which these technologies can address and produce a proper plan on how to gradually implement these technologies to address specific targets and deliverables. AI-based software and hardware solutions will fundamentally revolutionise manufacturing, yet for 2019, manufacturers just have to be willing to make the first steps in modernisation.

Continue Reading

Trending

Copyright © 2019 World Wide Worx