Connect with us

Featured

Decoding tech buzzwords

Every day, we are bombarded with jargon and buzzwords. Each industry has its fair share of jargon, but none so much as the IT industry, writes DOUG CRAWFORD, Manager of Service Delivery at Entelect.

Sometimes a buzzword emerges out of necessity – there simply is no other way to describe a new phenomenon or trend. In many cases, however, it is a shift in thinking or a reincarnation of an old concept that provides the marketing fraternity with a fresh opportunity to create a ‘buzz’.

Instead of fostering a real understanding, the use of buzzwords and jargon can create misconceptions that ultimately slow down the decision-making process. At best, people waste time trying to understand what is essentially something very simple. At worst, they miss an opportunity. Either way, new terms can be confusing so I have decoded some of the IT industry’s up and coming jargon and buzzwords.

1. Big Data: Big Data refers to the large amounts of data that are typically collected from events triggered by particular actions or devices monitoring some phenomena, often stored in a loosely structured format. Traditional techniques for processing these large data sets are ineffective and new approaches are necessary to collect, retrieve, summarise and visualise – turning the data into something more meaningful.

The generally accepted defining properties of Big Data are known as the Three Vs, which Gartner analyst Doug Laney originally coined:

·         Volume – the amount of data stored

·         Velocity – the rate at which data is generated and processed

·         Variety – the type and source of the data.

If each of these properties is increasing significantly, the information can be considered to be Big Data.

Aside from the fact that companies are collecting vast amounts of information on customers’ movements, behaviours and buying habits, why is Big Data important from a business perspective?

The old adage of ‘knowledge is power’ holds true. The more equipped people are to make decisions, the better the outcome for their business. What is relevant in the case of Big Data however, is making sense of the information (separating the noise from the meaningful), the timing of the information, and how to use the information effectively to improve a product or service.

The current movement in Big Data aims to address these issues and is reshaping our understanding of how to process information on a much larger scale.

2. Prescriptive Analytics: Making sense of the information leads us to the field of data analytics – the tools and techniques that are used to extract meaning from huge volumes of data. Analytics efforts can be broadly classified into one of three main categories – descriptive, predictive and prescriptive.

Descriptive analytics, tells us what has happened and possibly why it has happened. It usually involves reporting on historical data to provide insights into various business metrics. Predictive analytics attempts to tell us what may happen in the future, taking historical data into account and applying algorithms and various statistical models to predict the future.

Prescriptive analytics, the third and most recent focus area of data analytics, takes it to the next level by recommending a course of action and presenting the likely outcomes of choosing such action, incorporating any constraints of the current environment (financial or regulatory, for example). An actual person still has to make the decisions but prescriptive analytics can provide valuable input into scenario planning and optimisation exercises, by combining business rules, statistical models and machine learning to quantify the impact of future decisions.

There is a variety of organisations that have invested significant effort in descriptive analytics and reporting solutions to provide insight into historical data, and many are starting to explore the opportunities that predictive analytics has to offer. Both are necessary precursors to prescriptive analytics, which requires, at a minimum, the capability to capture and summarise large data sets efficiently. The data can then be used as input to prescriptive analytic engines.

3. Software-defined infrastructure (SDI): Software-defined infrastructure builds on the capabilities of virtualisation and cloud-based services to define IT infrastructure requirements. These requirements include computing power, as well as network and storage capacity, at the software level. SDI allows application developers to describe their expectations of infrastructure in a standard and systematic way, turning computing resources into logical components that can be provisioned on the fly without human intervention.

Take today’s scenario of having to configure each element of infrastructure to support an application – machines and images, storage and mount points, firewalls and load balancers to name a few – and replace it with the simple action of identifying an SDI-enabled data centre and clicking ‘deploy’. Each resource is automatically configured as required and, more importantly, can reconfigure itself as the application and usage changes.

Defining these requirements based on policies and expected usage patterns at the software level, and incorporating them into the deployable artefacts, means that IT organisations can respond more quickly to peaks and troughs in throughput, and achieve repeatable and reliable application deployments by automating many infrastructure related activities.

Furthermore, SDI-enabled data centres can optimise resource usage, which will drive down the cost of infrastructure. Specialists can focus on optimising specific elements of the infrastructure, such as network or storage, rather than reactively wiring and rewiring configurations to support evolving application requirements.

As was the case with Java and the standardised API’s (Application Programming Interfaces) that make up the Java Enterprise Edition framework, SDI will require a concerted effort to ensure inter-operability between the tools, platforms and processes that make up the virtual data centres of the future. As with cloud services, there is a vendor battle brewing to capture the lion’s share of what is likely to be a significant market for SDI-capable services. Those vendors who actively drive and support open interfaces and API’s will have the advantage in the long term.

4. DevOps: The term DevOps has been around for some time now, and the concept even longer. However, only in recent years has it started gaining widespread acceptance as standard practice in the development communities.

DevOps is to development and operations teams as Agile is to development teams and business. Where the Agile movement promotes increased collaboration between development teams and business users, DevOps looks at integrating operations activities much earlier and more tightly into the software-development life cycle. Both have the same goal of enabling software projects to respond more effectively to change.

In reality, DevOps is an extension of Agile as we know it today. However, it includes operations and support functions. The authors of the Agile Manifesto certainly never explicitly excluded operations and support from their widely accepted list of values and principles but in the experience of many, the focus on Agile projects has always been biased towards improving the collaboration between business users and the development team, rather than the development team and the operations team.

Yes, it is true that the operations team is implicitly included in the concept of cross-functional development teams (see below), but, in reality, IT operations in many organisations are still very much an isolated function, which is exactly the barrier that DevOps is trying to eliminate.

5. Cross-functional teams: The concept of a cross-functional team is simple. The development team has all the skills necessary to deliver a piece of working software into production, which may include activities such as user experience design, database design, server configuration and, of course, writing code. Where product development teams are concerned, businesses adopting Agile practices should be assembling cross-functional teams.

This is not an excuse for hiring fewer individuals and expecting them to be Jacks of all trades: specialisation is important and a necessity when solving complex problems that require focus and experience. By having a single, co-located team that can take something from concept to reality eliminates external dependencies that plague many software development teams of today, especially in large organisations.

Aside from efficiency and knowledge sharing, the argument for isolated teams defined by skill or technology is the degree of control of standards and governance within a particular domain. This argument is valid, but only for operational and ‘commoditised’ services such as desktop support and hardware infrastructure. As soon as product development enters the mix, the effectiveness of the team becomes more important than the efficiency of the team. Assuming differentiation is one of the main objectives, product development teams should be optimised for effectiveness rather than efficiency, since development in this scenario is a creative process, one that should not be constrained by red tape and corporate IT governance.

If companies want to increase their chances of creating a product that delights their customers, they should include specialists and designers in the team as full-time members until their services are no longer deemed critical, which will probably only be after several production releases. If you want to minimise your IT costs at the expense of rapid innovation, create a dedicated team that out-sources its services to several internal development teams.

While the occurrence of new ‘buzzwords’ in the ICT space is on-going, it is crucial that decision makers ensure a practical and simplified understanding before making any kind of investment on behalf of their organisation. Often designed to excite and compel, these buzzwords often do not describe the actual function or benefits of a particular concept.

We encourage business leaders to screen potential IT suppliers not by the terminology and complicated jargon they offer, but rather by how simply and understandably, they are able to communicate their solutions.

Featured

Now IBM’s Watson joins IoT revolution in agriculture

Global expansion of the Watson Decision Platform taps into AI, weather and IoT data to boost production

IBM has announced the global expansion of Watson Decision Platform for Agriculture, with AI technology tailored for new crops and specific regions to help feed a growing population. For the first time, IBM is providing a global agriculture solution that combines predictive technology with data from The Weather Company, an IBM Business, and IoT data to help give farmers around the world greater insights about planning, ploughing, planting, spraying and harvesting.

By 2050, the world will need to feed two billion more people without an increase in arable land [1]. IBM is combining power weather data – including historical, current and forecast data and weather prediction models from The Weather Company – with crop models to help improve yield forecast accuracy, generate value, and increase both farm production and profitability.

Roric Paulman, owner/operator of Paulman Farms in Southwest Nebraska, said: “As a farmer, the wild card is always weather. IBM overlays weather details with my own data and historical information to help me apply, verify, and make decisions. For example, our farm is in a highly restricted water basin, so the ability to better anticipate rain not only saves me money but also helps me save precious natural resources.”

New crop models include corn, wheat, soy, cotton, sorghum, barley, sugar cane and potato, with more coming soon. These models will now be available in the Africa, U.S. Canada, Mexico, and Brazil, as well as new markets across Europe and Australia.

Kristen Lauria, general manager of Watson Media and Weather Solutions at IBM, said: “These days farmers don’t just farm food, they also cultivate data – from drones flying over fields to smart irrigation systems, and IoT sensors affixed to combines, seeders, sprayers and other equipment. Most of the time, this data is left on the vine — never analysed or used to derive insights. Watson Decision Platform for Agriculture aims to change that by offering tools and solutions to help growers make more informed decisions about their crops.” 

The average farm generates an estimated 500,000 data points per day, which will grow to 4 million data points by 2036 [2]. Applying AI and analysis to aggregated field, machine and environmental data can help improve shared insights between growers and enterprises across the agriculture ecosystem. With a better view of the fields, growers can see what’s working on certain farms and share best practices with other farmers. The platform assesses data in an electronic field record to identify and communicate crop management patterns and insights. Enterprise businesses such as food companies, grain processors, or produce distributors can then work with farmers to leverage those insights. It helps track crop yield as well as the environmental, weather and plant biologic conditions that go into a good or bad yield, such as irrigation management, pest and disease risk analysis and cohort analysis for comparing similar subsets of fields.

The result isn’t just more productive farmers. Watson Decision Platform for Agriculture could help a livestock company eliminate a certain mold or fungus from feed supply grains or help identify the best crop irrigation practices for farmers to use in drought-stricken areas like California. It could help deliver the perfect French fry for a fast food chain that needs longer – not fatter – potatoes from its network of growers. Or it could help a beer distributor produce a more affordable premium beer by growing higher quality barley that meets the standard required to become malting barley.

Watson Decision Platform for Agriculture is built on IBM PAIRS Geoscope from IBM Research, which quickly processes massive, complex geospatial and time-based datasets collected by satellites, drones, aerial flights, millions of IoT sensors and weather models. It crunches large, complex data and creates insights quickly and easily so farmers and food companies can focus on growing crops for global communities.

IBM and The Weather Company help the agriculture industry find value in weather insights. IBM Research collaborates with start up Hello Tractor to integrate The Weather Company data, remote sensing data (e.g., satellite), and IoT data from tractors. IBM also works with crop nutrition leader Yara to include hyperlocal weather forecasts in its digital platform for real-time recommendations, tailored to specific fields or crops. IBM acquired The Weather Company in 2016 and has since been helping clients better understand and mitigate the cost of weather on their businesses. The global expansion of Watson Decision Platform for Agriculture is the latest innovation in IBM’s efforts to make weather a more predictable business consideration. Also just announced, Weather Signals is a new AI-based tool that merges The Weather Company data with a company’s own operations data to reveal how minor fluctuations in weather affects business.

The combination of rich weather forecast data from The Weather Company and IBM’s AI and Cloud technologies is designed to provide a unique capability, which is being leveraged by agriculture, energy and utility companies, airlines, retailers and many others to make informed business decisions.

[1] The UN Department of Economic and Social Affairs, “World Population Prospects: The 2017 Revision”

[2] Business Insider Intelligence, 2016 report: https://www.businessinsider.com/internet-of-things-smart-agriculture-2016-10


Continue Reading

Featured

What if Amazon used AI to take on factories?

By ANTONY BOURNE, IFS Global Industry Director for Manufacturing

Amazon recently announced record profits of $3.03bn, breaking its own record for the third consecutive time. However, Amazon appears to be at a crossroads as to where it heads next. Beyond pouring additional energy into Amazon Prime, many have wondered whether the company may decide to enter an entirely new sector such as manufacturing to drive future growth, after all, it seems a logical step for the company with its finger in so many pies.

At this point, it is unclear whether Amazon would truly ‘get its hands dirty’ by manufacturing its own products on a grand scale. But what if it did? It’s worth exploring this reality. What if Amazon did decide to move into manufacturing, a sector dominated by traditional firms and one that is yet to see an explosive tech rival enter? After all, many similarly positioned tech giants have stuck to providing data analytics services or consulting to these firms rather than genuinely engaging with and analysing manufacturing techniques directly.

If Amazon did factories

If Amazon decided to take a step into manufacturing, it is likely that they could use the Echo range as a template of what AI can achieve. In recent years,Amazon gained expertise on the way to designing its Echo home speaker range that features Alexa, an artificial intelligence and IoT-based digital assistant.Amazon could replicate a similar form with the deployment of AI and Industrial IoT (IIoT) to create an autonomously-run smart manufacturing plant. Such a plant could feature IIoT sensors to enable the machinery to be run remotely and self-aware; managing external inputs and outputs such as supply deliveries and the shipping of finished goods. Just-in-time logistics would remove the need for warehousing while other machines could be placed in charge of maintenance using AI and remote access. Through this, Amazon could radically reduce the need for human labour and interaction in manufacturing as the use of AI, IIoT and data analytics will leave only the human role for monitoring and strategic evaluation. Amazon has been using autonomous robots in their logistics and distribution centres since 2017. As demonstrated with the Echo range, this technology is available now, with the full capabilities of Blockchain and 5G soon to be realised and allowing an exponentially-increased amount of data to be received, processed and communicated.

Manufacturing with knowledge

Theorising what Amazon’s manufacturing debut would look like provides a stark learning opportunity for traditional manufacturers. After all, wheneverAmazon has entered the fray in other traditional industries such as retail and logistics, the sector has never remained the same again. The key takeaway for manufacturers is that now is the time to start leveraging the sort of technologies and approaches to data management that Amazon is already doing in its current operations. When thinking about how to implement AI and new technologies in existing environments, specific end-business goals and targets must be considered, or else the end result will fail to live up to the most optimistic of expectations. As with any target and goal, the more targeted your objectives, the more competitive and transformative your results. Once specific targets and deliverables have been considered, the resources and methods of implementation must also be considered. As Amazon did with early automation of their distribution and logistics centres, manufacturers need to implement change gradually and be focused on achieving small and incremental results that will generate wider momentum and the appetite to lead more expansive changes.

In implementing newer technologies, manufacturers need to bear in mind two fundamental aspects of implementation: software and hardware solutions. Enterprise Resource Planning (ERP) software, which is increasingly bolstered by AI, will enable manufacturers to leverage the data from connected IoT devices, sensors, and automated systems from the factory floor and the wider business. ERP software will be the key to making strategic decisions and executing routine operational tasks more efficiently. This will allow manufacturers to keep on top of trends and deliver real-time forecasting and spot any potential problems before they impact the wider business.

As for the hardware, stock management drones and sensor-embedded hardware will be the eyes through which manufacturers view the impact emerging technologies bring to their operations. Unlike manual stock audits and counting, drones with AI capabilities can monitor stock intelligently around production so that operations are not disrupted or halted. Manufacturers will be able to see what is working, what is going wrong, and where there is potential for further improvement and change.

Knowledge for manufacturing

For many traditional manufacturers, they may see Amazon as a looming threat, and smart-factory technologies such as AI and Robotic Process Automation (RPA) as a far off utopia. However, 2019 presents a perfect opportunity for manufacturers themselves to really determine how the tech giants and emerging technologies will affect the industry. Technologies such as AI and IoT are available today; and the full benefits of these technologies will only deepen as they are implemented alongside the maturing of other emerging technologies such as 5G and Blockchain in the next 3-5 years. Manufacturers need to analyse the needs which these technologies can address and produce a proper plan on how to gradually implement these technologies to address specific targets and deliverables. AI-based software and hardware solutions will fundamentally revolutionise manufacturing, yet for 2019, manufacturers just have to be willing to make the first steps in modernisation.

Continue Reading

Trending

Copyright © 2019 World Wide Worx