Connect with us

Featured

Decoding tech buzzwords

Published

on

Every day, we are bombarded with jargon and buzzwords. Each industry has its fair share of jargon, but none so much as the IT industry, writes DOUG CRAWFORD, Manager of Service Delivery at Entelect.

Sometimes a buzzword emerges out of necessity – there simply is no other way to describe a new phenomenon or trend. In many cases, however, it is a shift in thinking or a reincarnation of an old concept that provides the marketing fraternity with a fresh opportunity to create a ‘buzz’.

Instead of fostering a real understanding, the use of buzzwords and jargon can create misconceptions that ultimately slow down the decision-making process. At best, people waste time trying to understand what is essentially something very simple. At worst, they miss an opportunity. Either way, new terms can be confusing so I have decoded some of the IT industry’s up and coming jargon and buzzwords.

1. Big Data: Big Data refers to the large amounts of data that are typically collected from events triggered by particular actions or devices monitoring some phenomena, often stored in a loosely structured format. Traditional techniques for processing these large data sets are ineffective and new approaches are necessary to collect, retrieve, summarise and visualise – turning the data into something more meaningful.

The generally accepted defining properties of Big Data are known as the Three Vs, which Gartner analyst Doug Laney originally coined:

·         Volume – the amount of data stored

·         Velocity – the rate at which data is generated and processed

·         Variety – the type and source of the data.

If each of these properties is increasing significantly, the information can be considered to be Big Data.

Aside from the fact that companies are collecting vast amounts of information on customers’ movements, behaviours and buying habits, why is Big Data important from a business perspective?

The old adage of ‘knowledge is power’ holds true. The more equipped people are to make decisions, the better the outcome for their business. What is relevant in the case of Big Data however, is making sense of the information (separating the noise from the meaningful), the timing of the information, and how to use the information effectively to improve a product or service.

The current movement in Big Data aims to address these issues and is reshaping our understanding of how to process information on a much larger scale.

2. Prescriptive Analytics: Making sense of the information leads us to the field of data analytics – the tools and techniques that are used to extract meaning from huge volumes of data. Analytics efforts can be broadly classified into one of three main categories – descriptive, predictive and prescriptive.

Descriptive analytics, tells us what has happened and possibly why it has happened. It usually involves reporting on historical data to provide insights into various business metrics. Predictive analytics attempts to tell us what may happen in the future, taking historical data into account and applying algorithms and various statistical models to predict the future.

Prescriptive analytics, the third and most recent focus area of data analytics, takes it to the next level by recommending a course of action and presenting the likely outcomes of choosing such action, incorporating any constraints of the current environment (financial or regulatory, for example). An actual person still has to make the decisions but prescriptive analytics can provide valuable input into scenario planning and optimisation exercises, by combining business rules, statistical models and machine learning to quantify the impact of future decisions.

There is a variety of organisations that have invested significant effort in descriptive analytics and reporting solutions to provide insight into historical data, and many are starting to explore the opportunities that predictive analytics has to offer. Both are necessary precursors to prescriptive analytics, which requires, at a minimum, the capability to capture and summarise large data sets efficiently. The data can then be used as input to prescriptive analytic engines.

3. Software-defined infrastructure (SDI): Software-defined infrastructure builds on the capabilities of virtualisation and cloud-based services to define IT infrastructure requirements. These requirements include computing power, as well as network and storage capacity, at the software level. SDI allows application developers to describe their expectations of infrastructure in a standard and systematic way, turning computing resources into logical components that can be provisioned on the fly without human intervention.

Take today’s scenario of having to configure each element of infrastructure to support an application – machines and images, storage and mount points, firewalls and load balancers to name a few – and replace it with the simple action of identifying an SDI-enabled data centre and clicking ‘deploy’. Each resource is automatically configured as required and, more importantly, can reconfigure itself as the application and usage changes.

Defining these requirements based on policies and expected usage patterns at the software level, and incorporating them into the deployable artefacts, means that IT organisations can respond more quickly to peaks and troughs in throughput, and achieve repeatable and reliable application deployments by automating many infrastructure related activities.

Furthermore, SDI-enabled data centres can optimise resource usage, which will drive down the cost of infrastructure. Specialists can focus on optimising specific elements of the infrastructure, such as network or storage, rather than reactively wiring and rewiring configurations to support evolving application requirements.

As was the case with Java and the standardised API’s (Application Programming Interfaces) that make up the Java Enterprise Edition framework, SDI will require a concerted effort to ensure inter-operability between the tools, platforms and processes that make up the virtual data centres of the future. As with cloud services, there is a vendor battle brewing to capture the lion’s share of what is likely to be a significant market for SDI-capable services. Those vendors who actively drive and support open interfaces and API’s will have the advantage in the long term.

4. DevOps: The term DevOps has been around for some time now, and the concept even longer. However, only in recent years has it started gaining widespread acceptance as standard practice in the development communities.

DevOps is to development and operations teams as Agile is to development teams and business. Where the Agile movement promotes increased collaboration between development teams and business users, DevOps looks at integrating operations activities much earlier and more tightly into the software-development life cycle. Both have the same goal of enabling software projects to respond more effectively to change.

In reality, DevOps is an extension of Agile as we know it today. However, it includes operations and support functions. The authors of the Agile Manifesto certainly never explicitly excluded operations and support from their widely accepted list of values and principles but in the experience of many, the focus on Agile projects has always been biased towards improving the collaboration between business users and the development team, rather than the development team and the operations team.

Yes, it is true that the operations team is implicitly included in the concept of cross-functional development teams (see below), but, in reality, IT operations in many organisations are still very much an isolated function, which is exactly the barrier that DevOps is trying to eliminate.

5. Cross-functional teams: The concept of a cross-functional team is simple. The development team has all the skills necessary to deliver a piece of working software into production, which may include activities such as user experience design, database design, server configuration and, of course, writing code. Where product development teams are concerned, businesses adopting Agile practices should be assembling cross-functional teams.

This is not an excuse for hiring fewer individuals and expecting them to be Jacks of all trades: specialisation is important and a necessity when solving complex problems that require focus and experience. By having a single, co-located team that can take something from concept to reality eliminates external dependencies that plague many software development teams of today, especially in large organisations.

Aside from efficiency and knowledge sharing, the argument for isolated teams defined by skill or technology is the degree of control of standards and governance within a particular domain. This argument is valid, but only for operational and ‘commoditised’ services such as desktop support and hardware infrastructure. As soon as product development enters the mix, the effectiveness of the team becomes more important than the efficiency of the team. Assuming differentiation is one of the main objectives, product development teams should be optimised for effectiveness rather than efficiency, since development in this scenario is a creative process, one that should not be constrained by red tape and corporate IT governance.

If companies want to increase their chances of creating a product that delights their customers, they should include specialists and designers in the team as full-time members until their services are no longer deemed critical, which will probably only be after several production releases. If you want to minimise your IT costs at the expense of rapid innovation, create a dedicated team that out-sources its services to several internal development teams.

While the occurrence of new ‘buzzwords’ in the ICT space is on-going, it is crucial that decision makers ensure a practical and simplified understanding before making any kind of investment on behalf of their organisation. Often designed to excite and compel, these buzzwords often do not describe the actual function or benefits of a particular concept.

We encourage business leaders to screen potential IT suppliers not by the terminology and complicated jargon they offer, but rather by how simply and understandably, they are able to communicate their solutions.

Featured

Smart home arrives in SA

The smart home is no longer a distant vision confined to advanced economies, writes ARTHUR GOLDSTUCK.

Published

on

The smart home is a wonderful vision for controlling every aspect of one’s living environment via remote control, apps and sensors. But, because it is both complex and expensive, there has been little appetite for it in South Africa.

The two main routes for smart home installation are both fraught with peril – financial and technical.

The first is to call on a specialist installation company. Surprisingly, there are many in South Africa. Google “smart home” +”South Africa”, and thousands of results appear. The problem is that, because the industry is so new, few have built up solid track records and reputations. Costs vary wildly, few standards exist, and the cost of after-sales service will turn out to be more important than the upfront price.

The second route is to assemble the components of a smart home, and attempt self-installation. For the non-technical, this is often a non-starter. Not only does one need a fairly good knowledge of Wi-Fi configuration, but also a broad understanding of the Internet of Things (IoT) – the ability for devices to sense their environment, connect to each other, and share information.

The good news, though, is that it is getting easier and more cost effective all the time.

My first efforts in this direction started a few years ago with finding smart plugs on Amazon.com. These are power adaptors that turn regular sockets into “smart sockets” by adding Wi-Fi and an on-off switch, among other. A smart lightbulb was sourced from Gearbest in China. At the time, these were the cheapest and most basic elements for a starter smart home environment.

Via a smartphone app, the light could be switched on from the other side of the world. It sounds trivial and silly, but on such basic functions the future is slowly built.

Fast forward a year or two, and these components are available from hundreds of outlets, they have plummeted in cost, and the range of options is bewildering. That, of course, makes the quest even more bewildering. Who can be trusted for quality, fulfilment and after-sales support? Which products will be obsolete in the next year or two as technology advances even more rapidly?

These are some of the challenges that a leading South African technology distributor, Syntech, decided to address in adding smart home products to its portfolio. It selected LifeSmart, a global brand with proven expertise in both IoT and smart home products.

Equally significantly, LifeSmart combines IoT with artificial intelligence and machine learning, meaning that the devices “learn” the best ways of connecting, sharing and integrating new elements. Because they all fall under the same brand, they are designed to integrate with the LifeSmart app, which is available for Android and iOS phones, as well as Android TV.

Click here to read about how LifeSmart makes installing smart home devices easier.

Previous Page1 of 2

Continue Reading

Featured

Matrics must prepare for AI

Published

on

students writing a test

By Vian Chinner, CEO and founder of Xineoh.

Many in the matric class of 2018 are currently weighing up their options for the future. With the country’s high unemployment rate casting a shadow on their opportunities, these future jobseekers have been encouraged to look into which skills are required by the market, tailoring their occupational training to align with demand and thereby improving their chances of finding a job, writes Vian Chinner – a South African innovator, data scientist and CEO of the machine learning company specialising in consumer behaviour prediction, Xineoh.

With rapid innovation and development in the field of artificial intelligence (AI), all careers – including high-demand professions like engineers, teachers and electricians – will look significantly different in the years to come.

Notably, the third wave of internet connectivity, whereby our physical world begins to merge with that of the internet, is upon us. This is evident in how widespread AI is being implemented across industries as well as in our homes with the use of automation solutions and bots like Siri, Google Assistant, Alexa and Microsoft’s Cortana. So much data is collected from the physical world every day and AI makes sense of it all.

Not only do new industries related to technology like AI open new career paths, such as those specialising in data science, but it will also modify those which already exist. 

So, what should matriculants be considering when deciding what route to take?

For highly academic individuals, who are exceptionally strong in mathematics, data science is definitely the way to go. There is, and will continue to be, massive demand internationally as well as locally, with Element-AI noting that there are only between 0 and 100 data scientists in South Africa, with the true number being closer to 0.

In terms of getting a foot in the door to become a successful data scientist, practical experience, working with an AI-focused business, is essential. Students should consider getting an internship while they are studying or going straight into an internship, learning on the job and taking specialist online courses from institutions like Stanford University and MIT as they go.

This career path is, however, limited to the highly academic and mathematically gifted, but the technology is inevitably going to overlap with all other professions and so, those who are looking to begin their careers should take note of which skills will be in demand in future, versus which will be made redundant by AI.

In the next few years, technicians who are able to install and maintain new technology will be highly sought after. On the other hand, many entry level jobs will likely be taken care of by AI – from the slicing and dicing currently done by assistant chefs, to the laying of bricks by labourers in the building sector.

As a rule, students should be looking at the skills required for the job one step up from an entry level position and working towards developing these. Those training to be journalists, for instance, should work towards the skill level of an editor and a bookkeeping trainee, the role of financial consultant.

This also means that new workforce entrants should be prepared to walk into a more demanding role, with more responsibility, than perhaps previously anticipated and that the country’s education and training system should adapt to the shift in required skills.

The matric classes of 2018 have completed their schooling in the information age and we should be equipping them, and future generations, for the future market – AI is central to this.

Continue Reading

Trending

Copyright © 2018 World Wide Worx