Every day, we are bombarded with jargon and buzzwords. Each industry has its fair share of jargon, but none so much as the IT industry, writes DOUG CRAWFORD, Manager of Service Delivery at Entelect.
Sometimes a buzzword emerges out of necessity – there simply is no other way to describe a new phenomenon or trend. In many cases, however, it is a shift in thinking or a reincarnation of an old concept that provides the marketing fraternity with a fresh opportunity to create a ‘buzz’.
Instead of fostering a real understanding, the use of buzzwords and jargon can create misconceptions that ultimately slow down the decision-making process. At best, people waste time trying to understand what is essentially something very simple. At worst, they miss an opportunity. Either way, new terms can be confusing so I have decoded some of the IT industry’s up and coming jargon and buzzwords.
1. Big Data: Big Data refers to the large amounts of data that are typically collected from events triggered by particular actions or devices monitoring some phenomena, often stored in a loosely structured format. Traditional techniques for processing these large data sets are ineffective and new approaches are necessary to collect, retrieve, summarise and visualise – turning the data into something more meaningful.
The generally accepted defining properties of Big Data are known as the Three Vs, which Gartner analyst Doug Laney originally coined:
· Volume – the amount of data stored
· Velocity – the rate at which data is generated and processed
· Variety – the type and source of the data.
If each of these properties is increasing significantly, the information can be considered to be Big Data.
Aside from the fact that companies are collecting vast amounts of information on customers’ movements, behaviours and buying habits, why is Big Data important from a business perspective?
The old adage of ‘knowledge is power’ holds true. The more equipped people are to make decisions, the better the outcome for their business. What is relevant in the case of Big Data however, is making sense of the information (separating the noise from the meaningful), the timing of the information, and how to use the information effectively to improve a product or service.
The current movement in Big Data aims to address these issues and is reshaping our understanding of how to process information on a much larger scale.
2. Prescriptive Analytics: Making sense of the information leads us to the field of data analytics – the tools and techniques that are used to extract meaning from huge volumes of data. Analytics efforts can be broadly classified into one of three main categories – descriptive, predictive and prescriptive.
Descriptive analytics, tells us what has happened and possibly why it has happened. It usually involves reporting on historical data to provide insights into various business metrics. Predictive analytics attempts to tell us what may happen in the future, taking historical data into account and applying algorithms and various statistical models to predict the future.
Prescriptive analytics, the third and most recent focus area of data analytics, takes it to the next level by recommending a course of action and presenting the likely outcomes of choosing such action, incorporating any constraints of the current environment (financial or regulatory, for example). An actual person still has to make the decisions but prescriptive analytics can provide valuable input into scenario planning and optimisation exercises, by combining business rules, statistical models and machine learning to quantify the impact of future decisions.
There is a variety of organisations that have invested significant effort in descriptive analytics and reporting solutions to provide insight into historical data, and many are starting to explore the opportunities that predictive analytics has to offer. Both are necessary precursors to prescriptive analytics, which requires, at a minimum, the capability to capture and summarise large data sets efficiently. The data can then be used as input to prescriptive analytic engines.
3. Software-defined infrastructure (SDI): Software-defined infrastructure builds on the capabilities of virtualisation and cloud-based services to define IT infrastructure requirements. These requirements include computing power, as well as network and storage capacity, at the software level. SDI allows application developers to describe their expectations of infrastructure in a standard and systematic way, turning computing resources into logical components that can be provisioned on the fly without human intervention.
Take today’s scenario of having to configure each element of infrastructure to support an application – machines and images, storage and mount points, firewalls and load balancers to name a few – and replace it with the simple action of identifying an SDI-enabled data centre and clicking ‘deploy’. Each resource is automatically configured as required and, more importantly, can reconfigure itself as the application and usage changes.
Defining these requirements based on policies and expected usage patterns at the software level, and incorporating them into the deployable artefacts, means that IT organisations can respond more quickly to peaks and troughs in throughput, and achieve repeatable and reliable application deployments by automating many infrastructure related activities.
Furthermore, SDI-enabled data centres can optimise resource usage, which will drive down the cost of infrastructure. Specialists can focus on optimising specific elements of the infrastructure, such as network or storage, rather than reactively wiring and rewiring configurations to support evolving application requirements.
As was the case with Java and the standardised API’s (Application Programming Interfaces) that make up the Java Enterprise Edition framework, SDI will require a concerted effort to ensure inter-operability between the tools, platforms and processes that make up the virtual data centres of the future. As with cloud services, there is a vendor battle brewing to capture the lion’s share of what is likely to be a significant market for SDI-capable services. Those vendors who actively drive and support open interfaces and API’s will have the advantage in the long term.
4. DevOps: The term DevOps has been around for some time now, and the concept even longer. However, only in recent years has it started gaining widespread acceptance as standard practice in the development communities.
DevOps is to development and operations teams as Agile is to development teams and business. Where the Agile movement promotes increased collaboration between development teams and business users, DevOps looks at integrating operations activities much earlier and more tightly into the software-development life cycle. Both have the same goal of enabling software projects to respond more effectively to change.
In reality, DevOps is an extension of Agile as we know it today. However, it includes operations and support functions. The authors of the Agile Manifesto certainly never explicitly excluded operations and support from their widely accepted list of values and principles but in the experience of many, the focus on Agile projects has always been biased towards improving the collaboration between business users and the development team, rather than the development team and the operations team.
Yes, it is true that the operations team is implicitly included in the concept of cross-functional development teams (see below), but, in reality, IT operations in many organisations are still very much an isolated function, which is exactly the barrier that DevOps is trying to eliminate.
5. Cross-functional teams: The concept of a cross-functional team is simple. The development team has all the skills necessary to deliver a piece of working software into production, which may include activities such as user experience design, database design, server configuration and, of course, writing code. Where product development teams are concerned, businesses adopting Agile practices should be assembling cross-functional teams.
This is not an excuse for hiring fewer individuals and expecting them to be Jacks of all trades: specialisation is important and a necessity when solving complex problems that require focus and experience. By having a single, co-located team that can take something from concept to reality eliminates external dependencies that plague many software development teams of today, especially in large organisations.
Aside from efficiency and knowledge sharing, the argument for isolated teams defined by skill or technology is the degree of control of standards and governance within a particular domain. This argument is valid, but only for operational and ‘commoditised’ services such as desktop support and hardware infrastructure. As soon as product development enters the mix, the effectiveness of the team becomes more important than the efficiency of the team. Assuming differentiation is one of the main objectives, product development teams should be optimised for effectiveness rather than efficiency, since development in this scenario is a creative process, one that should not be constrained by red tape and corporate IT governance.
If companies want to increase their chances of creating a product that delights their customers, they should include specialists and designers in the team as full-time members until their services are no longer deemed critical, which will probably only be after several production releases. If you want to minimise your IT costs at the expense of rapid innovation, create a dedicated team that out-sources its services to several internal development teams.
While the occurrence of new ‘buzzwords’ in the ICT space is on-going, it is crucial that decision makers ensure a practical and simplified understanding before making any kind of investment on behalf of their organisation. Often designed to excite and compel, these buzzwords often do not describe the actual function or benefits of a particular concept.
We encourage business leaders to screen potential IT suppliers not by the terminology and complicated jargon they offer, but rather by how simply and understandably, they are able to communicate their solutions.
When will we stop calling them phones?
If you don’t remember when phones were only used to talk to people, you may wonder why we still use this term for handsets, writes ARTHUR GOLDSTUCK, on the eve of the 10th birthday of the app.
Do you remember when handsets were called phones because, well, we used them to phone people?
It took 120 years from the invention of the telephone to the use of phones to send text.
Between Alexander Graham Bell coining the term “telephone” in 1876 and Finland’s two main mobile operators allowing SMS messages between consumers in 1995, only science fiction writers and movie-makers imagined instant communication evolving much beyond voice. Even when BlackBerry shook the business world with email on a phone at the end of the last century, most consumers were adamant they would stick to voice.
It’s hard to imagine today that the smartphone as we know it has been with us for less than 10 years. Apple introduced the iPhone, the world’s first mass-market touchscreen phone, in June 2007, but it is arguable that it was the advent of the app store in July the following year that changed our relationship with phones forever.
That was the moment when the revolution in our hands truly began, when it became possible for a “phone” to carry any service that had previously existed on the World Wide Web.
Today, most activity carried out by most people on their mobile devices would probably follow the order of social media in first place – Facebook, Twitter, Instagram and LinkedIn all jostling for attention – and instant messaging in close second, thanks to WhatsApp, Messenger, SnapChat and the like. Phone calls – using voice that is – probably don’t even take third place, but play fourth or fifth fiddle to mapping and navigation, driven by Google Maps and Waze, and transport, thanks to Uber, Taxify, and other support services in South Africa like MyCiti, Admyt and Kaching.
Despite the high cost of data, free public Wi-Fi is also seeing an explosion in use of streaming video – whether Youtube, Netflix, Showmax, or GETblack – and streaming music, particularly with the arrival of Spotify to compete with Simfy Africa.
Who has time for phone calls?
The changing of the phone guard in South Africa was officially signaled last week with the announcement of Vodacom’s annual results. Voice revenue for the 2018 financial year ending 31 March had fallen by 4.6%, to make up 40.6% of Vodacom’s revenue. Total revenue had grown by 8.1%, which meant voice seriously underperformed the group, and had fallen by 4% as a share of revenue, from 2017’s 44.6%.
The reason? Data had not only outperformed the group, increasing revenue by 12.8%, but it had also risen from 39.7% to 42.8% of group revenue,
This means that data has not only outperformed voice for the first time – as had been predicted by World Wide Worx a year ago – but it has also become Vodacom’s biggest contributor to revenue.
That scenario is being played out across all mobile network operators. In the same way, instant messaging began destroying SMS revenues as far back as five years ago – to the extent that SMS barely gets a mention in annual reports.
Data overtaking voice revenues signals the demise of voice as the main service and key selling point of mobile network operators. It also points to mobile phones – let’s call them handsets – shifting their primary focus. Voice quality will remain important, but now more a subset of audio quality rather than of connectivity. Sound quality will become a major differentiator as these devices become primary platforms for movies and music.
Contact management, privacy and security will become critical features as the handset becomes the storage device for one’s entire personal life.
Integration with accessories like smartwatches and activity monitors, earphones and earbuds, virtual home assistants and virtual car assistants, will become central to the functionality of these devices. Why? Because the handsets will control everything else? Hardly.
More likely, these gadgets will become an extension of who we are, what we do and where we are. As a result, they must be context aware, and also context compatible. This means they must hand over appropriate functions to appropriate devices at the appropriate time.
I need to communicate only using my earpiece? The handset must make it so. I have to use gesture control, and therefore some kind of sensor placed on my glasses, collar or wrist? The handset must instantly surrender its centrality.
There are numerous other scenarios and technology examples, many out of the pages of science fiction, that point to the changing role of the “phone”. The one thing that’s obvious is that it will be silly to call it a phone for much longer.
MTN 5G test gets 520Mbps
MTN and Huawei have launched Africa’s first 5G field trial with an end-to-end Huawei 5G solution.
The field trial demonstrated a 5G Fixed-Wireless Access (FWA) use case with Huawei’s 5G 28GHz mmWave Customer Premises Equipment (CPE) in a real-world environment in Hatfield Pretoria, South Africa. Speeds of 520Mbps downlink and 77Mbps uplink were attained throughout respectively.
“These 5G trials provide us with an opportunity to future proof our network and prepare it for the evolution of these new generation networks. We have gleaned invaluable insights about the modifications that we need to do on our core, radio and transmission network from these pilots. It is important to note that the transition to 5G is not just a flick of a switch, but it’s a roadmap that requires technical modifications and network architecture changes to ensure that we meet the standards that this technology requires. We are pleased that we are laying the groundwork that will lead to the full realisation of the boundless opportunities that are inherent in the digital world.” says Babak Fouladi, Group Chief Technology & Information Systems Officer, at MTN Group.
Giovanni Chiarelli, Chief Technology and Information Officer for MTN SA said: “Next generation services such as virtual and augmented reality, ultra-high definition video streaming, and cloud gaming require massive capacity and higher user data rates. The use of millimeter-wave spectrum bands is one of the key 5G enabling technologies to deliver the required capacity and massive data rates required for 5G’s Enhanced Mobile Broadband use cases. MTN and Huawei’s joint field trial of the first 5G mmWave Fixed-Wireless Access solution in Africa will also pave the way for a fixed-wireless access solution that is capable of replacing conventional fixed access technologies, such as fibre.”
“Huawei is continuing to invest heavily in innovative 5G technologies”, said Edward Deng, President of Wireless Network Product Line of Huawei. “5G mmWave technology can achieve unprecedented fiber-like speed for mobile broadband access. This trial has shown the capabilities of 5G technology to deliver exceptional user experience for Enhanced Mobile Broadband applications. With customer-centric innovation in mind, Huawei will continue to partner with MTN to deliver best-in-class advanced wireless solutions.”
“We are excited about the potential the technology will bring as well as the potential advancements we will see in the fields of medicine, entertainment and education. MTN has been investing heavily to further improve our network, with the recent “Best in Test” and MyBroadband best network recognition affirming this. With our focus on providing the South Africans with the best customer experience, speedy allocation of spectrum can help bring more of these technologies to our customers,” says Giovanni.