Flu season is once again here and many South Africans will be turning to their trusted dose of antibiotics to resolve the outbreak as effectively as possible. Despite this ongoing routine, South Africa is facing a huge problem.
A study led by Julia Gasson of the Western Cape Department of Health has revealed that local clinics are ignoring the guidelines on prescribing those antibiotics, and formal procedures are followed only 45% of the time.
Another global study, by the Center for Disease Dynamics, Economics & Policy (CDDEP), Princeton University, ETH Zurich and the University of Antwerp, analysed human antibiotic consumption in 76 countries and found it has increased worldwide from 11.3 to 15.7 defined daily doses (DDDs) per 1,000 inhabitants per day between 2000 and 2015.
The results of these actions are far-reaching. Antibiotics have begun to lose their effectiveness and we are developing a resistance to them. And the problem is more complicated than we had first thought. In one case, it was found that antibiotic resistant patients with infections such as E. coli and Klebsiella pneumonia had contracted a specific gene that had its origins on Chinese pig farms.
A crucial step in combatting antibiotic resistance lies in technological evolution
As a solution, scientists are now looking towards the field of bacterial genomics to tackle the global issue of antibiotic resistance. Genomics refers to the branch of molecular biology concerned with the structure, function, evolution, and mapping of our genes. This process can provide clarity around resistance mechanisms and even the evolution of various strains of disease. Today the process of genomics has also become highly automated, which has been greatly accelerated through a combination of parallel processing and the advancement in data management processes.
Due these advancements, we are now entering a world of personalised medicine, whereby individual patients can be sequenced and comparative genomic analysis can provide vital information around the progression of resistant strains of disease. Within this context, technology plays a pivotal role in allowing bioinformaticians to work with and transfer data to clinicians in an efficient and timely manner.
NetApp is assisting to achieve this through our ONTAP Cloud storage software, which allows for the protection of genomic data whilst adding the flexibility to simplify the use of public cloud. We are currently seeing this process in action through our work with PetaGene – a team which originated from Cambridge University PHDs who required a novel approach to the problem of storing data associated genomics. Additionally, we are deploying the new and improved NetApp StorageGRID which now automates tamper proof retention of critical personal data. With an increased focus on data analytics, StorageGRID customers retain and manage an unlimited amount of rich media, which is particularly useful for the field of genomics.
The key benefits of applying technology to scientific research
Unlike generic data reduction techniques, there are a few key benefits of applying the process of data management to the pursuit of tackling antibiotic resistance:
o Increased collaborative efficiency, with smaller more portable files transferable over the NetApp data fabric
o Use less storage capacity and lower costs
o Leverage the flexibility of the cloud. With the NetApp data fabric, files can be seamlessly and securely moved to and from the cloud
o Maintain interoperability with existing workflows and formats
The field of genomics, underpinned by efficient data management, is the way forward in combatting the global issue of antibiotic resistance. This of course needs to be coupled with the necessary behavioural changes where prescription guidelines are followed to the tee. Within South Africa, a country marred by drug resistant infectious diseases ranging from HIV through to malaria, the need to simplify collaboration in genomics with improved data management has never been more crucial.
The future of the book… and of reading
Many fear that the days of the printed book are numbered. In truth, it is not so much the book that is evolving, but the very act of reading, argues ARTHUR GOLDSTUCK.
Let’s talk about a revolutionary technology. One that has already changed the course of civilisation. It is also a dangerous technology, one that is spreading previously hidden knowledge among people who may misuse and abuse the technology in ways we cannot imagine.
Every one reading this is a link in a chain of this dangerous and subversive technology.
I’m talking, of course, about the printed book.
To understand how the book has changed society, though, we must also understand how the book has changed reading. That, in turn, will help us understand the future of the book.
Because the future of the book is in fact the future of reading.
Let’s go back to a time some may remember as their carefree youth. The year 400.
(Go back in history with the links below.)
Wearables enter enterprise
Regardless of whether wearables lack the mobility or security capabilities to fully support the ways in which we now work – organisations remain keen and willing to unlock the potential such devices have, says RONALD RAVEL, Director B2B South Africa, Toshiba South Africa.
The idea of integrating wearable technology into enterprise IT infrastructure is one which, while being mooted for several years now, has yet to take-off in earnest. The reasons behind previous false dawns vary. However, what is evident is that – regardless of whether wearables to date have lacked the mobility or security capabilities to fully support the ways in which we now work – organisations remain keen and willing to unlock the potential such devices have. According to ABI Research, global wearable device shipments will reach 154 million by 2021 – a significant jump from approximately 34 million in 2016.
This projected increase demonstrates a confidence amongst CIOs which perhaps betrays the lack of success in the market to date, but at the same time reflects a ripening of conditions which could make 2018 the year in which wearables finally take off in the enterprise. A maturing IoT market, advances in the development of Augmented Reality (AR), and the impending arrival of 5G – which is estimated to have a subscription base of half a billion by 2022 – are contributing factors which will drive the capabilities of wearable devices.
Perhaps the most significant catalyst behind wearables is the rise of Edge Computing. As the IoT market continues to thrive, so too must IT managers be able to securely and efficiently address the vast amounts of data generated by it. Edge Computing helps organisations to resolve this challenge, while at the same time enabling new methods of gathering, analysing and redistributing data and derived intelligence. Processing data at the edge reduces strain on the cloud so users can be more selective of the data they send to the network core. Such an approach also makes it easier for cyber-attacks to be identified at an early stage and restricted to a device at the edge. Data can then be scanned and encrypted before it is sent to the core.
As more and more wearable devices and applications are developed with business efficiency and enablement in mind, Edge Computing’s role will become increasingly valuable – helping organisations to achieve $2 trillion in extra benefits over the next five years, according to Equinix and IDC research.
Where will wearables have an impact?
At the same time as these technological developments are aiding the rise of wearables, so too are CIOs across various sectors recognising how they can best use these devices to enhance mobile productivity within their organisation – another factor which is helping to solidify the market. In particular it is industries with a heavy reliance on frontline and field workers – such as logistics, manufacturing, warehousing and healthcare – which are adopting solutions like AR smart glasses. The use case for each is specific to the sector, or even the organisation itself, but this flexibility is often what makes such devices so appealing. While wearables for the more traditional office worker may offer a different but no more efficient way for workers to conduct every day tasks such as checking emails and answering phone calls, for frontline and field workers they are being tailored to meet their unique demands and enhance their ability to perform specific tasks.
Take for example boiler engineers conducting an annual service, who could potentially use AR smart glasses to overlay the schematics of the boiler to enable a hands-free view of service procedures – meaning that when a fault becomes a barrier to repair, the engineer is able to use collaboration software to call for assistance from a remote expert. Elsewhere, in the healthcare sector smart eyewear may support clinicians with hands-free identification of patient records, medical procedures and information on medicines and results.
Such examples demonstrate the immediate and diverse potential of wearables across different verticals. With enterprise IT infrastructure now in the position to embrace such technologies, it is this ability to deliver bespoke functionality to mobile workers which will be the catalyst for continued uptake throughout 2018 and beyond.