In remaining compliant with new data protection legislation, companies can generate even greater value from their data, says CLEO BECKER, Regional Counsel Sub-Saharan Africa, Middle East, Pakistan, Turkey and Israel for Hitachi Data Systems.
The conversation around data has become increasingly complex – with multiple pieces of data-focused legislation in play, companies no longer need to simply know how to unlock the value in their data, but also how to make sure they remain compliant.
With the General Data Protection Regulation (GDPR) coming into effect on 25 May 2018, it’s important for South African businesses which conduct business in the EU to understand exactly how they will be affected. According to the legislation any company which processes the personal data of EU residents in connection with the offering of goods or services, or monitors the behaviour of those residents may need to comply.
GDPR will affect SA businesses
There are a number of key requirements set out in Article 5 of the GDPR, which include the responsibility for companies to process personal data lawfully, fairly and in a transparent manner, as well as to ensure that personal data is kept accurate and up to date, and only retained for as long as is necessary for a company to achieve the purposes for which the personal data was collected.
There are further requirements stipulated in the legislation of which companies need to take note. One of the most topical of these may be the obligation for personal data to be processed in a manner that ensures appropriate security of that data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage. This is particularly the case due to the growing threat of cyberattacks which target personal data.
These requirements make it essential for companies to know what kind of personal data they hold and where it is stored.
How POPI fits into the picture
To complicate matters, South African companies also need to comply with the Protection of Personal Information Act (POPI).
Luckily, the provisions across the two pieces of data protection legislation are so similar (save for naming conventions) that complying with the GDPR means complying with POPI should be smooth sailing. For example, both POPI and the GDPR necessitate compliance with certain principles when processing personal data, they both require the regulator be notified in the case of a privacy breach (although notification time periods differ), both POPI and GDPR call for a data protection officer to be appointed, and both place restrictions on and requirements for what personal data can be sent outside of the EU (in the case of the GDPR) and South Africa (in the case of POPI).
Unlike the GDPR, we don’t know when POPI will come into effect. What we do know is that there will be a one-year transitional period for companies to become compliant once the date of implementation is announced.
Make sure you’re ready
Both POPI and the GDPR require companies to identify all the personal data they hold, keep that personal data up to date and accurate, set retention policies around each piece of personal data and put appropriate security safeguards in place to prevent unauthorised access, loss, damage, modification or destruction of that data. This means businesses need to make sure they employ industry best practice when it comes to their technology, IT processes and security, ensuring they have clear policies in place; that their staff are properly trained; and that there is adequate protection in their supplier contracts.
To meet these security requirements, companies may also wish to consider technology functionality such as encryption, and ensure that they back up or replicate their data in accordance with best practices to avoid losses.
How tech can help
Technology will play a big role in efficient compliance with GDPR and POPI as large amounts of data need to be clearly identified and stored for certain periods.
Technology can help companies make sense of their data and increase efficiencies through automation. For example, it can assist in responding to requests from both data subjects and regulators in a timely manner by making the data easily searchable. Once personal data is identified technology can be used to set further controls around who accesses the personal data and for how long it needs to be retained. Service providers like Hitachi will assist with the compliance journey by identifying what personal data the company holds, where that data is located (on premises or in the cloud) and assessing whether it includes personal data or sensitive personal data – particularly as different rules apply to both.
Once the personal data is identified, Hitachi makes use of the Hitachi Content Platform to store the data. This platform makes use of object storage, which allows companies to further enrich the metadata on their files to make them more easily searchable, independent of applications.
Hitachi Content Intelligence can then be used to search for and set controls on files within the Hitachi Content Platform. For example, a company could locate all of its files which contain a credit card or identity number and then set controls on who can access those files, and alerts as to when those files need to be deleted.
It’s no secret that data is increasingly becoming the lifeblood of organisations – gaining greater insight into that data not only assists with regulatory compliance, but also with identifying and uncovering new revenue opportunities.
Data gives coaches new eyes in sports
Collecting and analysing data is entering a new era as it transforms both coaching and strategy across sports ranging from rugby to Formula 1, writes ARTHUR GOLDSTUCK
Coaches and managers have always been among the stars of any sports. They become household names as much as the sports heroes that populate their teams. Now, thanks to the power of data collection and analysis, they are about to raise their game to unprecedented levels.
The evolution of data for fine-tuning sports performance has already been experienced in Formula 1 racing, baseball and American football. All are known for the massive amount of statistic they produce. Typically, however, these were jealously guarded by coaches trying to get an edge over their rivals. Thanks to the science of “big data”, that has changed dramatically.
“American baseball has the most sophisticated data science analytics of any sports in the world because baseball has this long history of stats,” said Ariel Kelman, vice president of worldwide marketing at Amazon Web Services (AWS), the cloud computing giant that is working closely with sports teams and leagues around the world. “It’s an incredibly opaque world. I’ve tried for many years to try and get the teams to talk about it, but it’s their secret sauce and some of these teams have eight, nine or ten data scientist.”
In an interview during the AWS Re:Invent conference in Las Vegas last week, Kelman said that this statistical advantage was not lost on other sports, where forward-thinking coaches fully understood the benefits. In particular, American football, through the National Football League there, was coming on board in a big way.
“The reason they were behind is they didn’t have the player tracking data until recently in in the NFL. They only had the player tracking data three years ago. Now the teams are really investing in it. We did an announcement with the Seattle Seahawks earlier this week; they chose us as their machine learning, data science and cloud provider to do this kind of analysis to help figure out their game strategy.
“They are building models predicting the other teams and looking at players and also evaluating all their practices. They are setting up computer vision systems so that they can track the performance of the players during their practices and have that inform some of the game strategies. The teams then even talk about using it for player evaluation, for example trying to figure out how much should we pay this player.”
Illustrating the trend, during Re:Invent, Kelman hosted a panel discussion featuring Rob Smedley, a technicalconsultant to Formula 1, Cris Collinsworth, a former professional footballer in the NFL and now a renowned broadcaster, and Jason Healy, performance analytics managerat New Zealand Rugby.
Healey in particular represents the extent to which data analysis has crosses sporting codes. He has spent four yearswith All Blacks, after 10 years with the New Zealand Olympic Committee, helping athletes prepare for the OlympicGames.
“The game of rugby is chaos,” he told the audience. “There’s a lot of a lot of things going on. There’s a lot of trauma and violence and it can be difficult to work out the load management of each player. So data collection is a big piece of the technical understanding of the game.
“A problem for us in rugby is the ability to recall what happened. We have to identify what’s situational and what’s systemic. The situational thing that happens, which is very unlikely to be replicated, gets a lot of attention in rugby. That’s the sensational big moment in the game that gets talked about. But it’s the systemic plays and the systemic actions of players that lies underneath the performance. That’s where the big data starts to really provide some powerful answers.
“Coaches have to move away from those sensational andsituational moments. We’re trying to get them to learn what is happening at that systemic level, what is actually happening in the game. How do we adjust? How do we make our decisions? What technical and defensive strategies need to change according to the data?”
Healey said AWS was providing platforms for tracking players and analysing patterns, but the challenge was to bring people on this technology journey.
“We’re asking our coaching staff to change the way they have traditionally worked, by realising that this data does give insights into how they make their decisions.”
Kelman agreed this was an obstacle, not just in sport, but in all sectors.
“Across all of our customers, in all industries, one of the things that’s often underestimated the most is that getting the technology working is only the first step. You have to figure out how to integrate it with the processes that us humans, who dislike change, work with. The vast majority of it is about building knowledge. There’s ways to transfer that learning to performance.”
Of course, data analytics does not assure any side of victory, as the All Blacks discovered during the recent Rugby World Cup, when they were knocked out in the semi-finals, and South Africa went on to win. We asked Healey how the data-poor South Africans succeeded where the data-rich All Blacks couldn’t.
“You have to look at how analytics and insights and all thesetechnologies are available to all the coaches these days,” he said. The piece that often gets missed is the people piece. It’s the transformation of learning that goes into the player’sactual performance on the field. We’re providing them with a platform and the information, but the players have to make the decisions.. We can’t say that this particular piece of technology played a role in winning or losing. It’s simply just a tool.”
The same challenge faces motor racing, which generates massive amounts of data through numerous sensors and cameras mounted in vehicles. Rob Smedley, who spent 25 years working in engineering roles for Formula 1 teams, quipped that his sport had a “big data” problem before the phrase was invented.
“We’ve always been very obsessive about data. Take car telemetry, where we’ve got something like 200 to 300 sensors on the car itself. And that goes into something like two to three thousand data channels. So we’re taking about around 600 Gigabytes of data generated every single lap, per car.
“On top of that, where we’ve also got all the time data and GPS data. The teams are using it for performance advantage. We’re into such marginal gains now because there are no bad teams in Formula 1 anymore. Data analytics provide those marginal gains.”
• Arthur Goldstuck is founder of World Wide Worx and editor-in-chief of Gadget.co.za. Follow him on Twitter and Instagram on @art2gee
IoT faces 5-year gap
In five years, the world will have more than 40 billion devices. Locally, IoT specialist,Eseye, says that South African CIOs are recognising IoT (Internet of Things) and M2M (Machine to Machine) technologies as strategic imperatives, but the journey is still in its infancy.
“As legacy systems start to reach end of life, digital shifts will become inevitable. This, coupled with an increasing demand for improved bottom line results from existing and new markets, makes IoT a more viable option over the next five years. This is particularly prevalent in manufacturing, especially where time to market and product diversification has become necessary for business survival,” says Jeremy Potgieter, Regional Director – Africa, Eseye.
He says that within this sector one thing matters – output: “Fulfilling the product to market lifecycle is what makes a manufacturer successful. Addressing this functionality and production optimisation through technology is becoming more critical as they focus on increasing output and reducing downtime. By monitoring machinery and components in the production line, any concerns that arise, which impacts both the manufacturer and consumers alike, will be more efficiently dealt with by using an IoT approach.”
Potgieter says that there is also the growing strategic approach to increase the bottom line through new markets. As manufacturers seek new revenue streams, Eseye is encouraging the use of rapid IoT enabled device product development : “By addressing the connectivity aspects required at deployment, manufacturers are immediately diversifying their portfolios. Eseye, as an enabler, assists by providing market ready SIMs, which can be embedded into IoT connected devices at OEM level, connecting them to a plethora of services (as designed for) upon entry to market, anywhere in the world.”
In addition, Potgieter says that organisations are increasingly looking towards IoT connectivity managed services to capitalise on specialist expertise and ensure the devices are proactively monitored and managed to ensure maximum uptime, while reducing data costs.
Impacting IoT adoption though, is undoubtedly the network infrastructure required. Potgieter says that this varies significantly and will depend on criteria such as sensor types and corresponding measurements, the overall communication protocols, data volume, response time, and analytics required: “While the majority of IoT implementations can be enabled using cloud-based IoT platform solutions, the infrastructure required still remains important. A cloud platform will simplify infrastructure design and enable easy scaling capability, while also reducing security and data analytics implementation issues.”