Connect with us

Featured

Knowledge grazing must change way we educate

Published

on

The information revolution has shifted learning away from ordered hierarchies toward a much messier and self-directed learning paradigm. But has this disrupted education yet? ANGELA SCHAERER, Teacher Engagement Lead for Microsoft SA takes a look.

Look at the world we live in – the ways in which we access information has changed beyond all measure. If I think back to those awful events of September 11, 2001, we watched the unfolding drama on news channels, morbidly glued to our TV sets as the tragic events of that awful day unfolded across our screens. Fast forward a few short years to 2009 and we watched Captain Sullenberger land his Boeing 737 on the Hudson River on our screens, but the size had dramatically decreased.

These events were brought to the world first, not by 24-hour news channels, but by Twitter and YouTube, and we watched the first reports not on our TV screens but on our smart mobile devices. Of course the TV stations quickly caught up, but the story broke across social media first. And that’s the way it is now. We are used to hearing or reading about the big news stories of the day on Twitter rather than the morning papers or the TV. All the media businesses, be they television or newspaper have Twitter accounts. The Chinese government first learned about the 2008 Szechuan earthquake from Twitter rather than its own news agency. And remember the 2012 Arab spring when revolution raced across a continent broke via images from mobile phones and live conversations on social media?

‘Digital information can be altered, mashed, changed or trashed in minutes’

Our world is evolving. New ideas spread the whole way around the world in less than 24 hours. That’s the power of the YouTube video clip! It’s even quicker on Twitter.

Social media is global and ubiquitous. And today, in the middle of the second decade of this 21st century information age, we are now all reporters, sharing, creating, changing and critiquing the news as it happens.

This is evolution, but not as we’ve previously understood the word. Now the term is used to describe changes that occur much more rapidly than Darwin could ever have dreamed about. Digital information can be altered, mashed, changed or even trashed in minutes, in ways previously impossible. Digital textbooks will rarely be out of date in the way their printed versions are. And it seems our brains might be changing as well. Brain plasticity is a well-documented phenomenon.

Some people have written about the possible change in the way our brains have been made to work differently over the past few years, as information arrives at our consciousness via short, sharp simultaneous bursts. And media changes have come hand in hand with the ways in which we consume them. The biggest box-office successes nowadays all rely on the “flash, bang, wallop” effect. It seems we need instant gratification and fast-paced action full of dazzling special effects and noise which appear to trump the great narratives and plot lines of the past. Neural pathways do change. But is this change not to be embraced? After all, it’s how the brains of most of our learners work. In fact, their brains probably know no other way of working. The world has changed, and there is no going back.

‘It’s not what you know but what you do with what you know’

All educators need to do is set the parameters, then work individually with students, helping, providing advice, and yes, even teaching them that it’s not just OK to recycle and mash up knowledge. The real goal is to reboot it, make it work, and truly own it. By this I mean evaluating what is discovered and commenting on how relevant it might be to the project, benchmarking it against the set parameters.

The days of old-style factual regurgitation are long gone, left behind by the post-industrial information age. We should be in the business of helping learners to become consummate knowledge rebooters and problem-seekers. It’s not what you know, but how you use it and how you figure it can address global challenges. Bloom’s hierarchical, level-upon-level paradigm of learning is well and truly disrupted by this knowledge-grazing paradigm.

Educational institutions and governments all around the world are latching on to this knowledge grazing and are making their learning resources freely available online at an incredibly rapid rate. Some of these courses, known as Massive Online Open Courses or MOOCs, attract thousands of eager learners to each course, and many thousands more graze on these fantastic learning artefacts, using, recycling, mashing and rebooting them.

And so this self-directed learning leads to increased confidence to mess about with what we discover. John Seely-Brown calls it “Tinkering”. He believes that this tinkering brings thought and action together in a very magical way. It’s what we do when things won’t work and we get over the fear of getting it wrong. If we get in there and tinker – to try and sort it out – we generally manage to get things going. And yes, this is how our kids play computer games: where failure is just one step on the way to really powerful learning.

This “getting things going” strikes right to the heart of what learning really means. We learn when we engage with whatever we discover. It’s our level of engagement which leads to depth of retention and, therefore, true learning. The world-wide education establishment is waking up to this new paradigm. It can’t come a moment too soon.

Featured

IoT at starting gate

South Africa is already past the Internet of Things (IoT) hype cycle and well into the mainstream, writes MARK WALKER, associate vice president of Sub-Saharan Africa at International Data Corporation (IDC).

Published

on

Projects and pilots are already becoming a commercial reality, tying neatly into the 2017 IDC prediction that 2018 would be the year when the local market took IoT mainstream. Over the next 12-18 months, it is anticipated that IoT implementations will continue to rise in both scope and popularity. Already 23% are in full deployment with 39% in the pilot phase. The value of IoT has been systematically proven and yet its reputation remains tenuous – more than 5% of companies are reluctant to put their money where the trend is – thanks to the shifting sands of IoT perception and success rate.

There are several reasons behind why IoT implementations are failing. The biggest is that organisations don’t know where to start. They know that IoT is something they can harness today and that it can be used to shift outdated modalities and operations. They are aware of the benefits and the case studies. What they don’t know is how to apply this knowledge to their own journey so their IoT story isn’t one of overbearing complexity and rising costs.

Another stumbling block is perception. Yes, there is the futuristic potential with the talking fridge and intelligent desk, but this is not where the real value lies. Organisations are overlooking the challenges that can be solved by realistic IoT, the banal and the boring solutions that leverage systems to deliver on business priorities. IoT’s potential sits within its ability to get the best out of assets and production efficiencies, solving problems in automation, security, and environment.

In addition to this, there is a lack of clarity around return on investment, uncertainty around the benefits, a lack of executive leadership, and concerns around security and the complexities of regulation.  Because IoT is an emerging technology there remains a limited awareness of the true extent of its value proposition and yet 66% of organisations are confident that this value exists.

This percentage poses both a problem and opportunity. On one hand, it showcases the local shift in thinking towards IoT as a technology worth investing into. On the other hand, many companies are seeing the competition invest and leaping blindly in the wrong direction. Stop. IoT is not the same for every business.

It is essential that every company makes its own case for IoT based on its needs and outcomes. Does agriculture have the same challenges as mining? Does one mining company have the same challenges as another? The answer is no. Organisations that want their IoT investment to succeed must reject the idea that they can pick up where another has left off. IoT must be relevant to the business outcome that it needs to achieve. While some use cases may apply to most industries based on specific circumstances, there are different realities and priorities that will demand a different approach and starting point.

Ask – what is the business problem right now and how can technology be leveraged to resolve it?

In the agriculture space, there is a need to improve crop yields and livestock management, improve farm productivity and implement environmental monitoring. In the construction and mining industry, safety and emergency response are a priority alongside workforce and production management. Education shifts the lens towards improving delivery and quality of education, access to advanced learning methods and reducing the costs of learning.  Smart cities want to improve traffic and efficiently deliver public services and healthcare is focusing on wellness, reducing hospital admissions and the security of assets and inventory management.

The technology and solutions selected must speak to these specific challenges.

If there are no insights used to create an IoT solution, it’s the equivalent of having the fastest Ferrari on Rivonia Road in peak traffic. It makes a fantastic noise, but it isn’t going to move any faster than the broken-down sedan in the next lane. Everyone will be impressed with the Ferrari, but the amount of power and the size of the investment mean nothing. It’s in the wrong place.

What differentiates the IoT successes is how a company leverages data to deliver meaningful value-added predictions and actions for personalised efficiencies, convenience, and improved industry processes. To move forward the organisation needs to focus on the business outcomes and not just the technology. They need to localise and adapt by applying context to the problem that’s being solved and explore innovation through partnerships and experimentation.

Continue Reading

Featured

ERP underpins food tracking

The food traceability market is expected to reach almost $20 billion by 2022 as increased consumer awareness, strict governance requirements, and advances in technology are resulting in growing standardisation of the segment, says STUART SCANLON, managing director of epic ERP

Published

on

Just like any data-driven environment, one of the biggest enablers of this is integrated enterprise resource planning (ERP) solutions.

As the name suggests, traceability is the ability to track something through all stages of production, processing, and distribution. When it comes to the food industry, traceability must also enable stakeholders to identify the source of all food inputs that can include anything from raw materials, additives, ingredients, and packaging.

Considering the wealth of data that all these facets generate, it is hardly surprising that systems and processes need to be put in place to manage, analyse, and provide actionable insights. With traceability enabling corrective measures to be taken (think product recalls), having an efficient system is often the difference between life or death when it comes to public health risks.

Expansive solutions

Sceptics argue that traceability simply requires an extensive data warehouse to be done correctly, the reality is quite different. Yes, there are standard data records to be managed, but the real value lies in how all these components are tied together.

ERP provides the digital glue to enable this. With each stakeholder audience requiring different aspects of traceability (and compliance), it is essential for the producer, distributor, and every other organisation in the supply chain, to manage this effectively in a standardised manner.

With so many different companies involved in the food cycle, many using their own, proprietary systems, just consider the complexity of trying to manage traceability. Organisations must not only contend with local challenges, but global ones as well as the import and export of food are big business drivers.

So, even though traceability is vital to keep track of everything in this complex cycle, it is also imperative to monitor the ingredients and factories where items are produced. Having expansive solutions that must track the entire process from ‘cradle to grave’ is an imperative. Not only is this vital from a safety perspective, but from cost and reputational management aspects as well. Just think of the recent listeriosis issue in South Africa and the impact it has had on all parties in that supply chain.

Efficiency improvements

Thanks to the increasing digital transformation efforts by companies in the food industry, traceability becomes a more effective process. It is no longer a case of using on-premise solutions that can be compromised but having hosted ones that provide more effective fail-safes.

In a market segment that requires strict compliance and regulatory requirements to be met, cloud-based solutions can provide everyone in the supply chain with a more secure (and tamper-resistant) solution than many of the legacy approaches of old.

This is not to say ERP requires the one or the other. Instead, there needs to be a transition provided between the two scenarios that empowers those in the food supply chain to maximise the insights (and benefits) derived from traceability.

Now, more than ever, traceability is a business priority. Having the correct foundation through effective ERP is essential if a business can manage its growth and meet legislative requirements into the future.

Continue Reading

Trending

Copyright © 2018 World Wide Worx