Connect with us

Software

Higher ed must adapt to AI chatbots

Artificial intelligence offers both challenges and opportunity for the education sector, but it must act urgently, writes education leader DR GILLIAN MOONEY.

The recent release of OpenAI’s ChatGPT has taken the world by storm, raising questions about its impact and implications across various industries. One sector that urgently needs to consider questions around the opportunities and challenges surrounding ChatGPT in particular, but artificial intelligence in general, is higher education.

Universities have to consider this new factor and its impact on Teaching and Learning, as well as academic integrity, without delay, and implement the appropriate guidelines and policies.

And while ChatGPT is South Africa’s first widescale encounter with AI, technologies are now advancing at such a rapid pace, that the education sector must remain vigilant and resilient in response to new developments.

The technology has already had an impact on higher education. This impact has largely been in the form of quite different reactions to the release of the technology as we are still in the early days of release.

On the one side, there are camps that are thinking about how to leverage the technology in their teaching and learning practices. On the other hand, there are discussions that assume students will cheat using this platform, and we need to work on ways to stop this.

On the opportunity side, academics are taking the following questions into consideration at this stage:

  • How to provide students with a strategic advantage in the current academic, and future work, contexts with this technology;
  • How to ensure that students are not disadvantaged in any way, given that a subscription to ChatGPT, providing preferential use, is currently US$20, and
  • What are the implications of this technology for knowledge generation, given that the response is based on information to which the platform has access.
  • Does the above include peer-reviewed journal articles, which are typically only available via subscription? And what will be the impact of the fact that in recent years, researchers have made their work more readily available?

Academics at good universities should already be considering how to embrace this new technology, and ask how they can be leveraged in their teaching and learning to provide students with a strategic advantage in the academic and future work contexts. However, not surprisingly, there are significant concerns about academic integrity and the validity of degrees which, if not addressed, could have a devastating impact on the reputation of a university and its qualifications going forwards.

Dr Gillian Mooney
Dean: Academic Development and Support at the IIE

Currently, the most extreme version of the response to AI developments is that all assessments will have to be written by hand in a standardised and invigilated venue to ensure students are not submitting AI-generated responses.

However such thinking reflects a wider existing problem in education.

For instance, if you do a basic search in Google – “assessment help in South Africa+ – in 0.44 seconds you receive 553 000 results. What ChatGPT is likely to do is to collate this information for the student. In recent years, there has been an explosion of what is considered cheating websites, which work off various models – from paying for an “expert tutor” to write your specific assessment response, to sites that mimic social media platforms in which students upload their assessments, other students download these assessments and there is a matrix of ratings and likes for each user.

So how can institutions of learning now ensure that the degree certificate that a graduate holds reflects a valid set of knowledge and skills that a graduate can demonstrate, and not the knowledge and skills of a paid-for expert, fellow student, or a piece of technology? There are certain levers in this regard.

Firstly, there is the “stick” approach. Students who cheat must fail the assessment and face disciplinary consequences. Students must not be able to access such platforms via the campus WiFi. One can also send lawyers’ letters to such websites to require them to remove the institution’s content. However, this is not enough and is not aligned with the imperative of developing students.

Of greater importance is the “carrot” approach.

The nub of the issue is about the ethical practices of a socially-responsible citizen in the world of academia and work. Here we focus on the ethical identity of the student and see this ethical behaviour in a range of everyday activities. To put it simply, we help students to understand that cheating is stealing, and both behaviours are ethically unacceptable.

We should be encouraging our students to want to be socially responsible and retain the integrity of their hard-earned qualification in the workplace, and to make a contribution to the economy and to society. While all of this may sound idealistic and a mammoth challenge in a society in which accusations about corruption are widespread, we must own this idealism and take our responsibility as educators of future generations seriously.

Ultimately, the response to these rapid advances in AI technology will require a two-pronged approach which includes elements of the carrot, as well as the stick.

* Dr Gillian Mooney is Dean of academic development and support at the Independent Institute of Education (IIE), and a member of the Accreditation Committee of the British Accreditation Council (BAC).

Subscribe to our free newsletter
To Top