Transforming the future of university education with artificial intelligence
When Chat GPT was released in November 2022, everything changed; some things for the better, and some for the worse. What does the future hold for this incredible technology?
Unless you’ve been living under a rock for the past few months, you’d have heard all about Chat GPT, an incredible piece of artificial intelligence (AI) software.
Type a question into its interface, and you get a response. Engage with it, ask it questions, prompt it to provide syntheses of relatively complex topics, and Chat GPT will do its best to provide you with an answer.
It’s not just a search engine like Google, but a new disruptor in the IT space. It’s software like this that will not just do what Uber did to the taxi industry, but it will go far, far beyond.
Applications of Chat GPT and artificial intelligence
Among the numerous and multifaceted applications of Chat GPT, and indeed any piece of AI software, is the impact that’ll have in the world around us. For readers of PandemicTeaching.com, that’s mostly going to be the emerging role, and indeed challenges faced, in the tertiary education sector.
Chat GPT has the power to revolutionise the way students learn at universities, providing personalised, real-time information, instant feedback, and affordable access to resources (it’s free at the time of writing, but it wouldn’t surprise me if it becomes a pay-to-use resource in the future).
With any new technology comes potential drawbacks, and it’s important to consider the consequences for academic integrity and ethics.
One of the biggest concerns amongst the teaching profession is the likelihood that some students may tend to rely too heavily on Chat GPT in their scholarly pursuits, asking it to write assignments, syntheses, and providing answers to exam questions.
While the use of AI models like Chat GPT can provide valuable information and assistance, it’s crucial that students learn to think critically and independently, rather than simply regurgitating information from a machine.
So how do universities mitigate this? I suspect it’ll be a substantial challenge, but one that will start with steps to ensure that students understand the limitations of AI and the importance of academic integrity.
It’s not perfect (yet!)
Artificial intelligence software like Chat GPT is trained on massive datasets; these may include biases and inaccuracies. It’s crucial that universities take steps to ensure that students receive balanced and accurate information, and that they understand the limitations and potential biases of AI.
I spent some time recently ‘talking’ to Chat GPT and asking it to write a few paragraphs about the drumming on The Beatles’ eponymous 1968 album, ‘The Beatles’ (or colloquially known as the ‘The White Album’). I don’t teach anything about music, but am a massive fan of the band and know a little bit about their history.
I asked Chat GPT who drummed on the opening songs of the album. It answered with ‘Ringo Starr’, which sounds correct, but it was actually Paul McCartney (Ringo had temporarily left the band but re-joined for later tracks on the album).
I then entered into a dialogue with the software and told it that it was wrong; it would agree with me that it had erred and changed the answer to Jimmie Nicol. Jimmie did sub for Ringo, but only for a handful of shows in their 1964 world tour when Ringo was unwell, but he had nothing to do with the 1968 album. I then challenged Chat GPT saying that it was incorrect, and it reverted its answer back to Ringo, and then Jimmie, and then Ringo…
And thus Chat GPT and I entered into a loop where we just kept repeating ourselves.
Even though it was consistently wrong, I was impressed that it kept changing its answer and tried to provide more justification each time to support its new response. It was almost as if it was trying to learn from its previous errors.
The thing is though, unless you knew tidbits of information like in this example, you or perhaps your students, would probably rely on whatever Chat GPT spits out. But that would be a mistake.
I then tested Chat GPT with some more rigorous academic questions – ones drawn directly from some past exam papers that I’d written… and it struggled to provide anything accurate. I think that part of the issue there is that many of my exams are written in a way to truly test a student’s critical thinking and ability to synthesise information, rather than just recall facts.
These are still just the early days of AI though. While students might try to use Chat GPT to ‘beat the system’, it’s going to take a while for AI to get there. Still, with the speed that this technology seems to be evolving, it might not be that far away.
Some final thoughts
The increasing use of AI in education raises questions about the role of human teachers and tutors. While Chat GPT can provide instant information and assistance, it cannot replace the human connection and personalised attention that students receive from their teachers and tutors.
Indeed, that’s one of the most important things about tertiary education – that students have the opportunity to learn not only from subject matter experts, but literally the leaders in their fields and the knowledge generators the drive society forward.
To me, that is something that is pretty special and unlikely to ever be taken over by AI.
By ensuring that students understand the limitations of AI, promoting academic integrity, combating biases, and valuing the role of human teachers and tutors, universities are in an outstanding position to harness the power of AI software like Chat GPT to enhance, not undermine, the quality of education.