Following in the steps of New South Wales and Queensland, Tasmanian public school students heading back into the classroom are set to be unable to access a new AI-learning tool on school systems.
ChatGPT was launched in November, and its use is already being restricted in Tasmanian public schools
Educators say it’s not time to panic but to adapt, as conversations about the tool get underway
Some predict the technology will particularly aid students who struggle with writing
ChatGPT – in full, the Chat Generative Pre-Trained Transformer — chatbot was launched by artificial intelligence research company OpenAI late last year.
Users are able to write in questions and receive human-like responses across a range of topics – such as ‘what actually is AI?’ or ‘what should I buy my cousin for his birthday next month?’, with follow-up questions encouraged.
With it, comes the risk of students using the tool to cheat on assignments – but educators say there’s no need to panic.
Department ‘restricting access’
In a statement provided to ABC Radio Hobart, Tasmania’s Department of Education, Children and Young People said it was “aware of the current discourse” around how online artificial intelligence and chatbot tools could potentially “be used inappropriately by students in completing assessment tasks”.
“ChatGPT requires users to be 18 years or older and able to form a binding contract with OpenAI to use the services. On these grounds, DECYP will be restricting student access to ChatGPT on its systems, which is consistent with approaches taken in other jurisdictions,” they said.
They said DECYP would continue to “review and update policies” related to technology changes as needed, with teachers “well-versed in being able to identify language consistent with a student’s other pieces of work”.
“A robust and diverse mix of assessment tasks across a school year also helps teachers identify any concerns around authenticity.”
‘You’re not going to stop it’
For Wendy Ruback, principal at Eastside Lutheran College in the state’s south, the opportunities of this, or similar, AI-learning tools should not be lost in this discussion.
She said, just like when computers first became a learning tool, this was simply another adjustment period as technology developed.
“It’s similar in that we have to make certain that we use the AI rather than trying to stop it – because you’re not going to stop it,” Ms Ruback said.
“So then we need to look at how AI can actually be helpful.”
She said at her school, some of those conversations were starting to take place among staff, with more to come about how tasks could be designed to make use of programs like ChatGPT, while bringing in the necessary elements of critical thinking and analysis.
“I actually think that’s a great thing, because that’s what all assignments should have. If they’re not showing that critical thinking, then that’s saying they didn’t do the assignment.”
Wendy Ruback said particularly for students with special needs or who find writing difficult, tools like this would be a real asset, and it was too simplistic to reduce the discussion to how AI might help some students cut corners in their studies.
“I actually think it’s going to be the opposite. I think we’re going to be requiring kids to show higher-order thinking and critical thinking – and we’re going to be needing to make certain they understand what they have to show,” she said.
“I certainly don’t think the kids are going to get out of it lightly.
“From our point of view, when you’re looking at kids who are dyslexic, who’ve got the ideas but often can’t write, it’ll be a real godsend for them to actually [be able] to show what they know.
“At the moment, we have things like speech to text, and we have C-Pens and everything else for them, but this, to me, takes it to another level for them because they’ll be able to give us their ideas.”
AI already around us
Margaret Bearman, professor with the Centre for Research in Assessment and Digital Learning at Deakin University, echoed Ms Ruback’s sentiments about not jumping the gun on how AI learning tools would be used.
“It’s really interesting that ChatGPT, in some ways, appears like a game-changer but in other ways is a continuation of a long series of conversations that have already been occurring,” Professor Bearman said.
“Google is actually, in essence, powered by AI, and so therefore, the impact of AI in how we teach and how we learn has already been transformed, even without us knowing,” she said.
“What we’re seeing is the reality starting to hit the classroom, the lecture theatre … some of the principles and ideas underpinning how a lot of people have been thinking about AI are becoming more obvious now we have this new and very powerful technology.”
She said when it came to the role of artificial intelligence tools in learning – be that in schools or higher education – it was important to note that it wouldn’t remove the need for critical thinking, or automatically adhere to the same standards we expect from work submitted.
“One of the big problems with AI is it doesn’t know whether what it’s done is any good or not,” Professor Bearman said.
“All it does is it generates something in response to a text, to a prompt. And how good is it is very dependent on a whole range of factors – none of which involves the AI making a judgement about whether it’s done something good or bad.”
She said that’s where the role of educators came in, by encouraging students to analyse what, for example, actually makes a good essay, or how we know if a statement is accurate or lacking context.
“By focusing on these ideas, I think we move away a lot quicker from content generations, which these AIs are extremely good at, and thinking about quality content, which in my view remains a very human capability.”