Fear of new technology in schools is exaggerated
We shouldn’t be too worried that conversational agents such as ChatGPT might be making cheats of our pupils. Schools should be empowering them to try out new technologies.
As a researcher into learning technologies in schools, and a former teacher at an upper secondary school myself, I have been following the debate surrounding the use of artificial intelligence and conversational agents, such as ChatGPT, with great interest. In my opinion, the anxieties being voiced on this topic are exaggerated.
Many people in the schools and education sector have been voicing their concerns about the impact of pupils being allowed to use conversational agents such as ChatGPT to help them write their essays. But it isn’t as if pupils have never had more than enough ways of helping them to cheat.
Many opportunities to cheat
The aim in subjects such as Norwegian, English and social studies is that pupils should learn to compose independent texts. They shall display their ability to articulate themselves well and to reflect on the issues at hand.
So, of course it is cheating if pupils let ChatGPT write their essays for them. But it is also cheating to submit texts that have been written by a family member or purchased online. There is in fact a vigorous market offering completed essays, and texts can also be ordered from others who have the time and skills to write them.
In terms of opportunities for cheating, there is nothing new about ChatGPT.
Trying out technology in a safe setting
It is difficult to control the aids that pupils may have access to outside the classroom – something that teachers always have to take into account when they assign exercises that have to be completed at home. In the classroom, online tools can be blocked if the teacher so wishes.
In my opinion, the fear that conversational agents will turn schools into breeding grounds for cheats is exaggerated. It is far better that pupils are empowered to learn about and try out new technologies in the safe setting of their schools, rather than trusting entirely in the experiences they obtain outside the classroom.
Moreover, the main part of the Norwegian national curriculum states that while “technological development can contribute towards problem solving, it can also generate new problems. Knowledge about technology entails an understanding of the dilemmas that can arise when using technologies and how these can be managed”.
We can in fact contribute to effective learning by assigning exercises to pupils that involve trying out a conversational agent, followed by class discussions about ethics and ethical dilemmas. This is entirely in keeping with the emphasis set out in the 2020 National Curriculum (Kunnskapsløftet) on pupils’ abilities to understand the opportunities and challenges linked to digital technology.
The Norwegian Directorate for Education and Training has designed a poster about algorithmic thinking on which it states that: “Algorithmic thinking involves evaluating the steps necessary to solve a problem and applying technological competency to enable a computer to solve all or parts of the problem. In this also lies an understanding of the distinction between the types of problems/exercises that can be solved by the use of technology and those that require human intervention”.
The use of conversational agents is a way of enabling pupils to combine theory and practice when it comes to using algorithms.
However, before schools let these robots loose in the classroom, they have to be aware of the personal data that the robots may be gathering. I recommend that personal privacy terms and conditions should be discussed with pupils. They should be made fully aware of the digital tracks they leave behind when they interact with ChatGPT and other apps.
Some apps may be free to use now but can be paid for later in other ways once personal data has been shared.
A need for national guidelines?
In the debate surrounding ChatGPT, a number of teachers have been loud in their demands for national regulations governing its use. This is all rather strange, coming from a profession that is seldom enthusiastic about regulation from on high.
I agree that national guidelines are needed for centrally administered examinations so that exam conditions are as equal as possible for all pupils. But this need only apply in subjects in which pupils are being assessed for their abilities to express themselves using correct and precise language when analysing and interpreting specific texts.
In such cases, it will be appropriate both to block online resources such as ChatGPT that are able to complete exercises for pupils and, at the same time, to send a clear message that the use of such tools will be regarded as cheating. In some subjects, open access to online resources will probably not play a major role. The authorities will have to rule on this in consultation with the teaching professions – something which in some subjects has already been accomplished.
Micromanagement is not the responsibility of the state
However, the micromanagement of what happens in the classroom and during homework exercises should not be the responsibility of the authorities. The municipalities or the schools themselves should be working to establish appropriate practices.
Anyway, the media interest in the use of ChatGPT is now in decline. No doubt there will soon be another exciting app that EVERYONE will simply have to try out. The best thing to do is to continue to work with pupils’ attitudes, rather than worrying about permanently blocking potential ‘threats’ from the internet.
As long as we take precautions, the use of aids such as ChatGPT can be both exciting and instructive. And school is exactly the right place for pupils to experience new technologies and learn what they can from them.
And just in case you’re wondering – this article was not written with the help of ChatGPT.
This feature article was first published in the newspaper VG on 17 February 2023 and is reproduced here with the permission of the paper.