Are we teaching young university students to pass a reversed Turing test?
Yesterday I was teaching 1st year undergraduates University of Sydney in the Faculty Science in a faculty-wide unit run by the Philosophy of Science. We talked about GenAI, the Senate hearing, regulation and power, & of course, ChatGPT in edu. I was surprised by what I learned so sharing here.
Many students use ChatGPT and similar as a scratchpad or notepad to get their thoughts in order.
Mostly they seemed fully aware that the outputs are not good enough to submit. They seemed more aware of the limitations than most public. These are 17-20 yr old digital natives They were clear that the outputs couldn’t be relied on but felt they should be able to use them to assist them. Much like a calculator.
They expressed concerns that they might lose writing skills but wanted to learn how to still be good writers and also to be skilled at using these techs.
More concerningly, they told me about how they are experiencing uni now some academics are using “GenAI detection tools”. Many of them had received zero on some submitted work even though it was their own. I believed them, they looked genuinely concerned. They said they got the marks changed no problem but this has led to a new phenomenon for some. Checking their work against multiple “GAI detection tools” available online before they submit to make sure their work sounds “human enough” to pass.
This sounds unnecessarily time-consuming and some told me how stressed they were about this extra step now. It has also rocked their confidence. They see the task as being able to write in a way that you are not proclaimed by a machine to be a machine.
Remember these are 1st year Sci students. There are only so many ways you can simply describe photosynthesis or fundamental rules of physics. Apparently, their answers frequently trigger these GAI detection tools. Seems to me that rote learning exercises are the problem.
Are we training these students to gamify their way through their degrees by proving to machines that they are humans in some black mirror version of the Turing test?
This isn’t speculative fiction it is happening RIGHT NOW and the students are looking for answers. By the time these students graduate with an u/grad and a masters, it will be 2028. Who knows what the hell Gen AI is going to look like by then. We need to prepare them, and teaching them how to pass a reverse Turing test is not the world we want to see. We need to adjust teaching styles and assessment formats ASAP.
I think it’s tough for Higher Ed teachers who are often over-worked and under-paid, who battled through Zoom teaching in covid only to face ChatGPT in edu.
My suggestion: the uni funds better cross-collaboration between teachers, course designers, AND students. This is so new we MUST listen to the experiences of the students when considering our teaching design response.
Can you pass the reverse Turing test?