‘But there is a difference between recognising AI use and proving its use. So I tried an experiment. … I received 122 paper submissions. Of those, the Trojan horse easily identified 33 AI-generated papers. I sent these stats to all the students and gave them the opportunity to admit to using AI before they were locked into failing the class. Another 14 outed themselves. In other words, nearly 39% of the submissions were at least partially written by AI.‘
Article archived: https://web.archive.org/web/20251125225915/https://www.huffingtonpost.co.uk/entry/set-trap-to-catch-students-cheating-ai_uk_691f20d1e4b00ed8a94f4c01


He runs right up to the actual problem but side-steps it:
“A college degree is not just about a job afterwards – you have to be able to think, solve problems and apply those solutions, regardless of the field.”
Problem: With generative AI, this is the LAST thing employers want. If you’re out there working right now, particularly in tech? It’s all about “leveraging” AI to “be more efficient.” They don’t want you thinking and solving problems on your own, they want you regurgitating solutions they presume are pre-vetted by AI.
I’ve had these discussions at my own job… “But, but, Generative AI makes it so easy to make and place Facebook ads!” - Agreed, and that’s not my job. “But, but, you can analyze data and generate reports!” - Yes, also not my job.
But the push by business to use it is HUGE, and in that environment, some student using it to cheat in a history class, ultimately, will benefit more from that experience in the “real” world than probably taking the history class in the first place.
Then again, my plan is to John Henry the shit out of this until I’m dead.
For those who never took a folklore class:
https://www.americanfolklore.net/john-henry-the-steel-driving-man/
What the business wants and what they say they want are two completely different things. They still want people to be able to think and solve their own problems, even if they end up assigning the praise for your hard work on the AI you only pretended to use
This is a new pattern that I am seeing. C suite is required to launch AI initiatives because of market expectation, tech staff expected to launder their work through Claude
I think that most of all they want you to feel like AI could replace you if you misbehave.
deleted by creator