Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My son recently told me his teacher used him as an example for the class as someone who wrote a good piece himself. Teacher accused all the other students of using AI.

He also told me that he had in fact used AI, but asked AI multiple times to simplify the text, and he had entered the simplified version. He liked the first version best, but was aware his teacher would consider it written by AI.

Guess the teachers have already lost...



There's an easy fix. All course work must be completed in class without tools.


Exactly this. It really is this easy. You have the full class period to write an easy on the economic causes of the civil war .. or on gender roles in pride and prejudice, or on similarities and differences on morality from the stoic ideals to Christianity in the Roman Empire. Kind of like most of my 90’s era college experience.


Giving a teacher judge/jury powers on administrative punishment jist creates a tyranny of the classroom.

Bart Simpson, we need you.


Further, what's to stop anyone from pumping text into an LLM and then re-writing it's output to match their own style, etc.?


Nothing. Word is getting around about how to do this. I anticipate that in another couple of years it'll have diffused to everyone, except the constant crew of new younglings who have to find out and be told about it from their older siblings and such.

"AI detection" wasn't even a solution in the short term and it won't be going forward. Take-home essays are dead, the teachers are collectively just hoping some superhero will swoop in and somehow save them. Sometimes such a thing is possible, but it isn't going to happen this time.


Yes. The ship has sailed and in fact it sailed away many, many years ago. Modernity now has to reckon with Brandolini's law at the scale of these AI systems, which, depending on the system you're inside of, can vary from "this is easy to refute" to "don't bother, assume it's all bullshit."


I wonder if doing this would actually be a step closer to learning (from not doing anything at all). To put it in your own style, you are forced to read the output and probably understand the basic concepts of what the LLM provides


Probably so, assuming that what it spits out is actually real and not some hallucination, but that's not at all a given. And I also assume that the people most inclined to regurgitating what an LLM spits out are also heavily overlapped with the people who are least likely to verify that the information is correct, or verify primary sources, or even think to ask for sources in the first place.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: