AI Software May Correctly Identify Student Plagiarism
The modular steel workbench shown above features an ESD worksurface to protect sensitive microelectronics.
In this Formaspace executive report, we investigate the choices facing academic institutions seeking to navigate the new opportunities and pitfalls posed by AI.
When we want to address the issues of using AI in the classroom, we should first take a moment to put ourselves in the shoes of the students – many of whom are bombarded with seductive, persuasive marketing messages from AI companies advertising on YouTube and other social media sites. These companies promise to help students improve their writing, come up with topics, and even create documents from scratch.
For today’s digital native students – many who have been encouraged to use AI-powered computer learning software as part of their learning experience (in some cases while using computers issued by the school system!) ‑ the siren song of AI-powered tools that offer a ‘shortcut’ to getting better grades may prove too seductive to resist.
Educators need to quickly come up to speed on the ethical implications of using AI technologies in the classroom and deliver a unified message to students about what is expected and what is not. (In many states, legislators are already stepping into the void, enacting new laws regulating AI in the classroom.
This may be easier said than done, however, given the rapid pace at which AI software is being added to commonly used software products, from Microsoft Office and browsers to Google Apps to writing plug-ins such as Grammarly.
Educators also need to think about any ‘double standard’ mixed messages they might be sending to their students. For example, classroom teachers may be actively encouraging students to use AI-based tutoring programs that help their students learn better, or adopting AI-based tools for their own use to speed up the time-consuming process of reading and grading student work.
In the minds of students, these AI use cases may create a ‘what’s good for the goose is good for the gander’ situation that justifies using AI in their own work.
What Can Academics Do To Control AI Plagiarism? The First Reaction Is Often To “Fight Fire With Fire”
The initial response by many teachers confronted with the problem of their students using AI to create work presented as their own is to fight fire with fire – by validating if a student’s work is authentic by checking it via one of the new-generation plagiarism software detection tools (such as TurnItIn, Small SEO Tools, Paraphraser, or GPTZero) that also claim to be able to identify the presence of AI-generated writing.
Unfortunately, these detection algorithms are not foolproof and, in many cases, have created false positives, e.g. accusing students of AI-based plagiarism when they were, in fact, innocent.
This is an untenable situation that has led many academic institutions (including the University of Alabama, UC Berkeley, Missouri Northwestern, and SMU) to pause the use of TurnItIn AI-detection software (and others) pending the resolution of these false-positive issues, among other ethical concerns.
Will Widespread Adoption Of AI Detection Software Result In Falsely Accused Students Having To “Prove” They Didn’t Use AI?
There is an escalating technology “war” brewing between AI detection systems trying to identify works created by AI and the newest editions of generative AI software, which are becoming better at creating more natural, human-like output that’s also harder to detect.
This never-ending battle may result in a perverse reality – one where the burden falls on students to prove their work is original and created without relying on input from AI software.
There are instances where this is already happening.
Some educators are encouraging students to write their assignments using software such as Google Docs, which annotates each item as it’s written.
Because Google Docs records each edit during the writing process, it’s possible to review the detailed history of a student’s writing process, including flagging large sections of text that were pasted into the document (which could have been unethically ‘lifted’ from another source, such as a Chat GPT-type AI assistant.)
The Luddite Response: Ban Computers And Smartphones From The Classroom Entirely
Is banning technology from the classroom the solution?
This might be an attractive “kill two birds with one stone” option for some educators, many of whom are already fed up with having to compete for their student’s attention with handheld screens during class.
Read more...
Julia Solodovnikova
Formaspace
+1 800-251-1505
email us here
Visit us on social media:
Facebook
X
LinkedIn
Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.