Is Using AI for Academic Writing Cheating?

Is Using AI for Academic Writing Cheating?

Does AI really mean the death of the college essay? Here’s how AI writing can be detected.

From films written and directed by ChatGPT to Kindle authors writing books with artificial intelligence (AI), we are seeing more and more reportage on AI writing tools these days. While writing with an AI may seem harmless and possibly helps improve work efficiency, the use of these tools in the sphere of academic writing has generated serious concern. Teachers are worried that the use of AI writing tools is equivalent to plagiarism or cheating.  

If you are an avid reader of our content, you would know that AI tools are trained on the work of humans. A coding AI would be trained on the work of other coders, an AI artist would be trained on the work of other artists and so on. But just because an AI writer is trained on the work of other writers, does it mean that its content isn’t original? Let’s discuss the concerns surrounding AI plagiarism and the ways in which AI writing can be distinguished from human work. 

Why are AI writing tools a problem for academia?

To understand why AI tools are being considered the death of academic writing, we have to first understand what plagiarism is and whether AI falls under the purview of that or not. 

What counts as plagiarism? 

According to the University of Oxford, plagiarism is, “presenting someone else’s work or ideas as your own, with or without their consent, by incorporating it into your work without full acknowledgment.” So, if a student takes content from three different sources and just copies it directly into the academic essay, that would count as plagiarism. Teachers use plagiarism prevention tools, like Turnitin, to check whether any copying has taken place. These tools compare a student’s essay with preexisting works and look for similarities between the two. 

If we view the work of an AI with Oxford’s definition, it wouldn’t fall under the category of plagiarism. This is because AI is creating completely new content and so, there isn’t anything for tools like Turnitin to catch. 

Okay, so if it’s not plagiarism, then what’s the issue?

The biggest problem with using AI tools in academic writing is that it is harmful to the learning process. Academic writing is meant to help you understand concepts by studying the work of others, analyzing it and coming to your own conclusions. All of this isn’t possible if an AI is doing all the legwork for you. 

Moreover, while the content created by AI isn’t plagiarized, the same can’t be said about students who are claiming this content to be their own without giving any credit to the AI tool that they used. If the AI tool isn’t listed as a co-author and given “full acknowledgment” (as per the definition by Oxford) it would be considered academic dishonesty at best and plagiarism at worst. 

Telling the AI writer and the human apart 

The good news is that, even though AI can be used to skimp on the effort a student would otherwise put into academic writing, it isn’t completely foolproof. 

AI tends to make mistakes 

The first issue with AI writing is that it is largely reliant on information from the web, which can be inaccurate. So, if a student were to just take an essay written by an AI and send it to their professors as is, chances are that it would be riddled with mistakes and inaccuracies. 

There is also a high likelihood that the essay would have racist and sexist undertones, given that most AI tools trained on web data tend to have that problem. This would make it easy enough for the teacher to figure out that the student has used an AI tool. 

AI writing is predictable

According to the Vice President at Turnitin, Andrew Wang, AI writing lacks the human touch. AI tends to only use a set number of well-known words. This is different from how students write since they would have their own specific style of using words and a unique vocabulary. The use of certain words makes it possible for plagiarism prevention tools to find what has been written by an AI and what has been written by a human. 

Recently, a Princeton student, Edward Tian, created a tool called GPTZero to check for AI-written texts. This tool grades writing based on how complex it is. If the tool is perplexed by the writing style of a particular text, that means it is written by a human; if not, then it is most likely AI-generated. 

While plagiarism checkers are able to distinguish between humans and AI for now, this might not be the case if AI continues to develop at its current rate. To address the issue of AI plagiarism, OpenAI is currently working on a way to watermark the text generated by ChatGPT. 

Besides the technical solutions, some experts suggest that if educators are worried about students cheating on their assignments using AI, they can simply change the assessment. Replacing the written submissions with group presentations or oral reports would reduce the scope of cheating and would make sure that the student puts in the work. 

Overall, while AI can make things easier for students, it certainly isn’t a death toll for academics. There are ways to overcome the challenges posed by AI writing tools and having discussions on the use of AI helps bring these to the forefront. 

Also read:

Header image courtesy of Freepik

SHARE THIS STORY

Share on facebook
Share on twitter
Share on linkedin
Share on email

RELATED POSTS

Step Into Tomorrow: Explore the Wonders of InnoEX 2024 in Hong Kong

In the bustling city of Hong Kong, where over seven million people reside, the call for smarter, more livable cities is louder than ever. This April, the Hong Kong Trade Development Council (HKTDC) steps up to answer that call with the InnoEX and the landmark 20th edition of the HKTDC Hong Kong Electronics Fair (Spring Edition) (EFSE). Backed by the visionary efforts of the HKSAR Government Innovation, Technology and Industry Bureau and the HKTDC, these tech expos are set to feature the latest and greatest innovation from over 3000 exhibitors from more than 20 nations and regions. 

Cloud Software Group and Microsoft Forge Strategic Cloud and AI Partnership

Cloud Software Group Inc. and Microsoft Corp. have announced an expansion of their long-standing collaboration through an eight-year strategic partnership. This partnership aims to strengthen the go-to-market collaboration for the Citrix virtual application and desktop platform and facilitate the development of new cloud and AI solutions. As part of the agreement, Cloud Software Group will commit US$1.65 billion to Microsoft’s cloud services and generative AI capabilities.

The Best 4 Hardware Crypto Wallets of 2024

After a long crypto winter since the spring of 2022, the crypto world has been buzzing with activity recently. In January, the U.S. saw the approval of Bitcoin ETFs; on March 14, Bitcoin’s price soared to an all-time high of US$73,835—obviously, there is an upsurge in interest in the crypto market. 

SUNRATE Empowers B2B Transactions with Apple Pay Integration

SUNRATE, a global payment and treasury management platform for businesses, announced the integration of Apple Pay for its customers, offering a safer and more private payment method. This move leverages the advanced security features of the iPhone to protect transactions and aligns with the growing demand for seamless and secure business transactions.

VTT and ISKU Collaborate on Biocomposite Materials for Next-Gen Furniture

Finland’s VTT Technical Research Centre and furniture company ISKU have collaborated to develop a pioneering chair made from biocomposite materials. This sustainable piece leverages cellulose-plastic compounds, integrating natural fibers, polypropylene, and wood pulp sourced from Finnish forests, showcasing the practical applications of biocomposites in furniture design.