On Nov. 30, a new and more developed version of artificial intelligence entered the educational sphere. ChatGPT, a software that is capable of endless tasks such as emails, poetry, interview questions and more was created by the startup company OpenAI. 

The technology is more known for troubling reasons recently, however, as the AI has been caught creating essays and other reports in students’ names. Since the device is proposed as a free tool for any and everyone with access to a computer or phone screen, those in academia are taking advantage of its writing properties.  

Multiple publications including the New York Post, Indiatimes and others have released articles reporting that an increasing number of professors have unveiled the system’s writing pattern and can note when a student has submitted AI-created work. My question is: why?

I never quite understood why students resorted to cheating on schoolwork when the cost severely outweighs the reward. Of course, you are striving for that “A” to boost your grade point average and look favorable to your future employer. But how desirable will you look to your dream company when you are expelled? I’d personally rather take a zero for an assignment because I failed to study correctly than get a zero from trying to play off someone else’s work for my own. 

ChatGPT has the potential to aid students who genuinely are only using the device with good intentions. Maybe someone is struggling with structuring a paper or finding talking points. Those are totally valid concerns that many students who aren’t proficient in writing struggle with. 

But when you simply put in your paper topic, let the system run and submit the essay entirely written by coding, I think it’s ridiculous. I often question why people pay thousands of dollars for a course (or courses) just to have a computer do their work for them. 

Furthermore, it seems that the system has its own kinks anyway. Due to the fact that ChatGPT compiles information found across the internet, it cannot understand if it is producing factual works. According to The New York Times, “ChatGPT [can] write an essay about a classic book […] But if others published a faulty analysis of the book on the web, that may also show up in your essay,” and therefore would contribute to spreading misinformation if that work were to be published once more.

So, why run the risk? ChatGPT even makes an attempt to steer students away from making this poor decision as well. As written in Fortune Magazine, when a user asks if it’s okay to use the system to write academic papers, ChatGPT responds with the message: “As a general rule, it is not appropriate to use ChatGPT or any other automated writing tool for school papers, as it is considered cheating and does not benefit the student in the long run.”

If you use it ethically, I think AI could prove to be a good contribution to academia. You can use the resource to spell-check your own work, ask questions when studying harder topics, create an exam outline or just use it as a good example of how to set up an essay.

If it is taken advantage of, however, (ie. copying and pasting work that is not your own into BlackBoard) then that is where I feel that universities and professors should draw the line.

About The Author

-- Senior I Executive Editor I English Creative Writing & Digital Journalism --

Brooke is a senior English Creative Writing and Digital Journalism major, with minors in Film, Television & Media and Editing & Publishing. She plans to pursue a career in screenwriting after graduation.

Leave a Reply

Your email address will not be published.