ChatGPT, AI and Student Learning

AI graphic
Image by Mike Mackenzie Marketing

By Jade Pearce

Artificial Intelligence, or AI, has gained momentum in the news lately. Depending on who you ask AI tools will either save humanity, or destroy it while many others are trying to navigate a middle ground. In November 2022 AI technology reached a new frontier through the company Open AI, which launched ChatGPT. What sets this AI apart from others (such as Siri, Alexa or a Google Search) is the sophistication of the content and its interface with the end user. In its own words, ChatGPT is a free “state-of-the-art natural language processing model”, which is “specifically designed for conversational language and is capable of generating responses to input in a way that mimic human conversation.”

While this description may seem vague at first, its capabilities are staggering.  Based on a user request, ChatGPT can produce a myriad of products from creating an eloquent poem to developing a comprehensive and thoughtful lesson plan at any grade. Trained on a massive dataset of text, including books, articles, and websites, it is able to synthesize data related to the request and transform its answer into consolidated and “complete paragraphs” that is user friendly. Having amassed over a million users in just the first five days of its launch, ChatGPT has become a powerhouse of sorts that is able to provide its end user highly personal answers rather than requiring them to sift through various links within a search engine. With a click of a button, ChatGPT seemingly is able to give you the answer you need in seconds. 

The Benefits

Proponents of this powerful chatbot argue there are benefits through its diversity; ChatGPT can be leveraged in multiple ways and within various fields. This is no exception within academia and faculty nationwide are debating the pros and cons of this tool. Those in favor argue for ChatGPT’s extensive capabilities that range from developing bibliographies to modeling written arguments, which easily proves itself to be an invaluable tool within the classroom. When properly utilized, ChatGPT could become a “new baseline for student essays” and “set a new bar for creativity.” Using it as supplementary teaching aid, students have the opportunity to understand ChatGPT’s strengths and weaknesses as well as further developing their skills as writers and critical thinkers. Using the tool in this way can not only develop writing skills but can aid student learning when writing resources are unevenly distributed or scarce. Taking it one step further, utilizing this type of practice can prepare them for life outside of college where employers may request them to have a broad range of AI-related skills.

Further, and possibly the most appealing, is the amount of time saved utilizing a tool such as ChatGPT. Often students that may have multiple obligations (such as a non-traditional student) may be able to get back minutes or hours of their day and could help reduce the struggle of balancing a heavy workload. Further, for students with disabilities, utilizing a device such as ChatGPT could provide a powerful tool to enhance understanding of the assignment at hand, or develop their skills in writing where they previously were unable to do so. Additionally, resources available for these non-traditional students vary nationwide and often scarce resources create an environment of inequity that can put unnecessary strain on the student. ChatGPT could help equalize the playing field and thus create a more equitable environment for students within a university setting.

The Challenges

Despite these “ready made promises” of ChatGPT, it has quickly raised ethical questions and concerns. Dr. Todd Faubion, Department of Global Health Undergraduate Program director and lecturer initially didn’t fully understand the power or sophistication of ChatGPT until he read a New York Times article about its pervasive use. With the discovery of a student submitting suspicious answers and confirming that cheating occurred, it made it clear that something had to be implemented to regulate ChatGPT and other AI tools within the school. Some view ChatGPT as plague-like, where educators nationwide have yet to feel the full force of this phenomenon.

“ChatGPT and AI software generally make it increasingly challenging to identify and reward students who truly and thoughtfully engage with course content,” replied Faubion via email when asked about the subject. He suggests this tool does a disservice in practical and conceptual ways through inadvertently adding to the already heavy workload of many faculty members (potentially leading to increased headaches, stress and burnout) to existential concerns (some of his concerns include: should faculty concern themselves with focusing on individuals who prefer using this technology versus those who submit independently authored, critically engaged work?  How does assessment need to change given this new reality that isn’t going away?  What is a University of Washington student who submits work authored by an AI software hoping to gain from their college education?  How does one encourage information literacy in the context of AI software?) 

Faubion points to other concerns surrounding writing and critical thinking: “AI software inherently devalues writing because a machine is doing it, absolving people of the need to think critically and learn to communicate effectively.” This issue not only affects the quality of student writing, but can undermine the faculty-student trust on whether a student is producing original work or not. Taken a step farther, this can even shake the confidence in the quality of higher learning in general. What does it say to the rigor and prestige of a school if deregulated use of AI tools are utilized? University tuition is increasing every year; is a student getting the quality education they are paying for if they rely on AI? According to Faubion, trust is essential for collective growth, vulnerability and compassion; traits essential for human development. Without foundational trust, it can create a tone where instructors become suspicious of the authenticity and originality of a student’s work instead of focusing on effectively teaching their course.  

What’s more, even developers of ChatGPT readily mention its limitations, which is clearly stated on its website. The program “may occasionally generate incorrect information, may occasionally produce harmful instructions or biased content and has limited knowledge of the world and events after 2021”. These limitations "can generate responses that are inaccurate, offensive, or nonsensical.” Therefore, heavy reliance on this tool quickly becomes problematic. AI tools are just that – tools. They can only produce what is given to them, they cannot develop an original idea nor do they know all of the answers. Finally, not only can the information be inaccurate or reliable at times, but there is still information that it cannot access such as fire-walled or subscription services. These sources tend to be the most rigorous and robust sources and are often favored or required to cite within academic papers and exams.

Ultimately, the frequent use of AI tools is creating a messy space to navigate and leaving many members of the academic institution skeptical at best and nervous at worst.

What Can Faculty Do Now?

While this is uncharted territory, leaders within the school are already developing ways to address ChatGPT.

Liz Kirk, the SPH Associate Dean for Education, has developed a resource document about what ChatGPT is, which includes UW-specific resources related to the tool, suggestions of language to use to address it within faculty syllabus as well as a possible announcement that instructors can share with their students about it during the quarter and moving forward.

Already Faubion has held conversations with the Global Health 101 teaching team where they developed a process for discouraging the use of this software and tools for detecting its use in the classroom.

Additionally, the Curriculum and Education Policy Committee (CEPC) is forming a ChatGPT “workgroup” and is in the process of developing its committee members. The group’s focus will be towards understanding how students are using ChatGPT, testing tools to detect whether this tool, or other AI tools were used during a student’s study and developing school-wide policies about AI use. If you are interested in joining the workgroup, please contact Jenn Slyker at jslyker@uw.edu.

Below are other resources faculty can access in order to navigate using ChatGPT and other AI tools.

https://libraryhelp.sfcc.edu/Chat-GPT/faq

https://libraryhelp.sfcc.edu/Chat-GPT/detectors

Conclusion

Despite the heated discussions on and off campus, ChatGPT and other AI tools are not going away. Therefore, it is necessary to start these conversations as soon as possible with the focus on an intentional way of how to use AI and how to best regulate it within an academic setting. In its own words, “it is important to use ChatGPT responsibly and to carefully review its output to ensure that it is appropriate for the intended audience.” According to Faubion, these types of conversations must be incorporated into the curriculum while simultaneously creating clear parameters for its use and policies for violation. Ultimately, the university is a bastion for learning and developing necessary skills for critical thinking, decision making and insight. AI tools such as ChatGPT can help students along this process, but without conversations on how to use it responsibly, UW’s mission could be severely compromised.