Anissa Graham, Senior Lecturer in English, University of North Alabama
With its wide release in November 2023, ChatGPT has dramatically altered the way courses are presented and student work is assessed. Successful adoption of generative AI will require faculty, staff, administration, and students to exercise restraint, curiosity, and patience if it is to be a helpful tool for writing instruction. Here are some thoughts so far.
Transparency is vital for the ethical use of the technology. Transparency in this case not only applies to how the technology is used but also what we mean when we say we are using AI to help us write. Students and faculty need to spend time talking about what the terms generative artificial intelligence (Gen AI) and large language model (LLM) mean; this search for shared meaning is productive on at least two levels. First off, it allows for conversation about source ethos. Lots of sources are defining these terms in similar ways, but tech companies have different goals and audiences in mind when they discuss these terms than academics might. Students and faculty need to think about the motivations behind any definition in addition to how it clarifies the concept. The second element to be addressed is our emotional response to the acronym “AI.” Frequently, AI invokes images of either helpful/friendly Alexa and Siri or malicious and corrupt HAL9000 and GLaDOS. Because students think they know what AI are and can be they may be tempted to accept or reject such tools without evaluating how the technology does or does not match their preconceived notions. We need to encourage critical inquiry into the use of AI to promote that skill for our students in their lives outside the classroom.
Any use of generative AI should be intentional. While we all have specific goals and lessons in mind when we assign student writing, those goals are not always communicated to the LLM when students enter a prompt. In “Guide, Don’t Hide: Maximizing Course Assignments with ChatGPT Integration,” Lauren E. Burrow notes that with specific instructions on how to use AI to complete work, students do not use the tool to cheat but instead “[are] conservative and conscientious in their use of the AI tool” in this case to craft a fairytale as part of a larger project. Erin Margarella and Rebecca Stobaugh offer some practical advice in “The Potential of AI and ChatGPT: Empowering Learning and Communication in the Digital Age” for using AI in writing assignments like the one Burrow’s students completed. Their five steps for successful use of ChatGPT should sound familiar: (1) “Be clear and specific” when prompting the AI for text and feedback, (2) “Use context” to help establish the purpose of the text to be created and offer background for the conversation to come, (3) “Provide examples” of what the final text should look like, (4) “Test and evaluate” by reviewing the produced text for accuracy and interest, and (5) “Comply with ethical guidelines.” These strategies are the same ones we encourage students to employ when taking their work to writing centers or tutors for assistance (1, 2, and 4) and when we provide instruction on genre and modality for a writing project (3 and 5). When used with a clear and focused purpose, ChatGPT can aid students in creating text; however, a living reader is essential for that text to have meaning.
Go slow to go fast. Just as the last minute paper written hours or even minutes before a deadline can suffer from a lack of depth and predictable prose, so too can a text generated by an LLM with minimal interaction from a student. If we want to have students use AI in productive ways that focus on their growth not only as writers but as thinkers, faculty need to begin small and be willing to take time for students to build the critical digital literacy skills to recognize flat writing from meaningful writing. Much of the commentary on the use of generative AI mentions the speed at which a text can be produced, but just like text created by a person, interesting, thoughtful text created with the help of AI needs time to be reviewed. AI can help writers at all levels – the accomplished and the struggling, the native speaker and the non-native speaker – craft clear and concise prose, but we need to emphasize for students that their lived experiences and perceptions are what add interest and value to their writing.
These observations are not the only takeaways from the discussions I have read and heard; instead, they are the ones that seem to echo from source to source. For those interested in diving further into the research, Anna Mills is curating resources for teaching at “AI Text Generators and Teaching Writing: Starting Points for Inquiry”; the site was last updated in November 2023. “Statement on Artificial Intelligence Writing Tools in Writing Across the Curriculum Settings: A Statement from AWAC Executive Committee” and “MLA-CCCC Joint Task Force on Writing and AI Working Paper: Overview of the Issues, Statement of Principles, and Recommendations” both offer starting points for conversations with colleagues, administrators, and students.
Works Cited:
Burrow, Lauren E. “Guide, Don’t Hide: Maximizing Course Assignments with Chatgpt Integration.” Faculty Focus | Higher Ed Teaching & Learning, 9 Nov. 2023, www.facultyfocus.com/articles/teaching-with-technology-articles/guide-dont-hide-maximizing-course-assignments-with-chatgpt-integration/.
Margarella, Erin, and Rebecca Stobaugh. “The Potential of AI and CHATGPT: Empowering Learning and Communication in the Digital Age.” Faculty Focus | Higher Ed Teaching & Learning, 26 Sept. 2023, www.facultyfocus.com/articles/teaching-with-technology-articles/the-potential-of-ai-and-chatgpt-empowering-learning-and-communication-in-the-digital-age/.
Commentaires