Focused Inquiry Faculty Incorporate, Embrace ChatGPT Into Curriculum
With every emergence of a new tool can come some anxiety about what change this tool will bring forth, but that has not happened in University College's Department of Focused Inquiry.
Focused Inquiry faculty have already started adapting and integrating the use of ChatGPT–an artificial intelligence chatbot–as well as other artificial intelligence technologies into their classrooms in an attempt to utilize the tools and encourage its use for writing and editing purposes.
A handful of faculty, who teach UNIV 111, 112 and 200 where writing, critical thinking and analysis are core pillars of the curriculum, have introduced ChatGPT to students to use as a means to add to the classroom of experience.
Below, Focused Inquiry faculty members, and discuss how they are choosing to utilize the new technology in their courses
What made you want to integrate ChatGPT into your courses, rather than try to prohibit its use?
Chris Martiniano (right): It’s not that simple. Of course, we all “prohibit” its use for completing assignments. That’s academic dishonesty. But like it or not, this is our new reality and it would be a disservice to our students to ignore it. The foremost fear for educators is that large language models are capable of writing a (good) essay. That’s simply not true. After working with the chatbot, my students realize that now. Yet this technology is already being heavily integrated into the professional world to automate certain tasks like emails and reports. It seems important to make students comfortable with the technology, its limitations and opportunities-to advance their professional success.
Ryan Cales (left): We should embrace it rather than ignore and or reject it. The biggest concern for me is figuring out how to embrace it. I still think everyone is trying to win the race of how to use this new shiny thing and then market whatever comes of it as superior, and we just aren’t there. What we do know is that this technology isn’t going away, and I think we have an opportunity, while it’s free, to figure out how we can possibly utilize it in the classroom. This is how we treat other technologies: for instance, we help students understand what “fair use” means and the kinds of media they can appropriately incorporate into their work without violating copyright. I don’t really see the difference with how we should approach ChatGPT.
Michael Abelson (right): As I told my students, ChatGPT and other generative AI programs will likely prove to be more than just flashes in the pan. I don’t think it is hyperbole to say that these technologies will transform the worlds of work and communications in fundamental ways that we can’t even begin to imagine yet. If we view these programs as tools, we need to learn what they can do and what they can’t. This is a very challenging proposition as they are evolving rapidly - we might as well get started getting to know one another. Mastery of ChatGPT will be a skill like others we cultivate and this skill is closely in line with other skills we prioritize in Focused Inquiry, such as asking good questions, critical analysis of information we encounter, creative solutions to problem-solving, effective communication and an ability to ascertain ethical dimensions of complex problems.
How are you using ChatGPT in your classes?
CM: In my class, we integrated ChatGPT into our multimodal project, creating a time capsule. After students decided on the purpose and contents of their capsule, they asked ChatGPT what it would include in a time capsule if it was their age, their gender identity, living in their region, as well as any other demographic data they wanted to add about themselves. They then wrote an accompanying reflection with their video presentation of the time capsule that assessed ChatGPT’s response. They critically thought about how its suggestions were overly generalized, sometimes pointing to objects that they didn’t think of (that were often irrelevant), and the importance of specificity in their writing of the prompt..
In my sections of UNIV 111, we use ChatGPT almost every class to generate outputs regarding themes and issues that our readings provoke. In particular, their synthesis essay analyses, assesses and synthesizes the “Turing Test” with Turing’s original, landmark “Computing Machinery and Intelligence" (1950), J McCarthy’s, “A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence,” (1955), the 1976 “An Analysis of the Turing Test” essay by James H. Moor, and their choice of one of five current, 2021-23 critiques of the Test. While ChatGPT is “supposedly” good at summary, it’s actually not, according to my students. It’s often overly general, sometimes factually incorrect and incapable of analysis. We also use it to see how generative AI platforms essentially follow the original schema provided in Turing’s essay, looking at OpenAI, Google LaMDA and IBM’s API charters. Lastly, students will attempt their own Turing Test with ChatGPT to see if its “imitation” deceives them.
RC: I’ve asked students to use it “in some way” in their research and writing processes for their upcoming essay assignment, which is intentionally vague. I’ve given them examples as to how they might use it (help with topic generation, refining research questions, developing theses, to name a few), though I stress that (1) this is an experiment, (2) I don’t have the answers or a real agenda with its use; and, (3) they will likely come up with ways to use the technology individually that are better for them than things I could think of.
MA: I have decided to embrace ChatGPT in an experimental way. Students are required to use ChatGPT for their normal weekly reading response assignments and for larger essays, but they must indicate in color text drawn directly from ChatGPT. They must also have a second reflective section of all assignments where they assess the quality of the material drawn from ChatGPT. In this second section, they should indicate areas where ChatGPT fell short and why. They should also indicate areas where ChatGPT performed well.
Mastery of ChatGPT will be a skill like others we cultivate and this skill is closely in line with other skills we prioritize in Focused Inquiry, such as asking good questions, critical analysis of information we encounter, creative solutions to problem-solving, effective communication and an ability to ascertain ethical dimensions of complex problems.
What has the reaction from students been like when you’ve introduced ChatGPT as a tool within the course?
CM: I was surprised how few students were familiar with ChatGPT, DALL-E, Google Bard, Midjourney or any other generative AI given all the press they’ve received. Those who were familiar with it, knew it well and knew how it was being used to cheat on TopHat, exams, papers, etc. Or my art students who were familiar were upset about the loss of their craft. Since we’ve been working with it, the two primary reactions have been frustration with the tools not generating satisfactory outputs but also a general sense of hope that, as OpenAI claims, this technology, according to some, will benefit humanity, despite the many ethical concerns it brings with it.
RC: Like Chris, I was also surprised how few students were familiar with the AI available. Also like Chris, my art students have been the most vocal when discussing generative AI because of the potential ramifications within the art world. Specifically for our classes, students range from being interested and excited about using it to not really caring much at all. Some have voiced concern about what they can and can’t do with it for class, which seems to stem from it being seen by many (at least how it’s being talked about in education) as a tool to cheat. Throughout everything I’ve stressed that this is ultimately an experiment, and that we’re just figuring out how it might be used for good, which they seem to understand and accept.
MA: Students were initially quite excited, and maybe a bit surprised, to so openly engage with this tool that had been widely viewed as a tool for cheating. Thus far, everyone has enjoyed having the opportunity to experiment with ChatGPT to see what it can and can’t do. Interestingly, many students seem to be concluding that ChatGPT might be a good tool for brainstorming ideas and possibly for drafting, but at the present time, the quality of its output is not sufficient for them for complete or finalized work.
Is there anything else you would like to add?
CM: In all my many presentations, workshops, symposia, and conversations with my fellow faculty members and the university community at large since January, the one thing that I stress is that the entire goal of generative AI is to “imitate” human language and conversation, not generate legitimate content or knowledge. ChatGPT’s linguistic, syntactical, and grammatical idiosyncrasies are easy enough to detect once you are familiar with the technology. So much so that I rarely need to run student work through an AI detector like ChatGPT Zero. Most importantly, ChatGPT’s engine will most likely be incorporated into Office products, like it is with Bing, in the near future. Be ready.
RC: Like most things, I share Chris’s “be ready” sentiment. Honestly, I’m more worried about AI detecting/policing technology (GPTZero, for instance) than students using ChatGPT to cheat. Using this technology to vet student essays for authenticity has the potential to open up a new level of student profiling and discrimination, which is clearly only harmful.
MA: I was in a session the other day sponsored by VCU Libraries, Oscar Keyes, who works for the VCU Libraries system, suggested that we might start thinking of ChatGPT as an assistive technology which can help people do more with the tools and abilities they have available to them. I really like this way of thinking because it allows us to move past reductive conceptions of this technology that suggest it is in some way cheating or a diminishment of what we do as humans. Rather, this might prove to be a powerful leveling tool, so long as all people have access to it and know how to use it well. Ensuring equitable access will be one of the biggest challenges as we move to the next chapter of the digital divide. In terms of education, I look forward to having institutional access to these programs so that students, faculty, and staff can explore and utilize these tools in new ways.
This interview has been edited for length and clarity.